Skip to main content Skip to main navigation Skip to search Skip to footer
Type to Search Subscribe View Tags

In-Memory Computing: The Need for Speed

In-Memory Computing: The Need for Speed
Greg Palesano - Executive Vice President - Applications | March 18, 2013

Recently, much has been written about In-Memory Computing (IMC), but does it match the hype?  Will IMC deliver the results that many have predicted?  If the hype is real, what is the likely impact on big businesses and enterprise applications? This series of blogs will attempt to define IMC, demonstrate how IMC can be used, and explain in simple terms why it is critical for companies to investigate how to leverage it today. 

-- Gartner defines IMC as “a computing style in which the primary data store for applications (the "data store of records") is the central (or main) memory of the computing environment (on single or multiple networked computers) running these applications.”

To look to the future, we should start by examining the past.  Thinking back to the early 1990’s (a mere 20 years ago), I am sitting in my cubicle, working with one of the first commercially available personal computers. Everyone else in the department is envious. I have a desktop while they are staring at a green screen.  The Internet was not prevalent, and email was largely an internal function (via the mainframe). But I was at the leading edge of productivity, challenging the very fabric of mainframe law, leveraging the newfound power of personal computers.  Wow!  At that time, it felt very cool, practically starring in my very own IT movie!  Remember those days?  Do you also remember the memory on those “leading edge technology” PCs? If you were lucky, you had 8 megabytes of Dynamic Random Access Memory (DRAM). Does not seem like much now, but in those days, 8 megabytes was huge…but already, applications and data were overwhelming the PC. Think about how long it took to perform simple tasks. Did you ever have an Excel (or Lotus 123) spreadsheet with a few thousand rows? How long did it take to run a simple data sort? Why did it take so long? The computer was literally reading and writing every row to disk. It took every ounce of memory to run the operating system (some of you may remember that Windows 2.0 was not all that efficient!).  Did you ever try running more than one application at a time?  Your disk drive light would be on for hours! Rebooting took half a day. Backups were through the mainframe to tape (ugh). Shall I go on?

OK, let us come back to the present day. Purists might argue that the situation described above is not really related to today’s IMC. But to a business user, it is exactly the same thing–the need for speed. This is what IMC promises the world of business and enterprise applications.

From a business perspective, there are several factors causing continued pressure on the IT industry to increase computing speeds. 

  • Continued focus on instant gratification–business users, irrespective of the complexity or the size of the data involved, are unwilling to wait 5 seconds, much less minutes, for ANY transaction to complete. Users are just not interested in sacrificing time and productivity waiting for an application (and underlying infrastructure) to perform. This is the new normal, and the demand for instant response will only increase with time.

  • Big data: Access to newer and bigger data is pushing the boundaries of current technology. We now have access to terabytes of data on just about anything…grocery store buying patterns, jet engine performance statistics, online product information, and anything else we can dream up. In certain cases, this data has previously existed; however, we are only now coming to grips with how to process these vast amounts of information. 

  • Mobility: The remarkable and growing use of mobile devices. The iPhone has only existed for the last five years (and the iPad for less), but this kind of mobile device has transformed how we deal with information. Specifically, more and more of what we do is in text form and available for analysis instantly. Mobile devices also provide valuable (albeit frightening) information on where and when the data was created or read. 

  • Costs: Today’s world of commodity servers and inexpensive disk drives is completely different from yesterday’s world of enterprise IT. Historically, analytics ran on expensive, high-end servers and used expensive, enterprise-class disk drives. Buying a new database server was a big decision; it came with software licensing costs, as well as incremental operational needs (e.g., a database administrator). Today, adding more nodes is not a major capital expense and does not necessarily trigger new software licenses, or additional administrators. These costs will only continue to move downwards over time.

Technology is really starting to catch up.  All the major infrastructure, database, and

Contact Us

We will treat any information you submit with us as confidential. Please read our privacy statement for additional information.

We will treat any information you submit with us as confidential. Please read our privacy statement for additional information.

Sign in to Add this article to your Reading List