Remember everything, faster
In-memory computing is going mainstream. Fleur Doidge asks what it means and whether the channel stands to gain
Imagine if you had instant, total recall of everything you'd ever learned or experienced, either yesterday or many years ago. No more time wasted trying to remember facts or apply them, or fruitless resource searches, or getting distracted by random connections thrown up by Google for that matter.
An impossible dream? Perhaps, as some have said, the natural decay of memories over time is a blessing that prevents us from constantly reliving painful experiences, distancing us from a past that we cannot change. The same is not true of computing, however, where we are always looking to retrieve knowledge and improve information processing performance.
In-memory computing - meaning that app data is stored on DRAM rather than on disk or elsewhere - has been around for many decades but, this year, a genuine buzz has begun to be heard around the possibilities, centred on the fact that information can be retrieved more quickly, allowing for higher volumes of more complex transactions.
Gartner says, for example, that it's finally going mainstream, with the in-memory data grid market now on track to reach $1bn (£686m) within the next three years. This it puts down to the fact that application infrastructure technologies are now maturing rapidly against a backdrop of cheaper semiconductor production.
Until recently, only cutting-edge segments of sectors like financial trading, telecoms, defence, logistics and online entertainment were up for the cost and potential complexity.
Massimo Pezzini, vice president and fellow at Gartner, opines that the market is reaching a point now where companies that don't harness the benefits of in-memory processing may lose out to more innovative competitors.
"Relentless declines in DRAM and NAND flash memory prices, the advent of SSD technology and the maturation of specific software platforms have enabled in-memory to become more affordable and effective for IT organisations," he said.
In-memory can help customer organisations develop apps that run advanced queries or do complicated transactions on very large datasets much faster and in more scaleable ways. The technology is already present in many database management systems, data grids, analytics offerings, application servers, complex-event processing platforms, and messaging infrastructures - but not everywhere.
Currently, according to Gartner, only about 10 per cent of midsize and large organisations have in-memory computing in their infrastructure somewhere, however, this is expected to at least triple over the next few years.
"Vendors that don't have an established presence in the in-memory computing market, or those looking to expand their offering, will acquire small players with advanced technology or an established presence in the market, which will lead to market consolidation," Pezzini says.
"During the next two to three years, in-memory will become a key element in the strategy of organisations focused on improving effectiveness and business growth. Organisations looking for cost containment and efficiency will also increasingly embrace it."
Rich Phillips, UK channel sales director at SAP, says its whole HANA database platform evolution incorporates in-memory computing - starting with the vanilla database, then moving those advances across the portfolio to include ERP, CRM, HR and the rest.
As Phillips (pictured, left) notes, it's not the speed of in-memory recall per se that's important, although that may be the essential enabling factor - it's what you then do with that extra speed.
From SAP's perspective, this means supporting complex analytical processes and techniques that previously were very difficult, if not impossible, to achieve and help customers make better decisions faster.
"I think everybody is going to be quite surprised where in-memory takes us, because there's so much that can be done. It is one of those game changers," he confirms. "People are saying ‘wow, I know what we can do'."
Integrators, VARs, and other business partners can and should all benefit ultimately - not just vendors and end-user customers, he says.
"We're encouraging partner organisations and working closely with them to develop solutions or use cases using the HANA capability. A good example of that is our Business Innovation Odyssey - which is kind of like a Dragon's Den for channel partners," says Phillips.
"If a channel partner has a game-changing use case, SAP will help them deliver that, we will invest and we will work with national enteprise groups and others to take it to market."
Challenging pace of innovation
The biggest challenge for partners of a vendor today is the sheer pace of innovation, Phillips notes, so SAP is putting heavy emphasis on supporting VARs wherever that support is needed, including providing qualified technical staff to help with a project.
"And it is important that we can demonstrate to customers why they should invest, and what they are going to get out of that investment," Phillips says.
John Callan is senior director of global product marketing at QlikTech - another company spreading the current gospel on in-memory computing.
He confirms there has definitely been a recent increase in investment, and that niche players may no longer hold all of the related cards. For QlikTech, in-memory means its QlikView dashboard offering can associate various different types of data in real time, carrying out a dynamic aggregation that can allow the end user customer to perform faster, more accurate, analytics.
"We are creating associations between a variety of different data sets - such as an Oracle database, TeraData warehousing, Excel spreadsheets and so on - to get a single view across all those data sets," says Callan.
Consumerisation of technology is one phenomenon helping drive the move to in-memory, he adds - not just BYOD but the expectations people now have as a result of familiarity with consumer applications such as Google.
Of course, few end users can expect to achieve the economics of scale of Google in their local business activities - but it still creates an expectation of more information aggregation being possible, and in a shorter space of time.
This is just as important in the SMB as in the larger enterprise or public sector body, and in-memory technologies can bring the price down enough for particularly cost-conscious organisations to reap the benefits, he says. Many more companies have access to volumes of data than yet know what to do with it.
"For the channel, you must explain what it means to the customer," Callan says. "What is it about the in-memory approach that makes it so different. You need to talk about combining and analysing data from multiple sources, rather than saying ‘hi, we have an in-memory solution' you need to articulate the benefits, you need to parse it."
Gerard Marlow, general manager for business development at storage disti Hammer, says storing information on the RAM of a server rather than a hard disk drive certainly suits some needs around quick access and data analysis.
"At Hammer, bespoke server solutions can be modified according to each individual requirement, so in-memory computing may be the best option for some specifications, and therefore does have a place in the channel," he says.
The storage market is an ever-changing arena, Marlow concludes, in the drive to answer questions posed by bottlenecking and other technology downfalls. There are often other answers as well.
"One solution does most definitely not fit all," he confirms.
Gartner adds that a lack of standards and skills, coupled with architectural complexity and the perennial security and management concerns, may hold back the adoption rate of in-memory technologies.
However, as the analyst firm points out, the ability to do rapid recall - perform batch processing in seconds or minutes, rather than hours, and the capability to support rapidly changing business requirements more easily as a result - are definitely valuable to customers in the short, medium and long term.