Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Developers hold the key to unlocking the potential of next-gen computing

public://pictures/Jaikumar-Vijayan-Freelance-Writer.png
Jaikumar Vijayan Freelance writer
As vendors launch new technology development initiatives for dealing with cloud and big data applications of the future, the big question is whether they can attract the attention of the software developer community to the new platforms.
 

The trend of organizations generating and consuming increasingly massive volumes of data is expected to soon strain (and eventually overwhelm) the ability of current-generation computing technologies to keep up with these demands.

In the next few years, global data volumes are expected to grow exponentially. Analyst firm IDC expects that by 2020, the overall digital data universe will be roughly 40,000 exabytes, or a staggering 40 billion terabytes, in size. Some, including HP, predict that in about a decade, the data universe will grow even bigger to brontobytes in size, where each brontobyte is equivalent to one billion exabytes.

Surging data volumes drive next-generation hardware

Technology vendors have launched different initiatives to enable the huge improvements in processor speeds, networking capabilities, and storage capacity that this kind of data growth will require.

IBM, for instance, has embarked on a $3 billion initiative to develop next-generation chip technology that IBM says will be capable of handling the cloud, big data, and cognitive system requirements of the future.

Canadian firm D-Wave Systems is betting on quantum computing technologies to build machines large and powerful enough to run next-generation, specialized applications and workloads.

HP has hitched its wagon to a new system dubbed The Machine that HP says will be completely different from the processor-centric machines of the past six decades. Others, including Intel and Samsung, are working on so-called non-volatile phase change memory as a replacement for the DRAM technologies in current systems.

Fundamental technology shifts

The efforts are all designed to introduce fundamental, performance-boosting changes to the technologies that people have used for decades to run their computational workloads.

IBM's initiative, for instance, involves scaling semiconductors down from its present 22 nanometers to a monumentally challenging seven nanometers and below by the end of the decade to improve performance, energy consumption, and other features. Going forward, IBM will focus on post-silicon-era technologies like carbon nanotubes to overcome some of the physical limitations of silicon semiconductors.

"Cloud and big data applications are placing new challenges on systems," the company noted when it first disclosed plans for its next generation processors. "Bandwidth to memory, high-speed communication and device power consumption are becoming increasingly challenging and critical."

Breaking the mold with The Machine

HP's plan with The Machine is to find a way past the limitations of the processor, memory, and disk-based hierarchies used in computers since their invention.

Almost all computers in use today work by constantly shuttling data at high speeds between storage, memory, and cache as needed, using copper connections. The operating system, programs, and data are basically stored on a hard disk, loaded into memory when the program is run, and then put back into storage when the processing is complete.

HP wants to eliminate this process entirely by collapsing the separate memory, persistent storage, and cache tiers that exist today into a single, huge, non-volatile memory tier called memristor. By eliminating the need to constantly keep moving data between memory and storage, memristors are expected to be able to handle more information faster than RAM technologies while consuming significantly less power in the process.

The Machine will use fiber optics or photonics instead of copper to directly attach the memristor to the CPU, making it much faster than current-generation computers.

The software opportunity, and challenge

For developers, such ambitious vendor efforts present both an opportunity and a challenge. If harnessed well through software, some of the emerging technologies could deliver substantially better performance than today's systems. For example, HP's memristor technology will eliminate the need for developers to write code for shuttling data back and forth from memory and storage, thereby improving performance.

But many of the products are still years away from commercial availability, and there's no telling what changes might happen along the way. Tools for helping developers move their software to the new platforms are scarce and only still emerging.

"Like most commercial businesspeople, developers have an aversion to new technologies that promise more than they ever deliver," says Charles King, principal analyst at Pund-IT.

The history of IT is replete with examples where this has happened, he says. "The enterprise version of that occurred when proponents claimed that ARM-based servers were poised to eat Intel's breakfast, lunch, and dinner," but nothing of the sort happened, King says. Vendors will have their work cut out for them in trying to engage and interest developers in the emerging platforms, he notes.

Vendor efforts to garner developer support

The vendors themselves certainly appear cognizant of the critical need to pull developers along as they go.

D-Wave held a funding round last year mainly to raise money for software development and for building an application ecosystem around its emerging quantum computing technologies. The company says it plans on using the $30 million it raised via the funding round to accelerate the development of critical industry applications and to hire more developers for its quantum computers.

Over the next few years, it wants to get developers engaged in key areas such as drug research, financial services, and machine learning, because that's where it sees its machines being most valuable.

HP, for its part, has committed to building a completely new operating system dubbed Carbon for The Machine over the next several years. But instead of waiting for that to become available, the company has released a version of Linux called Linux++ that developers can immediately use to familiarize themselves with the new programming models and tools that will be needed for The Machine.

In an interview at HP Discover in Barcelona last year, Rich Friedrich, senior research director at HP Labs, described Linux++ as critical to the success of The Machine. According to Friedrich, one of The Machine's primary tasks will be to do real-time analytics against many diverse data sets and to conduct longitudinal studies over high-volume data sets that may be hundreds of terabytes or even a few petabytes in size.

"Today's operating systems can't handle that much memory and performance and scalability," he said. "So we have an opportunity to make some fundamental changes to these operating systems and provide that kind of scalability to users who want to build applications for these next-generation workloads."

Easing the path

The goal of Linux++ is to maintain interfaces that programmers are used to today, while modifying the internal components of the operating systems where needed. "For example if we have to manage a petabyte of main memory, how do we do that and how do we make sure Linux OS does that in a scalable, high performance way?" Friedrich asked.

Like most disruptive technologies, Linux++ will require some amount of code rework. But the goal is to try to maintain the existing programming model to the extent possible, while also taking advantage of The Machine's load and store capabilities, according to Friedrich. "That means no more file system calls for doing file read and file write," he noted in one example.

Jim Handy, general director at Objective Analysis, says that since The Machine's processor is new, it will require new compilers, and application programs will need to be recompiled to run on it. "But if the system calls are all Linux calls, then the application programs shouldn't require much rewriting," he says. But, they would still need to be re-qualified.

However minimal the disruption might be, developers will still need to port their software to the new system at some point, which is a lot of work, Handy says. "And then they will need to do quality testing to make sure that the new ports are reliable. HP is going to have to show them that all this effort won't be a waste of time."

Scaling down to speed up

HP has also scaled down some of its original specifications for The Machine in a bid to bring a version of the product to market quickly so developers can have a go at it. The company's current plan is to introduce a prototype of The Machine sometime next year featuring conventional DRAM technologies and running Linux ++ instead of the more futuristic memristors and Carbon OS.

"In the end, HP will probably be producing a scalable x86 machine that has a lot of memory, sort of like a Superdome on steroids," says Rich Fichera, vice president and principal analyst at Forrester Research. Or, it could end up developing a very high-performance cluster architecture of four- and eight-socket nodes, connected with a very high-speed interconnect like Intel's Omni-Path, he says.

Based on HP's proclamations to date, the initial versions of The Machine at least will be built with entirely commodity parts and run Linux. The Machine will use the same CPU, memory, and interconnects that are used by other players in the market.

"So the original vision of The Machine, which was actually pretty neat and addressed the right set of problems, will be transformed into a somewhat better mousetrap that will potentially face competition from other vendors designing similar systems with similar underlying technology," Fichera says.

Over the next few years, HP will release new, increasingly sophisticated prototypes based on different kinds of technologies, like phase change memory, before it finally evolves to a fully memristor-based system running Carbon.

It's too early to say whether efforts like HP's phased rollout of The Machine and D-Wave's investment in software development will succeed at attracting developer interest to the technologies. What's clear, according to analysts like King, is that without developer support, many of these efforts will fail to live up to their full potential.

Keep learning

Read more articles about: Enterprise ITData Centers