FileHippo News

The latest software and tech news

Hewlett Packard Enterprise (HPE) has built the largest single-memory computing system in the world at 160TB of RAM. To put this into perspective, 160 terabytes (TB)... HPE Unveils ‘World’s Largest’ Single Memory Computer

Hewlett Packard Enterprise (HPE) has built the largest single-memory computing system in the world at 160TB of RAM.

To put this into perspective, 160 terabytes (TB) of Random Access Memory (RAM), is roughly the same amount of memory contained within 80,000 of the latest iPhones!

Simply known as ‘The Machine’, the 160TB computer runs on a Linux-based operating system that has been prioritized to maximize RAM rather than processing power. Instead of how fast it can process data, The Machine instead has been designed to focus on how much data it can process at any one time. “With the exploding growth in data, computer architectures are hitting a wall in how to deal with all that data,” Mark Potter, chief technology officer at HPE said.

Hewlett Packard Enterprise Unveils ‘World’s Largest’ Single Memory Computer

HPE attempts to convince us that in just three years time, no one will be able to install Windows updates due to their vast size. (I’m kidding; or am I?)

Ambitious research project 

The Machine is at this stage, still just a prototype, and just one part in HPE’s so-called “memory-driven computing,” an ambitious research project attempting to rethink how computer systems are designed and built based on the limitations of current technology.

Memory at the forefront

The Machine puts memory at the forefront of it core system, not the processor as systems do today, and have done since the 1970’s. From then until recently, with the demise of Moore’s Law, the processor has always been the key component. HPE claims that The Machine is only the first computer in a new wave of computers that will result in huge jumps in performance and efficiency.

Massive data-set

It’s main selling point, according to HP is the size of the data-sets it can analyze. “If you think about all the data coming at us, being able to look at larger data sets will help us solve problems that we aren’t able to solve today,” said Potter.

1 exabyte on the way?

HPE also thinks it can scale the current architecture of The Machine far past the current 160 terabyte limit, and sees no problem with reaching into the exabyte (1 billion gigabytes… That’s an awful lot of iPhones) range of working RAM and beyond.

Just three of the 128 GB modules used by ‘The Machine.’

The Machine’s 160 terabytes isn’t just one single chunk of memory, despite the smoke and mirrors HPE would have us believe otherwise. Instead, the 160 terabytes of RAM is distributed across 1,280 dual in-line memory modules, each one a gigantic 128 gigabytes. They’re connected together with what HPE calls the fabric to transfer the data between memory and processor using photonics. The Machine also has 40 processors installed to control the data analysis.

The potential applications for The Machine and its future progeny could take computing and the physical boundaries that currently limit it, into new territory. HPE cited the advantages that a memory-based computer could bring to areas such as Deep Learning, Artificial Intelligence, Astronomy, and medical research that currently require massive amounts of data to be analyzed and processed to reach significant advances.