data center modernization

Last year about this time, HP announced a revolutionary technological change – persistent memory. Currently, memory is volatile, meaning that if it looses power, it’s lost. As a result, operating systems were developed to move data from hard drives to random access memory, then write it back as needed.

What is Persistent Memory:

In computer science, persistent memory is any method or apparatus for efficiently storing data structures such that they can continue to be accessed using memory instructions or memory APIs even after the end of the process that created or last modified them. Often confused with non-volatile random-access memory (NVRAM), persistent memory is instead more closely linked to the concept of persistence in its emphasis on program state that exists outside the fault zone of the process that created it. Wikipedia

As an example, let’s share the process of opening a Word Document. The Word application temporarily opens the file in memory. In memory, you can continue to write on the document – but it’s not saved permanently until you click save. As you interact with the document, computations are being made by the CPU and it’s pushing and pulling the data in memory. When you click save, the file is moved back from your computer’s memory and stored permanently on the hard drive. You are no longer interacting with the stored version of the file. This is efficient in that the permanent storage takes a long time to read and write, so memory is needed as a temporary caching mechanism in between.

Let’s take it up a notch. When you turn on your laptop or server, your operating system works back and forth to push settings and information into your memory. But what if it didn’t have to?

Of course, I’m oversimplifying the process but I’m doing that to illustrate the means for which software has been engineered for the last couple decades. That’s all about to change!

Persistent memory is a holy grail, the hoped-for love child of DRAM and non-volatility that meshes forgetful memory and persistent storage together. Chris Mellor, The Register

Even with solid state drives (SSD), the latency of reading and writing still required that we have memory to take up the caching slack. However, persistent memory is a huge step forward and somewhere in between modern memory and solid state hard drives:

  • Persistent memory is non-volatile.
  • Persistent memory is much faster than SSD.
  • Persistent memory has much lower latency than SSD.

Micron and Intel introduced 3D XPoint Persistent Memory and provided the following latency stats:

3D XPoint Persistent Memory Latency

The technology will transform both how operating systems are developed, including data center and cloud computing servers. Windows Server 2016 is already supporting persistent memory as of late last year.

Shahin Khan recently shared persistent memory as a top data center prediction:

…as the gap between CPU speed and storage speed separates apps and data, memory becomes the bottleneck. In comes storage class memory (mostly flash, with a nod to other promising technologies), getting larger, faster and cheaper. So, we will see examples of apps using a Great Wall of persistent memory, built by hardware and software solutions that bridge the size/speed/cost gap between traditional storage and DRAM. Eventually, we expect programming languages to naturally support byte-addressable persistent memory.

As the reality of machine learning, deep learning, and artificial intelligence become reality, an advancement in the utility of the cloud becomes necessary – and persistent memory is it. Hold on tight!

About Tony Johnson

Innovative helps you balance your business requirements, service levels, staff and infrastructure to make your IT as effective as possible. Tony Johnson is Vice President of Operations at Innovative and has been helping clients optimize their IT spend and operations since 1983.

Leave a Reply

Innovative Integration can help you optimize your IT infrastructure. Request a Consultation