DRAM-less SSDs – Frequently Asked Questions

Answering Age-Old Questions Regarding DRAM-less SSDs
This post includes affiliate links, for which we may earn a commission at no extra cost to you should you make a purchase using our links. As an Amazon Associate, we can earn from qualifying purchases. Learn more.
There are usually two primary sides when it comes to DRAM-less SSDs. It’s usually either to shun DRAM-less SSDs period or to accept them as a viable budget option.

The issue here lies within use cases. Statistical enthusiasts often cite basic design faults, failure rates, and subpar benchmarks, while actual users attest to their reliability and usability. Add the fact that even reputable brands (except Samsung and Seagate) have their dedicated line of DRAM-less SSDs, and we are left scratching our heads in confusion.

But while we don’t have any user-case data to have conclusions with, there is information that we can still definitely draw with regards to today’s SSD technologies to help us answer some of the age-old questions about DRAM-less SSDs.

Why is DRAM important in an SSD again?

When your computer needs to access data from an SSD, the DRAM cache acts as a mapping buffer, and this allows the sections containing the requested data to be accessed quickly and easily. The result, at least from an operational standpoint, is that performance becomes relatively consistent due to having a dedicated component whose sole job is to act as a data “catalog”.

Because there is a lesser delay when it comes to reading and writing data, a DRAM cache also has the effect of making the SSD more responsive.

One of the commonly cited fatal flaws of DRAM-less SSDs is that they create a lot more unnecessary read and write cycles, due to the data mapping table being processed directly onto the drive’s storage space. Thus, a DRAM cache is also frequently touted for extending SSD operational lifespans, since the read and write cycles (including random ones) are dedicated only to actual data processing operations.

TL;DR: Affects operational stability, with a heavy emphasis on response times and longevity.

Is it a waste to save a bit of money with a DRAM-less SSD?

That being said, when citing how inferior DRAM-less SSDs are, the actual practical use of its technologies are often dismissed. The focus is solely on the lack of DRAM, never usually answering the question of why there isn’t (aside from cost), and never establishing a line separating the application of both types of SSDs in accordance with the technologies that they employ.

The relatively lower cost of SSDs today is all thanks to the Triple-Level Cell (TLC) NAND flash technology. TLC-based SSDs achieves this lower cost per bit of data, due to being able to store three-bit values in one cell. However, precisely due to this squeeze of values within a single cell, that it has narrower data distribution and voltage margins.

As such, some form of Single-Level Cell (SLC) caching is always implemented to keep the SSD data “intact”. This… is the technology that allows the cheaper TLC SSDs to forego DRAM, allowing performance levels close to that of much more expensive SLC SSDs, at an extended (theoretical) usage lifespan that is more or less equal to the time users usually buy new storage media (somewhere between three to five years).

Hence, the limitations of DRAM-less SSDs are not as catastrophic as critics claim that it is. Even better, for direct read operations, the negatives of DRAM-less SSDs are often negligible, so long as the NAND cells themselves aren’t filled to capacity, or haven’t degraded too much due to long-time use. Typical loading times for most games for example, would only see a few seconds of difference. Even then, such results wouldn’t exactly be due to the lack of DRAM anyway.

So no, saving a bit of money to purchase a DRAM-less SSD is not a waste, if your objective is to use the storage drive within its typical warranty period (of usually three years). In fact, if it is not a boot drive and is only meant to install/archive software/games that you need to use (read onto) often, its service lifespan can be extended significantly further.

Do NVME SSDs have DRAM too and is it important for them?

The variety for DRAM in NVME SSDs will be more or less the same: some have, others don’t. Why exactly do these more-advanced “AHCI-transcended” SSDs not have DRAM despite the higher prices, you ask? Their cost is half the answer. These NVMEs would have their prices jacked up significantly higher like the 980 Pro and wouldn’t be able to compete economically.

The other half is that NVMEs without DRAM do employ the use of a Host Memory Buffer (HMB) to facilitate the functions of traditional DRAM. Well, this is practically DRAM, but only taken from the resource of your PC’s CPU. The inherent communication delay between components makes HMB slower than a dedicated DRAM. However, it is still operationally faster than when it is saved directly within the drive’s storage space. Plus, its size is usually just in the tens of megabytes (MBs), as it is used to map the drive’s storage data.

Another reason is that some NVMEs are simply designed so efficiently that they essentially no longer need DRAM (or even HMB for that matter) to achieve the same level of performance and reliability. The most famous example of a top-quality DRAMless NVMe SSD is the Western Digital WD Blue SN550, for several reasons:

  • It is based on the powerful (proprietary) SSD controller design of its older brother, the WD Black SN750.
  • It uses one of the later developed flash technologies (starting 2017), the 96L/BiCS4 flash, that provides read performance that is slightly faster than traditional low-end NVMEs.
  • Lastly, it is designed with a Static RAM (SRAM), allowing it to handle much larger write operations without breaking a sweat as if it has an actual DRAM.

In fact, Western Digital is so confident of the WD Blue SN550 model that it is bundled with a five-year warranty, something that is only usually given to SSDs with DRAM.

Related Posts