4 Ways to Overcome the Embedded Memory Market Disruption
Recent price developments in DRAM and flash memory are doing more than just shocking purchasing departments; they have the potential to disrupt business models and damage the entire embedded industry. Because many industrial applications require product lifecycles exceeding 10 years, they rely on the long-term availability of memory at stable prices. Unfortunately, simply switching to newer processor generations is not a quick solution, as extensive qualification and certification requirements make rapid transitions impossible. These are difficult challenges, but we have four strategies for how embedded application designers can mitigate the impact of current DRAM prices. But first, let’s look at how we got here.
The current situation
The DRAM industry has consolidated from a market with over 15 manufacturers in the 1990s to a business dominated by three leading players: Micron, Samsung and SK Hynix. This was caused by the high investments required for continuous development of DRAM technologies and the construction of production facilities.
More recently, driven by unprecedented investment from the AI industry, we are seeing a strategic shift by IC manufacturers towards becoming market leaders in high-speed products like HBM for datacenter appliances. This shift is accompanied by massive semiconductor investments in new manufacturing plants dedicated solely to AI. While this has boosted pricing power and profitability for the leading chipmakers, the rest of the industry faces a downturn in investment and continues to struggle with the current ongoing memory market disruption.
The consequence of this is that the "big three" in the DRAM industry are losing interest in the production of DDR4 and DDR5 memory solutions that are mainly used for industrial and embedded devices. Industry sources anticipate additional price hikes for those DRAM architectures. Meanwhile, pricing strategies will keep adapting to the evolving market conditions.
The impact on embedded
And now we come to the impact on embedded and industrial systems. First, the current memory market situation is forcing system builders to increase their pricing because of the volatile memory market, supply shortages, and the domino effect within the value chain. Furthermore, we will see increased pressure to move to newer processor architectures supporting the latest memory technology. This is necessary to mitigate the shortage of older memory technologies in the mid-term and to avoid further cost increases for legacy memory. Adopting mainstream memory is the best approach to overcome this difficult situation in future.
So, if you are in the embedded or industrial sector, how can you minimize the impact of the current memory market situation on your environment? Here are four ideas.
-
The memory you need, not what you want
At this point, it is important to ask whether an embedded application really needs as much memory capacity as currently implemented. Many system builders over-provision memory implementations to ensure flexibility for future updates or to avoid additional effort in a more optimized software environment or AI models. However, now is a good time to focus on implementing the memory you actually need. The current situation will also have an impact on the entire system configuration for future products. -
Leverage memory-optimized OSs
The big question is what embedded OS is the right choice for the system, and how much bandwidth is needed for Edge AI applications and integrated graphics? With the current memory market, it is crucial to move away from memory-hungry general purpose operating systems toward memory-optimized embedded OSs like Ubuntu Pro, ctrlX OS and Kontron OS. -
Memory defines the CPU
System builders will also face the situation where the selection of the processor will not be the starting point of a new project; instead, the required memory will define the processor platform. Cache sizes and speeds will have a big impact on the response time. Therefore, flexible designs built on a modular principle, like Computer-on-Modules (COMs), offer a great benefit with regards to scalability. By a simple modular exchange within the same standard, it is simple to switch from one processor platform to another, without the need for changing the overall system design. -
More flash, less RAM
Faster flash mass storage devices might also help to reduce the capacity of the required DRAM for specific applications.
Conclusion
All in all, system builders need to include the memory configuration much more into their system cost model than in the past. With COMs, they gain the required flexibility and adaptability to meet current and upcoming market challenges.
Related Content

