The rise of high AI computing power, three major storage manufacturers compete for HBM4
On HBM4, Samsung Electronics, SK Hynix, and Micron Technology also launched a new competition. Driven by the main demand for AI, the three major storage giants Samsung Electronics, SK Hynix and Micron will become each other's biggest competitors in the future, and HBM4 will also become their next competitive point on the high-computing power track.
HBM (High Bandwidth Memory) is a new type of CPU/GPU memory chip. It actually stacks many DDR chips together and packages them with the GPU to achieve a large-capacity, high-bit-width DDR combination array. This memory technology breaks through the memory capacity and bandwidth bottlenecks and is regarded as a new generation DRAM solution. It is also in line with the development trend of miniaturization and integration of semiconductor technology.
Over the past 10 years, HBM technology performance has been continuously upgraded and iterated, and it has become one of the important technical cornerstones in the field of high-performance computing. Since the beginning of 2023, large AI models represented by ChatGPT have generated huge demands for computing power, making HBM one of the few prosperous market segments in the entire memory chip industry. Although HBM3E is still undergoing performance verification, HBM4-related technological innovation competition has already begun among major storage manufacturers.
It is reported that all major storage manufacturers are currently developing HBM4 memory technology using a wider 2048 bit interface. Among them, Samsung and SK Hynix have also disclosed the timetable for HBM4, and Micron Technology has started the development of the next generation HBM memory, temporarily named HBMnext. It can be seen that as the application of AI technology continues to deepen and develop, the HBM4 era is coming.