Hot Chips 33: Samsung demonstrates latest advancements in PIM technology

Since its introduction in February 2021, the HBM-PIM (Aquabolt-XL) has been tested in the Xilinx Virtex Ultrascale+ (Alveo) AI accelerator, where it delivered an almost 2.5X system performance gain while reducing energy consumption by more than 60%, Samsung said in a press release on Tuesday.


Devdiscourse News Desk | Seoul | Updated: 24-08-2021 09:18 IST | Created: 24-08-2021 09:18 IST
Hot Chips 33: Samsung demonstrates latest advancements in PIM technology
Image Credit: Samsung

At Hot Chips 33 annual conference, Samsung today showcased its latest advancements with processing-in-memory (PIM) technology including the first successful integration of its PIM-enabled High Bandwidth Memory (HBM-PIM) into a commercialized accelerator system.

Since its introduction in February 2021, the HBM-PIM (Aquabolt-XL) has been tested in the Xilinx Virtex Ultrascale+ (Alveo) AI accelerator, where it delivered an almost 2.5X system performance gain while reducing energy consumption by more than 60%, Samsung said in a press release on Tuesday.

HBM-PIM is the industry’s first AI-tailored memory solution being tested in customer AI-accelerator systems, demonstrating tremendous commercial potential. Through standardization of the technology, applications will become numerous, expanding into HBM3 for next-generation supercomputers and AI applications, and even into mobile memory for on-device AI as well as for memory modules used in data centers.

Nam Sung Kim, senior vice president of DRAM Product & Technology at Samsung Electronics.

Next, the Acceleration DIMM (AXDIMM) minimizes large data movement between the CPU and DRAM by bringing processing to the DRAM module itself. With an AI engine built inside the buffer chip, the AXDIMM can perform parallel processing of multiple memory ranks instead of accessing just one rank at a time, greatly enhancing system performance and efficiency.

The AXDIMM is claimed to offer approximately twice the performance in AI-based recommendation applications and a 40% decrease in system-wide energy usage.

Further, Samsung's LPDDR5-PIM mobile memory technology is capable to provide independent AI capabilities without data center connectivity, with simulation tests showing that the technology can more than double performance while reducing energy usage by over 60% when used in voice recognition, translation and chatbot applications.

Lastly, Samsung announced plans to expand its AI memory portfolio by teaming up with other industry leaders to complete the standardization of the PIM platform in the first half of 2022.

Give Feedback