1613607737 faster memory ai helps samsung double hbm speeds

Faster Memory: A.I. Helps Samsung Double HBM Speeds

High-bandwidth memory, or HBM, is already fast. But Samsung wants to make it even faster. The South Korean-based technology giant has announced its HBM-PIM architecture, which will double the speed of high-bandwidth memory by bending over artificial intelligence.

PIM, which stands for Processor-in-Memory, leverages the capabilities of artificial intelligence to speed up memory, and Samsung expects its HBM-PIM technology to be used in data center and high-performance computing (HPC) ) Will be done in applications such as machines. Future.

“Our groundbreaking HBM-PIM is the industry’s first programmable PIM solution designed for a variety of AI-driven workloads such as HPC, training and insertions,” Kwangil Park, Samsung Electronics, senior vice president of memory product planning, said in a statement said. “We plan to build this success further by collaborating with AI solution providers for advanced PIM-powered applications.”

Faster memory ai helps samsung double hbm speeds

A potential customer for Samsung’s HBM-PIM is the Argon National Laboratory, which “hopes to use the technology to solve problems of interest.” Lab noted that HBM-PIM addresses memory bandwidth and performance challenges for HPC and AI computing by providing impressive performance and power gains.

According to Samsung, HBM-PIM works by placing a DRAM-optimized engine inside each memory bank in the storage subunit to enable parallel processing to minimize data submission.

“When applied to Samsung’s existing HBM2 Aquabolt solution, the new architecture is able to double system performance while reducing energy consumption by more than 70%,” the company said. “HBM-PIM does not require any hardware or software changes, allowing for faster integration into existing systems.”

This differs from existing applications, all based on the von Neumann architecture. In current solutions, a sequential approach requires a separate processor and separate memory unit to carry out all data processing tasks. This requires the data to travel back and forth, often resulting in bottlenecks when handling large data volumes.

By eliminating the bottleneck, Samsung’s HBM-PIM can be a useful tool in a data scientist’s arsenal. Samsung claims that HB-PIM is now being tested by AI accelerators by AI solution partners, and verification is expected to be completed in the first half of 2021.

Editors recommendations

Similar Posts