❌

Normal view

Today β€” 23 April 2026Main stream

JEDEC previews LPDDR6, proving that datacenters have stolen the mobile memory standard

JEDEC is shifting the focus of LPDDR6 from mobile platforms to datacenters – Thanks AI Last year, JEDEC unveiled its JESD209-6 LPDDR6 memory standard, a new high-bandwidth memory type designed for phones, laptops, and other mobile platforms. Now, JEDEC has previewed new updates to the standard, shifting its focus from mobile platforms to β€œdatacenter and […]

The post JEDEC previews LPDDR6, proving that datacenters have stolen the mobile memory standard appeared first on OC3D.

Rambus Quietly Builds The Missing Piece Of AI Servers As SOCAMM2 Becomes The Favorite AI Memory Standard

23 April 2026 at 12:20

Rambus Rolls Out A Vital Component For LPDDR5X SOCAMM2 Memory, The Key Enabler of Next-Gen AI Datacenters

Rambus has announced its LPDDR5X SOCAMM2 memory chipset, a vital component to enable the next-gen compact memory for AI datacenters. SOCAMM2 Lays The Key Foundation of Next-Gen AI Datacenters As Rambus Gears Up Its LPDDR5X-Based Chipset For Launch Press Release: Rambus, a premier chip and silicon IP provider making data faster and safer, today announced aΒ SOCAMM2Β (Small Outline Compression Attached Memory Module)Β chipsetΒ designed to enableΒ low-power, high-performance LPDDR5X-based memory modulesΒ forΒ AI server platforms. The SOCAMM2 chipset represents the first step in a broader Rambus roadmap of LPDDR-based server module solutions, reflecting the company’s ongoing collaboration with industry partners to support new memory architectures optimized […]

Read full article at https://wccftech.com/rambus-builds-missing-piece-of-ai-servers-socamm2-favorite-ai-memory-standard/

JEDEC Previews LPDDR6 Memory With SOCAMM2 Modules & 512 GB Capacities, Clearing the Path for Next-Gen AI Servers

23 April 2026 at 11:45

JEDEC Previews LPDDR6 Memory With SOCAMM2 Modules & 512 GB Capacities, Clearing the Path for Next-Gen AI Servers 1

JEDEC has previewed its LPDDR6 memory standard, powering future AI datacenters & mobile platforms with 512 GB capacities & SOCAMM2 variants. JEDEC's LPDDR6 SOCAMM2 Modules Are Going To Be A Mouth-Watering Piece of Memory Technology For AI Datacenters Today, JEDEC unveiled a new set of features for its upcoming LPDDR6 memory standard "JESD209-6". The new memory standard will play a vital role in powering future AI datacenters, PCs, and mobile platforms. LPDDR6 will not just provide a more power-efficient memory solution, but it will also offer increased performance and higher capacities than existing LPDDR5 and LPDDR5X standards. Memory makers are […]

Read full article at https://wccftech.com/jedec-lpddr6-memory-socamm2-modules-512-gb-capacities/

Yesterday β€” 22 April 2026Main stream

OpenAI Patent Reveals Custom AI Chip With 20 HBM Stacks, Using Intel EMIB-Style Bridges to Smash Current Limits

22 April 2026 at 13:50

OpenAI Patent Reveals Custom AI Chip With 20 HBM Stacks, Using Intel EMIB-Style Bridges to Smash Current Limits

OpenAI has published a new patent in which it discloses an AI chip housing several compute chiplets, surrounded by a large number of HBM memory stacks. One Compute Chiplet, Several HBM Memory Stacks: This Could Be OpenAI's Future AI Chip Plans In a new patent titled "Non-Adjacent Connection of High-Bandwidth Memory Chiplets, I/O Chiplets, And Compute Chiplets Through Embedded Logic Bridges", OpenAI shares plans for an AI chip solution that is going to house several HBM chiplets and compute chiplets, all connected using Embedded Logic Bridges. The research proposes the idea of leveraging these embedded logic bridges for high-speed interconnects […]

Read full article at https://wccftech.com/openai-patent-custom-ai-chip-hbm-memory-stacks-using-intel-emib-like-bridges/

❌
❌