❌

Reading view

This PCIe AI Accelerator Card Can Run 700B LLMs Locally With 384 GB Memory at Just 240W, Less Than Half The Power of RTX PRO 6000 Blackwell

A circuit board labeled HTX301 Evaluation Platform features an HTX301 chip in the center.

A Taiwanese company has announced its new PCIe AI accelerator card that can run 700B LLMs locally at just 240W, ending the need for large GPU clusters. Taiwanese Company Unveils Its PCIe AI Accelerator That Devalues Large-Scale AI Installations By Running 700B LLMs on A Single Card Skymizer, a Taiwan-based company specializing in AI software and hardware, has announced its brand new solution, the HTX301. The HTX301 is designed for On-Prem AI, offering a PCIe Add-in-Card design and offering large-scale levels of AI performance at sub-250W TDPs. Some of the highlights of the card include: The company says that the […]

Read full article at https://wccftech.com/this-pcie-ai-accelerator-card-packs-384-gb-memory-run-700b-llms-240w/

Anthropic Eyes UK Startup’s Fusion Tech Promising 100x Faster AI Inference at One-Tenth the Cost of NVIDIA’s Groq

Anthropic Eyes UK Startup's Fusion Tech Promising 100x Faster AI Inference at One-Tenth the Cost of NVIDIA's Groq

Anthropic, the creators of Claude AI, are reportedly in early talks with a UK startup whose SRAM tech can boost AI inference by 100x & reduce costs by 10x. Anthropic Reportedly In Early Talks With Fractile, A UK-based Startup Working on the fusion architecture as an AI Inference Booster Currently, Anthropic sources its chips from various companies, including NVIDIA, Google, and Amazon. This trio allows the company to keep running its AI infrastructure without major concerns that are often associated with relying on a single chipmaker. But as compute demand intensifies in the AI space, many AI firms are now […]

Read full article at https://wccftech.com/anthropic-sets-eyes-on-uk-startup-tech-speeds-up-ai-inference-100x-reduces-costs-10x/

❌