Alternative AI Compute

deals emerging tech unexpected combo
Major cloud providers and AI developers, including AWS and OpenAI, are rapidly adopting novel hardware configurations to meet intense AI demands. This shift involves integrating specialized accelerators and exploring disaggregated architectures beyond traditional GPU clusters, fundamentally altering data center compute blueprints.
Amazon Web Services is collaborating with chip manufacturer Cerebras on artificial intelligence inference disaggregation technology, which will be implemented within AWS data centers using the Amazon Bedrock platform.
RISC-V proponent SiFive has adopted Nvidia's proprietary NVLink Fusion interconnect technology, a decision that casts doubt on the future viability of competing interconnect standards like UALink.
OpenAI has committed to deploying 750 megawatts worth of Cerebras' large, SRAM-heavy accelerators through 2028, aiming to enhance its ChatGPT inference capabilities and real-time agent performance.