As AI workload demands continue to accelerate, Cloud Service Providers, System OEMs, and IP/Silicon vendors require a scalable, high-performance solution to support advanced workloads. By enhancing performance, optimizing power and cost efficiency, and promoting interoperability and supply chain diversity, the UALink 200G 1.0 Specification delivers a low-latency, high-bandwidth interconnect designed for efficient communication between accelerators and switches within AI computing pods.
Room 201

Nafea Bshara

Amber Huffman
Amber Huffman is a Principal Engineer in Google Cloud responsible for leading industry engagement efforts in the data center ecosystem across servers, storage, networking, accelerators, power, cooling, security, and more. Before joining Google, she spent 25 years at Intel serving as an Intel Fellow and VP. Amber is the President of NVM Express, on the Board of Directors for the Open Compute Project Foundation (OCP), on the Board of Directors for Ultra Accelerator Link (UALink), and the chair of the RISC-V Software Ecosystem (RISE) Project. She has led numerous industry standards to successful adoption, including NVM Express, Open NAND Flash Interface, and Serial ATA.