AMD CTO Predicts $500B AI Market by 2028, Says Openness Is Key to Scale

At the 2025 OCP Global Summit, AMD Chief Technology Officer and Executive Vice President Mark Papermaster underscored a clear message: artificial intelligence will only sustain its exponential growth if the industry rallies around open collaboration and shared standards. He pointed to a rapidly expanding market—potentially reaching the hundreds of billions of dollars by 2028—and emphasized that interoperability across the entire AI stack is the key to getting there.

The AI boom has spotlighted a hard truth. Training state-of-the-art models, deploying them at scale, and keeping costs and power consumption under control require more than raw computing muscle. It takes an open, interoperable foundation that lets hardware and software from different vendors work seamlessly together. That means well-defined standards for how accelerators communicate, how data flows through memory and storage, how networks move massive workloads across clusters, and how developers compile, optimize, and deploy models without being locked into a single platform.

This is where the OCP community plays a vital role. By advancing open designs for data center infrastructure—racks, power, cooling, networking, and server layouts—OCP helps AI builders scale responsibly. As models get larger and inference demand surges, standardized building blocks reduce integration friction, speed up deployment, and improve energy efficiency across facilities. Open reference designs and shared best practices give companies a faster path from pilot to production while keeping long-term total cost of ownership in check.

For developers, open collaboration lowers the barrier to innovation. Consistent interfaces, portable model formats, and widely supported runtimes make it easier to move from prototyping to production across on‑premises clusters and multiple clouds. When compilers, kernels, and libraries follow common standards, teams can optimize for performance without sacrificing flexibility. This not only accelerates iteration cycles but also makes AI more accessible to smaller organizations that can’t afford bespoke integrations.

For enterprises, the benefits are just as tangible. A standards-based approach offers pragmatic insurance against rapid change. It allows organizations to combine best-of-breed components—CPUs, GPUs, accelerators, memory, storage, and networking—without rebuilding their stack each time they upgrade. It encourages healthy competition, which drives down costs and pushes the performance-per-watt envelope. And it fosters transparency around benchmarking and deployment practices, helping leaders make data-driven decisions as they scale.

Sustainability is another pillar of the open approach. As AI clusters consume more power, shared standards for telemetry, power delivery, cooling, and workload scheduling become essential. By co-optimizing hardware and software, organizations can reduce energy use, improve utilization, and extend the life of their infrastructure. Publishing and adopting open efficiency metrics creates a common language that helps the entire ecosystem progress faster.

The strategic takeaway from Papermaster’s remarks is straightforward: if the AI market is to reach its full potential by 2028—potentially as high as $500 billion—growth must be built on a foundation of openness. That means:

– Prioritizing interoperable hardware and software to avoid vendor lock-in.
– Supporting common model formats, frameworks, and runtime interfaces to streamline development.
– Embracing open data center designs that improve scalability and energy efficiency.
– Participating in community-driven standards that make benchmarking and deployment more transparent.

The momentum behind generative AI, advanced analytics, and edge intelligence is undeniable. But the pace of innovation creates complexity, and complexity is where closed ecosystems struggle. Open collaboration turns that complexity into a competitive advantage by letting experts across vendors, cloud providers, research institutions, and enterprises contribute to a shared foundation. The result is faster innovation, lower costs, greater sustainability, and broader access to cutting-edge capabilities.

Papermaster’s call to action aligns with what many AI leaders are already experiencing on the ground: the projects that scale, deliver ROI, and keep optionality intact are the ones that are built on open, modular, and standards-driven architectures. As the industry races toward the next wave of AI breakthroughs, the path forward is clearer than ever—work together, align on shared standards, and build an ecosystem where innovation compounds for everyone.