All News

Arm Transforms Into AI Platform Leader with New Product Strategy and Ecosystem

Arm is evolving from a chip IP supplier to a platform-first company, launching new product families tailored for AI workloads across infrastructure, PCs, mobile, automotive, and IoT. With record revenues and growing adoption in cloud and automotive sectors, Arm’s energy-efficient designs and expanded software ecosystem position it as a foundational player in scalable AI compute.

Published May 16, 2025 at 12:14 AM EDT in Artificial Intelligence (AI)

Arm, the UK-based chip designer renowned for its system-on-chip (SoC) architectures powering giants like Nvidia, Amazon, and Alphabet, is undergoing a strategic transformation. Traditionally a supplier of chip intellectual property (IP) without manufacturing hardware, Arm is shifting to become a platform-first company focused on AI workloads. This move aligns with the explosive growth of AI in enterprise and cloud environments, where Arm’s energy-efficient designs offer a competitive edge.

Arm’s CEO Rene Haas highlighted the rising energy demands of data centers, which currently consume around 460 terawatt hours annually and could triple by 2030. Arm’s low-power chip designs and optimized software are positioned to curb this growth, making AI training and inference more sustainable. This environmental and economic imperative drives Arm’s new platform approach, integrating hardware, software, and firmware to deliver scalable, efficient AI compute solutions.

From IP Supplier to Platform Provider

Arm’s new product naming strategy reflects its platform-centric vision. The company introduced distinct product families targeting specific markets:

  • Neoverse for infrastructure
  • Niva for PCs
  • Lumex for mobile
  • Zena for automotive
  • Orbis for IoT and edge AI

The Mali GPU brand remains as a component within these platforms. Additionally, Arm revamped its product numbering to align with platform generations and performance tiers such as Ultra, Premium, Pro, Nano, and Pico, enhancing transparency for customers and developers.

Strong Financial Performance Fuels Expansion

Arm’s Q4 fiscal 2025 results underscore the success of its strategy, with $1.24 billion in revenue—a 34% year-over-year increase. Licensing revenue surged 53% to $634 million, while royalty revenue rose 18% to $607 million, driven by widespread adoption of Armv9 architecture and Compute Subsystems across smartphones, cloud infrastructure, and edge AI.

The mobile sector stood out, with smartphone royalty revenue growing about 30% despite less than 2% growth in global shipments. Arm also secured its first automotive Compute Subsystem agreement with a leading electric vehicle manufacturer, signaling strong growth potential in automotive AI and self-driving technologies.

Cloud providers including AWS, Google Cloud, and Microsoft Azure continue expanding their use of Arm-based silicon for AI workloads, reinforcing Arm’s growing influence in data center computing.

Expanding Software Ecosystem and Developer Support

Complementing its hardware platforms, Arm is enhancing its software tools and ecosystem. Its GitHub Copilot extension, now free for all developers, enables optimized coding for Arm architectures. Over 22 million developers build on Arm, and its Kleidi AI software layer has surpassed 8 billion installs, demonstrating broad adoption.

This integrated approach aims to meet rising demand for energy-efficient AI compute from edge devices to cloud data centers, positioning Arm as a foundational platform for the AI-driven future.

Implications for AI and Data Leaders

Arm’s platform shift offers AI and data decision makers clearer, more efficient pathways for selecting compute architectures tailored to AI workloads. Predefined platforms like Neoverse and Lumex simplify integration, accelerate development cycles, and optimize cost-performance tradeoffs for training and inference tasks.

Engineers managing AI pipelines benefit from modular platform tiers that align compute capabilities with workload demands, easing pipeline standardization across edge and cloud environments. Data infrastructure teams gain from scalable designs that support high-throughput pipelines and faster custom silicon development.

Security professionals will find consistent architecture across edge and cloud simplifies enforcing end-to-end protections, critical for AI workloads requiring both high performance and robust access controls.

Overall, Arm’s evolution signals a new era where it provides full-stack foundations for building and scaling AI systems, moving beyond component IP to delivering integrated platforms that address the complexity and energy demands of modern AI.

Keep Reading

View All
The Future of Business is AI

AI Tools Built for Agencies That Move Fast.

QuarkyByte offers deep insights into Arm’s new AI platform strategy, helping developers and enterprises optimize AI workloads with energy-efficient compute. Explore how Arm’s integrated platforms and software tools can accelerate your AI projects and reduce costs across cloud, edge, and automotive applications.