-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for $100 off!

Wallaroo.AI and Ampere join forces to deliver energy- and cost-efficient cloud AI

Wallaroo.AI, the company scaling production machine learning (ML) from the cloud to the edge, and Ampere, the modern semiconductor company, are announcing their strategic collaboration, aimed toward creating optimized hardware and software solutions that innovate on efficiency, sustainability, and lower cost per inference for cloud AI.

As the demand for AI continues to boom, so does its cost and energy consumption, creating a significant need for optimization on these fronts.

With graphics processing units (GPUs) high in demand for training AI models, compounded by the quantity needed for larger ML models, organizations are faced with a ramping AI price tag. On top of its expense, the MIT Technology Review reported that one AI training model uses more energy in a year than 100 U.S. homes, according to the companies. Clearly, while there is a massive benefit in implementing proprietary AI, it comes at a remarkable cost.

To address these challenges, Wallaroo.AI and Ampere are combining their technologies to innovate on the AI energy and cost consumption dilemma.

The focus of this collaboration centers on the integration of key Ampere and Wallaroo.AI technologies—Ampere’s built-in AI acceleration and Wallaroo.AI’s efficiency inference server derived from the Wallaroo Enterprise Edition platform for production ML. This strategic collaboration leverages Ampere’s enhanced sustainability to engineer low-code/no-code ML software solutions and customized hardware, granting organizations simple to implement AI at both a lower cost and lower energy consumption.

“This Wallaroo.AI/Ampere solution allows enterprises to deploy easily, improve performance, increase energy efficiency, and balance their ML workloads across available compute resources much more effectively,” said Vid Jain, chief executive officer of Wallaroo.AI. “All of which is critical to meeting the huge demand for AI computing resources today also while addressing the sustainability impact of the explosion in AI.”

Wallaroo.AI’s inference server optimizes AI and ML workloads costs by using currently available, advanced CPUs; the result delivers a similar performance output to that of its more expensive competitors. These CPUs—specifically within the Ampere Altra Family of processors—increase the energy efficiency of inference workloads that involve AI and ML.

According to benchmark results, this integration displayed as much as 6x improvement over containerized x86 solutions on specific models, such as the open source ResNet-50 model.

“Through this collaboration, Ampere and Wallaroo.AI are combining cloud native hardware and optimized software to make ML production within the cloud much easier and more energy-efficient,” said Jeff Wittich, chief product officer at Ampere. “That means more enterprises will be able to turn AI initiatives into business value more quickly.”

To learn more about this strategic collaboration, please visit https://wallaroo.ai/ or https://amperecomputing.com/.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues