FedML closes on $11.5M funding round to help enterprises build custom large language models
FedML announced it has closed $11.5 million in seed funding to expand development and adoption for its distributed MLOps platform, helping companies efficiently train and serve custom generative AI and large language models using proprietary data.
The oversubscribed funding round was fueled by growing investor interest and market demand for large language models, popularized by OpenAI, Microsoft, Meta, Google, and others.
FedML offers a “distributed AI” ecosystem that empowers companies and developers to work together on machine learning tasks by sharing data, models, and compute resources—fueling waves of AI innovation beyond large technology companies, according to the vendor.
Unlike traditional cloud-based AI training, FedML empowers distributed machine learning via both edge and cloud resources, through innovations at three AI infrastructure layers:
- A powerful MLOps platform that simplifies training, serving, and monitoring generative AI models and LLMs in large-scale device clusters including GPUs, smartphones, or edge servers;
- A distributed and federated training/serving library for models in any distributed settings, making foundation model training/serving cheaper and faster, as well as leveraging federated learning to train models across data silos, and
- A decentralized GPU cloud to reduce the training/serving cost and save time on complex infrastructure setup and management via a simple “fedml launch job” command.
“We believe every enterprise can and should build their own custom AI models, not simply deploy generic LLMs from the big players,” said Salman Avestimehr, co-founder and CEO of FedML, and inaugural director of the USC + Amazon Center on Secure and Trusted Machine Learning. “Large-scale AI is unlocking new possibilities and driving innovation across industries, from language and vision to robotics and reasoning. At the same time, businesses have serious and legitimate concerns about data privacy, intellectual property and development costs. All of these point to the need for custom AI models as the best path forward.”
FedML recently introduced FedLLM, a customized training pipeline for building domain-specific large language models on proprietary data.
FedLLM is compatible with popular LLM libraries such as HuggingFace and DeepSpeed, and is designed to improve efficiency, security and privacy of custom AI development. To get started, developers only need to add a few lines of source code to their applications. The FedML platform then manages the complex steps to train, serve and monitor the custom LLM models, according to the company.
The $11.5 million seed round includes $4.3 million in a first tranche previously disclosed in March, plus $7.2 million in a second tranche that closed earlier this month.
The seed round was led by Camford Capital, along with additional investors Road Capital, Finality Capital Partners, PrimeSet, AimTop Ventures, Sparkle Ventures, Robot Ventures, Wisemont Capital, LDV Partners,Modular Capital and University of Southern California (USC). FedML previously raised nearly $2 million in pre-seed funding.
For more information about this news, visit www.fedml.ai.