-->

KMWorld 2024 Is Nov. 18-21 in Washington, DC. Register now for Super Early Bird Savings!

AWS partners with Hugging Face to democratize access to NLP models

AWS is collaborating with Hugging Face to aid in making NLP simpler to train and optimize on AWS.

Thanks to the new HuggingFace estimator in the SageMaker SDK, customers can easily train, fine-tune, and optimize Hugging Face models built with TensorFlow and PyTorch.

This should be extremely useful for customers interested in customizing Hugging Face models to increase accuracy on domain-specific language: financial services, life sciences, media and entertainment, and more.

Founded in 2016, Hugging Face, a startup based in New York and Paris, makes it easy to add state of the art Transformer models to your applications. Thanks to their popular transformers, tokenizers and datasets libraries, you can download and predict with over 7,000 pre-trained models in 164 languages. What do I mean by ‘popular’? Well, with over 42,000 stars on GitHub and 1 million downloads per month, the transformers library has become the de facto place for developers and data scientists to find NLP models.

Customers are already using Hugging Face models on Amazon SageMaker. For example, Quantum Health is on a mission to make healthcare navigation smarter, simpler, and most cost-effective for everybody.

NLP datasets can be huge, which may lead to very long training times. In order to help users speed up training jobs and make the most of their AWS infrastructure, AWS worked with Hugging Face to add the SageMaker Data Parallelism Library to the transformers library.

For more information about this partnership, visit aws.amazon.com or https://huggingface.co/.

KMWorld Covers
Free
for qualified subscribers
Subscribe Now Current Issue Past Issues