HomeWinBuzzer NewsAmazon Boosts AI Development in Partnership with NLP Provider Hugging Face

Amazon Boosts AI Development in Partnership with NLP Provider Hugging Face

The partnership aims to make it easier for developers and companies to leverage machine learning models and ship NLP features faster.

-

Hugging Face, a leading provider of open-source natural language processing (NLP) and generative AI models, has announced a strategic partnership with Amazon Web Services (AWS), selecting Amazon as its preferred cloud provider for the future.

The partnership aims to make it easier for developers and companies to leverage machine learning models and ship NLP features faster. The Hugging Face community can now access AWS’s machine learning services and infrastructure. This includes the cloud machine-learning platform Amazon SageMaker, the custom machine-learning processor AWS Trainium and the machine learning accelerator AWS Inferentia, to train, fine-tune and deploy their models. 

The partnership offers a container service specialized for deep learning, referred to as the DLC (deep learning container), to speed up deployment of Hugging Face’s library of over 7,000 NLP models.

Thousands of Pre-trained NLP Models

The DLC enables developers to use Hugging Face’s Transformers library with just a few lines of code. Transformers is an open-source framework that provides thousands of pre-trained models for various NLP tasks, such as text classification, sentiment analysis, question answering, summarization and text generation.

The DLC also supports distributed training on multiple GPUs or TPUs using Amazon SageMaker’s built-in data parallelism feature. This allows developers to scale up their training jobs without worrying about the underlying infrastructure.

Additionally, the partnership enables developers to use AWS’s custom-built chips for machine learning inference. AWS Trainium is a new chip designed to deliver high performance and low cost for training workloads. AWS Inferentia is another chip that provides high throughput and low latency for inference workloads.

By using these chips with Hugging Face’s models, developers can reduce their costs and improve their performance for both training and inference.

The partnership between AWS and Hugging Face is expected to benefit both parties. For AWS, it will attract more developers who are interested in using open-source deep learning models on its platform. For Hugging Face, it will expand its reach and impact in the machine learning community.

Tip of the day: Do you know that Windows 11 / Windows 10 allows creating PDFs from basically any app with printing support? In our tutorial, we show you how this works via Microsoft Print to PDF and Bullzip PDF Printer to save a PDF from any app, even with advanced options like adjusted quality, multi-page printing, and password protection.

Markus Kasanmascheff
Markus Kasanmascheff
Markus is the founder of WinBuzzer and has been playing with Windows and technology for more than 25 years. He is holding a Master´s degree in International Economics and previously worked as Lead Windows Expert for Softonic.com.