NLP Customer Success Stories
Gilead
Gilead’s PDM (pharmaceutical development and manufacturing) team chose Amazon Web Services (AWS), adopting Amazon Kendra, a highly accurate intelligent search service powered by ML. While receiving support from AWS, the PDM team built a data lake within 9 months, and continued on to build a search tool within only 3 months, completing its project well within its estimated timeline of 3 years.
Since launching its enterprise search tool, users across the PDM team have been able to substantially reduce manual data management tasks and the amount of time it takes to search for information by approximately 50 percent. This has fueled research, experimentation, and pharmaceutical breakthroughs.
Amazon Kendra is a turnkey AI solution that, when configured correctly, is capable of spanning every single domain in the organization while being straightforward to implement.
-- Jeremy Zhang, Director of Data Science and Knowledge Management, Gilead Sciences Inc.
Latent Space
Latent Space is a company that specializes in the next wave of generative models for businesses and creatives, combining fields that have long had little overlap: graphics and natural language processing (NLP).
Amazon SageMaker‘s unique automated model partitioning and efficient pipelining approach made our adoption of model parallelism possible with little engineering effort, and we scaled our training of models beyond 1 billion parameters, which is an important requirement for us. Furthermore, we observed that when training with a 16 node, eight GPU training setup with the SageMaker model parallelism library, we recorded a 38% improvement in efficiency compared to our previous training runs.
AI Language Services
Amazon Lex
Amazon Lex provides automatic speech recognition (ASR) and natural language understanding (NLU) capabilities so you can build applications and interactive voice response (IVR) solutions with engaging user experiences. Now, you can programmatically provide phrases as hints during a live interaction to influence the transcription of spoken input.
Amazon Comprehend
Amazon Comprehend is a natural language processing (NLP) service that uses machine learning to find insights and relationships like people, places, sentiments, and topics in unstructured text. You can use Amazon Comprehend ML capabilities to detect and redact personally identifiable information (PII) in customer emails, support tickets, product reviews, social media, and more. Now, Amazon Comprehend PII supports 14 new entity types, with localized support for entities within the United States, Canada, United Kingdom, and India. Customers can now detect and redact 36 entities to protect sensitive data.
Amazon Lex
Amazon Lex is a service for building conversational interfaces into any application using voice and text. Starting today, you can give Amazon Lex additional information about how to process speech input by creating a custom vocabulary. A custom vocabulary is a list of domain-specific terms or unique words (e.g., brand names, product names) that are more difficult to recognize. You create the list and add it to the bot definition, so Amazon Lex can use these words when determining the user’s intent or collecting information in a conversation.
NLP on Amazon SageMaker
- Detect social media fake news using graph machine learning with Amazon Neptune ML. The spread of misinformation and fake news on these platforms has posed a major challenge to the well-being of individuals and societies. Therefore, it is imperative that we develop robust and automated solutions for early detection of fake news on social media. Traditional approaches rely purely on the news content (using natural language processing) to mark information as real or fake. However, the social context in which the news is published and shared can provide additional insights into the nature of fake news on social media and improve the predictive capabilities of fake news detection tools.
- Fine-tune transformer language models for linguistic diversity with Hugging Face on Amazon SageMaker. Today, natural language processing (NLP) examples are dominated by the English language, the native language for only 5% of the human population and spoken only by 17%. In this post, we summarize the challenges of low-resource languages and experiment with different solution approaches covering over 100 languages using Hugging Face transformers on Amazon SageMaker.
- Run text classification with Amazon SageMaker JumpStart using TensorFlow Hub and Hugging Face models. In this post, we provide a step-by-step walkthrough on how to fine-tune and deploy a text classification model, using trained models from TensorFlow Hub. We explore two ways of obtaining the same result, via JumpStart’s graphical interface on Studio, and programmatically through JumpStart’s APIs.
- Build a custom Q&A dataset using Amazon SageMaker Ground Truth to train a Hugging Face Q&A NLU model. One NLU problem of particular business interest is the task of question answering. In this post, we demonstrate how to build a custom question answering dataset using Amazon SageMaker Ground Truth to train a Hugging Face question answering NLU model.
- Achieve hyperscale performance for model serving using NVIDIA Triton Inference Server on Amazon SageMaker. In this post, we look at best practices for deploying transformer models at scale on GPUs using Triton Inference Server on SageMaker. First, we start with a summary of key concepts around latency in SageMaker, and an overview of performance tuning guidelines. Next, we provide an overview of Triton and its features as well as example code for deploying on SageMaker. Finally, we perform load tests using SageMaker Inference Recommender and summarize the insights and conclusions from load testing of a popular transformer model provided by Hugging Face.
Content Moderation design patterns with AWS managed AI services
Modern web and mobile platforms fuel businesses and drive user engagement through social features, from startups to large organizations. Online community members expect safe and inclusive experiences where they can freely consume and contribute images, videos, text, and audio. The ever-increasing volume, variety, and complexity of UGC (user generated content) make traditional human moderation workflows challenging to scale to protect users.
Watch a presentation of the demo on YouTube
Read more about content moderation design patterns with AWS managed AI services
Top comments (0)