Large language models (LLMs) have come to dominate the field of natural-language processing, so it’s no surprise that they also dominate the research that Amazon scientists...
Knowledge distillation is a popular technique for compressing large machine learning models into manageable sizes, to make them suitable for low-latency applications such as voice assistants....
Three years ago, Alexa began using an industry-leading self-learning model that learns to correct improperly phrased or misheard customer queries without human involvement. The model detects...
What if artificial intelligence could help an aspiring author write a novel? Or coach people to improve the quality of their writing? Could machines learn how...
In recent years, machine translation systems have become much more accurate and fluent. As their use expands, it has become increasingly important to ensure that they...
Miguel Ballesteros, a principal applied scientist with Amazon Web Services’ AI Labs, is a senior area chair for semantics at this year’s Conference on Empirical Methods...
Amazon’s more than 40 papers at this year’s Conference on Empirical Methods in Natural-Language Processing (EMNLP) — including papers accepted to EMNLP’s new industry track —...