Teaching large language models (LLMs) to reason is an active topic of research in natural-language processing, and a popular approach to that problem is the so-called...
Large language models (LLMs) have been around for a while but have really captured the attention of the public this year, with the advent of ChatGPT....
As they are everywhere, large language models are a major topic of conversation at this year’s meeting of the Association for Computational Linguistics (ACL). Yang Liu,...
Deep-learning models are data driven, and that data may contain sensitive information that requires privacy protection. Differential privacy (DP) is a formal framework for ensuring the...
The general chair of this year’s meeting of the European Chapter of the Association for Computational Linguistics (EACL) is Alessandro Moschitti, a principal scientist in the...
In recent years, and even recent months, there have been rapid and dramatic advances in the technology known as generative AI. Generative AI models are trained...
At this year’s International Conference on Learning Representations (ICLR), Amazon CodeWhisperer — the automatic-code-generation service from Amazon Web Services — is sponsoring the second Workshop on...
The Association for the Advancement of Artificial Intelligence’s annual Conference on Artificial Intelligence (AAAI) received around 9,000 paper submissions this year, which required a proportionally large...
The machine learning models that power conversational agents like Alexa are typically trained on labeled data, but data collection and labeling are expensive and complex, creating...