Will future generative AI models be larger or smaller?
The current wave of generative AI models is founded on the Transformer architecture, heralded by the rise of large language models (LLMs). However, despite their prominence, LLMs exhibit inherent drawbacks and constraints. In response to these issues with LLMs, researchers are ramping up development of small language models that could be a game changer for generative AI.
About the Author
Contact
E-mail: kyara@nri.co.jp