The current wave of generative AI models is founded on the Transformer architecture, heralded by the rise of large language models (LLMs). However, despite their prominence, LLMs exhibit inherent drawbacks and constraints. In response to these issues with LLMs, researchers are ramping up development of small language models that could be a game changer for generative AI.

About the Author

  • Ryoji Kashiwagi

    Expert Researcher

    Financial Market & Digital Business Research Department

Contact