21th AIAI 2025, 26 - 29 June 2025, Limassol, Cyprus

SHAKTI: A 2.5 BILLION PARAMETER SMALL LANGUAGE MODEL OPTIMIZED FOR EDGE AI AND LOW-RESOURCE ENVIRONMENTS

Syed Abdul Gaffar Shakhadri, Kruthika KR, Rakshit Aralimatti

Abstract:

  We introduce Shakti, a 2.5 billion parameter language model specifically optimized for resource constrained environments such as edge devices, including smartphones, wearables, and IoT systems.Shakti combines high-performance NLP with optimized efficiency and precision, making it ideal for real-time AI applications where computational resources and memory are limited. With support for vernacular languages and domain-specific tasks, Shakti excels in industries such as healthcare, finance, and customer service. Benchmark evaluations demonstrate that Shakti performs competitively against larger models while maintaining low latency and on-device efficiency, positioning it as a leading solution for edge AI.  

*** Title, author list and abstract as submitted during Camera-Ready version delivery. Small changes that may have occurred during processing by Springer may not appear in this window.