Congratulations to Associate Professor Lu Wei and his team for their successful launch of new AI Model “TinyLlama”
Congratulations to Associate Professor Lu Wei and his team for their successful launch of new AI Model “TinyLlama”
Congratulations to Associate Professor Lu Wei and his team for their successful launch of new AI Model “TinyLlama”. It’s called TinyLlama and it’s taken the research world by storm because of how much power it packs.
Developed by Associate Professor Lu Wei of Singapore University of Technology and Design (SUTD), research assistant Mr Zhang Peiyuan, and PhD students, Mr Zeng Guangtao, and Mr Wang Tianduo, TinyLlama is a 1.1 billion parameter open-sourced small language model that has outperformed other open-source models of comparable sizes across several benchmarks. A total of three trillion tokens1 of datasets were pre-trained on TinyLlama within just four months.
Current large language models (LLMs) such as ChatGPT or Google Bard developed by large technology firms such as OpenAI or Google are managed by thousands or even tens of thousands of graphic processing units (GPUs) and require users to connect online to their massive servers. TinyLlama, in contrast, is built on just 16 GPUs and takes up only 550MB of Random Access Memory (RAM). In other words, TinyLlama can readily be deployed on mobile devices, enabling everyone to carry a “mini ChatGPT” in their pocket wherever they go.
According to Marktechpost, a California-based Artificial Intelligence news platform with a community of over 1.5 million AI professionals and developers, TinyLlama’s performance in commonsense reasoning and problem-solving tasks highlights the potential of smaller models to achieve high performance when trained with a substantial amount of data. It also opens up new possibilities for research and application in natural language processing, especially in scenarios where computational resources are limited.
More about TinyLlama – https://lnkd.in/gUSujpZC