There are several LLMs that you can run locally on your PC. However, when it comes to smartphones, your options are somewhat limited. You’ll either need to run a small language model or have a premium device with enough processing power to handle LLMs. Either way, it’s possible to chat with language models locally and offline. Here’s everything you need to know about running LLMs locally on Android.
Running LLMs locally on ANDROID: LlaMa3, Gemma and more
Large language models are, well, large and require heavy processing power. But even if your Android device has the resources to run small language models (SLMs) and LLMs, you still need an app that lets you experiment with them in a user-friendly interface.
This is where an app like MLC Chat comes in handy. Use the steps below to perform LLMs locally on your Android using the MLC Chat app.
In the MLCChat app you will find a list of available models that you can download. Tap the download link next to a model that you like and wait for it to download.