Top 4 ways to Run LLM locally on Android and iOS
In my previous blog, I explored the technical rabbit hole of running Llama.cpp in Termux on an old Android phone. While that was a rewarding experiment, let's be honest: it wasn't exactly "plug-and-pl

Search for a command to run...
Series
In this series, I would explore way to run different AI models ranging from LLMs to Stable Diffusion models locally on all sorts of consumer hardware, like android phones, iPads and Consumer Laptops or Desktops
In my previous blog, I explored the technical rabbit hole of running Llama.cpp in Termux on an old Android phone. While that was a rewarding experiment, let's be honest: it wasn't exactly "plug-and-pl

Last week I wrote about repurposing an old Android phone to run local AI models. In this follow-up I address the biggest obstacle to running the device 24/7: overheating and how I transformed it into

This post covers a much more technical and involving way to run local LLMs on android via a terminal setup. If you want to try an easy-to-use , less technical way, I have it covered in this latest bl
