- Hugging Face has launched Huggingsnap, an iOS app that can analyze and describe what your iPhone’s camera is seeing.
- The app works offline and never sends data to the cloud.
- Huggingsnap is imperfect, but demonstrates what can be done completely on the device.
Giving the sight to AI is becoming increasingly common as tools such as Chatgpt, Microsoft Copilot and Google Gemini rolling out glasses to their AI tools. Hugging Face has just dropped its own spin on the idea of a new iOS app called Huggingsnap that offers to look at the world through your iPhone’s camera and describe what it sees without ever connecting to the cloud.
Think of it like having a personal travel guide that knows how to keep your mouth closed. Huggingsnap runs completely offline using Hugging Face’s internal vision model, SMOLVLM2, to enable instant object recognition, stage descriptions, text reading and general observations about your surroundings without any of your data sent out to the Internet area.
This offline capacity makes chopping nap particularly useful in situations where connection is stained. If you wander in the desert, travel abroad without reliable internet or simply in one of these grocery stores where cell service mysteriously disappears, it is a real blessing to have capacity on your phone. Plus, the app claims to be super-efficient, which means it won’t drain your battery, as cloud-based AI models do.
Huggingsnap is looking at my world
I decided to give the app a whirl. First I pointed it to my laptop screen while my browser was on my Techradar biography. First, the app performed a solid job that transcribed the text and explained what it saw. It drove from reality as it saw the headlines and other details about my bio. Huggingsnap believed that the references to new computer chips in a headline were an indicator of what drives my laptop, and seemed to believe that some of the names in headlines indicated other people using my laptop.
Then I pointed my camera to my son’s playpen full of toys that I hadn’t cleaned up yet. Again, AI did a good job with the wide strokes of describing the play area and the toys inside. It got the colors and even the textures right when they identified stuffed toys versus blocks. It also fell into some of the details. For example, it called a bear a dog and seemed to believe that a stacking ring was a ball. In general, I would call Huggingsnaps AI great for describing a scene to a friend, but not quite good enough for a police report.
See the future
Hugging snap approach to device stands out from your iPhone’s built -in abilities. While the device can identify plants, copy text from images and tell you if the spider on your wall is the kind that will make you move, it almost always has to send some information to the cloud.
Huggingsnap is remarkable in a world where most apps want to trace everything that is under your blood type. That said, Apple invests heavily in AI on device for its future iPhones. But so far, if you want privacy with your AI vision, Huggingsnap may be perfect for you.