Hugging Face Launches HuggingSnap: A Localized Google Lens Alternative for iOS Users

The developers at Hugging Face have launched HuggingSnap, an iOS alternative to Google Lens that operates directly on your device. While users may have slightly fewer features, the advantage is that they do not have to share their data with a corporation.

HuggingSnap employs the smolvlm2 machine learning model, which can recognize objects in images and respond to user inquiries. Upon its first launch, the app downloads the model to the smartphone and runs it offline. Users can pose questions either by voice or text, and there is also a button that initiates the generation of a description for a photo.

For instance, HuggingSnap allows users to translate text, request explanations, count specific objects in an image, or learn more about unfamiliar items. It’s advisable to ask questions in English. Besides photographs, the neural network can also process videos and images stored on the device.

HuggingSnap is compatible with devices running iOS 18 or later. There are also versions available for Mac (macOS 15.0+) and Apple Vision (visionOS 2.0+). The app itself is only 13.5 MB, but after downloading the model, it occupies about 1 GB.