Daniel Rosehill Hey, It Works!
Using Gemini's vision capabilities for body language analysis
· Daniel Rosehill

Using Gemini's vision capabilities for body language analysis

I built a body language analysis app using Google AI Studio's vibe coding interface and Gemini's multimodal vision. Upload a photo, get expert-level analysis.

Google recently launched a "vibe coding" interface in AI Studio that lets you describe an app idea in natural language and have Gemini build it for you. I wanted to test it out, and the use case I landed on was body language analysis — upload a photo of people and get expert-level interpretation of what their body language is saying.

The result is the Gemini Body Language Analyst, a proof-of-concept web app powered by Gemini 2.5 Pro's multimodal vision capabilities.

danielrosehill/Gemini-Body-Language-Analyst ★ 0

Test app "vibe coded" in Google AI Studio: analyse body language from photo plus context

TypeScriptUpdated Oct 2025
ai-studiogemini-appspocvibe-coded

What it does

You upload one or more photos containing people, optionally provide context about the relationships and scenario, and the app returns a comprehensive analysis covering facial expressions, eye contact and gaze direction, posture and positioning, gestures and hand movements, proxemics (spatial relationships between people), and overall emotional state and intentions.

The vibe coding experience

What made this particularly interesting was the development process itself. Instead of writing code, I described what I wanted in plain English through Google AI Studio's app builder interface. The initial concept was straightforward: use vision as the modality, with advanced tagging of people and expert body language analysis. Gemini generated a functional web application from that description.

The experience demonstrates how rapidly you can prototype AI-powered applications when the barrier to entry is just describing what you want. From concept to working app in minutes rather than hours.

Try it yourself

The repo includes everything needed to run it locally — just clone, install dependencies, add your Gemini API key, and start the dev server. It's a nice example of both Gemini's vision capabilities and the vibe coding workflow. Check it out on GitHub.

danielrosehill/Gemini-Body-Language-Analyst ★ 0

Test app "vibe coded" in Google AI Studio: analyse body language from photo plus context

TypeScriptUpdated Oct 2025
ai-studiogemini-appspocvibe-coded