A Living Dialogue with a Shop Window: An R&D Experiment for a Bakery
Developed and tested a proof-of-concept for an interactive AI-powered shop window, confirming the technical hypothesis for a unique customer experience.
Canvas Idea
To create an experimental interactive shop window for a bakery that uses AI to recognize the gender and age of a passerby and deliver a personalized audio greeting.
Canvas Challenge
The core problem for any retail spot is the "Curse of the Anonymous Shopper." A shop window is a silent tool that can't establish a personal connection. The challenge was to turn a passive passerby into an engaged guest before they even walked in the door, creating a "wow effect" and an instant emotional connection.
Stroke Solution
The key decision—the "one precise stroke"—was the development of a "Living Dialogue with a Shop Window." This system combined two technologies: 1) AI Recognition: An integrated camera with a neural network instantly and discreetly identified a person's gender and approximate age. 2) Personalized AI Voice: The system then synthesized a unique voice greeting relevant to the recognized demographic.
Value Metrics
- Technical Hypothesis Confirmed: The prototype proved that the combination of AI recognition and speech synthesis could work in a real-world retail environment to create a unique user experience.
- Invaluable Insights Gained: The local experiment allowed us to gather data on how real people react to this type of interaction, identifying potential issues and growth points for future projects.
- Demonstrated Innovative Potential: The project became a powerful internal case study, showcasing our team's ability to rapidly build and test bold, unconventional solutions at the intersection of technologies.
Tech Stack
- Computer Vision: Python with OpenCV for video stream processing
- AI/ML: PyTorch with a pre-trained model (e.g., MobileNet-SSD) for real-time age and gender detection
- Speech Synthesis: Google Cloud Text-to-Speech API for high-quality voice generation
- Hardware: Single-board computer (e.g., Raspberry Pi 4/5) with a USB camera and speakers
- Orchestration: A single Python script to manage the camera input, AI inference, and API calls
“The question was simple: 'Can we make a shop window talk to people?' We took a camera, a recognition neural network, and a speech synthesizer. For one week, an ordinary window in an ordinary bakery became a small portal to the future. It was pure R&D, an experiment for its own sake. It's projects like these, unafraid to be just a bold idea, that push technology forward.”— The Development Team