back to projects
Atmosphere - AI Music Generation for Smart Glasses

Project Overview

Atmosphere is a AI-powered music generation system that creates personalized soundtracks by analyzing your visual surroundings in real-time. Built for HackMIT 2025 as a collaborative group project, this system combines Mentra smart glasses with cutting-edge AI services to deliver seamless, adaptive music experiences that respond to your environment.

Technical Implementation

The system features a sophisticated pipeline: Mentra glasses capture images every 3 seconds, which are processed by Claude Vision API for scene analysis. The backend, built with Express.js and TypeScript, orchestrates AI services including Suno AI for music generation. Real-time communication is handled through WebSocket connections, while Redis manages caching and session storage. The companion app, developed in React, provides user interface and music history tracking.

Key Features

Team Contributions

Team Members: Srinivas (Full Stack & Hardware Integration), Ayat (App Development & Hardware Integration), Zayyan (Vision Analysis & Backend Integration), Eric (Presentation & Suno Integration)

My Contributions: I led the complete frontend development, building the React companion app from scratch with real-time status monitoring and music history tracking. I assisted with hardware integration for the Mentra glasses and implemented API calls for seamless data streaming between the glasses and backend services. I also designed the UI/UX interface and helped optimize the communication pipeline.

Technical Stack

Frontend: React, TypeScript, HTML5/JavaScript
Backend: Node.js, Express.js, Bun runtime
Communication: Socket.IO, WebRTC
AI Services: Claude Vision API, Suno AI
Infrastructure: Redis, Docker, Sharp (image processing)
Hardware: Mentra Smart Glasses

Atmosphere Project

Repository: github.com/IBS27/atmosphere