back to projects
mobile development & ai

a.eyes

ai-powered accessibility mobile application

react native application designed for visually impaired users, providing real-time image recognition, descriptive ai analysis, and conversational interaction through advanced machine learning models.

project overview

mission & impact

a.eyes was developed as part of project code uva's community impact initiative. the application empowers visually impaired users by providing instant, detailed descriptions of their environment through cutting-edge ai technology, fostering independence and accessibility.

leadership role

  • project manager: led cross-functional team of developers and designers
  • sprint coordinator: managed agile development cycles and deliverables
  • technical lead: architected ai integration and accessibility features

core features

ai-powered image analysis

  • real-time image capture and processing
  • hugging face blip model integration
  • detailed scene description generation
  • object detection and identification

conversational ai

  • zephyr-7b language model integration
  • contextual follow-up questions
  • natural language interaction
  • intelligent conversation memory

accessibility features

  • elevenlabs text-to-speech integration
  • voice-to-text input processing
  • screen reader compatibility
  • intuitive touch gesture controls

user experience

  • local storage of interaction history
  • offline mode capabilities
  • customizable voice settings
  • privacy-focused design

technical architecture

mobile framework

  • • react native
  • • expo development platform
  • • cross-platform compatibility
  • • native device integration

ai/ml integration

  • • hugging face transformers
  • • blip image captioning
  • • zephyr-7b language model
  • • api optimization

audio processing

  • • elevenlabs tts engine
  • • speech-to-text processing
  • • audio quality optimization
  • • real-time audio streaming

development methodology

implemented agile development practices with bi-weekly sprints, continuous integration through github actions, and regular user testing sessions with accessibility experts and visually impaired community members.

agile/scrumuser-centered designaccessibility testingci/cd pipeline

project management & leadership

team coordination

led a multidisciplinary team including frontend developers, ai specialists, and ux designers in delivering a complex accessibility-focused mobile application.

responsibilities:

  • • sprint planning and backlog management
  • • stakeholder communication
  • • technical architecture decisions
  • • code review and quality assurance

tools used:

  • • jira for project tracking
  • • github actions for ci/cd
  • • slack for team communication
  • • figma for design collaboration

milestones & deliverables

mvp development (sprint 1-2)

basic image capture and ai description functionality

ai integration (sprint 3-4)

hugging face models and conversational ai implementation

accessibility features (sprint 5-6)

tts/stt integration and screen reader compatibility

testing & refinement (sprint 7-8)

user testing with accessibility community and bug fixes

impact & results

8
sprint cycles completed
95%
user satisfaction rating
3
ai models integrated
100%
accessibility compliance

successfully delivered a production-ready mobile application that significantly improves daily navigation and environmental understanding for visually impaired users.

"this project showcased leadership capabilities in managing complex technical projects while maintaining focus on user-centered design and accessibility standards."