INTRODUCTION
Helping BART commuters navigate easily and get timely updates, improving ride satisfaction and catering to diverse user needs.
My Role
I conducted auto-ethnography, commuter interviews and lead the development of journey maps to identify key friction points in the BART experience. I translated these findings, related to navigation difficulties and the need for glanceable, real-time information, into user flows and an interface design tailored for the Apple Watch.
Skills
UI Design (Wearables), UX Design, Auto-Ethnography, User Interviews, Surveys, Usability Testing, Synthesis, Journey Mapping, Information Architecture, Prototyping
Duration
3 Weeks
Design Team
The Problem
The current mobile app and signage fall short in providing hands-free access, real-time updates, and clear navigation assistance thus making them rely on third-parties like Google Maps.
The Solution
I designed an Apple Watch companion by adding clear visual cues, haptic feedback, and integration with local navigation systems to provide real-time updates and hands-free functionality.
RESEARCH
Methodology / Research framework
From defining the problem, to offering guidance during design, testing and validation, we had to ensure that biomimicry is integrated into every aspect of the workflow.
INSIGHTS
Painpoints
iOS App Features
A unique elliptical icon, inspired by an orbit-like form. This delightful form guides users through key sections of the app.
Lack of Direction
Level up by earning badges for uploading your images to the community page. These badges double up as app icons.
VALIDATION
Why do we need a watch app?
Wearables are increasingly adopted for travel. A watch app offers glanceable info without fumbling for a phone making it ideal for crowded commutes.
DESIGN PROCESS
Prioritisation of data points and designing the UI to optimise space for the small watch screen.
I conducted on-site research at multiple BART stations (Civic Center, Berkeley, MacArthur, Montgomery, and more).
SOLUTION
Bud.ai fits into design and engineering teams and has the ability to act as an interdisciplinary collaborator
Whether you're sketching initial concepts or refining final prototypes, Bud’s suite of interactive tools — from advanced prompting controls to downloadable files — seamlessly integrates into your workflow.
1/3
To begin a project, use the robust input framework to input your project specifications
We have thoughtfully designed each input field to provide context for your project, you can also upload files to help Bud understand exactly where you are in the process.
Bud compiles everything into one flexible workspace, where you can edit, change, or add elements anytime.
2/3
With a click, Bud proposes three swimwear ideas inspired by nature.
Bud generates interactive 3D models and helps you iterate by smart prompting and evolves with your ideas.
3/3
Troubleshooting an existing design? Bud.ai has got you covered.
It can propose solutions taking into consideration text, image and even technical blue-prints. You can pick back up from wherever you left and Bud remembers your needs.
TECHINCAL SPECIFICATIONS
Bud.ai shouldn’t just lift from living systems — it should also support them, here's how we plan to train it to be circular
Borrowing from case studies that champion circularity and libraries of sustainable materials, Bud’s AI algorithm is designed to optimize for sustainability, rather than just cost, performance, or speed.
REFLECTION
Designing UX for AI-powered products
I learned that transparency, context, and timing are crucial when embedding AI into creative workflows, especially for interdisciplinary users. Having an input framework was key to streamline workflows by reducing repetitive back and forth between user and the AI interface.
What's next for Bud.ai
The next step is integrating Bud.AI into existing design tools through plugins, meeting users directly in their preferred environments. We're also focused on strengthening the solution exploration canvas and running real-world tests to better understand its impact and refine its capabilities.