I am currently a Staff Research & Engineering Lead at Apple. I am a hybrid software + hardware engineer, researcher, and rapid prototyper interested in the future of machine perception, mobile and wearable devices, and robotics. I hold a PhD in computer science from Georgia Tech, where I was a Google PhD Fellow. Previously, I was a research scientist and software project lead developing experiences for Focals everyday smart glasses at augmented reality startup North Inc. I have worn a computer with a head-up display in my daily life for 7+ years.
My research interests lie at the intersection of AI/ML and intelligent systems. I design and build devices, systems, and algorithms that enable applications for everyday computing, wearables, AR/VR. My work focuses on contextual understanding, gestural and activity recognition, and novel experiences. I leverage skills working with custom electronics and sensors, applied machine learning, computer vision, HCI, interface design, and physical prototyping.
Past projects include: contextual awareness for smart glasses, silent speech recognition using tongue movement and inner ear deformations, early prototypes of the Fitbit Relax guided breathing feature using heart rate variability now available on millions of trackers, a search & rescue robot with non-contact and non-invasive Doppler radar vital signs sensing, and non-voice acoustics for rapid input on smartwatches.
I’m always excited to collaborate on research-backed products and prototypes that reimagine how people sense, interact, and communicate with technology.