Ph.D. Student, ECE+NLP
UC Santa Barbara
I am an incoming ECE PhD student in the NLP Lab at the University of California, Santa Barbara. In 2020 I completed my MS in Computer Engineering at Arizona State University, with research activities centered around speech processing, natural language processing, and deep learning methods for low-resourced populations with neurological disorders, and affective computing.
I am excited about language technologies and the benefits they may bring to society; I am also apprehensive about how these technologies may be applied to nefarious ends the future. Long term my aim is to work in the space of maximizing societal benefits and minimizing societal harms of natural language understanding.
Currently, I am completing an Amazon Applied Science internship on the Alexa Edge ML team in Pittsburgh. This is the same team I worked with last summer, and we are hoping to complete two conference submissions by my internship's end.
I will be starting my PhD at UCSB in September. I am very lucky to have been granted the NSF Graduate Research Fellowship for 2020-2025!Past
Previously, I completed my MS as a member of the Center for Cognitive Ubiquitous Computing and Brain Behavior Analytics Labs at ASU where I pursued research on characterization of dysarthric speech using both traditional ML and deep learning.
I participated in the NSF Center for Efficient Vehicles and Sustainable Transportation Systems at ASU in developing automotive synchronous multisensor collection platforms and processing LiDAR data for use in neural networks, as a continuation of the work I did with them for my EE senior design project.
I took an Applied Scientist Intern position at Amazon on the Alexa Hybrid Science team in Pittsburgh, PA, between May and August 2019. In this position I developed end-to-end spoken language understanding "SLU" neural networks for intent classification.
I worked for Aural Analytics as a Research Engineering Intern part time between December 2018 and May 2019, developing ASR- and DSP-based methods for analyzing speech to track and potentially detect early stage degenerative brain disease.
I have worked in The Luminosity Lab, an ASU strategic initiative that rapidly builds student-led teams to attempt ambitious solutions to big problems, where I helped produce projects ranging from drone swarm communication software to chat agents for online learning platforms.
Additionally, I have worked for a summer at General Dynamics Mission Systems running software level testing for the HOOK3 combat survival radio, in the Social Robotics Lab at the National University of Singapore, developing an emotive dialogue management system, and in the ASU Engineering Tutoring Center.