Hi, I'm Amanda Swearngin.

I’m currently working at Apple as an AIML Research Scientist. I develop human-centered AI tools for accelerating developers, and train machine learning models for understanding user interfaces. I also build large-scale data collection pipelines for collecting human-labeled datasets to train these models. I have a strong interest in building better tools to help developers make their interfaces more accessible. I have developed and released production level code for multiple features at Apple, including Screen Recognition and Accessibility Inspector.

Achievements

  • Accessibility Developer Tools: Developed a large scale system to collect and report accessibility issues for mobile apps, used for internal accessibilty reporting for multiple prominent Apple apps, and released as an Xcode feature in Accessibility Inspector.
  • UI Understanding: Developed ML driven internal framework (Swift and Python) used by 20+ teams and products to recognize UI elements.
  • Screen Recognition: The first on-device ML model to recognize and announce iOS app UI for VoiceOver, benefiting 20M+ blind users. CHI Best Paper award.
  • Recognized expert on accessibility developer tools.

Graduate School: I earned a Ph.D. in Computer Science at the University of Washington. I was advised by Amy Ko and James Fogarty. I researched systems and interfaces for UX/UI designers that apply techniques from diverse areas including program analysis, synthesis, constraint solving, and machine learning.

Through this research, I created systems to help interface designers explore and adapt alternative and example interfaces, and analyze the usability of an interface without needing to collect any data. For this research, I collaborated with industry researchers through internships with Adobe Research and Google, and have conducted over 100 interviews and study sessions with interface designers. My research was supported by the National Science Foundation Graduate Research Fellowship.

Industry: Previously, I spent 3 years working as a full-time software development engineer at Microsoft, where I helped build a web interface framework for Microsoft Dynamics, and specialized in user interface layout, patterns, and visual regression testing.

Research Experience

Since 2020 Apple
Research Engineer/Scientist
I develop human-centered AI tools for accelerating developers, and train machine learning models for understanding user interfaces. I also build large-scale data collection pipelines for collecting human-labeled datasets to train these models. I have a strong interest in building better tools to help developers make their interfaces more accessible. I have developed and released production level code for multiple features at Apple, including Screen Recognition and Accessibility Inspector.
2015 - 2019 Code & Cognition Lab, Fogies Lab, University of Washington
Graduate Student Researcher
Advisor(s): James Fogarty, Amy Ko
I designed and developed 4 systems applying constraint solving, data-driven design, and machine learning to aid UI/UX designers in their use of examples and alternatives and conducting usability evaluations within interactive design tools, while collaborating with industry companies (i.e., Adobe, Google).
2019 Microsoft Research
Research Intern (Ideas Group)
Mentor(s): Shamsi Iqbal
I conducted a company-wide survey on information capture from mobile devices, and collaborated with two product teams to develop a cross-device system (i.e. mobile and desktop) for capturing and linking document-related information (e.g., photos, bookmarks).
2018 Google, Inc.
Student Researcher & Intern (Reflection Group, Google Research)
Mentor(s): Yang Li
I developed a crowdsourcing interface, and collected a dataset of over 20k labels through Mechanical Turk, and constructed a deep neural network model (Tensorflow) to automatically predict the tappability of mobile interfaces to help designers evaluate tappability of their interfaces without needing to collect any data.
2016, 2017 Adobe Research
Research Intern
Mentor(s): Mira Dontcheva, Wilmot Li, Morgan Dixon, Joel Brandt
I created Rewire, an interactive system helping designers leverage example screenshots by automatically inferring a vector representation with editable shape and style properties, including 3 novel design assistance modes, and evaluated Rewire with 16 interface designers.
2010 - 2012 University of Nebraska-Lincoln
Graduate Research Assistant
Advisor(s): Myra B. Cohen
I designed and implemented CogTool-Helper, which uses UI testing frameworks to automatically create storyboards for predictive human performance modeling of user interfaces, to help designers evaluate human performance and detect human performance regressions in interfaces.

Software Engineering Experience

2012 - 2015 Microsoft Corporation
Software Development Engineer II, Software Development Engineer in Test (SDET)
I primarily designed, developed, and tested web client framework features for Dynamics AX, Microsoft’s cloud-based ERP, and was the primary developer for client layout and UX patterns. Additionally, I developed a visual regression testing framework to validate the product across browsers and environments, and integrated it into the build system.
2010 Cerner Corporation
Software Engineering Intern
I conducted performance analyses and implemented C++ performance improvements that were put into production in Cerner’s core application (PowerChart), and conducted static analysis runs to improve code quality.
2009 Cerner Corporation
Software Engineering Intern
I designed a UI and built an interactive patient summary web app for the iPhone using JavaScript, CSS, and HTML.