- About
- Courses
- Research
- Computational Social Science
- Critical Data Studies
- Data Science
- Economics and Information
- Education Technology
- Ethics, Law and Policy
- Human-Computer Interaction
- Human-Robot Interaction
- Incentives and Computation
- Infrastructure Studies
- Interface Design and Ubiquitous Computing
- Natural Language Processing
- Network Science
- Social Computing and Computer-supported Cooperative Work
- Technology and Equity
- People
- Career
- Undergraduate
- Info Sci Majors
- BA - Information Science (College of Arts & Sciences)
- BS - Information Science (CALS)
- BS - Information Science, Systems, and Technology
- MPS Early Credit Option
- Independent Research
- CPT Procedures
- Student Associations
- Undergraduate Minor in Info Sci
- Our Students and Alumni
- Graduation Info
- Contact Us
- Info Sci Majors
- Masters
- PHD
- Prospective PhD Students
- Admissions
- Degree Requirements and Curriculum
- Grad Student Orgs
- For Current PhDs
- Diversity and Inclusion
- Our Students and Alumni
- Graduation Info
- Program Contacts and Student Advising
Join us at 2:30 p.m. Friday, February 19, for a virtual Info Sci Colloquium with Angelique Taylor, who will present, "Perception and Decision-Making Systems for Human-Robot Teaming in Safety-Critical Environments."
Angelique Taylor is a Ph.D. candidate in Computer Science and Engineering at UC San Diego. Her research lies at the intersection of computer vision, robotics, and health informatics. She develops systems that enable robots to interact and work with groups of people in safety-critical environments. She has received the NSF GRFP, Microsoft Dissertation Award, the Google Anita Borg Memorial Fellowship, the Arthur J. Schmitt Presidential Fellowship, a GEM Fellowship, and an award from the National Center for Women in Information Technology (NCWIT). More information on her research can be found at angeliquemtaylor.com.
Title: "Perception and Decision-Making Systems for Human-Robot Teaming in Safety-Critical Environments"
Abstract: In this talk, I will present my current and future work on developing perception and decision-making systems that enable robots to team with groups of people. My core focus is on problems that robots encounter in human-robot teaming, including perceptions of human groups and social navigation, particularly in safety-critical environments.
First, I will discuss how I developed computer vision methods that enable robots to detect and track their teammates in real-world environments. Most group perception methods employ fixed, overhead cameras (i.e., an exo-centric / third-person perspective) to sense groups of people, rendering them impractical for mobile robots working in most settings. I have developed a group detection and tracking system designed for ego-centric (i.e., first-person) perspective sensing, which is more suitable for mobile robots, to enable them to enter any environment and accomplish their goals without external sensing requirements.
Next, I will discuss this work contextualized within a real-world application: human-robot teaming in healthcare. I am developing systems for hospital Emergency Departments (ED), where frontline healthcare workers have been overwhelmed by the COVID-19 pandemic. I will describe my work characterizing ED care delivery and staff workflow to enable robots to operate in these challenging environments. Building on this, I designed a social navigation system that enables robots to incorporate the severity of patients' health while navigating in the ED, to prevent interruptions in care delivery. My work will enable robots to work in safety-critical, human-centered environments, and ultimately help improve patient outcomes and alleviate clinician workload.