- About
- Courses
- Research
- Computational Social Science
- Critical Data Studies
- Data Science
- Economics and Information
- Education Technology
- Ethics, Law and Policy
- Human-Computer Interaction
- Human-Robot Interaction
- Incentives and Computation
- Infrastructure Studies
- Interface Design and Ubiquitous Computing
- Natural Language Processing
- Network Science
- Social Computing and Computer-supported Cooperative Work
- Technology and Equity
- People
- Career
- Undergraduate
- Info Sci Majors
- BA - Information Science (College of Arts & Sciences)
- BS - Information Science (CALS)
- BS - Information Science, Systems, and Technology
- MPS Early Credit Option
- Independent Research
- CPT Procedures
- Student Associations
- Undergraduate Minor in Info Sci
- Our Students and Alumni
- Graduation Info
- Contact Us
- Info Sci Majors
- Masters
- PHD
- Prospective PhD Students
- Admissions
- Degree Requirements and Curriculum
- Grad Student Orgs
- For Current PhDs
- Diversity and Inclusion
- Our Students and Alumni
- Graduation Info
- Program Contacts and Student Advising
Please join us for the Information Science Colloquium with guest, Shiri Azenkot. Shiri Azenkot is an Assistant Professor at the Jacobs Technion-Cornell Institute at Cornell Tech who's broadly interested in human-computer interaction and accessibility. She recently received her PhD in Computer Science from the University of Washington where she focused on eyes-free input on mobile devices using gestures and speech. Shirireceived two Best Paper awards from ACM's ASSETS conference and has presented her work at other top HCI conferences (CHI and UIST). She received the University of Washington graduate student medal, a National Science Foundation Graduate Research Fellowship and an AT&T Labs Graduate Fellowship. Shiri also holds a BA in computer science fromPomona College and an MS in computer science from the University of Washington.
Title: Eyes-Free Input on Mobile Devices
Abstract: I will discuss new methods and studies that aim to improve eyes-free data entry for blind mobile device users. Currently, mobile devices are generally accessible to blind people, but text entry is almost prohibitively slow. Studies show that blind people enter text on an iPhone at a rate of just 4 words per minute.
I will present *Perkinput*, a chording text entry method where users touch the screen with one to three fingers at a time in patterns based on Braille. Instead of soft keys, Perkinput uses concepts from signal detection theory to determine the user’s input. Based on Perkinput, I developed*PassChords, *a touchscreen authentication method that has no audio feedback. Unlike current eyes-free input methods, PassChords doesn’t echo a user’s input, so it won’t broadcast the user’s password for others to hear. Finally, I will discuss another modality for eyes-free input: speech. I conducted a survey and a study to determine the patterns and challenges of the use of speech input for composing paragraphs on mobile devices. I will conclude by presenting current work on eyes-free methods for correcting speech recognition errors.