Chances are your phone spends more time in a pocket than in your hand (e.g. in your pants, jacket, or purse). While we might typically think of all this time as “not using our phone”, these times represent an important opportunity for human-computer interaction. In this talk, I’ll describe work we have been doing on context sensing, context sharing, context APIs, and even touch input while your phone is in your pocket.
Bio: I am a Researcher in the Computational User Experiences (CUE) group at Microsoft Research. My general research interests are Human-Computer Interaction (HCI) and Ubiquitous Computing (UbiComp). I spend most of my time creating new human-computer input and output techniques. I also write my bios in the first person. The broad goal of my work is enabling computing to aid people throughout every aspect of their lives. My focus toward this goal is the concept of Always-Available Computing: the idea that computing can and should be at our fingertips no matter where we are or what we are doing. In 2010 I completed my PhD in the Computer Science & Engineering department at the University of Washington where I was advised by Professor James Landay and Dr. Desney Tan. In my dissertation work, I created new human-computer interfaces by exploring techniques to harness the untapped bandwidth of the human body for physiological interfaces to computing. The focus of my work in this area was muscle-computer interfaces. This work has led to many publications and coverage by media outlets including being honored as one of Technology Review's 2010 Young Innovators Under 35.
Note: Due to technical issues, the rest of the talk was not recorded. The rest of the slides will be posted as a separate video file. Slides can be viewed here: vimeo.com/80535518, with the password "slides"