Professor of Computer Science at Tufts University
Robert Jacob is a Professor of Computer Science at Tufts University, where his research interests are new interaction modes and techniques and user interface software; his current work focuses on implicit brain-computer interfaces. He has been a visiting professor at the University College London Interaction Centre, Universite Paris-Sud, and the MIT Media Laboratory. Before coming to Tufts, he was in the Human-Computer Interaction Lab at the Naval Research Laboratory. He received his Ph.D. from Johns Hopkins University, and he is a member of the editorial board for the journal Human-Computer Interaction and a founding member for ACM Transactions on Computer-Human Interaction. He has served as Vice-President of ACM SIGCHI, Papers Co-Chair of the CHI and UIST conferences, and General Co-Chair of UIST and TEI. He was elected to the ACM CHI Academy in 2007 and as an ACM Fellow in 2016.
Robert Jacob is a Professor of Computer Science at Tufts University, where his research interests are new interaction modes and techniques and user interface software; his current work focuses on implicit brain-computer interfaces. He has been a visiting professor at the University College London Interaction Centre, Universite Paris-Sud, and the MIT Media Laboratory. Before coming to Tufts, he was in the Human-Computer Interaction Lab at the Naval Research Laboratory. He received his Ph.D. from Johns Hopkins University, and he is a member of the editorial board for the journal Human-Computer Interaction and a founding member for ACM Transactions on Computer-Human Interaction. He has served as Vice-President of ACM SIGCHI, Papers Co-Chair of the CHI and UIST conferences, and General Co-Chair of UIST and TEI. He was elected to the ACM CHI Academy in 2007 and as an ACM Fellow in 2016.
The current focus in my research group is on a new generation of brain-computer interfaces. Brain-computer interaction has made dramatic progress in recent years, but its main application to date has been for physically disabled users. Our research in real-time measurement and machine learning classification of functional near infrared spectroscopy (fNIRS) brain data leads us to develop, use, and evaluate brain measurement as input to adaptable user interfaces for the larger population.
We are using brain input as a way to obtain more information about the user and their context in an effortless and direct way from their brain activity. We then use it to adapt the user interface in real time. We are creating and studying these new user interfaces, with emphasis on domains where we can measure their efficacy.
We are now also broadening this work to include other forms of lightweight, passive, real-time adaptive user interfaces, based on physiological or other measurements. Our focus continues to be on the design of subtle and effective interfaces that make judicious use of the measurements we can obtain.