Personal assistants such as Siri have changed the way people interact with computers by introducing virtual assistants that collaborate with humans through natural speech-based interfaces. However, relying on speech alone as the medium of communication can be a limitation; non-verbal aspects of communication also play a vital role in natural human discourse. Thus, it is necessary to identify the use of gesture and other non-verbal aspects in order to apply them towards the development of computer systems. We conducted an exploratory study to identify how humans use gesture and speech to communicate when solving collaborative tasks. We highlight differences in gesturing strategies in the presence/absence of speech and also show that the inclusion of gesture with speech resulted in faster task completion times than with speech alone. Based on these results, we present implications for the design of gestural and multimodal interactions.

Isaac Wang, Pradyumna Narayana, Dhruva Patil, Gururaj Mulay, Rahul Bangar, Bruce Draper, Ross Beveridge, and Jaime Ruiz. 2017. Exploring the Use of Gesture in Collaborative Tasks. In Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems (CHI EA '17). ACM, New York, NY, USA, 2990-2997. DOI:

 author = {Wang, Isaac and Narayana, Pradyumna and Patil, Dhruva and Mulay, Gururaj and Bangar, Rahul and Draper, Bruce and Beveridge, Ross and Ruiz, Jaime},
 title = {Exploring the Use of Gesture in Collaborative Tasks},
 booktitle = {Proceedings of the 2017 CHI Conference Extended Abstracts on Human Factors in Computing Systems},
 series = {CHI EA '17},
 year = {2017},
 isbn = {978-1-4503-4656-6},
 location = {Denver, Colorado, USA},
 pages = {2990--2997},
 numpages = {8},
 url = {},
 doi = {10.1145/3027063.3053239},
 acmid = {3053239},
 publisher = {ACM},
 address = {New York, NY, USA},
 keywords = {communication, gesture interaction},