Francisco Ortega

Assistant Professor at Colorado State University (CSU)

Lecture Information:
  • December 2, 2022
  • 2:00 PM
  • CASE 241 & Zoom

Speaker Bio

Francisco R. Ortega is an Assistant Professor at Colorado State University (CSU) and Director of the Natural User Interaction lab (NUILAB). Dr. Ortega earned his Ph.D. in Computer Science (CS) in the field of Human-Computer Interaction (HCI) and 3D User Interfaces (3DUI) from Florida International University (FIU) from Dr. Naphtali Rishe and Dr. Armando Barreto. He also held the position of Post-Doc and Visiting Assistant Professor at Florida International University between February 2015 to July 2018. Broadly speaking, his research has focused on multimodal and unimodal interaction (gesture-centric), which includes gesture recognition and elicitation (e.g., a form of participatory design). His main research area focuses on improving user interaction by (a) multimodal elicitation, (b) developing interactive techniques, and (c) improving augmented reality visualization techniques. The primary domains for interaction include immersive analytics, assembly, Navy use cases, and collaborative environments using augmented reality headsets. His research has resulted in over 76 peer-reviewed publications, including books, journals, conferences, workshops, and magazine articles, in venues such as IEEE TVCG, ACM PACMHCI, ACM ISS, ACM SUI, and IEEE 3DUI, among others. He is the first author of Interaction Design for 3D User Interfaces: The World of Modern Input Devices for Research, Applications, and Game Development book by CRC Press. Dr. Ortega has experience with multiple projects awarded by the government. For example, Dr. Ortega was a co-PI for the DARPA Communicating with Computers project. He is currently a PI for a 3-year effort for ONR titled Perceptual/Cognitive Aspects of Augmented Reality: Experimental Research and a Computational Model. He recently was awarded a new ONR grant titled Assessing Cognitive Load and Managing Extraneous Load to Optimize Training. He has also been funded by the National Science Foundation and sub awardee for an ONR project from Virtual Reality Rehab. Since his initial tenure-track appointment at CSU from August 2018 to August 2022, Dr. Ortega has brought over 3.3 million dollars in external funding, with 2.8 million dollars as principal investigator. Finally, Dr. Ortega is committed to diversity and inclusion, and his mission is to increase the number of underrepresented minorities in CS.


The next wave of human-computer interaction technology includes augmented reality (AR) head-mounted displays (HMDs). While controllers remain the most common way to interact in virtual reality (VR), most AR (and some VR) HMDs are including midair gesture interactions, where the user extends their arm to interact with the virtual scene. Midair gestures are more intuitive for users than controllers but they cause fatigue. Microgestures are a different type of interaction because they are small, discreet gestures. Microgestures may be performed as a primary task, or a secondary task (done concurrently with another gesture or primary task). For example, a primary task may be annotating a 3D plot, and a secondary task may be changing the font size of the plot’s text. Microgestures are intended to be useful in hands-on situations, such as working in a kitchen, workshop, office, or whenever a person’s hands are preoccupied. Microgesture interaction can be used for interruptions to primary tasks [3, 90, 116]. Microgestures have been shown to be efficient and effective for direct and subtle interaction with modern computing systems. A simple visual example of a microgesture is using the volume control on a car steering wheel, where the driver uses a thumb to push upward or downward without moving their hand from the steering wheel. In HMDs, a similar microgesture would be the thumb and the index finger rubbing together to control a virtual volume knob. What should microgestures look like in AR? Currently, there is not a definitive answer, especially for complex 3D environments like those found in Immersive Analytics (Immersive 3D Data Visualization). The talk will concentrate on Dr. Ortega’s prior work on AR interaction and the PI will make the case for the importance of microgestures.

This event will be webcast live. Join via Mediasite live streaming.