Multimodal Interaction Design and Prototyping
Note : Interaction Design Concepts and Ideas are inspired from my previous exploration projects at Samsung R&D.
Due to the NDA nature and high-security policy, actual prototypes from Samsung are not displayed here. These prototypes are recreated as part of a rapid prototyping certification. (View Certificate)
-
Duration - 4 Weeks (December 2022)
-
Tools -
-
Unity 3D for high-fidelity prototyping
-
Interaction SDK
-
Shapes XR for low-fidelity prototyping and Bodystorming
-
Blender for 3D modeling
-
Brief :
Analyze the existing VR & MR applications (hand tracking based) and find the scope for improvement in the user experience.
Tasks :
-
Research
-
Interaction Design
-
Prototyping Hand Interactions
-
User Testing
Problem Statement :
For the applications which uses hand tracking or if there are multiple applications in different windows, the user has to raise their arm for long time to interact with them. Prolonged usage by raising their arm might cause fatigue and it will effect the user experience.
Proposed Solution :
Multimodal Interaction.
Apart form the interactions using hand rays, users can have another option for interaction which uses the Head Gaze or Eye Gaze to activate the application window or part of the application to interact and can use their hand gestures to perform the interactions (navigate, button press etc).
Prototype :
A high fidelity prototype is made using Unity 3D to test the interactions.
-
User can select the application to interact using gaze (Eye/Head)
-
User can perform the pinch gesture with out raising the arm and start interacting with the application.
-
Pinch + Move Left/Right - User can perform the horizontal scroll.
-
Pinch + Move Up/Down - User can perform the vertical scroll.
-
Pinch + Move Front/Back - User can have a new axis to perform the interaction. Eg: Zoom In/Out Image, Switch from month view to week view in calandar application.