Human Machine Interaction (HMI) for Music Production
We have developed an Interface by which Musicians, Producers, Sound Designers and Engineers can interact with their display/Monitors by looking (eye tracking) and operating in the air (Hand Gestures). Here is a demo of operating FL Studio (DAW) using our technology.
Cognitive load to compose music (Human Music Interaction/Computer Music)
We estimated the cognition using Pupil Dilation of the participant and mapped the values to musical notes in C Major scale. Later we shifted the notes to create harmony. The music piece thus represents the cognition of the participant. We have ideas for extending this to music therapy and music for spastic kids.
Eye gaze and Hand gesture interaction
We can interact with computer just by looking at the icon and selecting the icon by a (tap) hand gesture. This technology uses eye gaze tracker and a hand movement tracker. This is an intuitive way of communicating with computer other than touch/mouse interaction.
Laser traking for Large Screen/Display Interaction
We have developed a wireless laser tracking module for interacting with large screen (Projector/Big TV). We can use it like a virtual touch on displays. The click and drag option gives the ability to write/draw on the screen.
Object Detection using Deep learning in ADAS
Detection of on-road objects are implemented in GPU using Faster-RCNN for boosted performance in ADAS. This project was done in collaboration with TATA Elxsi.
Line tracking Robot for Matrix redemption
We designed an autonomous robot that can explore a 3 x 4 grid and find the embedded number that leads to the escaping tunnel.
Haptic feedback module for VR
We developed a haptic feedback module for interacting with objects inside Virtual Reality (VR). The module is made wireless and wearable for comfort.
Autonomous Mobile Navigation Robot
A robot was designed to navigate throughout the campus without any human intervention using machine vision and audition.
Environment monitoring Dashboard using IoT nodes
We developed IoT nodes with a set of 4 sensors (light, humidity, smoke and temperature) and streamed the data and alert signals to the central monitor where we visualize the data in charts. We built our own PCBs for mounting sensors onto Arduino boards. This project is a part of Smart Manufacturing dashboard.
Eye gaze controlled Surveillance Camera
The camera mounted on servos are controlled by eye gaze tracking. This way, we can view the entire room with azimuth and elevation of 180 degrees. This project was done for Mechatronics coursework.
VR Tobii HTC
We demonstrate HTC VR kit with Tobii eye tracking integrated for interacting and manipulating with objects in VR.
Drowziness Detection using Eye tracking Glasses
We developed a method to detect drowsiness and give an alert when the participant falls asleep. We used Tobii Pro Glasses to build this.
Dynarack - Housing Solution
A dynamic rack system was designed to contain house holds. It is a novel outcome of Product Design and Prototyping contest held by TATA Center at IIT Bombay. It is designed in such a way that the racks above the heights of human reach are made to come down to us and are pulled up to settle in their original places with the help of little mechanical efforts.
GSM based Laptop Anti-theft System
An embedded module is designed to detect and track a stolen Laptop using a GSM module driven by a micro-controller with an accelerometer.
Quadcopter Design
A quad copter with 1 Kg load carrying capacity and 45m flying height was designed.
INDUSTRIAL PACKING ROBOT
An autonomous line-tracking robot was designed to pick and drop packages in respective containers by using Image processing for color extraction and optical character recognition (Shaastra IIT Madras).