Canadian Startup Revolutionizes Tech with Mind-Controlled Devices


As we immerse ourselves in a future dominated by voice-command technology such as Amazon’s Alexa, Google Assistant, and Apple Siri, one innovative Canadian startup is taking this concept a step further, bringing automation to a level where users won’t have to lift even a finger.

Aavaa, a Montreal-born computer and electronics manufacturing firm, claims to have developed a technologically advanced headset that converts brain and biosignals into commands which appliances can interpret. This revolutionary tech grants devices the unique ability to comprehend user instructions without requiring any physical or verbal prompts.

Follow us on Google News! ✔️

“As we work towards pioneering in brain-computing interfaces, we aim to understand the attention and intent of users who operate their devices,” asserts Aavaa’s the founder and CTO, neuroscientist Naeem Komeilipoor. His statement comes during an interaction with where he went on to explain that the technology functions through monitoring head and eye movement, and facial expressions such as blinking or clenching.

Komeilipoor, previously a practitioner at the University of Montreal, elucidates that Aavaa’s range of products, including earpieces, glasses, and headsets, equip consumers with the capability to maneuver home appliances solely through eye movements.

Amidst the allure of what practically seems telekinesis –like switching on your television with a single glance, Komeilipoor stresses on the substantial medical implications of this technological innovation. By exploiting this cutting-edge tech, individuals with paralysis can now steer wheelchairs, and it can also facilitate enhanced biometric monitoring for critical care patients. Not to forget, the profound advantages it serves those struggling with speech and hearing impairments.

Komeilipoor, sharing a personal anecdote, recounts, “Growing up, I watched my grandparents struggle with hearing aids that often fail to work optimally in noisy environments due to their inability to discern the user’s focus on specific sounds.”

He refers to this phenomenon as “auditory scene analysis” or “auditory stream segregation” – the process the human auditory system employs to process and arrange various sounds. Despite the advancements in hearing aid technology to reduce environmental noise, it still lacks the cognizance to comprehend the listener’s intent.

Komeilipoor emphasizes that to solve this conundrum, “researchers need to study the brain and biosignals or monitor the head and eye movement.”

Armed with attention tracking, speech enhancement, and wearable sensor technology, Aavaa’s devices strive to infiltrate cognitive processes that computers have been unable to decode until now.

Komeilipoor concludes, “The crux is not just making machines understand us. The introduction of this type of technology simplifies this process immensely where machines can effortlessly grasp our intent and deliver services accordingly.”