May #WWTO argues mind-controlled software, emotional AI the future of computing

While many past We Are Wearables Toronto events have explored the wearable industry’s impact on consumer-facing industries like fashion, virtual reality, and photography, this month’s meetup looked at a more complex topic: brain-computer interfaces (BCI) and affective computing; particularly how our mind and emotions are being leveraged to change the face of computing.

Bringing in diverse perspectives, the event hosted speakers and panelists from the neurological startup and wearable startup spaces like Dr. Kang Lee, developmental researcher at the University of Toronto and chief science officer at Nuralogix; Lana Novikova, CEO of Heartbeat.ai; and Vuk Pavlovic, research manager at True Impact.

Lee kicked off the event by stressing that if humans want AI to be effective in their daily lives, it will be critical to make AI systems not just intellectually smart, but also emotionally smart.

“If you look at all these AI systems, one thing in common is that they are intellectually very smart but they are, emotionally, very stupid,” said Lee. “If we want to have AI play a significant role in our everyday activities, I think we need to put emotional intelligence into AI.”

“They [AI] are intellectually very smart, but they are, emotionally, very stupid.”

Lee pointed to Toronto-based Nuralogix, which is building out DeepAffex, a system that uses transdermal optical imaging to see blood flow beneath the skin to detect a person’s emotional state, something Lee said will making devices and machines intellectually and emotionally intelligent.

Novikova, who built Heartbeat.ai, an empathy analytics platform that uses natural language processing to help people understand other people’s emotions, explained that as the world’s population grows, and man-made crises become more common, building emotional apps will be important.

“There are 7.5 billion people on this planet today. By 2020, we will reach 7.7 billion people,” said Novikova. “Seven in 10 will have a mobile phone with internet connection. One in 10 will suffer from anxiety and depression. It means that if we build affordable and effective apps that would help people ease anxiety and build emotional resilience. This will make humans slightly…kinder and might prevent some man-made disasters.”

WWTO also hosted a panel featuring Graeme Moffat, vice-president of research and regulatory affairs at InteraXon; Dano Morrison, projects officer at NeuroTechTO; Alex Ni, founder and CTO at Avertus; and Mark David Hosale, associate professor at the department of computational arts at York University, who discussed opportunities in computing unlocked by using the mind.

When WWTO founder Tom Emrich asked the panelists about BCIs as the next wave of computing — which have been made possible with the use of wearable technology — Moffat explained that there are three types of brain-computer interfaces (BCI): passive BCI, which involves using BCI to enrich human-machine interaction by “passively adopting software through your brain,” reactive BCI, and active BCI.

Moffat stressed that “the hardest one that we really don’t have consumer examples of” is the active BCI, which allows humans to control a computer or software by thinking a command.

When asked about where BCIs will have potential applications, Ni gave examples such as helping blind people see or allowing paraplegics to walk. “There’s a lot of different applications, whether it’s clinical…marketing, consumer electronics, meditation, all kinds of different neurological research,” said Ni.

The panelists also touched on the fact that while companies like InteraXon have developed a wearable called Muse, a flexible headband that has six non-contact electroencephalogram (EEG) sensors built into the loop, it’s likely that we might be moving past EEG.

“We don’t do a lot of EEG. We’ve done EEG in the past at consumer level; we don’t find this useful,” said Hosale. “What we find is that it’s actually more reliable, things like heart rate, skin conductivity, respiration, really like full range, physiological. We find…that signals are very sensitive. They’re very hard to pick up and there’s a lot of noise, and so there’s a lot involved.”

Overall, it seems that BCI is the next stage of wearables, but according to Moffat, it will take some time before these technologies can be fully adopted.

“When it comes to brain-computer interfaces, or for that matter, any kind of device that supports interface with the brain directly…your reference point is beating its peripheral routes,” said Moffat. “It’s very, very difficult to input information to take action. If you want to build some sort of software, some sort of system for measuring emotions, you have to beat the ability of the software to look at someone’s face and say what they’re feeling.”

Avatar

Amira Zubairi

Amira Zubairi is a staff writer and content creator at BetaKit with a strong interest in Canadian startup, business, and legal tech news. In her free time, Amira indulges in baking desserts, working out, and watching legal shows.

0 replies on “May #WWTO argues mind-controlled software, emotional AI the future of computing”