Why ‘body language’ will be the emoji of wearable tech

Apple Watch

Communication is core to who we are as a species. Our ability to express our ideas, thoughts, and feelings to each other is what makes us human. Every new wave of technology has furthered our ability to communicate. The printing press democratized the printed word in books and newspapers; the telephone allowed us to talk to relatives across the sea; the computer equipped us with a way to express ourselves faster and to even more people with weblogs; the immediacy of email and social media and was furthered by the mobile phone and the introduction of text messaging and short form languages like T9 and emoji.

The driver of wearable technology’s capacity to change communication will be haptic feedback.

As we sit at the dawn of a new wave of computing, wearable technology will continue to evolve our use of language and our ability to communicate. But the advancements this wave will bring in how we connect with one another won’t be visual or spoken, but rather sensed or felt by our bodies, and the driver behind all of this will be haptic feedback.

Touch is a critical part of how we communicate. We shake hands when we meet someone for the first time, tap on someone’s shoulder to get their attention, hug a friend to convey how much they mean to us, and so on. But up until now, technology hasn’t been able to leverage this sense, and that’s because it needs access to our body. Smart bands, smartwatches, and smart clothing (shirts, shorts, socks, underwear etc.) equipped with sensors and haptics benefit from this access, which will break open our ability to communicate through body language and make touch a critical new addition to our technological lexicon. As wearables mature, and we with it, I expect that we will begin to learn a whole new language, like we did with T9 and emoji, a body morse code if you will – the ability to feel a message without having to look down at the screen or pick up the phone to hear someone’s voice. Eventually, we will know how someone feels or what they are thinking all through the nuances of vibrational patterns we sensed through our smart clothing and accessories. And although it is still early days, we are already starting to see the beginnings of how this will all come together.

Apple Watch

Perhaps the most critical example of how haptics are furthering our ability to communicate can be seen in the Apple Watch. The most disruptive aspect of the Apple Watch is its powerful taptic engine and the introduction of digital touch. Sending your heartbeat and tapping on someone’s wrist using digital touch may seem like a gimmick but it’s actually the beginnings of this new ‘body language’. Feeling your partner’s heartbeat on your wrist or incessant tapping from a friend telling you they “want to get out of here” is a powerful experience on the Apple Watch, one that doesn’t require pictures or words to know what is going on. After wearing an Apple Watch for a couple of weeks you begin to become familiar with the nuances in the vibrations you feel on your wrist. The short “tap-tap” of the standing reminder versus the more celebratory vibe that occurs when you get a progress update from the Activity app is a great example.

Touch is a critical part of how we communicate. But tech hasn’t been able to leverage touch until now, because it needs access to our body.

What’s exciting is that this is just the start for the Apple Watch. Apple’s latest announcement of watchOS 2 invites developers to begin to play with the taptic engine which means that we will soon be able to tell if our bus is on time or delayed without even having to flick our wrist to turn on the screen. It won’t be long until your most popular apps will begin to communicate with you by vibration alone.

But it isn’t just the ability to understand a message through touch that will be possible, but feeling the emotion and sentiment of that message as well. Imagine knowing the general sentiment of your location on Twitter just by sensing it, feel the disappointment of losing a friend who just unfollowed you on Facebook, or experience the emotion of an angry face emoji or the dancing woman (ole!).

CuteCircuit’s popular Hug Shirt is a great example of a wearable that conveys emotion through the use of haptics. The Hug Shirt allows you to send hugs to people wearing it even across long distances. Embedded in the Hug Shirt are sensors that feel the strength, duration, and location of the touch along with the skin warmth and the heartbeat rate of the sender, while the shirts actuators recreate the sensation of this touch, including its warmth, to convey the emotion.

Source: Wearable Experiments
Source: Wearable Experiments

Wearable Experiments has also played with feeling through its smart soccer jersey the Alert Shirt, created in partnership with Foxtel. Fans wearing the Alert Shirt are able to ‘feel’ the action of the game as it happens live. Wearable Experiments leveraged real-time sports data to allow a fan to feel what their favorite player is feeling on the field. If he gets tackled, you feel like you just got hit; if he is nervous, than you feel a nervous-like sensation; and if he scores, you feel his excitement. Feeling another person was something Wearable Experiments continued to explore in Fundawear, a project the company did with Durex to explore the future of foreplay. Fundawear is a set of connected underwear for a man and a woman which is connected to your smartphone, using sensors inside the garments to allow the partners to “tease, tickle and tantalise” each other even when they are apart.

For its Navigate jacket, Wearable Experiments used haptics for a more practical application. As the name suggests, Navigate is a smart jacket that can help direct you to your destination. It does this by using integrated LED lights and haptic feedback which are triggered by the turn-by-turn directions via a connected smartphone app. This application is very similar to Ducere’s smart shoes, Lechal, which when paired to Google Maps, vibrate to help the wearer know if they need to turn left or right to get them to where they want to go.

With Google’s recent unveiling of its connected clothing offering, ATAP’s Project Jacquard, I expect we will see a flurry of advancements in the use of haptics and touch within our clothing over the next few years.

If you have doubts about the power of haptics, let me leave you with this dynamic TED Talk from David Eagleman, an American neuroscientist and director of the Laboratory for Perception and Action. Eagleman’s research explores translating sound into haptic feedback so that a deaf person can understand what is being said. It’s a must watch TED Talk and will leave you believing that we have the capability to realize the future of communication through sense as I’ve described here.

You feel me?

0 replies on “Why ‘body language’ will be the emoji of wearable tech”