If youâre with a company looking to lead in just about any sector, the catchphrase for your business plan is âdata-driven.â
But weâve seen how a companyâs quest for big data insights can hit a wall of privacy constraints, in perhaps the starkest way imaginable: a woman in the US reportedly launched a proposed class-action lawsuit against a Canadian company over the We-Vibe vibrator. The suit alleges that the company secretly collects and transmits sensitive data, like the time and duration of usage for the sex toy, along with the userâs email.
With advances in wearables and the IoT making issues like these potentially more common, BetaKit chatted with Vancouver-based Boughton Law associate Elizabeth Reid about the scope of the problem and what tech companies (or any company dealing with data) can do to avoid these kinds of legal troubles.
When youâve got startup people juggling balls and not necessarily working with a strict chain of command, itâs easy for this kind of thing to slip through the cracks.
âThere are privacy issues arising all the time, particularly around wearable tech,â Reid says, noting that many companies are dealing with high volumes of information and much of it is personal. âThe first key thing to do is, youâve got to tell your customers what youâre collecting. The customer has a right to know what information youâre collecting, why youâre collecting it and how youâre going to use it.â Customers should not be surprised about the data theyâre sharing. The company should have informed consent. The more intimate the information youâre collecting, the more specific you need to be.
Fallout is still coming from the extramarital affair-facilitation website Ashley Madison. The hack exposed about 30 million user accounts â one of the biggest privacy breaches in modern history. The Privacy Commissioner of Canada slammed the company for sloppy security practices that made its users vulnerable.
âOne of the lessons for companies here is that it shows just how long youâre going to be tied up in these proceedings,â Reid said. âThis has been going on for over a year now. Just in terms of the management time and legal time being used up, itâs considerable. These people are not focused on serving the company.â
One of the factors that helped get the company into trouble: they represented to customers that it was a safe product and that their privacy was absolutely protected, with a discreet service. Those claims were at the heart of Ashley Madisonâs business model and brand: which made it all the more imperative for the company to deliver, before privacy investigators got on the scene.
For tech companies and startups in particular, where every employee may wear many hats, itâs critical to assign at least one person to watch for privacy issues, Reid says. âThey need to have training to look for problems. If itâs not specifically their responsibility, when youâve got startup people juggling balls and not necessarily working with a strict chain of command, itâs easy for this kind of thing to slip through the cracks. You need a privacy officer and ideally more than one person, who is designated to look for privacy issues.â
Companies also need to realize that when designing products, they need to assume people will make mistakes with how they use the technology. âFor instance, there was one company that sent out invoices with a weblink, where a recipient didnât cut and paste it, but typed it out manually and transposed one of the numbers,â Reid explains. âThey suddenly had a different companyâs information.â Privacy and data security need to be built into the user experience design.