If you’re with a company looking to lead in just about any sector, the catchphrase for your business plan is ‘data-driven.’
But we’ve seen how a company’s quest for big data insights can hit a wall of privacy constraints, in perhaps the starkest way imaginable: a woman in the US reportedly launched a proposed class-action lawsuit against a Canadian company over the We-Vibe vibrator. The suit alleges that the company secretly collects and transmits sensitive data, like the time and duration of usage for the sex toy, along with the user’s email.
With advances in wearables and the IoT making issues like these potentially more common, BetaKit chatted with Vancouver-based Boughton Law associate Elizabeth Reid about the scope of the problem and what tech companies (or any company dealing with data) can do to avoid these kinds of legal troubles.
When you’ve got startup people juggling balls and not necessarily working with a strict chain of command, it’s easy for this kind of thing to slip through the cracks.
“There are privacy issues arising all the time, particularly around wearable tech,” Reid says, noting that many companies are dealing with high volumes of information and much of it is personal. “The first key thing to do is, you’ve got to tell your customers what you’re collecting. The customer has a right to know what information you’re collecting, why you’re collecting it and how you’re going to use it.” Customers should not be surprised about the data they’re sharing. The company should have informed consent. The more intimate the information you’re collecting, the more specific you need to be.
Fallout is still coming from the extramarital affair-facilitation website Ashley Madison. The hack exposed about 30 million user accounts — one of the biggest privacy breaches in modern history. The Privacy Commissioner of Canada slammed the company for sloppy security practices that made its users vulnerable.
“One of the lessons for companies here is that it shows just how long you’re going to be tied up in these proceedings,” Reid said. “This has been going on for over a year now. Just in terms of the management time and legal time being used up, it’s considerable. These people are not focused on serving the company.”
One of the factors that helped get the company into trouble: they represented to customers that it was a safe product and that their privacy was absolutely protected, with a discreet service. Those claims were at the heart of Ashley Madison’s business model and brand: which made it all the more imperative for the company to deliver, before privacy investigators got on the scene.
For tech companies and startups in particular, where every employee may wear many hats, it’s critical to assign at least one person to watch for privacy issues, Reid says. “They need to have training to look for problems. If it’s not specifically their responsibility, when you’ve got startup people juggling balls and not necessarily working with a strict chain of command, it’s easy for this kind of thing to slip through the cracks. You need a privacy officer and ideally more than one person, who is designated to look for privacy issues.”
Companies also need to realize that when designing products, they need to assume people will make mistakes with how they use the technology. “For instance, there was one company that sent out invoices with a weblink, where a recipient didn’t cut and paste it, but typed it out manually and transposed one of the numbers,” Reid explains. “They suddenly had a different company’s information.” Privacy and data security need to be built into the user experience design.