The problem is, most of us are complicit in handing over our private information. We shouldn’t be.

5 min read

Opinions expressed by Entrepreneur contributors are their own.

Facebook CEO Mark Zuckerberg’s recent grilling in Congress raised expectations that technology companies would take responsibility for privacy. Speaking strictly for me, I wouldn’t get my hopes up.

Related: The EU Data Privacy Regulation Vexing Mark Zuckerberg Is a Huge Opportunity for Your Startup

Tech companies will continue to provide free services in exchange for the right to collect and commoditize our data. And most of us will continue to be complicit in handing over that private, personal information. Then, we’ll throw up our hands and say, “We can’t control it!” — especially when that data is connected to our health.

The Facebook and Cambridge Analytica scandal brought this privacy discussion front and center in the category of election data. But at the same time, this discussion can be applied to the dozens of companies handling data in the health realm. Examples of those struggling to protect user privacy there?

  • Strava’s fitness tracker inadvertently revealed the locations and movements of U.S. military personnel in dangerous places.
  • The dating app Grinder shared data about users’ HIV status and location with third parties.
  • MyFitnessPal leaked usernames, email addresses and hashed passwords from 150 million accounts.

These are not isolated examples. Researchers at the AV-TEST Institute found that more than 80 percent of 60 tested Android health apps lacked proper privacy policies. One big reason may be that many of those apps demand access to photos, GPS data, device IDs, cameras, microphones and other functions unrelated to their services.

Related: 3 Reasons Why Privacy Matters to Your Business, Your Brand and Your Future

As the CEO of Cycle Technologies, I think about these issues daily. My team created Dot, a smartphone-based family-planning app that uses a proprietary algorithm to calculate a woman’s chance of pregnancy based on her period-start dates.

Per our privacy policy, we collect anonymous data to understand how users interact with our technology. We learn how they use our apps (to prevent or plan pregnancy) and collect anonymous analytics, such as how often the app is opened and what the country of use is. Those pieces of information can help us improve our service.

What we don’t collect or see is any personal data that a user enters on her phone. The only exception to this is if a user enters a research study and explicitly agrees to share data.  

Our product predicts a user’s daily odds of pregnancy with minimal data; that’s by design. We want to keep things simple, and we don’t want women’s data. Many fertility apps request sensitive information like weight, height, age, medical history, exercise and diet, claiming that they will “personalize” your experience or “improve accuracy.” If this describes your or your partner’s fertility app, be careful.

Usually, that data won’t benefit your experience or sharpen your algorithms. What’s more, app makers can turn around and sell your superfluous data to third parties like advertising platforms. Or, they can stockpile data in hopes of monetizing it one day.

Even those app makers with the best intentions may leak your data inadvertently. For example, the fertility app Glow discovered a flaw that could allow hackers to access personal information simply by entering a user’s email address. Safeguarding data is hard. Don’t assume that app developers can pull that off.

In some cases, health apps become fronts for data harvesting. A study of 110 apps by researchers at MIT, Harvard and Carnegie-Mellon found that 73 percent of Android apps studied shared users’ email addresses, and 47 percent of iOS apps shared location data. More disturbingly, three out of 30 health apps, including Drugs.com, were sharing search terms and user inputs like “herpes” with advertising networks.

Is that the objective of marketing “personalization”? To hit web users with herpes medication ads wheverever they go?

There’s not necessarily a shadowy bald man with a white cat plotting to steal your data. Abuse often happens when well-meaning app developers try to create useful health services. Thinking no one will pay for those services, they convince users to pay with their free time and attention, which are far more valuable resources. Alternatively, developers sell the health data.

What can we do better as developers? And what should we do as consumers and companies alike?

Let’s treat consumers as customers, not commodities. Developers, stop collecting as much data as you can tease out of users just because you hope it will be worth money someday. Don’t collect data unless it’s for the user’s benefit. Any data you collect must be safeguarded.

Reconsider your model, if your business model doesn’t work unless you share sensitive information with third parties. Then, draft a privacy policy that a 10-year-old can understand. Stop dressing up your business model in legalese.

As a consumer, don’t be complicit. Research any apps you intend to download. Read their privacy policies, and if they’re impenetrable, hit “CTRL+F” and search “data.” If you don’t like what you find, don’t check the boxes. If an app asks you to enter information that you think is irrelevant, don’t provide it.

Related: Beyond the Privacy Fine Print: Making Privacy More Transparent

Sooner rather than later, your body, your interests and your habits might belong to companies built to manipulate your lifestyle for profit. The potential abuses will be endless.

Finally, again, if you’re the consumer, do the inconvenient thing. Read some privacy policies. Delete some apps. And stay safe.

Resource Library Page

Comments are Closed