Your best friend may not know when you last had sex, but it’s possible that Facebook does.
At least two menstruation-tracking apps, Maya and MIA Fem, were sharing intimate details of users’ sexual health with Facebook and other entities, according to a new report from Britain-based privacy watchdog Privacy International . In some cases, those details, which are self-recorded by users in the app, included when a user last had sex, the type of contraception used, her mood and whether she was ovulating.
The findings raise questions about the security of our most private information in an age where employers, insurers and advertisers can use data to discriminate or target certain categories of people.
[ Code words and fake names: The low-tech ways women protect their privacy on pregnancy apps ]
Facebook spokesman Joe Osborne said advertisers did not have access to the sensitive health information shared by these apps. In a statement, he said Facebook’s ad system “does not leverage information gleaned from people’s activity across other apps or websites” when advertisers choose target users by interest. BuzzFeed first reported the news.
Period- and pregnancy-tracking apps such as Maya and MIA have climbed in popularity as fun, friendly companions that provide insights into the often daunting world of fertility and pregnancy. They can also be used to track sexual health more generally, moods and other intimate data. But many apps aren’t subject to the same rules as most health data.
That has raised privacy concerns as some of the apps have come under scrutiny as powerful monitoring tools for employers and health insurers, which have aggressively pushed to gather more data about their workers’ lives than ever before under the banner of corporate wellness. Plus, it appears the data could be shared more broadly than many users recognize, as flagged by the Privacy International study.
[ Is your pregnancy app sharing your intimate data with your boss? ]
Several period- and pregnancy-tracking apps have been called out for sharing health data with women’s employers and insurance companies, as well as for security flaws that reveal intimate information. As a result, many women say they’ve devised strategies to use the apps without revealing all of their most sensitive information. Among those strategies : using fake names, documenting only scattered details and even inputting incorrect data.
Users and experts alike worry that the data could be exposed in security breaches, or used by employers and insurance companies to discriminate against women by increasing their premiums or not offering them leadership positions.
Deborah C. Peel, a psychiatrist and founder of the nonprofit Patient Privacy Rights, said people expect that their health data will be protected by the same laws that protect their health information in a doctors office, but many apps aren’t subject to the same rules.
“Most people would want to make their own decisions about what’s known about their sex life, about whether it’s shared or not,” said Peel. “Right now we have no ability to do that.”
[ Facebook wants to find you a soul mate. Will users trust the company with their secrets? ]
Facebook, the world’s largest social media platform with 1.2 billion daily users, is asking users to trust it with more and more sensitive information than at any time in the past. Last week, the company launched Facebook Dating in the United States , a matchmaking service that suggests potential love interests to users based on preferences, interests and Facebook activity.
At the same time, Facebook has come under fire in recent years for multiple scandals involving misinformation, fake accounts and breaches of trust. That includes the 2018 revelation from a whistleblower that Facebook had allowed political consultancy firm Cambridge Analytica to improperly access data from millions of users . In that case, the data was harvested through a third-party quiz app.
In a Facebook statement included in the report, the company said its terms of service prohibit app developers from sharing health or sensitive data, and that it has been in contact with Maya and MIA to notify them of a possible violation of those terms. Facebook also said that while it has systems in place to automatically detect and delete information like Social Security numbers and passwords from the information shared by apps, the company is “looking at ways to improve our system/products to detect and filter out more types of potentially sensitive data.”
Plackal Tech, which developed Maya, said in its statement to Privacy International that it would remove the Facebook Software Development Kit from a new version of its service. There was no published response from Mobapp Development, the company behind MIA, and the company did not have an immediate comment.
LINK ORIGINAL: Washington Post