Outed by Your Own Cell Phone: Private Data Isn’t So Private

Article In The Thread
New America / Samperix on Shutterstock
Sept. 28, 2021

For years, privacy advocates have warned about the risks of mobile app data, and recently those harms materialized in an unlikely context: Grindr and the Catholic Church. The Roman Catholic Church was hit with news of priests among its ranks using the queer dating app Grindr. The Catholic publication The Pillar used commercially available geolocation and app-usage data from Grindr to identify that a specific priest had regularly used the app since 2018, and visited gay bars and private residences while using the app. This priest held a prominent oversight role in the church’s response to the 2018 sexual abuse crisis, and resigned after he learned that the exposé would be published.

Clergy abuse flourished amid a culture of secrecy and coverups, so it is right to promote transparency and accountability to correct that culture. This new story, however, conflates institutional transparency with the monitoring and exposure of an individual’s privacy, revealing only a priest’s violation of his celibacy vow, not any abuse or institutional cover-up. It is dangerous to equate transparency with a lack of privacy — and allowing reporting using this type of data will only become more common and the consequences should concern everyone.

Days after the news broke, the Pillar published another article claiming that 32 devices using location-based dating apps (including Grindr) emitted signals from non-public areas of Vatican City. Supposedly anonymous location and app-usage data is sold in aggregate and commonly used in digital advertising. However, with an industry built on de-anonymizing this data, we can expect more private scandals to be revealed using similar methods. The ad tech industry pushes the false narrative that their data practices aren’t intrusive because they do not use people’s names, though this story demonstrates that mobile ad IDs ultimately serve the same identifying function.

Grindr has a particularly egregious history of data abuses. For example, in 2018 it shared information about users’ HIV status with ad tech companies, and was fined by the Norwegian government for violations of the European privacy law. Many mobile apps collect granular, real-time geolocation data even when the app is not in use.

Unfortunately, Grindr’s terrible privacy practices are not rare among mobile apps: The U.S. military buys geolocation data from popular Muslim dating and prayer apps through defense contractors; ICE buys geolocation data to locate and deport undocumented immigrants; police buy data to track people, legally evading warrant requirements and Fourth Amendment protections; individuals can buy data on romantic interests or other personal relationships; and bounty hunters can buy real-time geolocation data to track the movements of their targets. There are virtually no restrictions in the United States on how your personal data can be captured, sold, bought, and disclosed, and the few restrictions that do exist are rarely enforced.

Data usage capabilities are no longer limited to intelligence agencies; anyone with the resources and the will can simply buy access to records showing the precise movements of a person over many years. The Pillar’s piece rebukes claims from privacy skeptics and industry lobbyists that the data collected and circulated for surveillance advertising purposes pose no substantial risk of harm. The harm in this case is primarily reputational, though the same type of data could be used to cause privacy harms that are physical, emotional, or relational. Many communities have seen and experienced these harms firsthand. One of the clearest examples is that of the queer community who have been denied their right to privacy and have long been policed literally and figuratively, culturally and by the state. See: gay panics, police entrapment, and homophobic laws.

It is dangerous to equate transparency with a lack of privacy...

To those eager for a quick solution to systemic abuse, increased surveillance always seems like the answer. Law enforcement officials regularly claim that if they had the power to proactively monitor everyone’s communications, they would be able to stop the spread of child sexual abuse material and catch predators. In reality, this would involve violating the privacy of all users in an attempt to target a small group of users. The ability to communicate privately is important to everyone, especially for marginalized groups, including LGBTQ+ people, as they are at a greater risk of being targeted for private communications that are in no way illegal.

A system that enables third-parties to access one individual’s private communications would undermine security and privacy for all. Catholicism’s sacrament of confession embodies this principle: When someone confesses a sin to a priest within the context of the sacrament, that communication is considered so sacred that priests are willing to be jailed or martyred to protect that information. This privacy allows Catholics to seek spiritual guidance because they trust that a confession will remain between them and their priest. If those communications could be bought and sold by data broker companies that information could be used to target individuals or communities.

Catholics and non-Catholics alike would be wise to heed Pope Francis’s recent warning in his 2020 encyclical, Fratelli Tutti:

“Digital communication wants to bring everything out into the open; people’s lives are combed over, laid bare and bandied about, often anonymously. Respect for others disintegrates, and even as we dismiss, ignore or keep others distant, we can shamelessly peer into every detail of their lives.”

The data broker industry can grant the power to peer into every detail of our lives to anyone willing to pay for it. This case demonstrates what happens when that power is used against an individual to damage their reputation for conduct in their private life. Examples of organizations abusing supposedly anonymous data without repercussions will only increase as data becomes easier to purchase and reidentify. The sensitivity of geolocation data makes this threat especially urgent.

At OTI, we advocate for comprehensive federal privacy legislation that would protect personal data by limiting data collection and the purposes for which sensitive data can be used. Data brokers should be prohibited from exploiting personal data and exposing individuals to untold risks. Grindr faced legal consequences because the European Union treats privacy as a human right; it would be much more difficult to bring a case against the company in the United States because we lack comprehensive privacy legislation. And once data has been sold to a data broker, there’s no putting the genie back in the bottle. A hefty fine acts as a deterrent for the future, but it cannot undo the reckless exposure of personal data. Privacy legislation won’t undo damage that has been done, but it can put an end to the commodification and sale of our lives.

You May Also Like

Does Data Privacy Need its Own Agency? (Open Technology Institute, 2021): For now the FTC has the authority to bring enforcement actions against companies violating user privacy, but with comprehensive privacy legislation potentially on the way that could change. Is it time that the FTC is no longer the de facto privacy agency and we finally have our own? We evaluate the various approaches for what this could look like.

Reflecting on the Digital Standard (Open Technology Institute, 2020): Now more than ever the digital standard of safety and privacy is important for everyone. From cars to cribs, these products have been tested to create a standard that ensures the products are safe and trustworthy, and with the growth in popularity of internet-connected smart devices a similar standard needs to be set around their privacy protection.

Privacy's Best Friend (Open Technology Institute, 2021): Encryption is being used everyday and is an essential part in protecting consumer's data. Requiring companies to provide mechanisms for law enforcement to have access to this encrypted information and restrictions on encryption could endanger consumer privacy.


Follow The Thread! Subscribe to The Thread monthly newsletter to get the latest in policy, equity, and culture in your inbox the first Tuesday of each month.