Queer Dating Apps Need to Protect Their Users Better
Weekly Article
Vdovichenko Denis / Shutterstock.com
March 1, 2018
In late September, Egyptian authorities began a crackdown against the country’s queer communities after fans of Mashrou’ Leila, an outspoken Lebanese indie rock group with an openly gay band member, displayed a rainbow flag at the group’s concert in Cairo.
The government responded quickly in what some activists called the worst campaign against LGBTQIA+ Egyptians in decades. Security forces arrested more than 85 individuals on a range of charges, including “habitual debauchery.” Officials convicted at least 16 and issued sentences ranging from six months to six years in prison (though a handful were later released).
To find and arrest their targets, security forces, among other tactics, created fake profiles on queer dating apps like Grindr and Hornet. Though homosexuality isn’t outlawed in Egypt, authorities often lean on colonial-era codes regulating sex and morality to justify violence against LGBTQ communities and to prosecute queer people. Upon arriving for a rendezvous arranged through the app, some users instead found authorities waiting for them, ready to use the meeting and chat logs as evidence of illegal debauchery, immorality, promoting homosexuality, or other alleged offenses.
This was not the first time authorities or vigilantes have used gay dating apps to persecute their users. In 2014, at least three websites outed gay dating–app users in Jordan by posting their profile information, sometimes including their location. (The pages have since been taken down.) Last year, the South Korean army was suspected of using dating apps to out gay soldiers. Chechen authorities used gay dating apps as evidence for arrest during a terrifying homophobic purge in 2017. Egyptian authorities had a prior history of using queer platforms to target users, with reports of luring users to meet and arrest them and of targeting individuals on the street and searching their phones dating back as early as 2015. With the 2018 acquisition of Grindr by a Beijing-based tech firm, some are concerned that the Chinese government could use sensitive data from the app to similarly crack down on their local queer communities too. In all of these cases, just being identified as queer could be enough to put someone at risk.
Despite these dangers—and many other recent reminders of the violence LGBTQ individuals still face around the world, including a rise in the number of deadly attacks in the U.S.—many people continue to use queer platforms like Grindr and Hornet. These apps are more than just a place for dating. They act as a digital convening point for developing communities, exploring individual identities, and escaping heteronormative surroundings. The platforms can also afford a greater degree of anonymity for someone who wishes to remain in the closet in their public life.
Because of this, queer dating–app users face a hard choice: accept the risk or lose their important—and, in some instances, only—connection to their community. But the task of mitigating harm shouldn’t fall solely on these individuals. The app creators, too, bear responsibility for protecting their valuable users. Yet too often, intentionally or not, these developers design their platforms in ways that place the burden of digital safety and privacy on users. Thankfully, however, some of these companies may finally be recognizing a need to step up.
As an independent queer activist and security-and-privacy harm-reduction specialist, I often help app users mitigate risks. Depending on the circumstances, I might propose simple steps such as suggesting an individual be more mindful of what information they share—say, remembering to blur their face and any identifying marks (tattoos, birthmarks, etc.) when sending nudes. At other times, it might involve recommending that someone adopt more technology-dependent practices, such as using anonymizing software like the Tor Browser or switching to secure and ephemeral messaging apps.
But users who can’t have a personal consult with a security-fluent activist often feel forced to take actions that may actually put them at greater risk. For example, to lessen the chance of unintentional outing if they share a phone with family members or friends (or worse, fear being forced to turn over their device to authorities), individuals might delete and redownload the app between uses. This strategy, however, has the downsides of lost message logs, racking up cellular-data costs, and an increased chance of exposing users on networks that might flag a device for downloading a queer app.
Similarly, the lack of security features (and lack of transparency about security features that are already in place) in these queer dating apps and on some websites that serve the LGBTQ community also create problems. For example, the majority of dating apps don’t transmit pictures securely. All of the major dating apps, too, are able to access messages stored on company servers, meaning their contents could be compromised in the event of a government request or, if stored insecurely, in the event of a data breach.
To try to protect themselves, users often turn to software that only partially addresses privacy and security issues. For some, such software can also create a false sense of security. For example, using a virtual private network can help users circumvent government censorship by making it appear as if a user is accessing the web from a different location. But VPNs won’t hide a user’s location from a dating app, which relies on a phone’s built-in GPS sensor for its geolocation features. What’s more, in countries that have banned these kinds of tools, downloading or using certain known VPNs or circumvention software might land a user under increased scrutiny.
To be fair, though the majority of app companies refuse to acknowledge these issues, a few platforms are genuinely trying to protect users. Though, sometimes, their efforts fall short—or seem to be more aimed at papering over scrutiny.
In response to a series of crackdowns on LGBTQ communities in 2017, for example, Grindr and Hornet began providing safety information and safety tips in Arabic to inform users in certain parts of the Middle East about specific risks reported with the apps. Though the move was aimed at addressing the dangers these individuals face, it did so in a way that placed responsibility with the users without implementing changes to protect them too—say, eliminating watermarked app logos on photos, which have been used as evidence in court and blackmail attempts, but would have cost them the in-app branding space. It was a choice that seemed to prioritize the bottom line over security.
The same was true of changes to the geolocation features in dating apps. In 2014, security researchers revealed that, with a little effort, anyone could triangulate a dating-app user’s location by comparing distances of different points. In response, some users who were trying to find ways to mask this data from the app rooted or jailbroke their phones to fake GPS sensor data—which actually put them at even greater risk.
The issue creates a tricky dilemma for dating-app developers whose products revolve around these location features. In response to the researchers’ findings, Grindr and Hornet both rolled out controls that allowed users to disable the “show my distance” feature in the apps or automatically turned it off for “high risk” areas. But it didn’t resolve the issue: Researchers have still been able to accurately locate users using similar triangulation techniques—an issue that still exists as of this writing. Instead, it seemed to reveal these changes were more about crisis management than a comprehensive harm-reduction approach.
Similar issues can arise for individuals accessing other types of digital content too—say, pornography and, in particular, queer pornography in places where pornographic websites are censored or banned. Pornhub, for its part, recognized this type of risk and implemented HTTPS encryption across its entire site, meaning internet service providers would lose the ability to track what type of porn a user views. (The ISP would still know if the subscriber accessed Pornhub, but not the specific pages within it.) This stands in contrast to pornography sites that haven’t implemented encrypted HTTPS connections by default, which place the burden on the user to keep this browsing private.
A few dating apps, too, are opening up to the idea of changing their platforms to better safeguard users. For example, I’ve been working as a technical expert with Article 19, a human rights organization focused on defending freedom of expression and information, on an initiative that aims to bring a harm-reduction approach to protecting queer dating–app users. As part of the project, we’re working directly with Grindr and other major app platforms to recommend and implement certain tech- and design-safety changes.
To come up with these recommendations, our coalition—composed of community members, technologists, and queer rights activists—surveyed more than 400 users, held focus groups lead by local LGBTQ organizations, and interviewed LGBTQ activists in Middle Eastern and North African countries. This helped us gain a better understanding of how individuals use these technologies, how aware they are of risks and prevention strategies, and how they thought we might improve dating-app platforms to better keep users safe and secure in different contexts. In addition to relaying this feedback, we also have coalition members advising these queer dating–app partners about how to implement our suggested changes.
Grindr, for its part, has already publicly made two of our proposed reforms, offering alternative icons and app-specific pass codes, both of which reduce the harm that can be done by someone with physical access to a user’s phone. Risks still exist, of course. A user might divulge compromising information in conversation, share identifiable details in an app profile, be coerced into turning over a pass code, or be entrapped by authorities. But the changes can provide a layer of defense against simpler breaches.
Our partner apps are also considering our other recommended design and policy changes. These include allowing users to send disappearing messages or to get alerts when another user takes a screenshot, similar to the systems used on Snapchat. It also includes providing users with more up-to-date, geolocation-based safety guides instead of generic tips. Ideally, apps would tailor this information to reflect country-specific laws, practices, events, and consistent feedback from local organizations.
With all of our proposals, we start with the idea that these features should empower, not curtail, queer individuals. Apps and advocacy organizations shouldn’t pressure particular users to take up any of these strategies. Instead, people need options for different risks. We don’t want to perpetuate a shame culture or create digital closets. But all individuals should have the opportunity for informed consent. Platforms have a responsibility to provide users with information about the specific risks they may take using their technologies and reasonable features that can mitigate some of those risks.
By its nature, this harm-reduction approach will not remove all of the dangers LGBTQ individuals face. And with political and technological changes, users, activists, and platforms will have to constantly respond to shifting individual needs. We know queer dating apps like Grindr, Hornet, Her, and Wapa can be powerful platforms for building strong queer communities online and off, especially in the places where it matters most. So let’s give these budding communities as much control as we can to keep them alive.
This article originally appeared in Future Tense, a collaboration among Arizona State University, New America, and Slate.