Like many queer people in Russia, Roman Gunt uses multiple LGBT+ dating apps. “They’re the main reason our community exists here,” he tells me. “Our society is still stuck in a USSR mentality, so you can’t publicly declare that you’re part of the LGBT+ community.” Apps made a big difference because they allowed gay people to communicate directly with each other, Gunt explains, making them feel more united. And that matters in Russia, where a combination of open hostility and a harmful “propaganda law” — which bans anyone from “promoting” homosexuality through art, education, or online information — have pushed queer people underground. In Russia, dating apps are about more than just hookups; they’re about freedom.
What’s more worrying is that this discrimination is a global problem. There are 70 UN member states that still criminalize same-sex activity, according to a recent report by the the International Lesbian, Gay, Bisexual, Trans, and Intersex Association (ILGA). In some of these countries, homosexuality is a crime punishable by death, and LGBT+ dating apps often aren’t available. But when they are, they can “save lives” by offering sexual health advice, access to health and legal services, and a vital lifeline of connection to the community.
Yet these refuges can quickly become risky for unwary users. Roman describes his fear of online “hunters,” policemen or gang members who register to LGBT+ dating apps and pretend they’re trying to meet guys for sex. He explains that they have an ulterior motive “to plant drugs on you, so they can take you to prison. They can’t legally arrest you, so they resort to deception.” These “hunters” have recently made headlines in Egypt, Singapore, and Russia, sparking fear amongst LGBT+ communities worldwide.
Faced with such threats, users have tried to protect themselves on these apps by asking extensive questions that might help identify potential “hunters,” and by sharing safety warnings on sites like the Russian social networking and social media platform, VK. One Grindr user in the conservative Southeast Asian country of Malaysia, who asked not to be named, told me that although LGBT+ apps were providing a virtual safe space, there is a culture of fear in the country at large. “I could never risk uploading a profile picture,” he tells me. “Photos can be used to track you down, so I only share them privately.”
There could actually be more risks for users if platforms were made accountable for the actions of their users.
The lengths to which users of platforms, like Grindr in some countries, are going to protect themselves raises a question: What responsibility do these platforms have to proactively protect their users? Perhaps surprisingly, recent legal precedent, in the U.S. at least, indicates that these sites may have no duty of care at all.
In 2016, Matthew Herrick, a then-32-year-old gay man living in New York City, was targeted by a malicious ex who created numerous fake Grindr profiles using his name and image. These profiles were then used to lure men — he says more than 1,000 over the course of several months — to his home and workplace, where they were promised he would be waiting and willing to engage in rough, sometimes nonconsensual (“rape fantasy”) sex.
Herrick, along with friends and lawyers, filed more than 100 complaints against Grindr, and then followed up with a lawsuit. He was not successful; a federal judge, and later an appeals judge, all agreed that under current law Grindr was not responsible for Herrick’s safety.
The regulation at the center of that decision was Section 230 of the Communications Decency Act of 1996. Section 230 protects tech platforms by insulating them from the actions of their users. It’s why Twitter, YouTube, and other service providers can’t be held accountable for libel or malice spread by accounts they host. The legislation holds that such apps merely provide a platform, and that companies can’t then be held responsible for what their users do with it.
If Herrick had won his case, the internet could have been irreversibly changed. And yet, there could actually be more risks for users if platforms were made accountable for the actions of their users.
Some technology experts and digital rights activists argue that holding companies accountable for users’ posts could spark an increase in digital surveillance or automated censorship filters, because companies would be compelled to monitor users content more heavily. This kind of automated content moderation has been alleged to disproportionately target LGBT+ users, such as the apparent censorship of queer YouTube creators, and the deletion of trans Facebook users who choose to register under their chosen names.
If not legally, then what morally could LGBT+ dating apps be doing to better protect their users? The first step is to listen to affected communities, something initiated by an in-depth research project launched by the human rights organization Article 19, back in 2017. Its aims were to collaborate with app developers, alert them to malicious uses of their products, help developers connect to grassroots activists and focus groups, and then work alongside them to tackle each of these problems.
“The big conversation at the time was around geolocation, which we expected to see was being used to track people,” recalls Norman Shamas, an independent queer researcher involved with the project. Geolocation has come under fire on numerous occasions over the last few years; data leaks have also been exposed, and Grindr was even branded a national security risk earlier this year due to its Chinese ownership.
Yet, Norman outlines that the most worrying problems were coming from users themselves, citing Matthew Herrick’s lawsuit as a case in point. So instead of focusing advice on protecting users’ identities, through VPNs and encryption, the project shifted to harm reduction, advising that sites add panic buttons and a way for users to contact local LGBT+ groups and activists. Grindr is considering these options, Norman says, and has already implemented password protection and app logo modification, which allows users to conceal recognizable logos when searched by police officers at street checkpoints — a practice identified in Article 19’s report on Iran, Egypt, and Lebanon.
The project itself is radical not only because it treats dating apps as a form of free expression, but because it allows local communities and activists to outline exactly what they need to ensure safety. “No app is perfect,” continues Norman, who sees protecting users as foundational to running a service for the LGBT+ community. Making money from a community, they say, should not be a priority over safety, especially where apps are introduced in countries with poor civil rights records. “For me, it’s more about asking: Is there an app that’s doing this well? Why aren’t more apps engaging directly in conversations around what harm reduction looks like? Why are they not implementing certain changes?”
Grindr said in a statement to OneZero that user safety is its top priority, and to signpost Grindr for Equality, its human rights arm. Grindr declined to respond to questions about the Matthew Herrick case.
The gay social network Hornet has gained a reputation for actively protecting its users, partly because of its popularity in Russia, Turkey, and Brazil — all of which are notoriously hostile towards LGBT+ people. “Our entire organization is focused on keeping the community safe from various harms,” explains Hornet CEO Christof Wittig, citing threats from scammers, hackers — who famously hijacked an LGBT+ dating app to send threats during the 2014 Sochi Olympics in Russia — and police officers seeking to “trap” users. Hornet also has a strict, zero-tolerance hate speech policy and it partners with hackers through HackerOne, “a bug bounty program which helps us to find vulnerabilities in our infrastructure before malicious actors find them,” says Wittig.
“These apps are the main reason that our Russian queer community exists at all.”
Hornet is also advocating for the LGBT+ community. In 2016, Wittig launched the news service, Hornet Stories, to fill a human rights-shaped hole in the queer digital media landscape. His aim was to help queer people “express not just their sexual pleasures, but their interests, joys, fears, and dreams,” and to rehouse a series of talented editors whose former platforms had crumbled under economic pressure. He describes wanting to create “the equivalent of a digital gay bar” by shifting focus towards stories and communities, but he’s redefining what an LGBT+ dating app should be in the process.
The same can be said of the Chinese app Blued, which has applied for a license to sell the HIV treatment PrEP in China and wants to offer it at a lower price; it even has a surrogacy service in the form of Bluedbaby, inspired by founder Geng Le’s own experiences.
As these apps become more ambitious in scope, it’s important that developers understand the unique dangers that LGBT+ people face, including being outed and persecuted by “hunters,” being the target of hate crimes, and the lack of legal protections. It’s also worth noting that mainstream apps like Tinder are also targeted aggressively by hackers and used to facilitate crime, but they aren’t subjected to the same media scrutiny — despite concerns about scam bots, a lack of encryption, and even alleged murders. “The fact that we don’t scrutinize dating apps aimed at cisgender, heterosexual users in the same way, through that security lens, shows a queer negativity,” says Norman.
Ultimately, the mere existence of these apps presents a new series of risks tied to geolocation and digital security, but the platforms aren’t going anywhere — nor should they. Dating apps have become a vital tool of expression and connection, especially in communities that need support. Roman Gunt in Russia, for one, can’t imagine his world without them. “After all,” he says, “these apps are the main reason that our Russian queer community exists at all.”