in , , , , ,

Tinder fails to protect women from abuse. But when we brush off ‘dick pics’ as a laugh, so do we

Written by Rosalie Gillett, Research Associate in Digital Platform Regulation, Queensland University of Technology

An ABC investigation has highlighted the shocking threats of sexual assault women in Australia face when “matching” with people on Tinder.

A notable case is that of rapist Glenn Hartland. One victim who met him through the app, Paula, took her own life. Her parents are now calling on Tinder to take a stand to prevent similar future cases.

The ABC spoke to Tinder users who tried to report abuse to the company and received no response, or received an unhelpful one. Despite the immense harm dating apps can facilitate, Tinder has done little to improve user safety.

Way too slow to respond

While we don’t have much data for Australia, one US–based study found 57% of female online dating users had received a sexually explicit image or image they didn’t ask for.

It also showed women under 35 were twice as likely than male counterparts to be called an offensive name, or physically threatened, by someone they met on a dating app or website.

Tinder’s Community Guidelines state:

your offline behaviour can lead to termination of your Tinder account.

As several reports over the years have indicated, the reality seems to be perpetrators of abuse face little challenge from Tinder (with few exceptions).

Earlier this year, the platform unveiled a suite of new safety features in a bid to protect users online and offline. These include photo verification and a “panic button” which alerts law enforcement when a user is in need of emergency assistance.




Read more:
Tinder’s new safety features won’t prevent all types of abuse


However, most of these features are still only available in the US — while Tinder operates in more than 190 countries. This isn’t good enough.

Also, it seems while Tinder happily takes responsibility for successful relationships formed through the service, it distances itself from users’ bad behaviour.

No simple fix

Currently in Australia, there are no substantial policy efforts to curb the prevalence of technology-facilitated abuse against women. The government recently closed consultations for a new Online Safety Act, but only future updates will reveal how beneficial this will be.

Historically, platforms like Tinder have avoided legal responsibility for the harms their systems facilitate. Criminal and civil laws generally focus on individual perpetrators. Platforms usually aren’t required to actively prevent offline harm.

Nonetheless, some lawyers are bringing cases to extend legal liability to dating apps and other platforms.

The UK is looking at introducing a more general duty of care that might require platforms to do more to prevent harm. But such laws are controversial and still under development.

The UN Special Rapporteur on violence against women has also drawn attention to harms facilitated through digital tech, urging platforms to take a stronger stance in addressing harms they’re involved with. While such rules aren’t legally binding, they do point to mounting pressures.

Illustration of distressed woman at computer.
Online abusers on Tinder have been reported blocking victims, thereby deleting all the conversation history and removing proof of the abuse.
Shutterstock

However, it’s not always clear what we should expect platforms to do when they receive complaints.

Should a dating app immediately cancel someone’s account if they receive a complaint? Should they display a “warning” about that person to other users? Or should they act silently, down-ranking and refusing to match potentially violent users with other dates?

It’s hard to say whether such measures would be effective, or if they would comply with Australian defamation law, anti-discrimination law, or international human rights standards.

Ineffective design impacts people’s lives

Tinder’s app design directly influences how easily users can abuse and harass others. There are changes it (and many other platforms) should have made long ago to make their services safer, and make it clear abuse isn’t tolerated.

Some design challenges relate to user privacy. While Tinder itself doesn’t, many location-aware apps such as Happn, Snapchat and Instagram have settings that make it easy for users to stalk other users.

Some Tinder features are poorly thought out, too. For example, the ability to completely block someone is good for privacy and safety, but also deletes the entire conversation history — removing any trace (and proof) of abusive behaviour.

We’ve also seen cases where the very systems designed to reduce harm are used against the people they’re meant to protect. Abusive actors on Tinder and similar platforms can exploit “flagging” and “reporting” features to silence minorities.

In the past, content moderation policies have been applied in ways that discriminate against women and LGBTQI+ communities. One example is users flagging certain LGBTQ+ content as “adult” and to be removed, when similar heterosexual content isn’t.




Read more:
Looking for love on a dating app? You might be falling for a ghost


Tackling the normalisation of abuse

Women frequently report unwanted sexual advances, unsolicited “dick pics”, threats and other types of abuse across all major digital platforms.

One of the most worrying aspects of toxic/abusive online interactions is that many women may — even though they may feel uncomfortable, uneasy, or unsafe — ultimately dismiss them. For the most part, poor behaviour is now a “cliche” posted on popular social media pages as entertainment.

It could be such dismissals happen because the threat doesn’t seem imminently “serious”, or the woman doesn’t want to be viewed as “overreacting”. However, this ultimately trivialises and downplays the abuse.

Messages such as unwanted penis photos are not a laughing matter. Accepting ordinary acts of abuse and harassment reinforces a culture that supports violence against women more broadly.

Thus, Tinder isn’t alone in failing to protect women — our attitudes matter a lot as well.

All the major digital platforms have their work cut out to address the online harassment of women that has now become commonplace. Where they fail, we should all work to keep the pressure on them.

If you or someone you know needs help, call Lifeline on 13 11 14.

This article by Rosalie Gillett, Research Associate in Digital Platform Regulation, Queensland University of Technology, originally published on The Conversation is licensed under Creative Commons 4.0 International(CC BY-ND 4.0).

What do you think?

2080 points
Upvote Downvote
Legend

Written by Open Access

Content Author

Comments

Leave a Reply

Your email address will not be published. Required fields are marked *

Loading…

0
file 20201013 17 1borure

why Kevin Rudd’s call for a royal commission into News Corp may lead nowhere

file 20201013 17 ii0u3c

Where did Victoria go so wrong with contact tracing and have they fixed it?