Twitter’s Former Trust and Safety Chief Is Trying to Clean Up Your Dating Apps

Yoel Roth has spent the past 16 months recovering from a very bad, very public breakup.

For two chaotic weeks after Elon Musk took control of Twitter in October 2022, Roth clung on to his job as the platform’s head of trust and safety. He even won public praise from Musk for his “high integrity.” But Roth ended up walking away from the job that November, and he was quickly targeted with a torrent of harassment, driven partly by lurid accusations from Musk himself and also by “The Twitter files,” a dump of internal documents that revealed how Roth and other executives grappled with content moderation decisions.

Roth has kept busy consulting, teaching, and studying decentralized social networks (he now posts on Bluesky). Now, he’s getting back into what he has called the internet sanitation business, as the head of trust and safety at Match Group.

Match Group owns and operates a massive portfolio of dating apps around the world: Match, Tinder, Hinge, The League, OKCupid, BLK, Archer, Azar, Chispa, and more. Roth wrote his 2016 PhD dissertation on how gay culture, identity, and safety intertwine on location-based dating apps, with a focus on Grindr. You might say Roth’s new job is a good match.

Most discussions about online moderation revolve around open social platforms like Twitter, YouTube, and Facebook apps, but toxic behavior happens on dating apps too, where most of the dialog happens in one-on-one chats. Scammers and fraudsters have fully infiltrated dating apps—not just Match’s apps but across the whole industry—and Match is in a constant race to use advanced technology to fight scammers wielding the same tools. People on dating apps have suffered real-life harm and violence after meeting up with potential dates. The Medellin Tourism Observatory in Colombia linked several tourist deaths to the dating app Tinder in 2023.

So Roth, who started his new role two weeks ago, will have his hands full, and it’s unclear how many other hands will be on deck to help him manage it. He spoke with WIRED about the challenges of keeping people safe inside dating apps, and how he thinks app stores should share that responsibility, in a conversation that has been edited and condensed for clarity.

Lauren Goode: I’m curious what your job search has been like, given how public your last role ended up being. How did you end up considering trust and safety in the dating app world?

Yoel Roth: Well, I spent the better part of the last year and a half really trying to think of where I can have an active role in trust and safety. Some of that was speaking about the work we did at Twitter, and writing about it. Some of it was teaching. Some of that was doing a bunch of research on federated and decentralized social media.

But then I went back to where it all started for me, 15 years ago, when dating apps were brand-new and Tinder didn't exist yet. I was really interested in safety and privacy on these new services. I ended up writing my PhD dissertation on it. So when an opportunity to consult at the Match Group presented itself, I jumped at it. And then, we had a conversation around, “Should we DTR [define the relationship] and make this official and, like, go steady?” It's not an exaggeration to say this is a dream job for me.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

Did Match come to you or did you come to them?

Match came to me.

They swiped right on you.

That’s right.

OK, let’s not extend that metaphor too much. What are you actually going to do as part of this new role as head of trust and safety?

Match Group is a little bit unique because it's a portfolio of brands, not just one company making one app. So we’re really thinking about what a central trust and safety team will look like. I’m a couple of weeks in, and all of this is subject to change—but one area my team is going to be responsible for will be policy and standards development across the whole portfolio, determining which will be consistent and which ones need to be varied by brand.

Another is core safety features, like appeals and the ways users can have recourse when we make decisions. I’m also really excited about building out new protection functions for members, focused on things like scams and financial fraud.

How big will your team be?

TBD. It’s still early days.

I never assume that anyone has read my work, but you may have seen the story I wrote about my run-ins with scammers on Hinge.

Yeah, I did.

At the time that was a pretty persistent problem, which sparked a long rollout of identity verification features. How big of a problem are scammers on Match apps today?

There's never a mission-accomplished moment when it comes to spam or scams or fraud. They are persistent threats that every social media platform has to deal with. The scale of the challenge we deal with right now is already significant. We remove about 44 spam accounts every minute across Match Group apps. It’s a lot.

Each one of them is going to have a slightly different user base, a different profile that it attracts, so that also means they’re a different type of bad actor that’s attracted to it. I’m still learning the ropes, but the goal is not to do this app by app. It’s to think about how we can improve performance across the portfolio.

What we can do is think about the best ways to deploy people against this: How do we understand these problems? How do we build technology that will address them? If we've caught a scammer on Tinder, how do we make sure that they're not popping up on Hinge as well?

How are you thinking about protecting marginalized groups on dating apps, or people living in places where homosexuality is criminalized?

I think that's the trick, right? There are some shared resources that make sense to have as universal, and there are other instances where differentiation within the portfolio is incredibly important. Some people are very comfortable being public about who they are, what they're looking for, putting it all out there. Some people might be more reticent, especially for folks who may be in a more vulnerable position, whether because they're queer and in a country where that's potentially stigmatized, or perhaps they come from a very religious background where online dating is less acceptable.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

But there are some areas where we can and should find common ground around things, like fighting scams or offering people the ability to verify that they are who they say they are.

What about protecting underage users on these apps? I know the apps have age requirements, but as you know, people sneak on all forms of social media when they’re underage, and that’s another vulnerable population.

Across the board our apps are for adults. We have sort of a wide range of tactics and techniques that we're using to try to keep underage folks off of our apps. But I think it's worth considering these problems in a broader context of the ecosystem.

We care a lot about identifying underage users and removing them from our products. But I think there's an opportunity for app stores to play a part in this as well. Age assurance and age verification are a challenge that lots of different companies are going to have to wrestle with—it’s included in a number of different pieces of regulation. We’re going to keep doing what we’re doing here to keep underage users off of the platforms, but I’d like to see these challenges moved a bit upstream so we have better tools and signals to do that work.

So you’re basically saying, “Let’s pass the buck to Apple and Google”—which some people would probably say is a reasonable suggestion. But then there’s also the web. People can access Match services on the web too.

The vast majority of folks using our products are accessing them through apps rather than through the web. But I also don’t know if I would say it’s “passing the buck.” I think it’s about who is well positioned in the ecosystem to have information about somebody’s age. When you are in a position like an App Store, when you have payment card information, and additional information from somebody’s device, you may have more of a signal around how old they are than just an app would. I think it’s a shared responsibility across the board.

How do you share the responsibility, then, when your users’ physical safety is endangered? Just last week there was a Rest of World report about people in Medellín, Colombia, who have been robbed and in some cases physically harmed by people they’re meeting on dating apps. When something like this happens, how much responsibility does Match Group bear?

I think even one instance of somebody being harmed physically following a connection made on one of our apps is too much. We have a responsibility to do everything within our power to protect our members from that happening.

But there's also limits to what is knowable by a dating app or by a platform, in contrast to a public-conversation app where you can see the entire interaction in front of you. You can look at the text of a tweet, you can evaluate it. IRL interactions are different. Once you make the decision to meet somebody face to face, there’s a lot going on that we as a platform don't know about.

Most PopularGearThe Top New Features Coming to Apple’s iOS 18 and iPadOS 18By Julian ChokkattuCultureConfessions of a Hinge Power UserBy Jason ParhamGearHow Do You Solve a Problem Like Polestar?By Carlton ReidSecurityWhat You Need to Know About Grok AI and Your PrivacyBy Kate O'Flaherty

There are things we can do. When our members tell us that they've had a negative interaction, whether it's any type of physical safety risk, assault, financial fraud, we act on those reports immediately. That’s a lot of what my team is going to be doing. A second critical piece of that is working with law enforcement. In Colombia, around the world, we want to make sure that we are empowering local law enforcement to actually get bad guys off the streets and off of our apps as well. And we are proactively referring relevant information to law enforcement in cases where we think there's a physical safety hazard.

I really think the trust and safety industry, collectively, needs to start to approach these as shared problems rather than something that each company handles in isolation. If every company tries to solve a problem independently, you only have line of sight into what's happening on your platform. We are much more effective when we come together as an industry to address risks.

One of the things you wrote in your dissertation about dating apps that I found interesting from a design perspective was that—and I’m paraphrasing—you liked the idea of people having more of an open space to express themselves, versus the drop-down options and other preimposed structures within apps. Do you still feel that way? Why does that create a better experience?

There’s no universal answer to any element of trust and safety or any element of product design. Personally, I think open text fields are better. I like writing, I like expressing myself creatively. But a lot of people don’t want to take the time to think about exactly the right word to explain things, so they’re going to want the option of just entering some of their information and using a drop-down.

WIRED previously covered the trend of people preferring to use Google Docs instead of dating apps—just putting a link out there, sharing a public doc about themselves, sometimes entire chapters. You really do get a sense of who a person is from that.

Right, you get a sense that they’re the type of person who writes a Google Doc about their potential dating life. Which, if I were dating right now, I’d probably be a person who writes a Google Doc about that stuff.

Do you think these apps are really designed to be deleted?

I do. I met my husband on an app.

Right, coming from a person who had success! But really, in what way do you think they are actually designed to be deleted when the business model, which relies on people swiping continuously and paying monthly fees for a more “premium” experience, supports something entirely different?

There are always going to be reasons that people enter or exit the market for dating or relationships. Some people will exit it because they find a partner and are in a monogamous relationship or a marriage and they choose not to meet or date anyone else. There's also lots of different relationship types and relationships structures. We want to make sure that there are apps available to people at every step on the journey, and it's going to change over time.

I think there are lots of moments where people will get what they're looking for from one of our products, something that enriches their life, and then at a certain point, they'll say, I got what I wanted from that and I’m ready for something different. Our business model fundamentally is about offering people tools to find connections, and that is going to look very different for people at different points in their life.

Any last dating tips for people?

If I had one tip, it’s don't be afraid to show the weirder elements of your personality. The quirky, esoteric things that really make you who you are are the things that will help you find a match that is going to be exactly right for you.

Updated 3/11/2024, 10:25 am ET: We've updated the story to clarify that criminals in Colombia are reportedly targeting people on dating apps generally, but not on Hinge, which doesn't operate in the country.

About Lauren Goode

Check Also

The Hottest Startups in Helsinki in 2024

Helsinki’s startup scene evolved around behemoths such as Nokia, games giant Supercell, and food delivery …

Leave a Reply