Episode 506: Talking Dating in a Digital World

As of 2021, there were 30 million online dating users in the US and 321 million users worldwide. Despite this massive number of users, there’s very little policy that regulates how users behave on these platforms. Studies show that a majority of women have experienced sexual harassment online and that rarely is any action taken by law enforcement in situations where technology is being used to commit acts of gender-based violence.

Talk Policy to Me host Amy Benziger joins Ziyang Fan, the head of digital trade at the World Economic Forum, to interview Nima Elmi, head of public policy at Bumble. Most people know Bumble as the dating app that challenges outdated gender norms by only giving women the ability to send the first message when they connect with a match on the app. What you might not know is that Bumble has a female-led policy team doing amazing work to keep women safe, both online and offline. We’ll explore how their team is challenging legislators in both the U.S. and the U.K. to protect women and how she views the future of dating in today’s current digital landscape, the metaverse, and beyond.

Transcript

Amy: [00:00:00] Online dating is booming. As of 2021, there were 30 million online dating users in the U.S. and 321 million users worldwide.

Noah: [00:00:08] And Bumble is one of the biggest dating apps out there. Most people know Bumble as a dating app that challenges outdated gender norms by only giving women the ability to send the first message once they connect with a match on the app. There’s also the Bumble Bizz and Bumble BFF features that allow users to find their next business partners or adventure buddies. What you might not know is that Bumble has a female-led policy team doing amazing work to keep women safe online.

Amy: [00:00:34] And this critical work has been underreported and underrated for a long time. The Web foundation commissioned a study that said that 52% of young women and girls have experienced online abuse, including threatening messages, sexual harassment, and the sharing of private images without consent. And even more shocking is that in 74% of the countries they had surveyed, law enforcement bodies had failed to take appropriate action in situations where technology is being used to commit acts of gender-based violence. The question that begs is why do we allow online platforms to get a pass on facilitating behavior that’s criminal offline? And what’s being done about it?

Noah: [00:01:13] On today’s Talk Policy To Me, we’re talking keeping dating safe in a digital world.

Amy: [00:01:23] I was lucky enough to tag along with Ziyang Fan, head of digital trade at the World Economic Forum, to interview Nima Elmi, head of public policy at Bumble during Ziyang’s tech policy class at UC Berkeley, the first ever taught at the school.

Ziyang: [00:01:35] Nima, welcome. It’s so good to see you again. For those of you who don’t know, Nima and I were coworkers at the World Economic Forum before she joined Bumble. So to kick off this conversation, Nima, would you like to share a bit about yourself and your professional journey?

Nima: [00:01:55] I started my career many moons ago as a lawyer working on dispute resolution cases and regulatory reform issues. And after years in practice, I was able to transition into working with government initially in a legal capacity, and then took on more and more policy issues, mostly around foreign policy and security issues. And that kind of led me to work on projects with the United Nations and the African Development Bank and the African Union. I then kind of joined the World Economic Forum in my previous role as a policy expert. I led it enough focused on engaging governments on issues to do with technology policy, or, as Rob Schwab would describe it, the fourth Industrial Revolution. And having spoken to and worked alongside over 60 governments, regional and international bodies across heads of states and ministers and Secretary generals and EU commissioners, I was then at a point where I was given the opportunity to work with Bumble, which seemed a little random in the beginning, to be honest, because I hadn’t had experience of the data sector per se when it came to technology policy issues. But I was so impressed with the organization’s ethos as a female founded and female led organization that was committed to advocating for the safety of women, both online and offline, which I thought was very unique in the industry. We hear much larger platforms who have a much wider reach, and yet their mission is not as proactive when it comes to online safety, particularly in the space of women and girls, as on board. And so it’s been an exciting journey over the last few months since joining last summer.

Amy: [00:03:49] Some quick context on Bumble. Bumble was founded in 2014 by Whitney Wolfe Herd. Herd was a former co-founder and vice president of marketing at Tinder. One of the more public co-founder disputes over time, Herd sued Tinder for sexual harassment, alleging that her ex-boss and ex-boyfriend, Justin Mateen, and bombarded her with threatening and derogatory text messages and wrongly stripped her of a co-founder title. The company denied any wrongdoing, but Mateen was suspended and then resigned. The suit was settled, but in the months between filing the suit against Tinder and settling, Herd experienced a torrent of sexist and violent online abuse. It was clear to her that women weren’t safe online and that women needed to be in leadership roles to do something about it. The gender dynamics of dating needed to be flipped on its head. They built in protections to avoid unsolicited photos and easy reporting methods to flag inappropriate behavior as a way of ensuring customer safety. After years of bad experiences with men on dating apps, women clamored for this type of platform. The app went live in December 2014 and garnered over 100,000 downloads in its first month. In 2020, it reached 100 million users and in February of 2021 went public on the Nasdaq. Pretty meteoric rise for an industry where 90% of startups fail. Okay, we’re all on the same page now. Let’s dive back into the conversation.

Amy: [00:05:05] So outside of helping people find meaningful relationships, which is amazing, you mentioned that you work both online and offline when you look at policy creation. I’d love to hear from you–what are the top two or three policy issues you work on at Bumble?

Nima: [00:05:21] So one of the areas that we’re particularly committed to Bumble is advancing women’s digital safety and Women Lobby, one of Europe’s larger women’s focused organizations and their study found that women are 27 times more likely to suffer online harassment than men. The E.U. Agency for Fundamental Rights found that one in ten women experienced cyber harassment since the age of 15. This includes receiving unwanted or offensive sexual, sexually explicit emails, images, SMS messages, or inappropriate advances on social networking sites with the highest being for women aged between 18 to 29 years old. And so we are seeing that at least on this side of the pond, there is a proliferation of regulations in the EU and UK to try and codify some of the roles and responsibilities of technology platforms. So we have responded publicly to consultations on legislation in the EU and the UK, like the online safety bill that’s currently going through Parliament in the UK and the Government to ensure that gendered harms are captured as priority harms in the face of the draft laws. This affords not only greater protection to women, but also empowers the enforcement agencies to be able to intervene and actually really police the prevalence of these issues in different contexts.

Ziyang: [00:06:58] You know, one area that Bumble has received a lot of recognitions for is your policy work in a space of cyber flashing. So can we talk about one, what is cyber flashing for those of you who are not familiar with the concept and two what Bumble is doing to combat cyber flashing?

Nima: [00:07:19] For those that do not know, cyber flashing is the act of sending unsolicited genital images and videos on social media platforms through airdrop, etc.. Well, we have been advocating for legal intervention in the US and now in the U.K. over the last years. And more recently we’re seeing the increasing prevalence of AI enabled intimate image abuse, which is essentially A.I. tools that are being developed to nudify the images of fully clothed women into deep fakes of their nudified form. And sadly, these tools don’t work if you upload an image of a man and the technology that is being developed explicitly to target women and perpetuate online harms towards one specific gender, which we want to make sure are not only stopped, but that people have increasing awareness and that industry also works with policymakers to try and make sure that they have the safety tools in place on their platforms to address these issues. So those images are not created in that space.

Amy: [00:08:41] You know, it’s interesting. I’ve had conversations with friends that have said, obviously, if you walk down the street and someone flashes you, that’s illegal. It’s very clear and everyone understands that. And then I’ve talked with those same girls and most of us have received unsolicited images. So there’s this imbalance between how clear it is offline for something to be illegal and wrong and how prevalent that same action can be online. How do you imagine regulation tamping down on cyber flashing so that it’s as socially unacceptable to do it online as it is offline?

Nima: [00:09:12] Back in 2008, being in the U.S., we commissioned a nationally representative study of Bumble users and found that one in three women had reported receiving unsolicited lewd photos from someone that they hadn’t met in person. And in 96% of those surveyed were unhappy about this. In addition to that, as you mentioned, not only was our CEO, but also then there’s a wide board and those that had profiles and colleagues who didn’t were also experiencing cyber flashing. And so as part of that research, we were honestly in some ways dumbfounded that there was no legislation in place, as you said, that would deter this sort of digital indecent exposure, even though in almost all jurisdictions it is a crime to pull down your pants or expose yourself in the street. But yet nothing was stopping anyone from exposing themselves on DMs or text messages or airdrop, as I mentioned. And so we did two things. The first thing is we developed our own safety tool, which we call private detector and those who have used our apps may have experienced this where in essence, it is a tool that identifies that content that it believes to be a sexually explicit image–it blurs the image when the recipient sees it automatically and gives the recipient the option to view the image after seeing a notification saying that we think this is a sexually explicit image, do you want to view it? Yes or no, and report the sender. And in instances where the sender is reported, that person is removed from our platform. We have a one strike rule of very much kind of believing the victims, the Bible, our abusers in this instance, and making sure that we really are practicing what we preach when it comes to respect and integrity and how to behave with all platforms. And so that was the first step in making sure that we were able to address the issue on our platform and our products. Then we started at the same time to consult policymakers in our home states of Texas, where we developed a bill that would essentially criminalize cyber flashing as a Class C misdemeanor, punishable by a fine. And this was to encourage behavior change more broadly. It wasn’t to populate prisons. It was to highlight that what is illegal online should be illegal online and is just as serious in terms of the lasting impact it has on its recipients. Right. And so we were successful back in 2019, Whitney herself was giving testimony before the Senate and the House and in Texas shared her own personal experience of cyber flashing and how it had impacted her, because it is intrinsically something that, whether you anticipated or not, affects your concept of safety.

Ziyang: [00:12:32] Have there been any similar laws passed outside of Texas?

Nima: [00:12:36] Having had that success in Texas, we put bills in California, New York, Virginia and Wisconsin, and I continue to build on that work to ensure that we can get to a place where everybody understands that this is not okay. No one has to experience it. And not only have we done this work in the U.S., we’ve also done this work in the U.K.. We launched a campaign in November last year to criminalize cyber flashing in England and Wales. And again, we are currently going through that process of consulting the government, working with policymakers in Parliament to make sure that the online safety bill includes an offensive cyber flashing so that we are really taking this issue seriously.

Amy: [00:13:22] Neema I’m interested in your experience in working with U.S. versus U.K. policymakers. Do you see the EU leading in this gender based protection policy more than you see the U.S. pushing regulation forward?

Nima: [00:13:33] Someone very smart once told me that, you know, in many ways and perhaps is distilling it a little bit too much. But in the US they innovate. The trend grows in Asia they tend to replicate and enhance in some ways and in Europe they very much tend to regulate when it comes to the approaches of technology. And so in some ways it’s not surprising that the kind of first attempts to regulate online safety come out from this continent, because the approach here has been in many ways try and put parameters around issues of online harms through legislation. I think in the U.S., it’s a much more challenging environment given the very partisan political context. And I think that navigation can sometimes be really, really challenging with lobbyists and having those hurdles in place to actually get that consensus. And is it surprising that GDPR came out of Europe, came out of the EU, no, is it surprising that for the next few years and a lot of the regulations around how AI is developed and looking at, you know, technology platforms when it comes to the context of antitrust and how moderate in content online would look like, again, being something that comes out of legislation coming out of Europe.

Amy: [00:15:19] I think the economic angle is also one that hasn’t been talked about enough. Amazon and Meta both spent roughly $20 million to lobby Congress in 2021, which was a 7% jump from the year before. They’ve obviously been fueling that partizan rancor that you spoke about to a certain extent. What are your thoughts on how all that money and that power sways the regulatory space?

Nima: [00:15:40] A lot of the biggest tech platforms in the world today, particularly in the West, are U.S. companies. And so when you are trying to regulate your own growing entities. It’s really, really challenging fundamentally because you don’t want to diminish their dominance or their success in some ways. But at the same time, you also understand the responsibility and the impact it’s having not only on your society, but the world more generally. In some ways, some you know, some people believe it is easier for Europeans to be the regulators in this space because they don’t have that kind of skin in the game directly. A lot of these regulations will be impacting companies that are based outside or headquartered outside of Europe. And so it is very much right now a real crossroads in terms of how this will play out even within the UK and the EU. The battle of which legislation gets passed first. And if that is the legislation that is seen to be the most widely adopted, it is harder to go against the grain and have a different system that you have to adhere to. A lot of these companies will lobby to make sure that there is consistency between the two regulations. The Digital Services Act and Online Safety bill in the same way that the GDPR a set of standard, but in the EU, which has actually been replicated around the world.

Ziyang: [00:17:16] So not only do you have to think about how to regulate and protect our current digital lives and platforms, we also have to think about what the virtual world will look like down the road. For example, in the world of Metaverse, what will dating look like in the world of Metaverse? How is Bumble thinking about the issues that may come up? For example, NFTs, you know, I was listening to someone saying that a day that displaying or showing your NFT in your dating profile is the virtual or cyber equivalent of showing up on a date, to a date in a fancy car like a Porsche. So is that true, and will Bumble also be thinking about cryptocurrency policies as well?

Nima: [00:18:13] I think we’re reimagining how some of our products could function to kind of ensure that we are capturing that social networking element in the best way possible. We’re already kind of seeing some coverage about, you know, instances of grouping in the metaverse, which is, you know, disturbing to say the least. So I think there is this kind of conversation around the broader issues, you know, and then it’s Internet and you get to issues around like how what should be the rules, the foundational rules for grounding the Metaverse. And I think for us at Bumble, that will certainly be something that we will be keeping an eye on, particularly in terms of how those rules apply to women and what that means for not only them as users and recipients in the metaverse, but also encouraging economic growth, creativity and wider prosperity.

Amy: [00:19:23] So it’s really fascinating to think about what it means to create policy when the platforms for using are changing so rapidly. I loved this conversation because of the simple idea that at a basic level, we need to live in a world where what’s illegal offline needs to be illegal online. Noah, you actually took the class. I’d love your takeaways on how what Nima is doing at Bumble fits in with a broader look at tech policy moving forward.

Noah: [00:19:45] Yeah, one of my key takeaways from the class that Professor Fan really drilled into us was that tech policy needs to be grounded by a set of guiding principles. The work that Nima is doing at Bumble is a great example of policy that’s guided by the principle of women’s empowerment. It’s something that you see in their product design and in their policy work. I think this could serve as a model for the industry. It’s not surprising that Bumble is doing this, given the circumstances around their founding that you discuss, but it’s really refreshing. It makes me think that the Internet doesn’t have to reflect the worst of us. It’s definitely concerning to hear that there’s already been claims of sexual assault and harassment in the metaverse, which just goes to show that this isn’t how policy leaders are thinking across the industry. So I hope that as we move forward, policy leaders in tech really anchor their guiding principles in keeping women safe online as they design new tools and policies that govern their platforms.

Amy: [00:20:42] I love that Nima was up for coming to talk to the next generation of policymakers at UC Berkeley. We’re in a really interesting space where a lot of the decision makers in Congress aren’t digital natives. My guess is that most of them haven’t cruised a lot of dating apps. So I think it’s incredibly important to start expanding the definition of what falls under the scope of tech policy and training students to educate themselves about how to think about crafting this new type of legislation in the future. Huge thanks to Nima and Ziyang for letting me crash the party.

Noah: [00:21:13] Talk Policy To Me is a co-production of UC Berkeley’s Goldman School of Public Policy and the Berkeley Institute for Young Americans.

Amy: [00:21:20] Our executive producers are Bora Lee Reed and Sarah Swanbeck.

Noah: [00:21:23] Editing for this episode by Amy Benziger and Elena Neale-Sacks.

Amy: [00:21:27] The music you heard today is by Blue Dot sessions and Pat Messiti Miller.

Noah: [00:21:31] I’m Noah Cole.

Amy: [00:21:32] I’m Amy Benziger.

Noah: [00:21:34] Catch you next time.

Amy: [00:21:36] Cool. Okay. I think we’re good.

Past Shows

Talk Policy To Me feature image

Episode 510: Talking Social Equity Cannabis

In 2016, California voters legalized recreational cannabis through Prop 64. Now, five years after legalization, city’s are grappling with the difficulty of prioritizing social equity in the cannabis licensing process