Could your new match be a deepfake?
In this episode, we talk about Google’s move to get your mobile apps to stream on their other devices. Then we speak with Aunshul Rege, associate professor with the Department of Criminal Justice at Temple University, about the rise of online dating scams and the incredible amounts of money people get swindled out of. Then we speak with Billy Perrigo, Staff Writer for TIME, about his piece titled, "Inside Facebook's African Sweatshop."
Saron Yitbarek is the founder of Disco, host of the CodeNewbie podcast, and co-host of the base.cs podcast.
Josh Puetz is Principal Software Engineer at Forem.
Aunshul Rege is an Associate Professor and Director of the Cybersecurity in Application, Research, and Education (CARE) Lab at Temple University. Her research and education projects, which have been funded by several National Science Foundation grants, focus on the relevance of the human factor in cyberattacks and security.
Billy Perrigo is a staff writer at TIME magazine focused on covering big tech and artificial intelligence.
[00:00:10] SY: Welcome to DevNews, the news show for developers by developers, where we cover the latest in the world of tech. I’m Saron Yitbarek, Founder of Disco.
[00:00:19] JP: And I’m Josh Puetz, Principal Engineer at Forem.
[00:00:21] SY: This week, we’re talking about Google’s move to get your mobile apps to stream on their other devices.
[00:00:27] JP: Then we’ll speak with Aunshul Rege, Associate Professor with the Department of Criminal Justice at Temple University, about the rise of online dating scams and the incredible amounts of money that people get swindled.
[00:00:38] AR: So the victim thought that they were really speaking with these military personnel, but what was actually happening is they were speaking with a deepfake.
[00:00:50] SY: Then we’ll speak with Billy Perrigo, Staff Writer for Time, about his piece titled Inside Facebook’s African Sweatshop.
[00:00:57] BP: Abramson kind of suggested to them that they were easily replaceable, that they must choose to either sign a document that revokes their name from the list of demands or lose their jobs.
[00:01:10] SY: So this week, our producer, Levi Sharpe, told us about how he wished he could have streamed the Super Bowl from his phone to his laptop or TV to watch the game. So he could use something like the NFL mobile app to stream it for free, instead of getting a subscription to Peacock. Levi’s frugalness aside, this actually might be something that you’ll easily be able to do in the near future. According to 9to5Google, you’ll soon be able to stream your Android apps from your pixel phone to your Chromebook or PC. And Google’s Android 13 preview build for pixel phones, there were two different “cross-device” service apps that will allow you to stream an app from your phone to a browser.
[00:01:52] JP: And this announcement comes on the heels of Apple introducing something called universal Control in the latest iOS and macOS betas. Universal Control lets you control multiple Macs and iPads with a single keyboard and mouse. Now you don’t have to do anything to set this up if you’re on the latest betas. All you have to do is place the devices under the same Apple ID physically within 30 feet of each other and then you’re able to move your cursor from one device to another and type on them from the same keyboard, kind of the reverse of what Google is doing with the Pixel to the PC. You’re also able to drag and drop files from one device to another and copy and paste. As of now though, you can’t move app windows between the devices like you can with Chrome and you need a Mac just to make this whole system work. So for example, you can’t use Universal Control with two iPads. But it’s kind of the same idea, mixing your phones and your desktops. And we just wanted to let everybody know that these are some really interesting developments that are happening right now. Do you think you’d use something like this, Saron?
[00:02:52] SY: So I’ve got my iPad and I’ve got my Mac and the Universal Control seems like it could be interesting, but I’m trying to like imagine a situation where it would actually be useful, like move my cursor from one thing to another. I guess if you want to use it kind of like as a secondary display, as a secondary monitor, and then that could be useful. Right?
[00:03:13] JP: Yeah. Or if you’ve ever had a situation where you have a desktop computer and a laptop computer and you do the weird thing where you got the laptop and then you’ve got your desktop keyboard and they’re on the same desk and you’re swiveling back and forth between the two, it’d be really nice to just control it. I actually think the phone functionality Google and Chrome are experimenting with is much more interesting to me. Microsoft had a similar feature built into Windows called Your Phone, but it’s been around for a good year or two now, but it only works with some Samsung phones and some surface devices. This would work with all Android phones. So it sounds a lot more interesting. There’s lots of apps on my phone that only run on my phone. So to bring those to my desktop, that seems a lot more interesting to me.
[00:03:58] SY: Absolutely. I mean, there are so many times where I’m like, “I have the app for this,” or like, “Oh, I’m looking at this thing,” especially with the NFL example, just watching videos on your phone and being able to stream them or just move them over to a bigger screen would be so nice. Those are the moments I’m just like, “Ah! I already have the content. I have it.” It’s ready. You have access to it. Just let me make it bigger. So that would definitely be a huge one.
[00:04:25] JP: I think it’s interesting how these two companies are approaching this idea that you have mobile devices and they have different apps and different information, and you want to view it all in one spot. You don’t have to be juggling multiple places, but they’re definitely approaching it differently. So Apple’s sort of solution to this is to say, “Well, if you buy a new Apple Silicon Mac, in some certain circumstances, you can just run the phone applications on your Mac. Now a lot of developers opt out of this, but I have a couple apps that I actually do this on and it works okay. It’s not viewing your phone though. You have to re-log in and it’s like having a second phone, but like in your computer versus just seeing your stuff from your phone, that just sounds so much nicer.
[00:05:12] SY: Yeah. Do you think that Apple ever got to a point where we’re just actually sharing things across devices?
[00:05:18] JP: I really wonder. I mean, from a technical standpoint, I don’t see why not. Now from like, “Is this something Apple wants to do kind of perspective?”
[00:05:26] SY: Like for their business model.
[00:05:27] JP: Yeah. I don’t know. Would it really hurt their business? You think they’d actually sell more, right? You had better integration and you’re able to copy and paste between the devices right now.
[00:05:36] SY: That’s true.
[00:05:37] JP: You can make texts. You can make phone calls from it.
[00:05:39] SY: They’ve made like file sharing really easy with AirDrop.
[00:05:42] JP: Right. So I don’t see why they wouldn’t, other than maybe it’s just not a huge priority to them or maybe their vision is more, “We’ll let you just run the apps on your desktop.” I could definitely see Apple saying this, like, “We would rather that an application be developed that runs on the phone and on the Mac, and it can be the best experience on each platform versus trying to shoehorn one into the other.”
[00:06:06] SY: Right. And that would be the argument. Yeah.
[00:06:07] JP: Yeah. Google doesn’t have that option though, I mean, unless it’s on something like Chrome OS. But if we’re talking Windows and Mac, Google doesn’t have that option because they’re not making the operating system.
[00:06:17] SY: Yeah, absolutely. I think that Apple, I mean, their priority historically seems to always be best, most smooth experience, most immersive, most advanced, just quality. And I think that they would not allow any type of cross device sharing if it degraded the quality. Like you said, if you’re shoehorning apps, they wouldn’t be there for that.
[00:06:40] JP: On the other hand, though, like in terms of developer friendliness, I think Microsoft’s approach is definitely more, in my opinion, developer friendly. You only have to write your app once versus Apple’s answers like, “Here’s an iPad with all these different screen sizes. Here’s a phone with all these different screen sizes. Here’s a Mac now. Go write three apps under one codebase.” It could be a lot.
[00:07:00] SY: Right.
[00:07:01] JP: That’s why you see a lot of apps opting out of it and saying like, “You know, we just don’t want to deal with it.”
[00:07:05] SY: Not worth it. Yeah, absolutely.
[00:07:06] JP: Have it on your phone. Yeah. Just to add a related note, unfortunately, I think this is more argument for the most famous phone app of them all, Instagram, to not write a native desktop app. I just don’t think we’re ever going to get it.
[00:07:22] SY: No, never going to get that. Coming up next, we talk about why there’s been a massive increase in online romance scams. According to the Federal Trade Commission, victims of these crimes collectively lost around 547 million dollars in 2021, which is six times higher than the losses in 2017 after this.
[00:08:02] SY: Here with us is Aunshul Rege, Associate Professor with the Department of Criminal Justice at Temple University. Thank you so much for joining us.
[00:08:10] AR: Thank you so much for having me.
[00:08:12] JP: So back in 2009, you authored a study entitled, “What’s Love Got to Do with It? Exploring Online Dating Scams and Identity Fraud”. And since then, there’s been some major growth in the area of online dating scams. And we wanted to compare back then to now. So can you talk a little bit about your original research, what the mission was, and what the findings were?
[00:08:35] AR: Sure. I was very curious about the intersection between organized crime and cybercrime, specifically how the scam, especially romance scams, how do these work with technology? What I found, at least the focus was for that piece, I wanted to look at the scam process itself, how does it unfold, and how do criminals justify what they do. Because obviously, for us, at the receiving end, it’s clearly a bad thing, but scammers find ways to justify this. So it was a combination of their inner workings, their mindsets, the entire operation. And the emphasis was we always talk about Nigerian scammers. That’s the first thing that comes to mind. But one of the things that I found is that scams can be committed by literally anyone anywhere. So you had a single person committing romance scams. You had a husband-wife team that was committing romance scams. And then of course you have the large scale operations, like the Nigerian scammers. Not to give them a bad breath, but they are the ones that have been most extensively looked at. And so there’s more information available about them. But in Nigeria, they actually celebrate this lifestyle because they are so successful. Right? So if you are bringing money home, it’s actually not something that is admonished. This is actually okay to do because you’re bringing home money and more than enough to have a regular lifestyle. You can have a very comfortable life. And so a lot of scammers actually flaunt their earnings and you have a lot of youth that see this and they actually aspire to do this when they grow up.
[00:10:36] SY: Oh, interesting.
[00:10:36] AR: So the scammers are actually seen as role models because they are doing something right. There are many songs that also celebrate this type of activity. So what if the ones that I can think of is I Go Chop Your Dollar. So that’s a really famous one. It was intended to sort of take a jab at the Nigerian scams. They were sort of making fun of what’s going on in that scenario because it’s successful.
[00:11:20] JP: Wow! This is bonkers. Are they targeting one person at a time? I’m just curious. Can you talk about that as well?
[00:11:28] AR: So it really depended on the scope of your operations and what you could handle. So folks that were working solo at any given time could handle maybe 40 victims, that they were duping simultaneously, the husband and wife team, around similar numbers, but then they were so successful they had to hire more people and then you have the large scale networks that are really a well-oiled machine.
[00:11:55] SY: How do these romance scams typically work? I think that for those of us who haven’t been a victim to these scams, it’s really easy to kind of sit on the sidelines and go like, “Oh, that will never happen to me. I would never be tricked.” But yet it happens to thousands of people, to so many people. And how does it happen?
[00:12:12] AR: It all comes down to sort of a three to four-step process. So you’re going to start by creating a couple of different profiles because of course there are different types of folks out there and you want to cater to their interests. Right? So you’re going to need fake images. You’re going to need fake BIOs. And I want to also emphasize that these scammers are really great social engineers and they’re using principles of persuasion, psychological principles of persuasion to convince you at each step. So the second stage is the contact and grooming stage. Right? So this is where you’re going to initiate contact. And this could be something as cheesy as reaching out and saying, “Could you please reply to this message and make me feel like the luckiest person in the world?” Right? Something as cheesy as that. Or you could come across as a little bit more mature, hearing about how their day went. Right? Those types of things. And then once you start having these conversations, you’re going to, at this point, start engaging in what is known as love bombing. And this is literally incessantly messaging the victim, showering attention all the time. So be it through the medium that you’ve been using or through texting or web messaging or whatever it may be, things are going to now escalate to, “Last night I had a dream about you and we were out on this wonderful date.” And there are playbooks for this. So for instance in the Nigerian scam network, they provide the scammers with playbooks about what to say and how to say it and how long to wait before saying it and they have complete scripts. So the Better Business Bureau had actually come up with a piece a little while ago where they actually shared one of the scammer playbooks. So there’s a couple of them that are openly available and it’s a good 20, 30-page playbook that tells you exactly what to do. So these are the scripts that you should use to initiate contact, here are some scripts now to engage in that love bombing. Here are some YouTube songs that you can also send them. So everything from Marc Antony’s I Need You, Celine Dion’s My Heart Will Go On. So sort of weaving these things into the narrative in your engagement with them.
[00:15:16] SY: Wow!
[00:15:17] AR: Like I said, it’s a very well-scripted, very well-oiled machine. And going back to that social engineering aspect, you’re going to have pretext, right? So what is your backstory? Who are you? What do you do? And these pretexts are important because that’s going to set the stage or the tragedies that are going to come your way in stage three. So a popular one for instance might be I’m in the construction sector. I work abroad and I have these long hours and I am stressed out. So you start creating this pretext around your employment. And then you might have another pretext that goes hand in hand with, “I want to come see you.” So now it’s all about travel. So these two pretexts are going to intersect to start stage three, which are going to be those desperate sort of circumstances, things like medical emergencies, business problems. I lost my job. I was robbed. I need money to get office equipment. I’m trying to get money for my travel so I can come see you. And the really important thing to understand is the grooming process itself has lasted for at least six months to a year.
[00:16:38] SY: Pretty big investment.
[00:16:39] AR: Right. So you’ve really, really hooked your victim into this romantically, emotionally so that they are now going to really want to help you. And what they’ll do, they won’t flat out ask for the entire amount. They’ll say, “Look, I have this saved. I have a friend who can pitch in this much. All I need is $1,500 from you and that’s going to get me there.” And it doesn’t end. So let’s say you do get those $1,500, then that particular hurdle is done, but then all of a sudden you’re going to have a new predicament and you’re going to keep sort of staying in the sloop until either the victim realizes that they’re being conned or they just run out of money. So what happens at this time is there’s this optional stage four, which is revictimization. So if let’s say the victim says, “Hey, I don’t think you really ever had any intentions of coming to see me. I think this is all a scam,” the scammer might get defensive and say, “I can’t believe you don’t believe me. I thought we really had a relationship. Don’t you have any feelings for me?” So they might try that sort of strategy. Alternatively, they might say, pretending of course, to say this is, “Yes, you’re right. You got me. But you know what? I actually did fall in love with you. And now what do I do?” And this is of course all an act as well. Alternatively, if you don’t fall for that, some of the ways that they can come back at you is say, “Hey, you remember those late night risqué sessions that you and I had? Well, I recorded those. So if you don’t want me to leak these on porn sites or share them with your friends or family, your coworkers, and ruin your life, you better pay up.”
[00:18:42] SY: So according to the FTC, the Federal Trade Commission, victims of romance scams collectively lost around $547 million in 2021, which is oh my goodness. That is so much money. And it’s also six times higher than the losses just a few years ago in 2017. So I’m wondering, what do you attribute to this huge increase?
[00:19:04] AR: Yeah. I think one of the biggest reasons is obviously that pesky little thing called the pandemic and we’re all at home more than we were before. Folks are feeling isolated, lonely. They’re longing for that human connection. And related to that, you have this other angle, which is the usual things that kept you busy or occupied, like seeing your friends and family or your regular outings or if you would go to the gym every evening or you had regular movie nights with your friends, all of those things have stopped. So the things that used to keep you busy have gone away and you’re stuck at home for a long amount of times or longing for that human touch. So that’s one of the biggest reasons. I think it’s also depending on age groups because with the FTC, and it was interesting with that report, they said the younger age group, the numbers that were impacted has also gone out. And part of that is also for similar types of reasons is that schools closed. Activities are done. Folks are on their devices more often. We just had a case here in Montgomery County where 17 and 18-year-old boys on Instagram were sharing intimate photos. So nudes or videos of themselves, again, engaged in very racy behavior because they thought they were speaking with hot girls, but in actuality they weren’t. They were scammers at the other end of the line. And so of course, then this was used for sextortion and blackmail. So you have younger folks, again, looking for perhaps different things. You have older folks that are retired, empty-nesters. Maybe their spouse has passed away and they’re lonely and it’s a scary time in someone’s life. I think the usual folks that would check up on you can’t. So if you have children, they’re stuck at home with their kids during the pandemic, so they can’t come see you. So I think it’s a combination of all those types of things, people being on their devices more. In general, yes, but also because of the pandemic and this other aspect of feeling lonely.
[00:21:33] JP: Where does the tech come into this? I’m really curious. What has changed with the tech that’s available now?
[00:21:39] AR: So obviously, the communication mechanisms, shifting to apps. Bots were used back then as well. They’re being used increasingly today, of course, with the advancements in artificial intelligence. So if you want to automate that first step of sending out those 500 emails, you can have a script that does that for you. So that’s sort of one way that you can leverage technology to make things a little bit more efficient for you. Another place where technology is being used obviously is the different platforms where we’re going to try to get our victims off of the platforms that we’ve met them on, let’s say a dating website, and onto different platforms that isolates them, number one, and lets us, let’s say, record them. So what are the latest and greatest technologies that help us do that? Which again, you might use later for blackmail or sextorsion, and then you also have, obviously, how are you transferring the money. So back in the day, it would be good old fashioned bank transfers or wire transfers. Now there’s a whole bunch of cryptocurrencies that are out there. The last one, and this is the most recent one that I’ve seen, is the use of deepfakes. Scammers can use legitimate daters’ profiles on dating websites and sort of take on their identities. But what they’ve increasingly started to do is just use deepfakes to generate fake identities. So if you’re familiar with thispersondoesnotexist.com for instance, every time you refresh, you’re going to get a new person’s face that is generated by AI. But this is just one very sort of simple example. So in one particular case, in the case of a military scam, because a lot of scammers loved to use profiles of US soldiers because that’s a great sort of reason for why you can’t meet in person. Right? Because you’re always abroad. But what they did is they actually found multiple videos of this one particular lieutenant, I believe it was, who was front facing. So he had a lot of media interviews, things like that, and they pieced it together to create a deepfake. So you think he’s talking, but it’s actually not him because you have pieced together different bits and pieces of the words that he’s used to create this illusion that he’s actually saying these things when he’s not. And so what the scammer did is they used these deepfakes in a real-time conversation in Skype with the victim. So the victim thought that they were really speaking with these military personnel, but what was actually happening is they were speaking with a deepfake. So one of the things that you could do at least before is you could say, “Well, if they can’t meet you online or they can’t have a real-time conversation with you where you can see them, that’s an indication of a scam.” Well, now you’ve got deepfakes that sort of fill that loophole, right? Like, “Here’s a way that we can do that.” And it’s only going to get better as technology advances. So that to me is really, really scary. The possibilities with deepfakes are really endless.
[00:25:09] JP: Let’s pivot to talking a little bit more about how people can be proactive about this. Are there things people should be on the lookout for when they’re starting or they’re involved in online dating? Would your advice to people differ depending on their age group? Just talk to me a little bit through that.
[00:25:27] AR: So I guess I’ll start with just general things to look out for. So how do you know you’re speaking with a bot for instance? Is it a dating bot? One of the telltale signs, if the responses are just too fast, almost instantaneous, this is just not possible for an actual human. So when you get a text message, for instance, you read it, you process and you actually then have to type back. This takes a little bit of time. So getting an instant response means that it was scripted. It’s a bot behind it. Another giveaway is if the message seems too generic, like it’s not catered for you, but it’s for everybody or it just doesn’t make sense sometimes, that’s another way of knowing that you’re not really speaking with a human. Also, if they ask you randomly for financial information or ask you to visit links, this is not something that I would hope like a normal human would do, then there’s sort of other ways. So let’s say you were in the middle of a conversation and you think it’s a bot. Some of the ways that you can test for this, you can ask something unpredictable. So let’s say you were talking about your day and what happened at work, just throw something completely unpredictable or random into the conversation that it doesn’t make any sense and see how the person responds. If you just get another instant response that has nothing to do with what you just said, that means they’re not aware of the context, that’s not a human. Bots also aren’t great at detecting sarcasm or humor. So are they going to be able to tell that you’re actually making a joke? And the last strategy is something like using keysmashes. And what do I mean by that? So let’s say you were in the middle of texting and what you would do is you would just randomly string together alphanumeric characters. It’s not even a real word or a phrase. And just enter that and see what kind of response you get. The average person would say, like, “What? I didn’t understand what you just said.”
[00:27:43] JP: Right. What are you talking about?
[00:27:44] AR: Right. Exactly.
[00:27:46] SY: Are you a bot?
[00:27:46] AR: Exactly. Right? So the bot is just going to come back with something completely scripted because that’s all they know. They can’t detect these types of things or at least they’re not there yet. Then of course you have the scammers in general. So these are actual human beings, but they just happened to be conning you. So if you remember, I said the scam unfolds as a process. So in that first one, it’s a fake profile. So as horrible as it sounds, if they look too good and sound too good to be true, then they’re not real. So the average person may not look like a supermodel and have their act together. If they do, that’s awesome and I would love to meet them someday. But the average person isn’t really like that. If they engage in love bombing, so now we’re talking about the grooming stage. And while sort of you’re engaged in this, search them up. Look them up online. Do they exist? If they don’t have an online presence, that’s a red flag. However, if scammers are stealing identities of legitimate daters, that becomes a little tricky because real daters are going to have profiles on LinkedIn or they may have a Facebook page or you might be able to get actual information about them. So that gets a little tricky. If, again, during the grooming stage, you ask to meet with them and they just can’t for whatever reason and that’s where the pandemic really helped scammers is that was one of the reasons why they couldn’t meet you, it’s because of the pandemic.
[00:29:23] SY: Oh, good point. Yeah.
[00:29:24] AR: So that was actually leveraged. Scammers love that. So they couldn’t meet whether they wanted to. “I of course want to see you sweetheart, but I can’t. It’s this pandemic.” So again, so if they make plans to come see you, and this is sort of those desperate circumstances, something bad just keeps happening, this person just can’t get a break, this is the most cursed person on the planet because anything bad that can happen, that can go wrong is going to go wrong. If they keep asking you for money, that’s problematic. And a great sort of way to test this also is to check for any inconsistencies or holes in their stories. So things just don’t add up. So one way to do this, kind of like what you do with the bot, right? You take a question and you ask it today and see what they say and ask the same question a couple of days later.
[00:30:21] JP: Oh, that’s smart.
[00:30:23] AR: Right? So are you going to get the same response or do you get something different? So it’s a little bit of sleuth work at your end. You’re going to have to play detective. And a lot of times people are hesitant to do that because that’s an indication of distrust and scammers actually leverage and it’s like, “Why are you asking me this? Don’t you trust me?” So again, they’re very, very clever, but you have to be clever about it.
[00:30:54] JP: I’m curious if victims of these scams, do they have any legal recourse?
[00:30:59] AR: So for the most part, it’s game over. Once the money crosses international waters, it’s pretty much done.
[00:31:10] SY: So we talked about what we as individuals can do to protect ourselves. I’m wondering, what do you think developers of online dating apps, even things like Facebook and Instagram, where some of this stuff is happening, what can those companies do, those developers do to help mitigate this growing issue?
[00:31:28] AR: Detecting anomalies in behavior. So again, if someone just created account and is mass spamming legitimate daters, or if the profiles seem to have some anomalies themselves in terms of bias or the images. So can these be cross-checked with some other platforms? So a combination of give us your LinkedIn, give us your credit card, but even then it gets tricky because scammers can steal this information. Having programs that detect things like response rates, types of responses, how long have these exchanges been going on for, are these getting anywhere, going anywhere? No. Okay. So those types of things I think are possible. I think it’s also important to really have an open platform to hear out the victim that have fallen risk scams. And I think this is not just for dating websites and developers, but I think also for law enforcement. One of the biggest issues that at least victims of romance scams have is they’re embarrassed or they’re ashamed because they’ve fallen for this. So I think developers really need to understand that this is an actual problem and develop programs and platforms that allow daters who might potentially later be victims to be able to speak for that and how can that feedback be taken into consideration when you’re designing your programs, what safety features could you implement. And so I think this is something that maybe if there’s a way to make this community driven as opposed to being disconnected from the community that it’s impacting, I think that would make it a lot more worthwhile.
[00:33:30] SY: Thank you so much for joining us.
[00:33:32] AR: Thank you for having me.
[00:33:44] SY: Coming up next, we talk about the horrific conditions of a Facebook content moderation job after this.
[00:34:06] SY: Here with us is Billy Perrigo, Staff Writer for Time. Thank you so much for joining us.
[00:34:11] BP: Thank you for having me.
[00:34:12] SY: So tell us about what led you to write your recent piece in Time titled Inside Facebook’s African’s Sweatshop.
[00:34:19] BP: Yeah. So I’ve been on the Facebook beat for a while, and last year was actually the first time that I started talking to people on the content moderation side. So I’ve been reaching out to Facebook workers for as long as I can remember. Most of the time, you don’t really hear back from them because the risks basically talking to a journalist are quite high. You can lose your very high salary if you do that. But for content moderators, I realized that the job of content moderation is invisible firstly. People don’t recognize what the job entails and the experience of content moderation and the human toll that goes into it largely just doesn’t get talked about. So in talking to these content moderators who are actually working for WhatsApp is the story that I worked on last year. I basically found it, firstly, much easier to get in touch with them because lots of them were unhappy about the conditions that they were working under, and therefore had a reason to blow the whistle, basically. And as for why I was interested in what goes on in Africa, I think generally speaking across the board, the role of Facebook in the global south, the developing world is largely talked about a lot less, especially in the kind of Silicon Valley facing press. And I’ve done some work on that over the past few years. And I figured that with the war going on in Ethiopia right now and Facebook being increasingly blamed for failing to prevent incitement to violence, for example, by the whistleblower Frances Haugen, most recently, I wanted to know who is doing the content moderation for Ethiopia and for the whole of Africa. And that led me to doing a bit of Googling and I found that Facebook had signed a contract with Sama in 2019. I’ve reached out to significant numbers of content moderators, and the proportion of those who replied was significantly higher than the proportion of Facebook employees who replied to my cold calls and they told me something and I went from there.
[00:36:23] SY: So you mentioned that Meta contracted Sama in 2019. What is Sama and what kind of work do they do?
[00:36:32] BP: If you look at their website, Sama presents itself as an ethical AI company. It talks about how it has contributed to lifting more than 50,000 people in the developing world out of poverty. If you look into how they got to that number, they don’t just count their employees, but they count their employees’ dependents as well.
[00:36:51] JP: Oh!
[00:36:52] BP: Yeah.
[00:36:53] SY: That’s interesting.
[00:36:55] JP: That’s so creative accounting.
[00:36:56] BP: Yeah, exactly. And their founder, Leila Janah, like to say that the best way to help people in poor countries is to give work, not to give aid. And this company was one of the first companies to develop this idea of microwork that you could kind of break up these big tasks that tech companies needed humans to do because AI was not capable of doing them and kind of farm them out to places in the developing world where labor was cheaper. The kind of narrative behind this was that because this work did not really need much more skilled than being a human, it could be used to give work to people who did not have it. But it seems that somewhere along the way this company that was founded to help people began to care more about satisfying its biggest clients and reducing its margins as much as possible.
[00:38:00] SY: And is most of their work content moderation or you only talk about ethical AI? Does it extend past that or where else?
[00:38:09] BP: Oh, yeah. So that’s actually a fantastic question. No. Most of their work is not content moderation as far as I understand it. They have all kinds of work. For example, on their website, they like to show images of people doing training data for autonomous vehicles, for example.
[00:38:26] SY: Yes, I’ve heard of that.
[00:38:27] BP: So labeling whether something is a pedestrian or a tree in order to stop cars from killing people. They work for Google. And those Google knowledge panels, when you search something up such as a celebrity or something and you get a box on the side with their picture and some biographical details about them, maybe your assumption when you do that is, “Oh, Google’s AI is really smart. They’ve managed to create that box automatically,” but no, it’s a poorly paid person writing it.
[00:38:59] JP: So given that Sama does business with a variety of companies in a bunch of different areas, how much of their business comes from Meta and how much of their business is in the content moderation space?
[00:39:11] BP: From what I understand, there are about 200 or close to 200 people working on the Facebook contracts in Nairobi for summer, which is a small percentage of Sama’s real workforce. I don’t know whether or not they do content moderation for other companies, but it seems like what Sama does overall is basically whatever human intelligence tasks, I believe they’re called in the industry, its clients need them to do.
[00:39:39] SY: Tell us a little bit more of the demographics of these workers and what does the day-to-day look like for someone on this Meta content moderation team.
[00:39:50] BP: So I think the thing to point out here is that one of the crucial skills that you need to have to be a content moderator for Facebook is to be proficient bilingual in two languages. So English firstly, and also the language of the content that Facebook is specifically hiring you to moderate. That’s the whole reason, remember, that these people are being hired in Kenya rather than in Silicon Valley or elsewhere or that’s the reason that Facebook would like to tell you it’s why they’re being hired there because they understand the language, they know the local context, they can more efficiently tell you whether, for example, a piece of content violates Facebook rules because it’s inciting violence due to some protective characteristic, for example. That might specifically make sense only to someone with familiarity with Ethiopian context. It also happens that when you hire someone in Kenya, you have to pay them literally about 10 times less or lower than equivalent work in the US.
[00:40:52] JP: So over the years, we’ve heard some stories about what a terrible job content moderation is. There’s been some reporting about companies like Accenture and Cognizant. And I’m curious, compared to those stories and those companies, how would you say Sama and their workplace practices compare?
[00:41:12] BP: So a lot of that work, that reporting has been quite inspirational to me and my understanding of this, but one thing about all of those stories is that they tended to focus on people in the US. I said at the beginning, content moderation is an invisible job and it is, but it’s less invisible for content moderators in the US than it is for content moderators in other parts of the world, including Africa. Those jobs in the US are still not very nice. They are paid pretty much fairly decently low wages. I think $18 an hour now is the starting salary, which is higher than minimum wage, obviously, but you need to remember that these people are in many cases being traumatized for the rest of their lives by the content that they’re saying. The stories that came out and kind of stuck at 2019 era about the conditions in these US content moderation factories had the effect of forcing Facebook and its contractors into being much clearer about what is expected of content moderators, Accenture, for example, now asks at least some of its content moderators to sign a waiver before they start their jobs acknowledging that they may develop PTSD as a result of their work. The content moderator in Africa who I spoke to or at least some of the content moderators in Africa who I spoke to were not informed in their interviews, even the job would involve looking at traumatizing content, let alone the type of content that it would entail until they accepted the job, traveled to Kenya, arrived, signed an NDA. At that point, you’ve told your family you’re leaving, you’ve got on a plane, come to a different country. I suppose I’m probably obligated to point out here that if people did decide to go home, Sama does cover the costs of that. But you also have to realize in this entire context that middle-class jobs in countries like Kenya are very hard to come by. And for as low as $1.50 an hour sounds, it is tripled the minimum wage in Kenya. So we need to be slightly nuanced in our understanding of how these wages play with the rest of the world. And I think the most instructive way to think about it is what value does this work hold for Facebook when it’s done. Clearly, they’re happy to pay $18 an hour in the US. So where does that surplus money go? It’s clear. It’s not going to the people who have been traumatized by the work.
[00:43:46] SY: That was going to be my next question because one of the things that really stands out about the piece is just the payment, the $1.50, which obviously for the US economy is atrocious, but we’re talking about different economies, different regions and all that. And so I was wondering, how does that compare to what minimum wage is and all that? And you said it’s three times as much as their minimum wage there. But to your point, this is an interesting position because on the one hand, at least based on the description, it doesn’t seem like it requires a lot of technical knowledge, degrees, advanced education, that sort of thing. It seems like a job that as long as you’re fluent and you’re willing to work, that you can do the job.
[00:44:34] BP: I would actually push back against that slightly. I mean, all of them are university educated, firstly. All of them are proficient to a professional degree in English, which is a pretty sought-after skill in these economies.
[00:44:47] SY: Absolutely.
[00:44:48] BP: Lots of them are young people coming out of universities, seeking their jobs in an economy that often results in people being overtrained for the amount of labor that is available in the labor force. So it’s kind of as if this large pool of maybe even overqualified people are available to outsourcing companies who want to come to these countries and start hiring people. And the only reason that the wages are able to be so low is because of the large numbers of competition basically for these wages, a surplus workforce.
[00:45:20] SY: I see.
[00:45:21] BP: One thing that I would add is the justification that Sama gave for the low wages in 2018 with an interview in the BBC. Leila Janah said, “One thing that’s critical in our line of work is to not pay wages that would distort local labor markets. If we were to pay people substantially more than that, we would throw everything off.”
[00:45:40] SY: Is that an interesting argument? Is that a valid argument?
[00:45:43] BP: Yeah. Sure. Everything for whom is the question.
[00:45:47] SY: Can you talk about some of the attempts by the employees to change their working conditions to fight back and what the result of those attempts were?
[00:45:57] BP: So about four months into Sama having Facebook content moderation staff on its office floor, they’d all been hired in early 2019. Around July in 2019, a group of employees came together and they started talking to each other. They were saying, “This job isn’t what I expected it to be. This job is having an impact on my mental health. Managers are not treating us in the way we expect to be treated.” And one of these employees, Daniel Motaung, is the central whistleblower in the story. If you speak to him, he’s a very charismatic person. He believes deeply in worker’s rights, which obviously helps. And very quickly, these Sama employees kind of acknowledged that he should be their leader. So what they did is they grouped together. They created a WhatsApp group chat and decided that they would canvas opinion more widely among their colleagues to ask what are our grievances and what do we want to ask management to do. This happens. In the group chat, they kind of talk about what their grievances are and what they should do. And based on those discussions, Daniel goes away and he writes out a document. He studied law at the university. And he writes out this document asking for a lot of procedural things, like letting us know what we can reasonably be fired for under these contracts and stuff like that, but the central demand was for our pay to be doubled. And so they go and they present after sometime this document to the management of the company in Nairobi. And they’re taken into this meeting room. And unexpectedly to them, actually, they said to me, two executives from California joined via video conference. Now I spoke to many employees who were present in this meeting and they said that the people who were on the video call dismissed their concerns after they read the list of demands. They said to them that they’ve done research on what a fair wage is in Kenya. Lots of people would die to get that job. And then they said, “We’re coming to Kenya.” So at that point in time, two days later they arrive in Kenya, Daniel is basically immediately suspended from his job. He’s accused of bullying, intimidating, coercing his colleagues into joining up with this, they call it the Alliance, this employee access group. He’s told not to come to the office, not to talk to any of his colleagues. So he’s kind of extricated from the situation. Then I’m told, according to obviously the employees who I spoke to, employees were called into meetings with one of the executives who came from California. Her name’s Cindy Abramson, according to Sama’s most recent public filings. She was paid close to $200,000 in 2018, the year before the strike. Two employees who were involved in the strike, who were more outspoken told me that in these meetings Cindy Abramson flattered them, said, “You show leadership potential,” and kind of suggested to them that they might be in line for promotion at the company if they could convince their colleagues to stand down. And by the way, these are all one-on-one meetings. So the advice that Daniel gave to his colleagues was, “We only have power in this situation if we negotiate as a group. They can replace any one of us individually. But if we move as a group, you can’t replace people with expertise in 11 different African languages and full knowledge of Facebook’s policy programs in a week or in a month.” And because of that, they hope to kind of use that as leverage to get Sama to pay them more money. In these individual meetings, which Daniel had advised these employees not to go to, Cindy Abramson says, according to three of the kind of rank and file participants in the labor action, that Abramson kind of suggested to them that they were easily replaceable, that they must choose to either sign a document that revokes their name from the list of demands or lose their jobs. Also, one thing that’s important here is that the people who are employed directly from Kenya at Sama are paid around 30% less than people who are employed from other countries. That’s because they don’t get their kind of relocation bonus that employees would get if they come from other countries. And so the suggestion to them, I'm told by people with knowledge of the discussions, was that you are easily replaceable. And so what ended up happening with the strike kind of being decapitated and these meetings happening is that the result basically crumbled. Sama denies that there was ever a strike.
[00:51:03] SY: Oh, okay.
[00:51:04] BP: I can read a quote from them. They say, “Being a responsible employer, we wanted to see our team in person, meet with everyone face to face and address their concerns head on. It’s why we flew members of our leadership team to our offices in Nairobi and it’s a decision we stand behind.” A couple of weeks later, Daniel receives a letter. The letter tells him that he’s been fired from his job for engaging in acts that could amount to bullying, harassment and coercion, and that led to the disruption of business activities and put the relationship between Samasource and its clients at great risk. And the client that’s referring to there is obviously Facebook.
[00:51:42] SY: Right.
[00:51:42] BP: Sama was known as Samasource up until recently. It changed its name because it used to be a nonprofit organization and it went into a corporate transformation that included switching to a for-profit business. And at the same time, they changed the name.
[00:52:00] SY: Oh, interesting.
[00:52:01] BP: I say in my story that six people who speak Ethiopian languages left Sama in one week in January to go to a different content moderation provider based on the different country where they were paying significantly higher wages to work for a client that is one of Facebook’s big tech, social media competitors. So these people’s skills are in demand actually, especially for Ethiopian language speakers because of the civil war and the ethnic violence that is going on there. The people who can speak Ethiopian languages and English to a high degree and also remember as well that this job requires you to internalize the platform rules, which can run into the thousands of pages and make decisions based on a very dynamic piece of content is violating any of those rules in an extremely short amount of time. So it’s no small feat that you’re asking these people to do. And it says a lot that the working conditions that are encouraged by a desire to drive down costs are leading to people who have the capacity to do this work who are in short supply leaving for greener pastures.
[00:53:15] JP: I wanted to loop back in the piece how you talk about that a lot of these workers are relocating to Kenya from other countries in Africa. And I’m curious, how does Sama find these workers? And I also want to dig into, you mentioned that many of these workers were not told about the nature of the work they’d be doing. And I’m really curious, how is the job described to them? How does Sama get someone to move to a new country and do this job?
[00:53:45] BP: Yeah. I think that’s a great question. And I think to answer it I think I need to draw a distinction between when they first opened the office when they were hiring large numbers of people on a very short timescale, which as far as I understand it included them not being very clear with their workers about what was expected of them, or if they were only telling them after they arrived in Kenya. After I asked Sama some questions about their onboarding arrangements and processes and the allegations, their employees relate to me that they weren’t told about this. Sama told me that they updated their onboarding procedures after the employee activism in 2019 to be more transparent about the work. But to your question of how do you get someone to travel halfway across the continent to do this work, one might consider the fact that if you told someone that they might be traumatized for life for $1.50 an hour, they might not want to take that job.
[00:54:44] JP: Right. What Meta/Facebook’s response has been to all of this?
[00:54:49] BP: I sent an email to Facebook and Meta asking questions about this story, and they told me, “We take our responsibility to the people who review content for Meta seriously and require our partners to provide industry leading pay benefits.” I believe that in their entire statement to me, they didn’t use the word Sama upfront and they did not really answer any of my specific questions about the degree to which they knew that things like this were happening at Sama. I am none the wiser about how much Facebook actually did know what was going on. But one takeaway that I included in the story is that, well, Facebook may not have known the precise details. The reason for the conditions of this company is largely down to the leadership of this company, trying to please the client as it were at every turn. Content moderators who work for Sama log into a piece of software designed by Facebook every day. Their work lives are effectively micromanaged. Every ounce of productivity is squeezed out to them. And so even if Facebook doesn’t know quite the details of the strike and the alleged suppression of the union effort, the allegation made by Foxglove, which is a legal fund that is representing Daniel now in legal action against Sama, the allegation from them is that Facebook created the system.
[00:56:17] SY: Thank you so much for joining us.
[00:56:18] BP: Thank you. I’ve really enjoyed this and I hope it makes some people set up and take a closer look at what’s going on here.
[00:56:35] SY: Thank you for listening to DevNews. This show is produced and mixed by Levi Sharpe. Editorial oversight is provided by Peter Frank, Ben Halpern, and Jess Lee. Our theme music is by Dan Powell. If you have any questions or comments, dial into our Google Voice at +1 (929) 500-1513 or email us at firstname.lastname@example.org. Please rate and subscribe to this show wherever you get your podcasts.