Advertisement
Facebook Insider Leaks Docs; Explains 'Deboosting' 'Troll Report' & Political Targeting in Interview
Facebook Insider Leaks Docs; Explains 'Deboosting' 'Troll Report' & Political Targeting in Interview
- Category: Censorship / Prohibit Criticis,FreeSpeech/FirstAmendment,Uncategorized,Trigger Word
- Duration: 19:11
- Date: 2019-03-08 06:19:16
- Tags: facebook, censorship, trigger, de-platform, de-monetize, troll, twitter, sjw, msm, project veritas, zuckerberg, liberal, conservative, hate speech
0 Comments
Video Transcript:
And I was a content review analyst for the intellectual property department at Facebook. I handled copyright and trademark claims. And in the course of doing my duties, I noticed other things that were going on on accounts. And that is why I came forward. This is my story. Everyone is expected to be the same. Everyone's expected to not talk about these things openly. In the orientation at Facebook, they said the number one rule of Facebook is don't talk about Facebook. I see an odd note on an account. Being a curious person, I decided to look into it more. Things will only change if we, the people, are brave enough to step forward and do something bold. They may be able to stop one man, but they can't stop all of us. Be a catalyst for change. Wear a camera. Be brave. Do something. This Facebook insider contacted us some time ago and provided us with what she says are internal Facebook documents. I saw things going on that I personally found to be troubling. I knew that something had to be done about it. And so I felt that the best thing to do was to inform the public because they had a right to know. This is Danny Ben David, a software engineer. On his website, Ben David writes, I am now working at Facebook as a software engineer. I am part of the problem I know. In the leaked documents our insider gave us, we came across a back-end view of Mike Sertovic's page. On it, we could see Ben David running an action called action debust live distribution. In fact, Ben David wrote the code and may have invented the word debust. Who is Danny Ben David? From my understanding, Danny Ben David is a software engineer at Facebook. He did not work at the same location as me. But I noticed that every time I would see this debust live stream kind of code on there, that his name was always next to it. And what does it mean when these documents say action debust live distribution? What does the term debust mean? Well, I can't speak to what exactly the actions were when it says reduce live distribution. I am not a programmer. But I can read English just fine. When I see reduce live distribution, it means preventing the distribution of this live feed. Debusting is a method of suppressing distribution. This occurs because props such as share this video are disabled. Interactive notifications are also disabled. And the live feed boost associated with a live streaming video on Facebook is removed. The system converts the live video into text. Machine learning classifies or identifies certain words and bad words trigger debusting. Action debust appears after the word sigma, which project Veritas has learned, is an artificial intelligence system to block potential suicide and self-harm posts. The sigma code was written in 2017. Shortly thereafter, it seems debust went political. And where did you see this debust language when you were present in the Facebook facility? Where did you see this appear on whose pages? I would see it appear on several different conservative pages. I first noticed it with an account that I can't remember. But I remember once I started looking at it, I also sell it on Mike's Cernivitch's page. I was selling on Stephen Crowder's page, as well as the Daily Collars page. We spoke to Stephen Crowder about this and he was disappointed but not surprised. He told us that this is not the first time Facebook has targeted him. After an April 2016 Gismoto article, Crowder's live stream was dethraddled. He settled with Facebook out of court. So when things are taken down and like actually permanently deleted from Facebook, the user will typically get a notification. The user can respond to those notifications and there are humans that actually read that. However, with these debust live stream things, there was no warning sent to the user. These were actions that were being taken without the user's knowing. Did any of this show up on this the task version of the Facebook back end? Did it show up in there anywhere there? No, no tasks were created when these live stream debusts would occur. The story of using certain keywords to demote or downrank certain content is not unfamiliar. We exposed similar activities at Twitter during our investigation of them last year. It's good to run them through each and just look at the followers. It will all be like, Guns, Squad, America. Over the American flag and like the cross. Well, Senator, our goal is certainly not to engage in political speech. An April 2018 Facebook CEO Mark Zuckerberg was grilled by the Senate over these very notions, but with no actual proof. Overall, I want to make sure that we provide people with the most voice possible. I want the widest possible expression and I don't want anyone at our company to make any decisions based on the political ideology of the content. What Zuckerberg says in front of the Senate is completely different than what is actually happening according to the insider. Do you think that Mark Zuckerberg knows about this activity? Now, I do not know what Mark Zuckerberg does or does not know. However, what I can say is that at some point he does need to know what's going on with the company. Has Facebook ever admitted publicly that they've done this to your knowledge? Not to my knowledge, no. Okay. Did you check pages of people who are not conservative to see what it said? Yes. At first, I was wondering whether this is something that I had a couple working theories. I was like, maybe this is an independent versus mainstream thing. Maybe independent figures on the left are experiencing the same kind of deboosing, but I didn't see that. I looked at the Young Turks page. I looked at Colin Kaepernick's page. None of them had received the same deboosed comment on their account. Is this your name right there, Danny Ben David? I action deboosed live distribution. We'd like to talk to you about why this code shows up on all of these pages. And I can give these to you. Any other comment? Here, James O'Keefe. I'm James O'Keefe. Yes. Okay. Well, not much luck in talking to Mr. Ben David. This is Sage Yamamoto, a data science manager for Facebook since 2015. Yamamoto has had some significant duties. He has worked on informed sharing ranking demotion and the news-steed reduction strategy. On Facebook's private workplace, Yamamoto makes it clear that he is on a crusade against hate speech. However, his definition of hate speech is troubling. Quote, hate speech needs to be stopped. But there's quite a bit of content near the perimeter of hate speech that we need to address as well. Unquote. What does it mean by perimeter of hate speech? Things that aren't actually hate speech, but that might offend somebody. You know, anything that is perceived as hateful, even though no court would define it as hate speech. So there was another document as I was kind of researching through the workplace function of Facebook, which is the internal Facebook, as you will. I found, I came across a document called Coordinated Trolling on Facebook. And it was this troll report where they wanted to address the problem of trolls on the platform and what they could do to combat it. Seiji Yamamoto and Eduardo Arino de la Rubia are the authors of that troll report. And so I was looking through this document. And it was clearly kind of designed at the aim to be the right wing meme culture that's become extremely prevalent in the past few years. And some of the words that appeared on there were using words like SJW. You know, no person on the left talks about SJWs. That's what people on the right say. The MSM for the mainstream media. Once again, the New York Times doesn't talk about the MSM, the independent conservative outlets are using that language. In Yamamoto's troll presentation deck, he writes that they have a classifier that predicts if the user is a troll. The term troll refers to an internet troublemaker. Listing flagged words such as cuck, re, normie, lulls, and even MSM or mainstream media, and others to identify trolls. It's clear that Facebook is targeting the language of the right and using it to actively suppress their content, the perimeter of hate speech indeed. The Yamamoto glossary is clearly and right. They're shifting the goalpost. You know, it's one thing, if you're dropping the N word or things like that, using some kind of homophobic or racial slur, by all means, you know, that's something that a platform should not want on it. But now you're moving it to things like jokes that conservatives tend to make. And like I said, it's based largely around this meme culture. So now you're saying what jokes you are and aren't allowed to make, what kind of edgy comment. As long as it's not blatantly hateful, I don't see where Facebook has the right to be the arbiter. What exactly do the engineers at Facebook define trolling and hate speech as? In one slide from Yamamoto's deck, he writes, trolls are involved in many destructive behaviors on Facebook. Yamamoto then lists one example, writing, red-pilling normies to convert them to their worldview, with a link to a YouTube video by Lauren Chen called Why Social Justice is Cancer. In this video, I would like to talk about why social justice as an ideology is so very toxic. This is Facebook's definition of destructive behavior. This constitutes as hate speech and trolling. Yamamoto goes on to describe other methods of combating hate speech. He writes, introducing friction via the troll twilight zone will confuse and demoralize them. On his next slide, he defines the troll twilight zone, saying it will enact, quote, drastically limited bandwidth, auto-logouts, and comments and posts will magically fail to upload, unquote. Just below this Yamamoto writes that his troll twilight zone feature will be triggered by quote leading up to important elections. Do you think they're trying to influence elections? Absolutely. I do think, even if that's not their stated intent, I think like many people, they believe that the 2016 election of Donald Trump was a fluke. It was something that shouldn't have happened. It happened because all these trolls were sharing, you know, anti-hillary memes. And meme culture in their opinion is what one Donald Trump presidency, and so it's in their interest as a very homogenous company to kind of shut this down. Another tactic that Yamamoto describes sounds like outright bullying. Quote, when a user does something egregious, warranting an account suspension or deletion, we should notify the friend network, fear of being outed as a miscreant, is what regulates behavior in real life, and we should reintroduce that to the online world, unquote. It appears that Facebook is muscling users to peer pressure them into changing their political ideology. When Project Veritas showed tactic two from the troll report to another Facebook insider, the insider thought this was particularly egregious. Some people in Facebook might say that these trolling documents were just wishful thinking that they were an intended plan that was never implemented. What's your response to people who say that? Well, it's very possible, and even likely that not every measure has been implemented. There were measures in that document that I did see implemented. Like what, like which ones? For instance, they talk about assigning a troll score to accounts. So Facebook has what's called the fake account index, where they assign a score, which helps them to determine whether the account is a real person or just a dummy spam account. And so rightfully so they want to delete those accounts, that's okay. They created the troll score so they could help identify using parameters such as words that they would post, pictures, if they were friends with other trolls. And then using that, determine whether this person should be on the platform or not. So why is this troll score a bad thing? It's a bad thing because there's no accountability. And especially when they're using machine learning to do this, whenever an individual actions an account, there is a process where at least you can send a message to Facebook. Most of the time it falls on deaf ears. However, this is all being done without the user's knowledge. There's no recourse for them. And it's being done without, you know, on public figures, like Mike Surnovich, Steven Crowder. We asked Yamamoto about the insider actually seeing a troll score under the fake account index in CRT, Facebook's back-end content review tool. You actually created this report, did you not? I actually have a comment on this. Yamamoto issued a no comment comment and fled the scene. I think the biggest thing is that getting the documents, getting video or, you know, still pictures of what was going on, that shows that it actually is happening. This isn't rumors, you know, they talk about how right-wingers just, they come up with all these crazy theories and that's not actually happening at these social media companies. They poo-poo it. But here it is. And it's in your face. After leaving her employment at Facebook, the insider came to work for Project Veritas. So two days after the Twitter story had launched. I came to work just like I normally would. And my manager says we need to talk to you. And not thinking it was a big deal. I go and I set down my bag and I go to a conference room. And they told me that I was being terminated. Did they give you a reason? No. I was never given a reason. I asked them. I said, can I ask why I'm being terminated? And they told me we don't know. Did you find anything about the timing of this suspicious? Oh, absolutely. What was suspicious about it? What was suspicious was that it was two days after Project Veritas released their Twitter videos. And why would that be an interesting connection? Because at that point, the company knew that Project Veritas wanted to investigate the issue of censorship in Big Tech. And how would they know that you had anything to do with this if they did know? I think it's because you and I had had some contact. Okay. And at the time that your termination with Facebook occurred, were you a full-time employee with Project Veritas? No. I was not employed in any way, shape, or form with Project Veritas. My father asked me, well, was it worth it? Was it? Yes. And I told him there was not even a moment of hesitation because I knew what I had seen. And the fact that it was so egregious that they wouldn't even tell me why it was being terminated. Just reconfirmed that this is something they were trying to keep in the shadows, that they did not want the public to know, and yet that the public has a right to know. Now Project Veritas has learned that is latest last week on this workplace, back end of Facebook, there were executives and officials talking about the need to not reward divisive content. They're not sure what divisive means, but also saying that we need to promote publishers which are aligned with the mission of Facebook, promote publishers who promote content, which is aligned with the mission of Facebook. But as long as there is a conflict in the values of Facebook, as it relates to the first amendment, and being a public town square, as long as that exists, and as long as there are human beings in rooms which make determinations, there are always going to be the types of abuses that this insider talks about. Now, as it relates to this insider, she says she made enormous sacrifice, she said it was worth it. We need people like her. We need people to follow her example, to step up, be brave, do something, and wear a camera. It's enormously inspiring what she did, and I hope many other people follow in her footsteps. We need you to come forward. We need you to wear a camera. Starting from Menlo Park, California, outside of Facebook headquarters, this is James O'Keefe with Project Veritas.