The dark side of Google. Content moderation issues. Exclusive interview with Daisy Soderberg-Rivkin
So I decided to start a new series of interviews that would break myths, but this time with Google employees themselves. Google Ukraine can’t be a choice as their main task is to benefit as much as possible on ad services in Ukraine. For this reason, I started looking for possibilities to conduct interviews with English-speaking Google employees. One of the first obstacles I faced was that current Google employees who could shed light on the company’s work couldn’t answer questions due to a non-disclosure agreement. This prompted me to speak with those who are competent and can honestly answer the questions. Only former Google employees, who are no longer bound by contractual ties correspond to the aforementioned criterion. Daisy Soderberg-Rivkin is one of the first specialists who were able to explain what is wrong with the giants in her opinion and intricacies of working at Google, for which Digital World expresses gratitude to Daisy.
Our conversation was about a highly crucial topic that influences almost every single internet user in consequence — content moderation. I consider this topic to be very relevant, certainly, it is going to be useful for people how Google and its employees cope with this kind of work. Leaping ahead, it must be said that the specifics of working in this section of Google is a demanding job and certainly not for the timid ones.
Daisy had been working at Google for 2,5 years between 2015 and 2017. While working at Google, Daisy was in charge of the French market and content moderation related to child abuse and terrorism. She was responsible for processing content removal requests, which included litigation, government requests, defamation, data protection, and other privacy issues. Daisy’s team worked directly with the French data protection agency, the French regional legal team, French law enforcement, and government officials. She was also part of the counter-terrorism program, which included the development of responses, tracking sources of information, and the processing of all incoming terrorism-related requests. Daisy Soderberg-Rivkin currently works as a policy, technology and innovation specialist at the R Street Institute. The conversation with Daisy took place in English, here are some parts of our interview:
Daisy, you had been working at Google for more than 2 years, would you title this period as a special one? If yes, would you mind sharing what was special about it?
Google was a special time for me. I would describe it more as a significant time for me because it taught me a lot about how content moderation works at big technology companies and things that need to be changed in order to make working conditions and the system better and effective. But it also gave me more passion for technology policy as a whole.
You have been engaged in very important and challenging tasks, decision making when it came to content removal from Google. Which profession is closest to what you did at Google: judge, lawyer or a prosecutor?
I don’t think it has an analog to any profession that I know of. We have a lot of backgrounds that do this work, mostly in the legal field (lawyers in their country of origin or in the US). However, I, for example, am not a lawyer, I come from a Political Science background. The job itself is not like anything else because technology companies created it. A lot of companies call us first responders of the Internet. When something goes wrong on the Internet, we are the first ones to respond to it or try to make decisions on whether things stay or come down.
There is a general opinion that people like you, engaged in tasks you have been doing are way more than the average person: 24/7 mode, total dedication. Would you mind sharing how your day used to look like at Google and how demanding it was?
It was a very demanding job, general opinion has got that right. I think that it is presented very much like a normal 09:00 to 17:00 job but in reality, it is not that we have different people for different shifts. People are working around the world so there is a time difference so people are actually working 24 hours a day to work on these issues. Even for someone like myself, I would get up very early, I handled the French market and a lot of our child exploitation and terrorism content. Because of the time difference, I would wake up very early, I would start working usually around 6 am, I would keep working throughout the day and then after coming home, I would work some more. There was no separation between home life and work-life to a certain extent. Another reason for that is in the case there is some type of crisis, then you get online and you can do that because you are working from a computer. In another sense, your mind never escapes that and you almost feel that you are working 24/7. Some people would develop nightmares or flashes of memory or PTSD. You can’t erase those things even when you go home, even when you are on a vacation, even when you are with your family. There is so much that reminds you about the content you are seeing, it is very difficult to escape. From a practical perspective, it is a very long and intense job. From a psychological perspective, it is a job itself which is hard, you have to keep yourself going and sane, to a certain extent.
Do you think that working in Google gave an idea/purpose which was beyond material income, an idea that was worth fighting for? What made you spend your time there?
When I first saw the job posting for content moderation, there was one sentence that really stuck out to me: “you will be working to protect free speech online”. For me that seemed to be a very important goal — to make sure that everyone has a voice no matter who they are, this is something very near and dear to my heart. I figured that this would be something worth fighting for. I had to look at a very difficult content such as child abuse content, terrorist content. Things that were difficult for anyone to look at but I would keep coming to work because I was trying to make sure that people were treated fairly online and from that, I also realized that there were flaws within the system as well.
Could you please elaborate on those flaws?
Sure. To begin with, content moderation is a very difficult and challenging way to govern the internet because it puts the power to govern the internet in the hands of big private companies as opposed to the hands, mostly, of the people (and that would be what we prefer). At least in the US, these big private companies can create their own criteria and their own standards for content. Certainly, they need to follow the law but other than that they can create their own policies. That becomes difficult when dealing with topics like terrorism, because terrorism means very different things in different places, because of that things become even more challenging and difficult. Another thing is that there is no consistency throughout the companies: Facebook has very different policies on what you can say on the platform than Google or Twitter, or any other big companies.
I would like to add two more points to this:
1) Issue of transparency. Very often people do not know why certain decisions are taken and they are basically left out of the conversation. Also, it is very difficult to appeal the decision in content moderation. Users have reported that it is very difficult for them to communicate with the content moderators or the companies and ask them to review their case one more time.
2) Treatment of content-moderators. Content-moderators are split into two categories: a) contractors and b) full time employees. Contractors are paid very little and there has been a lot of news reporting on this issue; they don’t get the same benefits, they are put in very depressing conditions, frequently they are put in other smaller countries or smaller areas like Manila as an example. And then there are full-time employees like myself and the big problem is that even though we are getting the money and benefits it takes a hard psychological toll to look at that content.
In your opinion, what are the flaws in the rules of Google when it comes to content policy and what does it take to make urgent changes and improvements to make just decisions?
A simple answer is that it takes a lot. There are a lot of things that should be taken into consideration, it depends on what country you are talking about, for example in the US there are a lot of legal barriers that should go through in order to make some of these changes. Another thing is that they have to think about political pressure which is something that happens as well. Companies do operate under a great deal of political pressure, Facebook is a very good example of that in the US — a lot of their actions have come as a result of political pressure. For a place like Google, of course, it is the same thing, it is a company and its main priority is how much money they are making. As a result, that leads a way for many policies. I think that right now we are seeing a great deal of pushback in the US from the government and from users for these companies to make very big changes. They are doing that by bringing attention to very important issues that are happening online, child exploitation, human trafficking, terrorism, etc. There are many things happening online that became out of control. I think this is pushing companies to go a little bit in a better direction. There are a lot of people who are working in organizations that are pushing them as well. Congress is starting drafting a lot of legislation related to the way the internet is governed, which is very important. The above mentioned is related to the US, this becomes a whole different story when we speak about other countries. At the other end of the spectrum, we have countries like China, North Korea that have a very different political system and a very different set of rules, very different treatment of human rights. I can speak of the US but in a lot of other countries, you would go to a whole set of different challenges in order to make these changes.
What kind of knowledge, expertise, and skillset does it take to be somebody at your position in Google?
Most of the time I think they were looking for people who can work under very stressful conditions, people who can make quick and efficient decisions. The reason for that is that sometimes there will be a lot of pressure on a decision, it might be coming from a government official, law enforcement in certain countries, it could be coming from federal authorities here in the US or it could be under a very intense context. A terrorist attack could have happened and you have to take down as soon as possible images of dead bodies, of gruesome imagery that is on the web. Those characteristics are very similar to the ones that you see for law enforcement or for medical first responders, they have to be quick, think quick, be on their feet and look at very disturbing things and be able to continue with their job. Now, in terms of the intellectual part of the job, you should already know internet policies, one has to understand what laws apply to content moderation in the region that you are covering and understand the political situation of the country. In the case of my team, we had a lot of lawyers, I did not study law but I have extensive experience working in politics which usually means that you are competent in legal aspects of things as well.
While preparing to take the job did you go through a specific training process or maybe prepared yourself? Maybe you can share some stories that relate to this period?
I think that I was prepared to see disturbing content but I was not prepared for how long I would have to be looking at these things and what kind of toll it would take on my psyche. I was not fully prepared for that. Google provided training for me, but the training was about how to handle cases, what kind of questions users might ask, what kind of responses can or cannot be given. Basically I have studied how the reporting timeline looked like: what part did we play in the larger Google system in making sure that users felt safe, that we were complying with the laws of that country and while doing that so that we made sure that Google looks OK from a public perspective. From my perspective, I was moving across the country from New York to California, so obviously my mind was on the moving process as well. I prepared by doing a lot of reading not only on the laws that I was working with but what have been the responses so far as to companies and how well they were doing. I did see some criticism but nothing like we see now, it was relatively mild criticism and I was just excited to be at Google — I was 23 years old and that seemed to be the best decision of my life.
What about now? Do you think it was the right choice?
I don’t believe in regret, I think a lot of things happen for a reason. I use my experience as a content moderator to fuel the work that I do — first graduate school and fuel the work that I do now. I went to graduate school and worked on Internet policy. I have an experience where I saw things that should be changed and I knew that to push for those changes and those ideas of making the system better — I have to work in public policy. I moved to the organization I work for now — R Street Institute, it is a think-tank. We provide ideas to lawmakers and the public — the ways we can make this better. I had good luck to experience it, to exactly know how it feels like and what needs to change.
What were the difficulties you have faced in decision making when it came to what content should be removed or not, to put it simply
There were a lot of difficulties associated with it. It is very easy to make decisions when you are thinking about if it is allowed from a legal perspective or not. However, when you are dealing with the standards created by the companies there are times when you inevitably disagree with the policy, when you think “it is not right, we need to remove this” and it is out of your control. You are part of the machine, very low down, you do not have that kind of power and those were the most difficult decisions for me when I would have personally made a different decision but had to follow the companies’ policies. Those moments were emotionally very difficult because sometimes you are dealing with very distressing situations like getting messages from mothers whose child was killed in a terrorist attack or getting messages from women who were subject to revenge porn. People who were humiliated, whose lives have been turned upside down… and you could not do anything about it, these were the most difficult moments. At the end of the day you are dealing with humans and those are human problems and they expect human responses.
Have you had situations when management disagreed with the decisions you made or your vision of things? Were there cases when you had to review your decisions?
In general, there were moments as such, I personally never faced a situation like that. Basically we had a chart in front of us at all times that guided us through everything. Certainly, as I have said there are “gray areas” and I think we have encountered those moments most often with the material which was not that straightforward, like terrorism content. I had moments when I had to take something down and I said that we should launch an appeal for this. For instance, in France law enforcement can send a report to take the content down in 24 hours and then you can launch an appeal. And there were moments when I launched the appeal because certain things did not match to any definition of terrorism, it would be unfair if we take them down. We would get into a meeting, talk about it and make a final decision. For the most part, if we had good reasoning — at the end of the day, we were the experts — so our decisions were usually trusted in decision making.
Did your colleagues and/or subordinates share your ideas and philosophy? Or did everybody have his own idea of why he or she is doing this thing?
I think we all had a similar common goal to make sure that users stayed safe online and that we were protecting free speech — these two things were pretty common amongst all of us. But if you turn around to contractors’ side of things I can almost guarantee that contractors might have similar goals, initially, but the longer they stayed at Google the less motivation they had to keep that initial motivation. In our case we have the ability to have a discussion, to attempt to make changes, and for this reason, I think a lot of people have a common goal. The longer you are there, the more that goal and that motivation weaken because you are just exhausted — it has such a toll on the human brain. Another problem is the lack of control you have on the overall system and what changes you actually can make.
After talking to Daisy, I realized that «Unbiased Google» is another stereotype in the piggy bank of Google myths which is diligently broadcasted to the public. For big tech giants like Google notions such as the «freedom of speech» and «human rights protection» have lost their original meaning, now it is more of a lure for young professionals. During the establishment and the heyday of Google, there might have been some fight for the freedom of speech, but when the company turned into a multibillion-dollar company with a colossal influence on mankind, freedom of speech became an abstract category, which totally complies with the corporate policy. The more political and influential the entity becomes, the more restrictions it has to observe in modern society, just nominally declaring the freedom of speech to be one of the principles of the giant.
Fight for justice and freedom of speech at Google is turning into a struggle of employees within the company itself, which is demonstrated to the public as being still compliant with the principles that were laid down in the foundation of Google. And we will often face the phenomenon that the principles and foundations that are laid in such corporations remain in the charters just as a memory. Despite an American democracy, freedom of speech is declared simply nominally.
If one really wants to fight for the freedom of speech and seek justice on the Internet, having heard Daisy’s experience, one can conclude that it is achievable, but it takes to have a completely different professional activity rather than a content moderator in a big tech giant. Only small groups of companies and independent business configurations are able to fight for justice.
Digital World expresses gratitude to Daisy for a sincere interview and for her willingness to continue the dialogue. Our acquaintance turned out to be extremely exciting. See you next time!