Movie Review: Coded Bias
A new documentary about the intersection of technology and privacy has hit Netflix. “Coded Bias” released on April 5, 2021 on Netflix and immediately became buzzed about# Movie Review: Coded Bias
A new documentary about the intersection of technology and privacy has hit Netflix. “Coded Bias” released on April 5, 2021 on Netflix and immediately attracted buzz – it's currently high on the Top 10 (I think #4?) in the US as I write this. Even my partner noticed it and alerted me to check it out. Ironic that Netflix pushed a movie about the dangers of algorithms so hard, but here we are. So how does it stack up? Is it worth watching? Does it tackle the issues well? Is it a good resource for your non-privacy/non-techy friends and family? Here's my thoughts.
About the Director & the Film
The film was directed by Shalini Kantayya, an environmental activist “whose films explore human rights at the intersection of water, food, and renewable energy.” She has a master’s degree in Film Direction, and has received recognition from the Sundance Documentary Program, IFP Spotlight on Documentary, New York Women in Film and Television, John D. and Catherine T. MacArthur Foundation, and the Jerome Hill Centennial. She is a Sundance Fellow, TED Fellow, a finalist for the ABC | DGA Directing Fellowship, and a William D. Fulbright Scholar.
Coded Bias primarily follows Joy Boulamwini, a Ghanian-American computer scientist and Ph.D candidate at the MIT Media Lab. According to the opening minutes of the film, Boulamwini first discovered racial bias in facial recognition algorithms when she attempted to make a proof-of-concept art project that relied on the technology. The camera almost never detected her face no matter the lighting conditions – until she put on a plain white mask. This prompted her to dig deeper. The movie follows Boulamwini’s journey and features a number of interviews from experts in the field and real footage of real-life events as she goes.
The Good
I think perhaps the coolest thing to me is the real life, on-the-ground footage of certain events. For example, at several points in the movie, the filmmakers are in London alongside a civil rights group called Big Brother Watch. The group stands outside an area where the police are using facial recognition cameras – clearly marked with signs – and tries to hand out flyers and inform people of the flaws and risks of facial recognition. At one point, the crew gets firsthand footage of a man who pulled his shirt up over his face when he saw the signs as the police follow him and force himself to identify himself. Later, a black teenager is pulled aside and ID’d because the cameras falsely identified him in a face match database. Seeing these situations happen firsthand – not through re-enactments or interviews – really got my blood boiling. And that’s good. Humans are emotional creatures. The 1976 film Network is about the media, its sensationalism, and its exploitative relationship with viewers. At the climax of the film, the star makes a legendary speech, at one point declaring “I don't know what to do about the depression and the inflation and the Russians and the crime in the street. All I know is that first, you've got to get mad. I want you to get up right now and go to the window, open it, and stick your head out and yell, “I'm as mad as hell, and I'm not going to take this anymore!!” Personally, I think this is where we are as a society. First, we’ve got to get mad. We’ve got to touch on that human emotion that spurs people into action where we say “enough is enough,” and I personally was blown away at the film’s ability to do that, to show firsthand, real-world, actual situations where algorithms have gone wrong. Sure, there’s plenty of “think of the bad things that might happen,” but none of that is as powerful as watching a slightly traumatized 14-year-old black kid get pulled over by three plain-clothed police offers who then come back and try to stop the representative from Big Brother Watch from giving the kid a flyer and explaining what the hell just happened. I’m getting mad just remembering it. Let’s move on.
Relating to that previous point, I think the film does a great job of presenting a variety of stories – real stories, not just hypotheticals. They show the two incidents in London I mentioned. They go to an apartment building in Brooklyn that tried to use facial recognition in lieu of keys and to maintain order among residents. They even visit China and ask one girl’s opinion on the Chinese use of social credit and the daily ubiquity of facial recognition. Surprisingly, this girl presents some very positive aspects – I admire a film that can present both sides of the argument. The film then moves to the protests in Hong Kong and shows the dark side that China has used this technology for. The film is obviously overwhelmingly in favor of reigning in algorithms and putting some regulation on it, but I still appreciate that they took even a few minutes to show the other side of the argument rather than just painting a biased “doom and gloom” picture the entire time.
The film also makes a point of continuously reminding the viewers that algorithms aren’t just used by police and advertisers, algorithms are used everywhere. They’re used to determine your credit limit, your mortgage, your insurance rates, your employment, whether or not your resume gets seen by a person, and more. I’m glad they drove that point home. A lot of people think of privacy in terms of “well I’ve got nothing to hide,” but the continual reminder of how much algorithms have permeated our culture shows viewers that this does affect you, even if you’re not an activist or a government employee or you live in a good neighborhood.
The Bad
The film is obviously – and ironically – biased. Of course, every documentary is. If you’ve never realized that every documentary you’ve ever seen has been made with an agenda to make you think a certain way, consider this your wakeup call. Every documentary has a spin. Even Planet Earth’s goal is to make you realize how cool nature is and make you appreciate and want to protect it. I think if the film really wanted a more balanced approach, they could’ve spent a little more time explaining the good sides of algorithms. That’s not say I think algorithms are good – the film very clearly and plainly lays out why they’re problematic with both rhetorical and empirical evidence – but they could’ve done a slightly better job of presenting a less-biased story.
I think my biggest complaint is the pacing. The clips in London that made my blood boil were few and far between. Much of the movie is spent watching Boulamwini stare at a Macbook screen while talking about how she slowly began to realize the amount of control that the algorithms have over us, even in our daily lives and even here in the “land of the free.” There’s a lot of distracting jumping around with camera angles during the interviews, as if attempting to make the film more exciting and feel more energetic. All it did for me was make me motion sick. (Not literally, but it was a bit disorienting.) The first 15 minutes of the film are also painfully slow, it’s not until they get to London that things start to become engaging with the man who hid from the camera with his shirt.
Final Verdict
Despite the pacing issues, I whole-heartedly recommend this film. Force yourself to watch the whole thing, even if you find it boring. The topics covered are incredibly relevant and – as mentioned – permeate every part of our daily lives. There is nobody not affected by this issue, and it’s only in the last couple years that major attention has to come to the issues with algorithms – from facial recognition to resume softwares, and this documentary barely scratches the surface. This technology is being used to score future criminals, rate students, determine college admissions, etc. I sometimes catch heat in the privacy community because I’m not 100% against certain technologies. This technology is a perfect example. It has its uses – I don’t think all possible applications of it are good, but some can be – but it also has a long way to go before even those few good applications are ready. This stuff has some serious bugs that need to be worked out, and until we as a collective society can shine a light on those and have those discussions we’ll never be able to even get that far. This is a conversation that we as a society desperately need to have. For those who are unfamiliar with this subject, I think this documentary is an excellent starting point.
More on the Movie
You can visit Coded Bias’s official website here. It is currently viewable on Netflix.
Tech changes fast, so be sure to check TheNewOil.org for the latest recommendations on tools, services, settings, and more. You can find our other content across the web here or support our work in a variety of ways here. You can also leave a comment on this post here: Discuss...