Credit: Yagi Studios / Getty Images
The 1964 Supreme Court Case Jacobellis v. Ohio presented a highly subjective question to the justices: what is obscenity or pornography? How do you define it? Where do you draw the line? In response, Justice Potter Stewart gave us the iconic line, "I know it when I see it."
His ambiguous answer works fine for humans who can make judgement calls on the fly, but the algorithms that rule our lives need rules that are much more concrete. Say you flag something as inappropriate on social media. How is artificial intelligence meant to answer a question that even the Supreme Court could not definitively pin down?
That’s where humans come in. Mary Gray, an anthropologist and co-author of the book, “Ghost Work: How to Stop Silicon Valley From Building a New Global Underclass,” explores the work and lives of the real people behind online processes that internet users may assume are purely algorithmic. From analyzing medical tests, to flagging questionable social media posts, to identifying your rideshare driver, Gray argues that the human touch of “ghost work” is not only essential, but this hidden workforce will continue to keep growing.
Main Takeaways
- While AI can perform incredibly well with tasks that have clear parameters, such as a game of chess, humans are still better at making the tough calls and dealing with unpredictable situations. Gray shares the example of Uber wanting to verify its drivers’ identities with a current selfie matched against a photo on file. A machine trained in facial recognition can match faces fairly reliably, but it can’t compare to a human eye when it comes to added variables — a mask or a new beard, for instance. Humans, therefore, remain at the core of things like removing objectionable content from Facebook or interpreting special instructions in your GrubHub order.
- Gray suggests there are millions of people doing this “ghost work,” but we don’t actually have firm numbers. There is no tracking system set up for exactly how many people are in the field, how long they stay, or what their working conditions are like. A lack of visibility and regulation opens the door for worker exploitation, and this matters doubly when you consider that Gray only expects this field to keep growing. She describes it as this generation’s version of piece work, where larger projects are broken up into tasks that can be distributed online as contract work, and it may be the future of all employment.
- Computers haven’t caught up with everything that humans want them to do, and Gray doesn’t think they ever will. The more data we generate by doing everyday tasks — Google searches and social media browsing and online ordering — the more data we produce to be analyzed, and AI is essentially useless without humans to process the meaning behind the raw information. For instance, if you search for something on Google but you’re not getting results that make sense, it often takes a human mind to get to the bottom of the disconnect. The algorithm can learn from the glitch and fine-tune results in the future, but it takes the human touch to guide it.
More Reading
- The internet can be a dark place sometimes, and real people are responsible for reviewing and removing posts that deal with extremely sensitive subject matter. The Verge has a deep look at what it’s like to keep parts of the internet clean — but the piece is not for the faint of heart, so read on with caution.
- You may think algorithms are impartial since they work purely on objective data, but according to this Hidden Brain episode, the people making them run the risk of programming their own biases right into the code.
- We’ve been promised that self-driving cars are just around the corner for years now, but it’s much more complicated than just getting the technology in order. Check out this Forbes piece on how engineers are tackling the infamous “trolley problem.”