Flat Eye presents a possible future which may be neither desirable nor avoidable.

To create this world and make it credible, MONKEY MOON took inspiration from the present. There is no lack of sources: online articles, social media videos, scientific journals and more. On the team's private chat, one observation came up again and again: ""when reality goes beyond fiction."" Things that may seem a long way off or even impossible in the game may in fact have already happened, and some of the game's narrative threads were directly inspired by these true stories.
As archivist for the project, my mission, toward the end of the development process, was to gather all of these articles to create this coherent bibliography. It provides a closer look at what inspired Flat Eye, of course, but also at our present--a time of such rapid, constant change that we don't even realize it's happening anymore.
The goal of this snapshot of the world is to place Flat Eye's major themes (artificial intelligence, the future of work, social change, etc.) in their context. The bibliography sorts articles into several different categories (with frequent overlaps) and provides a summary for each. If you're only after the links and references, you'll find it all at the bottom of the page.

September 2022. The archivist.

Artificial intelligence, or rather machine learning (because that's what we're really talking about: programs that digest huge amounts of data--texts, images, audio and video--qualified in turn by metadata, to establish logical ties between them) is nothing new. It's an old concept. What has changed over the course of Flat Eye's development is the democratization of its use. The tools have been made public, and are increasingly being used in real-life conditions, but users don't always consider the innate limits of AI beforehand.

An AI Epidemiologist Sent the First Warnings of the Wuhan Virus

Published on January 25 2020

Seen by Flat Eye team on January 28 2020

{Content in English}

An epidemiologist AI named BlueDot developed by a Canadian start-up predicted the emergence of Covid-19 10 days before the World Health Organization began talking about the disease.


Salt the data mine.

Published on February 02 2020

Seen by Flat Eye team on February 02 2020

{Content in English}

An example of data sabotage: a grocery cart containing hundreds of smartphones was pushed around the streets of Berlin, making Google Maps believe there was a traffic jam. The application redirected traffic based on this information.


How India’s data labellers are powering the global AI race

Published on March 21 2019

Seen by Flat Eye team on February 10 2020

{Content in English}

This article explores data labeling companies in India, where employees label image after image to train autonomous vehicle AIs. This manual task which underpins all Artificial Intelligence could itself be automated in time.


Une intelligence artificielle a conçu un antibiotique surpuissant : c’est une première

Published on February 21 2020

Seen by Flat Eye team on February 22 2020

{Content in French}

An AI used in an MIT lab developed powerful antibiotic halicin (an homage to HAL, the AI in 2001: A Space Odyssey). When tested on mice and ex vivo human cells it effectively killed many antibiotic resistant pathogens including the one responsible for tuberculosis.


Never Gonna Give You Up, but an AI attempts to continuously generate more of the song - YouTube

Published on April 30 2020

Seen by Flat Eye team on May 13 2020

{Content in English}

The Jukebox tool by OpenAI is used to extrapolate the song "Never Gonna Give You Up" by Rick Astley from the refrain alone. Results vary.


AI bias in action

Published on July 28 2020

Seen by Flat Eye team on July 29 2020

{Content in English}

A quick video demonstration of AI bias: a name is entered and identified by the algorithm as female. When the title "Dr." is added to the same name, the AI labels it as masculine.


Note from the archivist: The tool has since been shut down.

Trying a horrible experiment... Which will the Twitter algorithm pick: Mitch McConnell or Barack Obama? (thread)

Published on September 20 2020

Seen by Flat Eye team on September 20 2020

{Content in English}

Tweet by computer scientist Tony Arcieri who did a little experiment to explore the racial bias of Twitter's photo preview algorithm (browser and app). He uses portrait-format pictures of Mitch McConnell (white, republication US Senator) and Brack Obama (Black former US President) with a big white space between them. He tries swapping their positions to see the impact (which image is on top, which on bottom). Regardless of positioning, Mitch McConnell always appears alone in the preview.


Note from the archivist: The Tweet created a lot of buzz and launched a debate about racial bias in algorithms. Twitter has since modified its algorithm. Now the white space in the middle is featured in the preview.

Algorithms of Oppression: How Search Engines Reinforce Racism

Published on February 20 2018

Seen by Flat Eye team on September 20 2020

{Content in English}

Book by Safiya Umoja Noble on the racial and gender biases of search engines and how that negatively impacts Black women in particular. The dominant position of these tools spread and reinforce these biases.


Microsoft Files Patent to Create Chatbots That Imitate Dead People

Published on January 21 2021

Seen by Flat Eye team on January 22 2021

{Content in English}

Microsoft filed a patent to create chatbots able to imitate dead people. All you have to do is feed the AI voice recordings, images and social media interactions.


Note from the archivist: Toward the end of Flat Eye's development, Amazon announced a similar feature for devices with Alexa.

Spotify has a patent for personality tracking technology – and it’s pretty creepy stuff

Published on October 07 2020

Seen by Flat Eye team on March 11 2021

{Content in English}

Analysis of a patent filed by Spotify designed to determine a listener's personality traits from what they listen to and the context in which they listen to specific tracks. Spotify then aims to develop a listener profile from the traits observed.


Note from the archivist: From the Monkey Moon forum: "It would be 'fun' if the AI assigned a personality type to players at the end of the game."