Since the rise of social networks, smartphones and online advertising, many companies monitor us daily. Behaviour, movements, social relations, interests, weaknesses and private moments are recorded, evaluated and analysed real-time. Most users try to protect their data with privacy settings. But does this give any guarantee if we know what happens behind the scenes at tech companies? An example: the iPhone X can recognize both faces and expressions with its TrueDepth camera. Apple now wants to give app developers limited access to those facial data. Would you, as a user, also give permission for this if you had anticipated this development? Maybe not. The point is that we use tools such as WhatsApp, Telegram, Facebook Messenger without thinking about who make use of the data we generate.
Also more and more devices, such as smartwatches, monitor us continuously. And at home voice-activated speakers like Amazon Echo are eavesdropping private conversations. When we sleep, when we wake up, where we go, what we buy. In recent years, Visa and MasterCard have begun to make information available about their customers’ purchases for the digital tracking and profiling universe.
This only increases. “Facebook now invests in Virtual Reality, the next step, that people immerse themselves completely in a world created by a tech company. Perhaps some people do not see the danger of such intimate entanglement with machines. But be aware: you are not only merging with those machines, but also with the companies that manage those machines (Franklin Foer, World Whiteout Mind, the existential threat of Big Tech).
The British journalist John Lanchester calls Facebook “the largest on surveillance based company in the history of mankind.” A very creepy article about Facebook was recently written by Kashmir Hill from Gizmodo: “Behind the Facebook profile that you have built yourself is another profile, a shadow profile made up of inboxes and smartphones from other Facebook users”. She sums up a series of examples of people who get the most unexpected and unwanted friendship suggestions.
You are not only being watched by devices near your body or at your home. There is a huge increase in the number of sensors in the public space. Cameras, motion detectors and sound meters are present in increasingly large numbers in the shopping streets. This is what Geonovum, an organization that makes geographical information accessible to governments, observes.
Weapons of Math Destruction
According to Cathy O’Neil, author of the bestseller ‘’Weapons of Math Destruction’’, whole groups are discriminated against by algorithms. In recent years, automated programs based on biased datasets have caused numerous scandals. In 2016, when a student searched Google images for ‘unprofessional hairstyles for work’, the results showed mostly photos of black people. When the student changed the first search term to ‘professional’, Google mainly placed images of whites. But this was not the fault of biased Google programmers; it rather reflected how people had labelled images on the internet.
Andrew Reece of Harvard University and Chris Danforth of the University of Vermont developed an algorithm that can use photos from Instagram to see whether someone suffers from depression. You wouldn’t want something like that to come out in public, do you?
The dark side
“Technology has crossed over to the dark side. It’s coming for you; it’s coming for us all, and we may not survive its advance. “A frightening phrase written by Farhad Manjoo in the New York Times. A recent example of this is what China wants to implement. The Chinese government is planning to launch its (Social Credit System) credit system in 2020. With this system, China wants to assess the trustworthiness of its 1.3 billion inhabitants.
Bas Boorsma of Cisco Northern Europe argues for a strategy. “We are dealing with two camps: technology evangelists and technology-doom thinkers. Their positions are miles apart, whereas we need a synthesis of these worlds of thought.” Boorsma argues for a ‘New Digital Deal’ in which ethics and digitization come together.
The most spectacular developments in AI come from a data-intensive technique known as machine learning. Machine learning requires a lot of data to create, test and “train” the AI. Data and AI are inextricably linked. According to AI-Developer Nick Bostrom and author of the book ‘Superintelligence’: “We are about to develop a self-learning system that far exceeds human thinking. We only have one chance to do it right”
According to Holger Hoos, professor of Machine Learning at the University of Leiden: “A human level of artificial intelligence, although intellectually fascinating, is not desirable. Instead, we need to focus on artificial intelligence complementing our capabilities and compensating for our weaknesses.” “Big Data can only quantify and not qualify, and all the information that comes out of it deserves a human judgment”, says journalist and researcher Timandra Harkness.
It is a fact that technology and technological developments cannot be stopped. “The trick is not to see technology as a big demon who opposes us and wants to push us away, but as something that shapes us into who we are,” says philosopher Peter-Paul Verbeek.
Arianna Huffington, CEO of Thrive Global, says: “We must embrace the technology to free us from the same technology so that we can reconnect with the people around us.”
As long as we, the consumers, the individuals, know what can happen to our data, we can also protect ourselves against this to a certain extent. For years there have been companies that claim that they do not do anything with our data, but is this really true? Maybe we should all switch to Signal (instead of Whatsapp), Duckduckgo, WolframAlpha etc.
Author: Erdinç Saçan, Lecturer Fontys University of Applied Sciences