lies, damned lies and big data
Tens or hundreds years from now, an archeologist will not bother to dig the soil to search and learn how people live back then. The modern artefacts would no longer in form of a step, a skull or a weapon, but the data. Our child and grandchild would spend around half of their life in modern-arbitrary soil called internet, where they would step and leave the trails of their existence on modern world. Our generations have leave more than 3 billion trails everyday on the internet, and it does not wait to other centuries to be analysed by modern archeologist. Tech giants like Google and Facebook lined up dig and learn human behaviour from these billions of trails and this is where the big data term were born.
The argument is that social science has been, historically, “theory rich” and “data poor” and now we will be able to apply the methods of “real science” to “social science” producing new validated and predictive theories to improve the world. Technically, the big data infrastructure only made possible if there are good resources, such humans, technologies, and funds. It is not a question how the big data gain its traction in the hand of Google, Facebook, or Amazon who already have their internet ecosystems. These conditions force people to release their personals, actions, and habits voluntarily to these giants, only to believe that it would make the world a better living.
George Orwell’s Nineteen Eighty-Four novel in 1949 and Issac Assimov’s I, Robot in 1950 were some of old literatures which questioned the empathy of an artificial judgement, in which big data plays it roles in human live nowadays. It came with another question: who owns these data? what algorithms have been used to which we believe that it is the best for us? what is behind these algorithms? how far these artificial judgements/predictions go before we question our humanity? how to guard your “free will” against big data advice/suggestion? These questions are the reason to make this essay positioned itself as the yin/the antagonist to this trend, in order to balance the optimism and raise human conscience to the newly born-immature technology of big data.
Dadaism began in early 20th century as new art movement which does not only talk about aesthetic, but also the social and political idea to react bourgeois class dispute of First World War. It is known that the First World War brought miseries not only to the bourgeois, but also to the people in general and proletariat. Put back our nowadays big data scene, the giants played role as the bourgeois while again, people have no control to their choice. Our only choice is to believe that corporation like Google will keep its “don’t be evil” motto and will not turn its back against us. So, rather than comparing big data and big dada as the new movement on how we see the data, big dada would be a by-product of big data bourgeois misused.
My country, Indonesia, is one of the biggest user of social media in the world. Last year, a total 90 millions Indonesian are connected to the internet and 63 millions of Indonesian using mobile Facebook in 2015. With this big number, it is no wonder that the last Indonesia presidential election, internet have been a bloody field that breaks the Indonesian apart. Internet, especially the social media, have been used as one of the biggest campaign tools and many of us do not realised that our “digital” behaviour play instrumental role in planning political campaign, in which indirectly could swing our vote. This is just one real case how big data could affect our life as a country 5 years ahead, and awareness on possibilities of big data, is what we should understand. Hence my research question would be: how to make people aware of their vulnerability from big data inception?
The idea of my project would be series of interactive installations called “Your Communal Face”. There will be two experience involved, which each of them question ourselves regarding the big data ethics and social issues. The first installation would record or manually receive an input of user, where a big data system would gave back a calculated summary according to dataset of user’s record/input. This would give the idea of how the generalisations are made and question back ourself to understand what this artificial judgement is thinking about the user. The second installation would questions user conscience of humanity even if it is a legit data or how we guard the “free will” nowadays, by recording user’s choice and actions if the big data blackbox produced some miscalculated suggestion on some issues. This installation will describe the persuasive power of numbers, particularly the use of big data to bolster weak arguments.
“Your Communal Face” is not final yet and it needs many collaborations to deeply understand the impact or even further expansions of the idea. I would love to study and collaborate with people who understand human-technology behaviour, along with artists to fully understand the human expressions and engineers to learn the real technicalities, opportunities, problems, and challenge on big data.