This is a snippet from 2019.transmediale.de/content/affects-ex-machina-unboxing-social-data-algorithms
## The script I didn't use, trimmed to 5000 chars ##
Today I want to talk about a project named ALEX, which is the acronym for Algorithm Exposed, and one of its first output: a tool for scientific analysis of the social network personalisation algorithm, that we call fbtrex, Facebook-tracking-exposed. It works by collecting what Facebook sends to you, as your timeline. Because is personalised, it can be obtained as evidence and used to understand the algorithm logic.
First, let me tell you how the idea started, and the road so far. The project begins almost two years ago, as a way to understand the algorithm used by FB to organize users’ timeline. My vision was to increase transparency behind personalization algorithms so that we can start having more effective control of our online experience, and more awareness of the information to which we are exposed. That’s why the name, Algorithm "Exposed" (ALEX).
We believe this mission to be highly interdisciplinary in nature, and it is our primary intention to engage with artists, social scientist and people outside of the technical world to capture all the aspects of how algorithms have an impact in everyday life.
Think about a common thing that still today transmit information: newspapers. They all base their report on events, but they differ in style, focus, and approach. We are all familiar with how they report news differently. We learn how to recognize their diversity, and we decide where we want to stay, on a spectrum of options. What they do, is they craft and organize information, pretty much as FB would do. However, while we do have many newspapers around, and we can “change” how information is organized by reading another newspaper, we cannot do the same with the “newspaper FB”, the information within which we can only read it that way, the Facebook way.
We know an alleged solution is to migrate to a better platform (every Facebook PR would say “nobody force you to stay on Facebook”) but the network effect ensures this is in fact not a viable option for many people on the network. Not only that: for one single expert who has tools to do fact-checking, getting accurate information, experiment different networks, there are hundred for which Facebook is synonymous with the internet. As we have seen the last couple of years at least, the opaqueness of FB algorithms have been exploited in many ways, and perhaps is the manipulation of political events that is the most frightening. Sure, political manipulation is nothing new, but political manipulation that exploits users political inclinations is different; because it is personal and automated; and the only way to realise it is to share your phone with someone else, which is not really viable. This is the technical challenge we face: enable citizen having a partial copy of what Facebook selects for them. It is a fundamental step to engage with peers, verify how differently you perceive a complex issue, and understand how your personalised newspaper is different from one of your friends, colleagues, partners.
fbtrex offers a new potential reality of how users and algorithms can co-exist, by allowing users agency and ownership over the algorithm, and letting users experiment with all of the possible alternatives for social interactions, information, and news that an algorithm can display. We want to help users access non-algorithmically curated Facebook data, and to experiment with different algorithms, to make their best Facebook fact newspaper.
Maybe, parallel to the implication that the General Public License had on the free software world, we can think of an algorithm ’s public license with a similar clause. Let me put it this way: Think of an algorithm as a commodity you can plug and change, according to your own preference and design. Imagine you can change it, remix, share it with someone else. An algorithm implements a set of values which defines priorities and urgencies. You, or people you decide to trust, should have the ability to control these value, not the opaque logic of a corporation which promises to make the world more connected.
The team and I asked people to “donate their digital body” to science. Users can still download a plugin and become part of the first community that helps research that works on the impact and the dynamics of Automated Decision Making in our life and society. Every FB access creates a different timeline, a window into the organizational ability of the algorithm. But once the user refreshes the page, that chunk of information is lost. We can capture that instant, and store it as publicly owned datasets; we collect forensically valid evidence that can be used for policy-making, sciences, and the education of people on algorithm influence into society.