Monthly Archives: October 2016

Fifth Lecture – Surveillance, Privacy and Data

This meeting was on: Surveillance, Privacy and Data

We talked about predictive policying and discriminatory algorithmic bias in such systems. How to deal with survailance. In China a big survailance system is already established. Many people are aware that smartphones do a lot of tracking but still continue using them, because they are convenient and deliver a lot of benefits. The usage of alternatives like TOR (for browsing on PCs) may make one seem suspicios and actually attract attention. Different popular sites also discourage the use of tools like TOR.

In Germany and Austria politicians talk about using a ‘Bundes Trojaner’. One does not know if survailance is really happening, or if it just looks like it’s happening. What are the actual benefits for people of doing survailance? Studies show that the impact on crime detection is neglectable. One does not know for what the data will be used in the future, people and governments change. What if totalitarian governments in the past had our current technology? The storage of big amounts of data is also problematic and dangerous. How to secure the data? What about data beeing moved? As a person one does not know where one’s data moves around. There are also possibilities of misinterpretation of data. People can be stopped from traveling because of ‘joke’ tweet (there have been already reported cases). What about a future which extends more and more in our lives?

Survailance exists also outside of governments, e.g. a shopping card tracking your purchases and giving you bonuses in return. In these cases survailance is usually done for capitalistic reaons, like predictions of what a person is buying.

The biggest problem of survailance is that it is usually invisible. One does not know what processes are executed and what the results are. The public only notices incidents or big discussions. Is for instance the data agreement of Google visible and understandable enough for most people? Currently only very few people have actually read it. We asked in our group what tools for “selfprotection” do people use: Ghostery, AdBlock+, NoScript. A lot of privacy enhancing tools are not that convinient, so many people think: “I know I should, but I don’t do it usually”.

Survailance often categorizies people, objects and events. This categorization is also not visible and users are usually labeled without knowing about it. For example automatic pregnancy detection through previous purchases and based on that the creation of recommendation of products. Another example is the monitoring of women and their toilet routines by their companies in order to detect pregnancy early. This predictions can then be used to fire them before they are in motherhood protection (by officially telling the company).

There is a difference in the actions if people know they are beeing watched (Panopticonism), e.g. persecution of homosexuality. For example: A woman posts private photos on Instagram without her hijab and gets in trouble because of that. In private this is ok, but actually the photos are entering the public realm. The distinctions between public and private are not so easy to see in the digital world.

In the neutral net everybody can host everything. Net neutrality is also important for a fair market. Getting rid of neutrality would result in changes in power structures on the web. It has already happened that ISPs not only want to ‘balance’ traffic but also completely make a certain service unsuable (e.g Netflix) unless one subscribes to a higher rate.

In the future more technology will be part of our lives (e.g. Internet of Things) and in turn more survailance is possible. The responsibility of technlogy creators needs to be a greater focus of this discussions. At the same time complexity has risen so much that it’s not really possible anymore to completly understand all deployed systems. Nowadays careing about privacy becomes a competitive advantage (e.g. Apple). Having a public discussion on privacy is of great importance.

Advertisements

Fourth Lecture – “Algorithmic Culture” and “Erasure of human judgement through rationalization and automation”

In this meeting we had one presentation:

and a short introduction to “culture”, a short summary of the readings and an “Engaging Activity” to Algorithms in our daily lives:

Culture is a term that is very difficult to grasp. It could maybe be desribed as aquired cognitive and symoblic aspects of human existence. Culture isn’t bounded, i.e. one can not define a “circle” around it (e.g. “Austrian Culture”: What is it actually? Does it stop at national borders? …). Many algorithms tend to fragment culture/publics through sorting/filtering/… (e.g. geolocalised search results). Culture does not only influence people, people also influence culture. This maybe is analogous to languages and how people speak them.

“The Lives of Bots” describes Wikipedia bots that take part in editing. In this system they are actors similar to human beings. An alternative stance would be to think of them as cultural artifacts, “anything created by humans which gives information about the culture of its creator and users” (https://en.wikipedia.org/wiki/Cultural_artifact).  The article “The Cathedral of Computation” was focused on the culture around algorithms. It is centered on technological determinism, i.e. with the belief that technology ultimately advances “society” and is able to solve important social problems. It also stated that people tend to view algorithms as god-given, unchangeable and objective, which is a problem since they are changeable.

Next we took part in an activity with the intention to have a recent, conscious experience related to the topic in order to fuel reflection and dicussion. The presenter of “Algorithmic Culture” chose to  use this method instead of the usual presentation with slides. We were told to search for a recent article in Austria on a specific algorithm influencing our daily lives. We talked about our experiences and the results we got from the different search engines we used. The various plattforms featured different paraemters for the search. Participants using the same plattform and search query got different results due to personalization algorithms. We talked about how ranking by “relevance” is very context dependent and that something like “relevance” can not just be calculated. It is also a very poltical concept, since these rankings are a hierachy of importance which is acknowledged by users. A lot of personalization seems to be language and country dependent which reproduces legal borders between states in the digital realm.

The last hour of our session was again dedicated to discussion. We talked about a paper stating that self-tracking healthy eating apps do not really structurally change the culture of unhealthy eating. The responsiblity is with the companies and marketing departments selling unhealthy products and not really emphasising the risks involved. These apps are predomanitkly used by people with smartphones and more wealth, since one needs time and money to really use these apps. Many unhealthy foods are also cheaper in comparision to healthy alternatives. The article “algorithmic self” raised the issue of people writing posts not only with the intent to influence how other people will see them, but also how the algorithm sees them. For example some news publisher write titles for their articles in ways that algorithms will rank them higher and more people will click them. Later on we talked about Social Media and its influence on people. The used medium governs how people communicate with each other and in turn social norms develop around structures build into the system, e.g. “can you friend me on facebook” is the new “can you give me your phone number”. It is important to make data available for discussion and research, so different groups can inspect the systems to see if they adhere the ethical principles. It would also be important to teach algorithmic literacy in school. Global Trending Topics on Twitter work as a way to unify different Filter Bubbles. Trends are still fragmented (by country or others). One can choose which fragment one desires, but still on twitter not many people do it. Creating filter bubbles protects from the outside and create a closed spaces where fewer things have to be discussed and less criticism is possible. Algorithmic systems are also becoming increasingly important in political discourse with Facebook beeing accused of manipulating it’s streams in the U.S. against Trump and Hillary Clinton having various issues with her emails.