Connected Courses ( Pre-req ) “Is my data showing in this?”

The following is a part of the ConnectedCourses.org program

handshake-36806

 

Although we often think of trust, privacy/anonymity and security in concert, perhaps even as synonymous, they’re each discreet concepts. Even a cast iron guarantee of one doesn’t mean we can rely unequivocally on any of the others.

In thinking about networks and network fluency for our connected course we will be assuming that we cannot guarantee technological security (there is a whole, forever out-of- date course in that subject alone) and we will be transparent about this at the outset. We will instead seek to build our networks with the security and trust afforded by being consistent and so to some degree predictable.

Networks and societies as a whole cannot function without this omnipresent low-level trust and security. I have to trust other drivers to abide by traffic laws when I take my children to school. I have to trust that the teachers at their school will teach and care for them during the day. This enables me to go to work and specialise as a photography teacher of still other people’s kids. I trust that my employer will in turn pay me for doing so and if they don’t, then I have to trust that the law will serve to force them. This trust and predictability provides a degree of “security”. Without it I cannot drive as efficiently on the roads, my children must be home schooled and I cannot make time to specialise in teaching photography because I will be too busy farming behind secure walls. Lack of trust inhibits civic engagement.

 

 “Trust makes social life more predictable, it creates a sense of community and it makes it easier for people to work together.”

Barbara Misztal

 

At the DML conference in 2013 when I proudly presented on my open courses, Nishant Shah asked me at the end about his right as a fictional 16yr old “to be forgotten”. What followed was the dawning realisation that in breaking this new ground with the forty students in my class and the thousands at large online, we had trustingly packed our data like messages into bottles and cast them out into the sea, ignorant of where that data will wash up in the future and of the consequences.

I reflected on my own 16 year old self and how I am no longer held accountable for the questionable views I held in 1988, nor for my embarrassing actions. Apart from some school reports in my parent’s attic, that data just doesn’t exist.  RFID chips did not record the times I went to my high school library and universities didn’t analyse that data to assess my likelihood of completing their degree courses. No one was storing my correspondence or analysing its contents, I wrote letters that were sealed and postcards that said nothing. I didn’t have a mobile phone to triangulate my locations, movements, modes of travel, to record my conversations and map my relationships. My buying habits weren’t analysed by insurance companies and no prospective employer was able to drill into the soap opera of my social media profile for the backstory on my resume.

In technologically connected societies we habitually share this seemingly inconsequential data by default. We share our own, we share that of others by association and others share ours. Mobile devices, pacemakers, hearing aids, library books, store cards, twitter lists, Facebook friends and photographs all store and haemorrhage data that can be harvested, algorithmically cross referenced and interpreted. Data that is “of us”, by us, but not for us, which is bought and sold by data brokers (download a list of companies who you can reclaim your data from and opt out of), the sums of which are fed back to us via an obfuscated feedback loop.

A single forty year old female who stops buying birth control from her supermarket and who’s sister has a history of pre-eclampsia probably won’t draw a direct connection between her store card, Facebook friends and a denial of her mortgage application due to the statistical potential of a fall in income. Just as her partner may not connect their regular route to work or Instagram tagged images with the store offers they receive via Twitter or in the mail.

As each of these connections reveal themselves they breed a new low-level mistrust and threaten the distributed digital intimacies that the web might of afforded. The sister no longer shares the trauma of pre-eclampsia with her Facebook family. Disaggregated friends can’t learn of her experiences and support her with theirs. Support groups can’t form for fear of the consequences of some blurry algorithmic association and in that instant, our privacy and trust flips from being an asset into a threat.

With no established norms or protocols for dealing with “our” big inconsequential data, we’re just going to have to make this up as we go along. But as instructors and mentors in and of these digital spaces, if we don’t develop strategies and processes with our learners for mitigating possible consequences, then who will?

jw

 

Movie:

Audio only:

Ben interview transcript

Make #1 (prior to session) :

Watch/listen to the interview with “Ben” the Security-Specialist and then choose one of the following people to create a Threat and Risk Assessment Matrix (TRAM) for them prior to participation in your open and connected class. Look at each Risk as a product of Consequence, Likelihood, and Vulnerability, rated from negligible to critical. The aim then becomes to mitigate the risks to an acceptable level (e.g. you can have a risk that is very likely and you are very vulnerable to, but mitigated to minimal consequence; in contrast high consequences need the likelihood and vulnerability surface minimised). This is a BIG ask and a more appropriate “Make” for most of us will be to think this through and begin conversations with our students.

  • A non-dominant female participant living at home or abroad who’s local culture (societal or familial) denies her access to an education.
  • A student in your onsite class who, now released from their strict home-life is partying hard and publicly.
  • An academic researcher.

Feel free to adapt those examples or make alternatives HERE is a TRAM example made by Ben for CCourses, this is however a context specific exercise and everyone’s risks, likelihoods and vulnerabiloties will be different.

Bonus Task 🙂

Using the concepts described by Ben,  locate where the interview took place.

Deeper-dive materials:

“Ars tests internet surveillance – by spying on an NPR reporter”
Malte Spitz “Your Phone Company is watching you”
Electronic Frontier Foundation
Information security CPJ
Who is harmed by a “Real Names” policy?
https://www.torproject.org/about/overview
TOR Beginner’s Guide Guardian
NSA and GCHQ agents ‘leak Tor bugs’, alleges developer (good 6min overview)
Police adopt facial recognition for CCTV’s
Facebook facial recognition
Bruce Shneier
Julia Angwin

Background reading (to engage a youth audience, although I set this as prep for my undergraduates):

“Little Brother” Cory Doctorow
“Homeland” Cory Doctorow

Resources / Tools:

Privacy Tools by Julia Angwin
Security in a Box
https://gpgtools.org/
https://www.gnupg.org/
Tactical Technology Collective
“How does the Internet Work?” flash-card download from tacticaltech.org
Threat Modelling and security planning by Jonathan Stray
TAILS
TrueCrypt
Terms of Service I did not read (a browser plug-in)

 

 

Bonus task Spoiler

 

Leave a Reply

You can use these HTML tags

<a href="" title=""> <abbr title=""> <acronym title=""> <b> <blockquote cite=""> <cite> <code> <del datetime=""> <em> <i> <q cite=""> <s> <strike> <strong>

  

  

  

#PHONAR

Phonar - [fo-'när] is a free and open undergraduate photography class run by Jonathan Worth


Follow #phonar on Twitter for the latest news, information and posts.

#PhonarMerch