Skip to main content

A Response to The Future of Privacy in Social Media


I was totally fascinated by Danah Boyd’s “The Future of Privacy and Social Media.” I like how she prefaced the idea of privacy with how people act as teenagers versus how they may act as adults and what they are willing to share with social networks. What’s even more interesting, that Danah did not discuss, is what will parents share about their children on social media and networking that their children will feel violated by when they are of age to understand what their parents posted? A child that was born in the mid to late 2000’s (and later) may find it harder to get a date in the future if their prospective date can look at all their embarrassing stuff their parents posted on social media about them. Even more frustrating, children do not own these accounts so they do not have control over how their image or stories about them are shared. In ten to twenty years, it will be interesting to see how children in the early 21st century deal with this issue of privacy out of their control from their early life.

Mommy issues...

Danah also described some of the various ways teenagers deal with privacy in their own way. Most of the techniques involve practices to avoid being seen by authority figures or family members. It seemed like every circumstance of privacy violation, described by Danah, mentioned posts being viewed and commented by their mother. Teens may think this is an unfortunate discomfort only afforded to their teenage years. But rest assured it’s an issue that lasts long into adulthood. Although I choose not to share most personal posts to social networks, my sister shares every idiotic thing she does. There were some years of Facebook wars between my mother and sister that lasted well into my sisters 30’s. Maybe my sister needs to grow up? Maybe my mother needs to not worry so much and mind her own business? At any rate, things are usually easier when they decide to unfriend each other and only share more appropriate things in person. I’m 99.99% sure neither my mother or sister will not read this blog so let’s hope I don’t get in trouble!

Cryptic faders

The most intriguing anecdote Danah shared was about a teen who would deploy several tactics to make herself visible to the public only when she wanted to be seen. This teen used cryptic text that made sense only to the culture of her peer groups. She also deactivated her Facebook account on a daily basis at night and reactivated it the next day to make posts. This essentially, makes her seen only by the people who she chooses to see publicly. I thought this was ingenious and sneaky but who doesn’t sneak around when they are a teenager? Which Danah tipped her hat to the teens by noting cryptic text and “fading” in and out of an active account as a practice of the oppressed. Are teens oppressed?

Corporate exploitation

The final thing I’d like to point out about privacy from Danah’s talk was the idea of consent. Consent is usually perceived as mutual understanding in agreement to do something. How many times do you install software or software updates and read all of the “terms of use”? I just installed two pieces of software last night for another class I am taking as a MOOC. I did not have time to read all 58 pages of the “terms of use.” I just clicked the check box in order to install the software so I can conduct coursework. My thought was, “No, I don’t have time to read this document and watch the 5 videos required for this course module.” The next time you install the itunes update, and you are required to click the checkbox that you agree to “terms of use,” think about all of the millions or billions of people who did not read this agreement. Is it ethically right for companies and corporations to require users to agree to terms that are inaccessible due to length of documentation? Because the users did not actually consent to the use of terms, rather simply clicked a checkbox, are they obligated to use as described in the terms? Do we expect this to become a bigger privacy and “consent” issue in the future? I think this is the broadest use of corporate exploitation ever known to society because we don’t actually consent but because the companies have the power over the technology we need to perform duties for work, life, and school we allow them to use our data and intellectual property freely.

Comments

Popular posts from this blog

VR ‘Redefining’ How We Design

SAMR: REDEFINITION. Image courtesy of Christina Moore 2017. In recent years virtual reality (VR) technologies have gained popularity for enhancement of a myriad of industries and experiences. It’s hard to dispute VR has the potential to transform. It’s exciting to consider exploring these technologies for the purpose of education, but before putting VR into practice in the classroom, it’s important to apply the study of theory to VR potential. The SAMR model (substitution, augmentation, modification, and redefinition) is a great way to apply rather basic theory to VR tech. Although it’s possible VR practitioners and learners can traverse SAMR, based on how VR is used, “redefinition” may be the most impactful way to demonstrate use of these technologies for learning. Redefinition, in regards to SAMR, refers to the ability for technology to “create tasks and ways of learning that were previously inconceivable.” (Technology Is Learning 2014) From the perspective of a CAD and Int...

How Discourse and Creativity Express Meaning

Moving from literacy and ‘new literacies’ to D iscourse. In chapter one of New Literacies: Everyday Practices and Social Learning Ed by Colin Lankshear and Michele Knobel . I learned about literacy as a historical concept and a social practice. I also learned about new literacies as ‘paradigmatic’ and ‘ontological’ (Lankshear & Knobel, 2011, p. 27). In chapter two I began to learn more about literacy as a social practice through Discourses and encoded texts. “Hence, literacies are ‘socially recognized ways in which people generate, communicate, and negotiate meanings, as members of Discourses, through the medium of encoded texts.”  (Lankshear & Knobel, 2011, p. 50). As a social practice one can think of literacy as observable ‘things’ humans do with their bodies and minds to create meaning. Lankshear and Knobel cite the work of Scribner and Cole to describe these practices as “consisting of three components: technology, knowledge, and skills. (ibid,: 2...

The Everyday Remix Practices of Teachers: A Critique of Christopher Emdin: Hip-Hop and the Remix of Science Education

Christopher Emdin: Hip-Hop and the Remix of Science Education UC Denver digital storytelling students wishing to comment on this critique:   Although I would love it if everyone had the time to watch the full youtube video, you can get ‘the just’ of this remix practice within 5-10 minutes of watching if you would like to participate in comments. Please don’t shy away because of the length of the video. Critique Format As part of the continued practice in digital storytelling, in INTE 5340 MA ILT at CU Denver, I will consume digital stories and offer critiques. Until now the course has focused on Jason Ohler’s assessment traits as criterions to assess stories. For the remainder of the critiques in the course, I will focus on “everyday remix practices” as described in the Lankshear and Knobel text New Literacies: Everyday Practices and Social Learning Third Ed by Colin Lankshear and Michele Knobel , on pages 127-140. Introduction In efforts to transition and answe...