Unit 2 Reflection: Consent, Extraction & Consequence

Video Reflection:


Supporting Materials:


Transcript:

Hi everybody.

So this week in thinking through questions around the ethics of digital privacy and surveillance, I'm really interested in these three ideas of consent, extraction and consequence. And really starting with Dr. Chris Gilliard's idea of privacy as a human right. Being able to navigate the world and the community around us without being harassed or surveilled, and without having associations drawn about what you're doing based on your movements. But in many ways, I think that genie is out of the bottle, and really the price of entry for a society which overwrites consent results in the absence of permission and removal of agency, especially in public spaces. And very often we position surveillance as convenience, and data collection and harvesting as being cost effective. It saves us time and it saves us money.

But what I think we really are talking about when we're talking about privacy as a human right, is this idea of the human right to consent. And when we talk about information privacy and communication privacy and individual privacy, consent wraps around all of these things, in terms of what we do or don't agree to. And that is in many ways, the ethical kind of crux of the problem. Because in individualistic societies like the United States, privacy is for sale and who actually owns an individual's data, likeness, engagement, history, or even the network has a value attached attached to a monetary value attached to it.

So I think a lot of this is around the right to be forgotten, or the right to be ignored, especially when you don't provide consent but it's also the right to be remembered as well. This isn't just information that we've shared, as Dr. Gilliard says, it's information that's extracted here as well. And it dovetails really nicely with this idea that from answering that in different parts of parts of our lives, we will take on this accountability for some part of how we make these decisions. And how we make these choices as citizens.

But I also think it's really often the case that we willingly say yes to such harvesting because we value the exchange, at least in the terms that we understand it. Even if we don't have the language. But the right to be remembered and leave a digital presence for others is something that we're starting to see in the emerging field of things like grieftech, where the value exchange which counters the feelings of grief for those left behind. We hold ourselves accountable for the legacy we choose to leave behind. So in many ways the value exchange of essentially downloading a life into a product is worth it.

Now, I looked at some solutions, proposals that were coming out of groups like the IEEE, who are really trying to get at this idea of increased agency by providing every individual with a personal data agent, which they curate to represent their terms, but these require mass consensus and cross national consensus as well, and are obviously incredibly challenging.

The last piece of this is really this idea of surveillance as inevitable. And in many ways, we're talking about containment versus avoidance here, we we’re more likely to gravitate towards damage mitigation, versus actually accepting the consequences of our choice and doing something about that upstream in the lifecycle of the product.

And legal protections don't do a lot to prevent abuse of populations that we should agree to be protected, especially when those populations can't represent themselves. In many ways, technological convenience starts to bend reality, in the sense by countering, in the case of grieftech, the emotional effects of death. So even if those who have given up their privacy in advance, and that consent is present, the person who's died is not the end user. They're actually the content creator in that exchange and and interactions with grieftech are still being surveilled and billed monthly.

So really, where I ended up was with a lot of questions. A lot of questions about the Vision Pro. In many ways it feels like we might be just about to invite an unprecedented level of data harvesting into our lives and is privacy really the price of digital innovation. We can't really live like hermits, but is there an equitable fair trade kind of environment here that would prohibit unintended data collection? this is really what the IEEE is getting at, I think. And then perhaps wouldn't when wouldn't privacy and the right to consent be a human rights?


References:

Arntson, P. (1989). Improving Citizens' Health Competencies. Health Communication. Lawrence Erlbaum Associates, Inc.

Gilliard, C. (2022). The Rise of ‘Luxury Surveillance’. The Atlantic. Retrieved from https://www.theatlantic.com/technology/archive/2022/10/amazon-tracking-devices-surveillance-state/671772/.

Gilliard, C. (2023). Unit 2.5 Guest talk: Chris Gilliard on Digital Redlining and Luxury Surveillance (53:56). [Digital Audio File]. Retrieved from https://canvas.upenn.edu/courses/1693062/pages/unit-2-dot-5-guest-talk-chris-gilliard-on-digital-redlining-and-luxury-surveillance-53-56?module_item_id=26838807.

IEEE. (2020). Letter to The Federal Trade Commission re: FTC Data Portability Workshop. [Digital File]. Retrieved from https://downloads.regulations.gov/FTC-2020-0062-0022/attachment_1.pdf.

Krieger, M. (2023). Unit 2.1 Defining privacy in data as an ethical issue (13:30). [Digital Audio File]. Retrieved from https://canvas.upenn.edu/courses/1693062/pages/unit-2-dot-1-defining-privacy-in-data-as-an-ethical-issue-13-30?module_item_id=26380202.


Previous
Previous

Unit 2: Synchronous Session Questions for Dr. Chris Gilliard

Next
Next

Unit 2 Takeaways: Can vs. Should