Unit 1 Reflection: When Technological Experiences Transcend our Understanding of the World

Video Reflection:


Supporting Materials:


Transcript:

Hey everybody. So for my unit one reflection what I'm really circling is this idea of when technological experiences transcend our understanding of the world. And that really starts with this idea of framing and categorical perception. I think Bateson’s ideas of psychological framing and then the extension of those from Goffman into notions of frames of reference strongly align with the ethics of artificial intelligence in surfacing all too human questions. And in particular, this notion of what happens when we begin to break that frame, or when the frame becomes less clear.

Goffman in particular talks about issues of governance and how we organize these things. But Romero sort of expands even further on that in terms of like, how we categorize them and how we shape language around this dissonance. What I've put here in the cyan is really how this this work in my thinking is connecting to my portfolio research project. And what I'm interested in is this idea of how we wrestle with the ethical questions, when we often lack the capacity and language to categorize within those frames of reference. And especially like I said, when we break the frames of reference. For example, one of the strong frames of reference we have in life is the difference between life and death. And our physical reality ends with death, but our digital reality really doesn't. Very often when we think about this from a framing perspective, that frame is often out of focus or broken as we grasp at definition and our relation towards that. So I think that there's this perceptual dissonance which builds on sort of Romero, Goffman and Bateson’s ideas, which motivates ethical fears about the future, but also discomfort with this idea of digital memory and preservation, specifically beyond death.

Further to this, you've got the complication of attention and agency. And one of the things I'm thinking about here is there's obviously strong commoditization of attention. But whereas we might think of attention as a commodifiable resource, it's also one which is somewhat infinitely renewable as well. It's harvested at the expense I think of agency through things like attention engineering, dark patterns, things like that. But relational proximity is a powerful motivator of attention, especially the preservation of memory around relation. I'm thinking about the exponential increase in photo sharing, through the rise of social platforms, things like that. So I think that attention and habituation are framed as tactics of monetization, but in many ways they actively seek to remove agency through the simulation of choice. It's not actually really choice is just highly effective prediction of Next. Further to that, I think our responsibility as citizens in the Arntson context, is really to be curious and ask questions around whether this is worth it. The value exchange between attention and what we get for for our attention, whether that's bought and sold by others or whether we give it freely of our own of our own fruition.

There's broad variations in cultural and faith based relations, in particular with digital presence, and whether that digital presence lives on beyond our physical presence. So in projecting values onto the technologies we use. We’re leaning here on sort of the Colom work here. There's obviously a large degree of questioning around what that value exchange refers to culturally whether it's an individualistic or collectivist perspective. But we often likely attention language or framing capacity to understand what that exchange means. And I'm thinking specifically of how liberal we are with with clicking terms of service agreements, for example.

So where I ended up after after the first unit is that it's not really just attention. It's really an economy of belief. And if we think about sort of extending Bateson’s idea of the conceptual framing here, we do have a responsibility, I believe, to be curious and to ask questions. But we also have a responsibility to ask what happens when our technological experiences transcend our understanding of the world. What happens when we break the frame by transcending something that happens is that is unfamiliar to us, like death.

It surfaces all manner of culturally defined issues, ethical issues and legal issues. But I think if you believe that our attention is commodifiable and a renewable resource, how we think about what's in and outside of that box, the curiosity that we exhibit, and the questions that we ask, are absolutely critical, ethical issues for how we understand them.


Previous
Previous

Unit 3: Synchronous Session Questions for Joanna Stern

Next
Next

It's Not Just The Algorithm That's Biased