Unit 4 Takeaways: With Great Power Comes Great Responsibility
Unit 4 Reflection:
Ethically responsible governance in the development and distribution of artificial intelligence services is deeply shaped by the all-too-human problems of unintended consequence, fear of the future, bias and flawed, culturally nuanced decision-making. But human problems also ache for human solutions, and what we choose to do next is a critical aspect of addressing unforeseen consequence, and moving from a space of anxiety to a place of assurance. We might do this work earlier in the development cycle as Tricarico proposes, around ethical procurement and the disclosures required to shape our own relationship with the platform (Tricarico, 2023), or reactively, swiftly, in addressing that which developers cannot be held responsible for not seeing (Peters et al., 2020). The important thing is that wherever we are in that development and distribution cycle, we as individuals do the work to address issues of responsibility raised by our own use of the products, but how might we build a stronger culture of personal accountability independent of broader regulation?
References:
Peters, D., Vold, K., Robinson, D. & Calvo, R.A. (2020). Responsible AI—Two Frameworks for Ethical Design Practice. IEEE Transactions on Technology and Society. [Digital File]. Retrieved from: https://canvas.upenn.edu/courses/1693062/files/122342365/download?download_frd=1.
Tricarico, M. (2023). Unit 4 Guest Lecture: Marisa Tricarico (25:35). [Digital Video]. Retrieved from: https://canvas.upenn.edu/courses/1693062/pages/unit-4-guest-lecture-marisa-tricarico-25-35?module_item_id=26566793.
Class Takeaway:
Artificial intelligence is reshaping social norms, issues of identity and citizenship, privacy, and the boundaries we establish between ourselves and others. As it reshapes who we are as individuals, and who we are to each other, it is even reframing how we think about the space between life and death. In the emerging field of grieftech, our capacity to digitize the storytelling of human experience, reducing it to a series of prompted responses stored on a remote server for recall in the future, raises deeply ethical questions of faith, prolonged bereavement and the masking of pain as a fundamental part of human experience. It surfaces deeply problematic, culturally nuanced questions of the difference between can and should, especially when the implementation of technologies motivates the often unintended consequences of reinforcing bias, discrimination, exclusion and hierarchy.