Essay by Zora Che: “…In sociologist Charles Cooley’s theory of the “looking glass of self,” we understand ourselves through the perceptions of others. Online, models perceive us, responding to and reinforcing the versions of ourselves which they glean from our behaviors. They sense my finger lingering, my invisible gaze apparent by the gap of my movements. My understanding of my digital self and my digital reality becomes a feedback loop churned by models I cannot see. Moreover, the model only “sees” me as data that can be optimized for objectives that I cannot uncover. That objective is something closer to optimizing my time spent on the digital product than to holding my deepest needs; the latter perhaps was never a mathematical question to begin with.
Divination and algorithmic opacity both appear to bring us what we cannot see. Diviners see through what is obscure and beyond our comprehension: it may be incomprehensible pain and grief, vertiginous lack of control, and/or the unwarranted future. The opacity of divination comes from the limitations of our own knowledge. But the opacity of algorithms comes from both the algorithm itself and the socio-technical infrastructure that it was built around. Jenna Burrell writes of three layers of opacity in models: “(1) opacity as intentional corporate or state secrecy, (2) opacity as technical illiteracy, and (3) an opacity that arises from the characteristics of machine learning algorithms and the scale required to apply them usefully.” As consumers of models, we interact with the first and third layer of the opacity―that of platforms hiding models from us, and that of the gap between what the model is optimizing for and what may be explainable. The black-box model is an alluring oracle, interacting with us in inexplicable ways: no explanation for the daily laconic message Co-Star pushes to its users, no logic behind why you received this tarot reading while scrolling, no insight into the models behind these oracles and their objectives…(More)”.