[SIGCIS-Members] "How Social Media's Giant Algorithm Shapes our Feeds."

Paul N. Edwards pedwards at stanford.edu
Fri Oct 29 13:25:03 PDT 2021

Great discussion - thanks everyone!

My thoughts: Wrt social media and big tech, the phrases “the algorithm” and “these algorithms” now function much like the phrase “these models” in many non-scientific discussions of climate science. They're massive generalizations, or straw figures to beat up on, or invoked as symbols of power relationships, stupid science, or human bias. The reality is more complex.

As an example, O’Neil’s “an opinion embedded in math” is dead-on right for some kinds of machine learning algorithms. But it’s dead wrong for many other algorithms. An algorithm for sorting lists alphabetically or numerically is not “an opinion embedded in math” since most of the time, alphabetical and numerical order are uncontroversial, widely used standards. Yet no distinction is made. “These algorithms.”

To describe usages like these, I like Malte Ziewitz’s phrase “the *figure* of the algorithm,” which is only loosely connected to what is actually going on under the hood. Ziewitz, Malte. 2017. “A not quite random walk: Experimenting with the ethnomethods of the algorithm.” Big Data & Society 4 (2): 205395171773810. doi:10.1177/2053951717738105

I see no reason to “correct” these usages most of the time – but they *can* become quite problematic when they’re really gestures at big tech, social media, human decisions leading to biased training data for ML, and so on. They obscure the complexity of how human choices and responsibilities interact with automatic activity in algorithmic systems.

For many with little understanding of how computer systems work, “the algorithm” becomes a proxy for “the programmers” and often assumes intent on the part of those programmers - but today, in many case programmers don't know in advance how their code will interact with all the other code in massive online systems that also interact with *other* massive online systems, all of them constantly changing on a near-daily basis. Nor do many discourses about algorithms fully take into the account how social media algorithms are constantly adapting in the continual back-and-forth between the human users of these systems and “the algorithm.”

Some ideas about all this appear in my article “We Have Been Assimilated: Some Principles for Thinking About Algorithmic Systems.” 2018. In IFIP Advances in Information and Communication Technology: Living with Monsters? Social Implications of Algorithmic Phenomena, Hybrid Agency, and the Performativity of Technology, edited by Ulrike Schultze et al., 19–27. Cham, Switzerland: Springer International Publishing. http://dx.doi.org/10.1007/978-3-030-04091-8_3

If you can’t get access directly, email me and I’ll send you a copy.



On Oct 29, 2021, at 09:16, Ellen Spertus <spertus at mills.edu<mailto:spertus at mills.edu>> wrote:

I liked this recent tweet from Colin McMillen [1<https://twitter.com/mcmillen/status/1449116726407401472>]:

algorithms are money laundering for unethical decision-making

I like Cathy O'Neil's discussion of algorithms<https://www.youtube.com/watch?v=heQzqX35c9A> (2:38), "an opinion embedded in math" and shared it with my Race, Gender, and Computing class.

I also shared David Malan's more conventional definition<https://www.youtube.com/watch?v=6hfOvs8pY1k> (4:57).

This email is relayed from members at sigcis.org<http://sigcis.org>, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org

Paul N. Edwards<https://profiles.stanford.edu/paul-edwards>

Director, Program on Science, Technology & Society<http://sts.stanford.edu>
William J. Perry Fellow in International Security and Senior Research Scholar
Center for International Security and Cooperation<http://cisac.fsi.stanford.edu/>
Co-Director, Stanford Existential Risks Initiative<https://cisac.fsi.stanford.edu/stanford-existential-risks-initiative>
Stanford University

Professor of Information<http://www.si.umich.edu/> and History<http://www.lsa.umich.edu/history/> (Emeritus)
University of Michigan

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sigcis.org/pipermail/members-sigcis.org/attachments/20211029/2b480e52/attachment.htm>

More information about the Members mailing list