[SIGCIS-Members] "How Social Media's Giant Algorithm Shapes our Feeds."

Emiel van Miltenburg C.W.J.vanMiltenburg at tilburguniversity.edu
Thu Oct 28 00:07:46 PDT 2021


Hi all,

It might be relevant in this context to watch the keynote video from Charles Isbell at NeurIPS 2020: https://nips.cc/virtual/2020/public/invited_16166.html

I’ve posted the title and abstract below, but the talk also discusses machine learning systems as “algorithms with gaps” that are filled in by using example data. I really like this framing, because it shows us how machine learning fits into the traditional programming paradigm, and it gives you a sense of (and an opportunity to discuss) the reliability of the different stages that a computer goes through to solve the problem at hand.

Best wishes,
Emiel van Miltenburg

——————

Invited Talk:
You Can’t Escape Hyperparameters and Latent Variables: Machine Learning as a Software Engineering Enterprise
Charles Isbell
Tue, Dec 8th, 2020 @ 02:00 – 04:00 CET

Abstract: Successful technological fields have a moment when they become pervasive, important, and noticed. They are deployed into the world and, inevitably, something goes wrong. A badly designed interface leads to an aircraft disaster. A buggy controller delivers a lethal dose of radiation to a cancer patient. The field must then choose to mature and take responsibility for avoiding the harms associated with what it is producing. Machine learning has reached this moment. In this talk, I will argue that the community needs to adopt systematic approaches for creating robust artifacts that contribute to larger systems that impact the real human world. I will share perspectives from multiple researchers in machine learning, theory, computer perception, and education; discuss with them approaches that might help us to develop more robust machine-learning systems; and explore scientifically interesting problems that result from moving beyond narrow machine-learning algorithms to complete machine-learning systems.

On 28 Oct 2021, at 07:28, Allan Olley <allan.olley at alumni.utoronto.ca<mailto:allan.olley at alumni.utoronto.ca>> wrote:

Hello,

My sense is talk of when an algorithm becomes many algorithms or the like is an example of a sorites paradox (how many items make a heap, if you take one item off a heap it is still a heap yet if you take 10 items off it is not etc.). How many steps can you add to an algorithm before it becomes a heap of algorithms, a blob (or a piece of software and how many lines of code before a piece of software becomes a suite of software and so on)?

My suspicion is talk of "the algorithm" may have started with Google's PageRank algorithm. My sense is that the original PageRank algorithm was a proper Knuthian algorithm of definite and limited size, but of course as they applied it to search and had to deal with various exigencies including people trying to game the algorithm there were endless additions and tinkering. So probably the scheme by which Google arranges search results is more like a heap of algorithms or  the Blob than the original PageRank algorithm, but it is often called an algorithm or "the algoirthm".

I am guessing the popular notion of algorithm grew from Google's PageRank to other not wholly dissimilar systems such as the method by which Facebook (and other social media sites) decides what we see on our feed or Youtube decides what videos to suggest we might want to watch and so on,


On Thu, Oct 28, 2021 at 12:49 AM Kimon Keramidas <kimon.keramidas at nyu.edu<mailto:kimon.keramidas at nyu.edu>> wrote:
Dear Paul,

I actually don’t think that’s an inappropriate use of the term and that term has certainly evolved in popular use to this extent. I would also say that it has grown in its uses in technical application. It may seem like a blob from one perspective but for Facebook, the system that decides a post’s position based on predictions is very much a “well-defined, finite set of steps that produces an unambiguous result.” They get exactly what they want by feeding data into that algorithm and getting a result that they can then apply to their business practices. I think that in this day and age a conception of how algorithms are conceived, executed and worked has to be more expansive as technologies are increasingly integrated into complex formulaic processes such as these. For example, I am certain that there is some level of AI built into Facebook's algorithm and therefore a level of complexity that seems “blob-like” but nonetheless is conceived and executed with the goal of unambiguous (at least from their perspective) algorithmic results by Facebook’s engineers.

Safiya Noble’s book blows this out even further as she argues for Algorithms of Oppression. Noble highlights that embedded social biases actually integrate themselves into the construction of computer-based algorithms. They embed themselves in such a way that we could say that these biases become acceptable cultural practices that integrate themselves into those “well-defined, finite steps of steps” if we start analyzing choices made in the construction of algorithms from a sociological as well as technical outlook.

And as far as whether people should consider some algorithms as something threatening. That probably wouldn’t necessarily be a bad thing at this point. I know that despite a long held skepticism towards all things Facebook even I have been shocked about some of the blatant abuses that are being revealed over the last few weeks.

Looking forward to further conversation.

Cheers,
Kimon

Kimon Keramidas, Ph.D.
Clinical Associate Professor, XE: Experimental Humanities & Social Engagement<http://as.nyu.edu/xe.html>
Affiliated Faculty, Program in International Relations

Pronouns: He/Him

New York University
14 University Place
New York, NY 10003

Co-Director - ITMO University International Digital Humanities Research Center<http://dh.itmo.ru/en_about>
Co-Founder - The Journal of Interactive Technology and Pedagogy<http://jitpedagogy.org/>
Co-Founder - NYCDH<http://nycdh.org/>

E kimon.keramidas at nyu.edu<mailto:kimon.keramidas at nyu.edu>
W http://kimonkeramidas.com<http://kimonkeramidas.com/>

The Sogdians: Influencers on the Silk Roads
Exhibition<https://www.freersackler.si.edu/sogdians>

The Interface Experience: Forty Years of Personal Computing
Exhibition<https://www.bgc.bard.edu/gallery/exhibitions/10/the-interface-experience>

The Interface Experience: A User’s Guide
Winner of the 2016 Innovation in Print Design Award from the American Alliance of Museums
Buy Book<http://store.bgc.bard.edu/the-interface-experience-a-users-guide-by-kimon-keramidas/>

On Oct 27, 2021, at 7:52 PM, Ceruzzi, Paul <CeruzziP at si.edu<mailto:CeruzziP at si.edu>> wrote:

This headline came from today's Washington Post, in a long above-the-fold article about Facebook's policies in determining what users see when they "like" a post. The article does not define the word, but describes an algorithm as "...a system that decides on a post's position on the news feed based on predictions about each user's preferences and tendencies." That sounds to me like a complex piece of software, with perhaps hundreds of lines of code, that takes in a lot of variables and produces a potentially wide range of outputs. It conjures up an image of something sinister and menacing. Not what Knuth defined as an "algorithm" in Volume One of his Art of Computer Programming. His definition has been refined over the years, but it retains the notion of a well-defined, finite set of steps that produces an unambiguous result.

Should we be bothered that the Post (and I assume other newspapers) are not using the term properly?  Are people now going to think of an "algorithm" as something threatening, like "The Blob" in that famous Steve McQueen movie?

Paul Ceruzzi

Tom Haigh & Paul Ceruzzi, A New History of Modern Computing (MIT Press 2021)
_______________________________________________
This email is relayed from members at sigcis.org<http://sigcis.org/>, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sigcis.org_pipermail_members-2Dsigcis.org_&d=DwICAg&c=slrrB7dE8n7gBJbeO0g-IQ&r=8lhfdn1CVAMK_u4AbH6K2X3Rh95e5EvJanbOcfGalCo&m=ea2IyyzGMBYqnJeWKHghD5FkVqtgxsNckj3MEVinsCQ&s=3RMyQdkyVkCEDElaIzRGbRU2gLRvNdv47KpPH6ucCnY&e=  and you can change your subscription options at https://urldefense.proofpoint.com/v2/url?u=http-3A__lists.sigcis.org_listinfo.cgi_members-2Dsigcis.org&d=DwICAg&c=slrrB7dE8n7gBJbeO0g-IQ&r=8lhfdn1CVAMK_u4AbH6K2X3Rh95e5EvJanbOcfGalCo&m=ea2IyyzGMBYqnJeWKHghD5FkVqtgxsNckj3MEVinsCQ&s=5ORNb_FCuQhU8VwTR2W2fTju2szcYKM7i-cfA7rrbnQ&e=


_______________________________________________
This email is relayed from members at sigcis.org<http://sigcis.org/>, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org
--
Yours Truly,
Allan Olley, PhD

http://individual.utoronto.ca/fofound
_______________________________________________
This email is relayed from members at sigcis.org<http://sigcis.org>, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sigcis.org/pipermail/members-sigcis.org/attachments/20211028/c1ec0970/attachment.htm>


More information about the Members mailing list