[SIGCIS-Members] Correctness and verification

thomas.haigh at gmail.com thomas.haigh at gmail.com
Sat Jul 4 10:58:05 PDT 2020


Returning for part II of the answer, on actual bugs. 

 

There is a huge computer science literature relevant to Matt’s question, but the key words are “correctness,” “formal methods,” “specification language” and “verification” rather than “error” or “bug.” 

 

With the Pentium bug, IIRC some bogus value in a lookup table deep in the processor caused it to give wrong answers. That’s an example of a situation where a finished system doesn’t perform in line with requirements. But how are requirements expressed? Typically with a written specification that leaves many things ambiguous. Going back to the 1950s a lot of work in systems analysis was aimed at coming up with methods to get specifications right so that the systems built from them would do what was required. 

 

A problem in the final system might be a result of a mistake in specification or in implementation. The computer science answer to this was to express specifications completely and unambiguously in mathematical terms and then prove that the final software/hardware would always do what the specification said. Both tasks were enormously difficult. A lot of the history is told in MacKenzie, Donald. Mechanizing Proof. Cambridge, MA: MIT Press, 2001 which is an extremely good book full of very clear explanations of difficult topics. This history includes the intervention of a philosopher, J.H. Fetzer, who IIRC said that claiming to prove any correspondence between a mathematical specification and a material object is a category error. So I’m sure there are pointers there to follow up for more recent philosophical work on the topic.

 

I gave some autobiographically grounded thoughts on some of this in a recent paper “Assembling a Prehistory for Formal Methods” which also points to some papers by Niklaus Wirth and Cliff Jones giving insider views on the history. http://www.tomandmaria.com/Tom/Writing/FormalMethodsHistoryPreprint.pdf My suggestion is that the formal methods movement was in large part an outgrowth of the original impetus and core community behind both the Algol effort and the 1968 Software Engineering Conference (which has less than generally suggested to do with what was offered under the banner of “Software Engineering” by the 1980s). That appeared in special issue on the history of formal methods, which includes other, more technical, items of possible interest: https://link.springer.com/journal/165/31/6. 

 

Editing the Turing Award website (particularly ongoing work to add video clips to profiles) has also reminded me that several awards were given for work in this area, and in particular that the “model checking” approach has had a huge impact on how chip designs are tested. So Matt might want to look at the following entries:

 

https://amturing.acm.org/award_winners/hoare_4622167.cfm

https://amturing.acm.org/award_winners/sifakis_1701095.cfm

https://amturing.acm.org/award_winners/emerson_1671460.cfm

https://amturing.acm.org/award_winners/clarke_1167964.cfm

 

Best wishes,

 

Tom

 

 

From: Members <members-bounces at lists.sigcis.org> On Behalf Of Matthew Kirschenbaum
Sent: Friday, July 3, 2020 12:55 PM
To: members <members at sigcis.org>
Subject: [SIGCIS-Members] the nature of computational error

 

Hello all,

 

I am interested in a better understanding of the nature of computational error. My sense is that actual, literal (mathematical) mistakes in modern computers are quite rare; the notorious Pentium bug of the early 1990s is the exception that proves the rule. Most bugs are, rather, code proceeding to a perfectly correct logical outcome that just so happens to be inimical or intractable to the user and/or other dependent elements of the system. The Y2K "bug," for instance, was actually code executing in ways that were entirely internally self-consistent, however much havoc the code would wreak (or was expected to wreak).

 

Can anyone recommend reading that will help me formulate such thoughts with greater confidence and accuracy? Or serve as a corrective? I'd like to read something fundamental and even philosophical about, as my subject line has it, the nature of computational error. I'd also be interested in collecting other instances comparable to the Pentium bug--bugs that were actual flaws and mistakes hardwired at the deepest levels of a system.

 

Thank you-- Matt

 


-- 

Matthew Kirschenbaum
Professor of English and Digital Studies
Director, Graduate Certificate in Digital Studies
Printer's Devil, BookLab
University of Maryland

mgk at umd.edu <mailto:mgk at umd.edu> 

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sigcis.org/pipermail/members-sigcis.org/attachments/20200704/4b4b22de/attachment.htm>


More information about the Members mailing list