[SIGCIS-Members] documenting or diagramming human computation

David Alan Grier grier at email.gwu.edu
Wed Feb 15 08:53:07 PST 2017

	Glad to be of help.  

	I think you comment on redundancy suggests one aspect of the managerial approach to computing.  The planners did indeed tend to break the computations into natural tasks, which tended to be a straight set of calculations.  By 1870, the best practices for these computers did not involve direct redundancy.  Babbage famously wrote that two computers working on the same set of data tended to make the same errors.  By the time the BAAS created its committee on tables, the calculating community had embraced that idea and decided that all calculations should be checked by doing them in different ways.  Eventually, the most common way was to have one computer do the calculation and a second computer, usually a member of the planning committee, reverse the process using the method of finite differences.  

	Are you familiar with Comrie’s Index to Tables?  It is the most complete critique of hand calculation techniques.  For example, he lambasts H. T. Davis for relying  on double calculation  (“the worst possible method of identifying errors” is the phrase I believe that he uses.)  At times it is exceptionally catty.  There is one critique of a woman who calculated some table and in so doing bravely ignored all the best practices of the time.  

	If you are looking for the mathematical principles upon which the management of computing groups is built, you have a dozen or so sources that give the basic process:  Janke and Emde (though not all the English versions have the computing preface.)  Pearson’s Tract for Computers (I think it is #7) that describes the general method (though it is an abstracted version and is probably best read at the end), Comrie and the MTP prefaces.  In the latter, there is a clear break after 1943 when Conrneilus Lanczos brought a much wider mathematical perspective to the group that moved them beyond Taylors Theorem and Gertrude Blanch had been given a dose of Scientific Management.  

	All the best,


David Alan Grier
Associate Professor of International Science and Technology Policy

Center for International Science and Technology Policy
Elliott School Of International Affairs
George Washington University
grier at gwu.edu 

How We Manage Stuff
http://HowWeManageStuff.com <http://howwemanagestuff.com/> 

Errant Hashtag:
Errantfeed.djaghe.com <http://errantfeed.djaghe.com/>

> On Feb 15, 2017, at 11:34 AM, Paul Fishwick <metaphorz at gmail.com> wrote:
> David
> Thanks for this wealth of information on human computing. I’ll begin with the Almanac and then proceed to
> citations indicated in your last paragraph. On diagrams, I wonder if another reason for the lack of them is that
> planners might have operated using redundancy with a decision at the planning level. This planning
> would have involved tasking each computer with a complete set of calculations. Results would have been compared
> (a bit like a human equivalent of the Saturn Computer with its triple-redundant logic) with decisions made by
> the planner rather than in a systematic way?
> -paul
> Paul Fishwick, PhD
> Distinguished University Chair of Arts, Technology, and Emerging Communication
> Professor of Computer Science
> Director, Creative Automata Laboratory
> The University of Texas at Dallas
> Arts & Technology
> 800 West Campbell Road, AT10
> Richardson, TX 75080-3021
> Home: utdallas.edu/atec/fishwick
> Blog 1: medium.com/@metaphorz
>> On Feb 14, 2017, at 12:27 PM, David Grier <grier at email.gwu.edu> wrote:
>> Paul
>> 	I found very few diagrams in my research.  The Planners tended to grids similar to spreadsheets.  Commonly the sheet would have some statement of the calculation on the left most column and the columns to the right would be filled with results with calculations from different starting values.  You can find examples of these spreadsheets going back to Maskelyne’s papers at the 18th century Royal Nautical Almanac.  
>> 	I think that diagrams were not common for a couple of reasons.  First,  the computers rarely had to make decisions as you would in a program.    Second, the algorithms that they were using tended to be variations of interpolation and required no decisions.  For example, I found no good examples of Gaussian elimination for Least Squares being mass produced.  While you can find computing offices that did it, these offices usually assigned that work to skilled individuals who didn’t need or want such guidance.  Third, the one algorithm that did require decisions - differencing for error detection, was not managed by the computing staffs.  While less skilled computers did the work, the Planning staff reviewed the results and made decisions form it.  Finally, I can find no good example of a human computing group that operated in a Fordist manner - with the kind of assembly line that Richardson envisioned in his writings about weather prediction.  I can find no example of a computation that extended beyond the scale of a single worksheet and hence required the worksheets to pass through the office in a systematic way. As far as I can tell, the offices used a modified market model.  They would put worksheets in a central place, let the computers choose the ones they wanted to do.  
>> 	It’s useful to compare the computing plans of L. J. Comrie in the 1920s or Gertrude Blanch in the 1930s with the industrial plans of the Ralph Flanders or office work plans of William Leffingwell.  It is clear that the human computers are aware of the work does in industrial organization but they are not using the kinds of flow models from industry.  The one big exception is, of course, the Columbia Astronomical group, but they are an IBM shop and IBM shops used diagrams to guide information processing.  
>> David
>>> On Feb 14, 2017, at 10:39 AM, Paul Fishwick <metaphorz at gmail.com> wrote:
>>> I saw Hidden Figures this weekend, which brought back some early days for me when I worked in
>>> Langley’s Structures Directorate as a systems analyst. After scouring the web for documents,
>>> and reading Grier’s book, I still wonder whether the computation steps were organized in some
>>> sort of diagram that would be used by a planner (?) to guide the human computers. I’ve also reviewed
>>> Pickering’s legacy, the Handbook of Human Computation, and Human Computation by Law and von
>>> Ahn. If you review Willey (1969) “Manual for Reduction of Data” by Helen H. Willey (Supervisory
>>> Mathematician), there are many equations but no visual guides as to who does wha,t and the
>>> structured flow of computed variables by computers.
>>> Here is a nice NASA site with citations: https://crgis.ndc.nasa.gov/historic/Human_Computers
>>> But I do not see anything resembling a time-ordered process. Today, we might expect data flow
>>> diagrams, business process notations, or something of the sort. What did they use back then, or
>>> perhaps they created computational and data flow order without explicating documenting it?
>>> -paul
>>> Paul Fishwick, PhD
>>> Distinguished University Chair of Arts, Technology, and Emerging Communication
>>> Professor of Computer Science
>>> Director, Creative Automata Laboratory
>>> The University of Texas at Dallas
>>> Arts & Technology
>>> 800 West Campbell Road, AT10
>>> Richardson, TX 75080-3021
>>> Home: utdallas.edu/atec/fishwick
>>> Blog 1: medium.com/@metaphorz
>>> _______________________________________________
>>> This email is relayed from members at sigcis.org, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sigcis.org/pipermail/members-sigcis.org/attachments/20170215/7ddc889d/attachment-0001.htm>

More information about the Members mailing list