[SIGCIS-Members] How the Digital Camera Transformed Our Concept of History

Peter Sachs Collopy peter at collopy.net
Mon Aug 3 22:17:23 PDT 2020


Apologies for picking up an old thread, but I’ve just been catching up on my SIGCIT reading. Tom’s excerpt from History of Modern Computing got me thinking about the relationship between continuous and discrete in television and video and how that relates to computing and digital photography, and then I remembered I’d written something about this which I should perhaps share with all of you.

“Video and the Origins of Electronic Photography <https://resolver.caltech.edu/CaltechAUTHORS:20190831-153442412>” was published in French translation last year in the history of photography journal Transbordeur. It’s available in English on the web from Caltech’s institutional repository. Here’s the abstract:

In our historical imagination, the recent digital revolution in photography can obscure an earlier revolution which was less total but more profound. This was the analogue revolution, the translation of images into continuously varying electrical signals and magnetic fields. Electronic analogue photography manifested as television and video, as well as technologies for recording still images like the Ampex Videofile of the 1960s and the Sony Mavica of the 1980s. Key components of these new technologies systems, including techniques for high-fidelity recording and vacuum tubes for video cameras, were first developed for military ends. Analogue electronics had this and much else in common with digital technologies, including the physical media on which they recorded. The deep transformation of photography in the last century, then, was not digitization but the replacement of photochemistry with electromagnetic media, analogue and digital alike.

Peter


> On Jul 10, 2020, at 08:45, Deborah Douglas <ddouglas at mit.edu> wrote:
> 
> Fascinating…
> 
> For those interested in Polaroid and digital photography, Peter Buse provides a nice synopsis in Chapter 3 of his book “The Camera Does the Rest: How Polaroid Changed Photography”.  Preoccupied with the introduction of integral film and the SX-70 in the 1970s, it seems Polaroid did not get series about electronic imaging until the late 1970s/early 1980s.  (You start reading about digital imaging (“filminess cameras” or “”electronic cameras”) in Polaroid’s press around the mid 1980s.).  Most of the records are at Baker Library at Harvard.
> 
> As for the cost of film versus digital images, you might find it interesting to include the cost of Polaroid “instant film” as that eliminated two of those drugstore trips and provided more immediate gratification.  (When the SX-70 camera was released in 1972 for a retail price of $180; a pack of film with 10 images was $6.90.).  Most serious historians of Polaroid consider the overnight Fotomat followed by the 1-hour mini-lab technology that were the main “killers” of instant photography for consumer use.  (The “nod to the pod” obsession that became gospel during the Polaroid v. Kodak lawsuit also created a stultifying atmosphere within the company that worked against the development of digital technologies…or any other technologies is another big factor.)
> 
> Probably more than you wanted to know about Polaroid but the great stories below got me thinking!
> 
> 
> Debbie Douglas
> 
> 
>> On Jul 10, 2020, at 1:33 AM, thomas.haigh at gmail.com <mailto:thomas.haigh at gmail.com> wrote:
>> 
>> It's not quite the same thing, but in the Revised History of Modern Computing (with Paul Ceruzzi, coming soon from MIT Press) we've tried to integrate the history of digital imaging into the history of computing. It seems necessary, not least because digital cameras are computers in disguise (and because the images were stored, edited, and transmitted on more recognizable kinds of computer). The topic comes back in the later discussion of smartphones and device convergence in the final chapter but as a sneak preview here is the subsection “Digital Cameras” from Chapter 10: The Computer Becomes a Universal Media Device. Would be happy to hear of any errors while there is still time to fix them….
>>  
>> Tom
>>  
>> Digital Cameras
>> Television worked by dividing a picture into a grid of dots. Even the term “pixel” (for picture element) which we now associate with computer equipment originated in the television equipment industry. Back in 1945, working on the “First Draft” EDVAC design, John von Neumann was fascinated by the idea potential of the iconoscope, an electronic tube used in early television cameras, as a storage device.
>> In television, however, intensity of each dot was transmitted as an analog value. Turning pixels into numbers was the job of the frame grabber. This captured a single frame from a video input and turned it into a bitmap image. Frame grabbers were used for video production work and were built into specialist video manipulation hardware to create special effects. A related piece of hardware, the gen lock, synchronized computer displays with video image so that computer generate titles and graphics could be added to videos. These devices were expensive, purchased mostly by video production companies to liven up music videos, advertisements, and wedding footage with titles and special effects.[1] <x-msg://19/#_ftn1>
>> Today digital video sensors are everywhere. The crucial development was the charged coupled device (CCD), which combined a semiconductor with a light sensitive layer. Fairchild Semiconductor began to sell a 100x100 light sensor in 1974. That provided the basis for an experimental digital camera at Kodak. When light was focused onto the sensor matrix numbers could be read off the chip. Space missions had a particular need for tiny and reliable digital imaging technologies, creating pictures that could be beamed back to Earth. Techniques had been developed back in the 1960s, original for spy satellites, to expose film and then scan it and transmit images digitally back to earth. Being able to take high quality digital still images directly was much simpler and faster. By 1978 a KH-11 spy satellite was using a CCD with, reportedly, an 800x800 resolution. <>[2] <x-msg://19/#_ftn2> The Hubble Space Telescope, launched in 1986, used a similar size mirror but gave much higher resolution CCD sensors a very public showcase.[3] <x-msg://19/#_ftn3>
>> Back on earth, the first big market was for cheaper “one dimensional” sensors able to scan a single line. Flatbed scanners and fax machines moved the scanner across against the page to capture the entire image gradually. (A similar digital scanning approach had been pioneered with the photo diode cameras of the Viking Mars landers. It worked well, albeit slowly, as neither the platform nor the landscape was moving). Commercializing digital cameras took longer, because many more sensor elements were needed to capture an entire image at once. The technology made a brief consumer appearance in 1987, in the PXL-2000 “PixelVision” camera produced by toy company Fisher Price. It recorded highly pixelated video onto standard audio cassettes, later becoming a favorite of hipster artists.[4] <x-msg://19/#_ftn4> CCDs were also used in some of the analog camcorders of the 1980s, bulky devices that combined a video cassette recorder and a television camera into a single box.
>> By the mid-1990s higher resolution sensors and the chips and memories to deal with the large files they produced were becoming affordable. They made their way into two related kinds of product. Digital video cameras could store one hour of crisp, high resolution footage on special tapes as 13 gigabytes of computer data. Computers fitted with a Firewire connection (also used by early iPods) could extract digital video, edit it, and write the results back to the tape without any loss of quality.
>> The other kind of digital camera was patterned after traditional cameras. Camera manufacturers competed on “megapixels” – how many millions of pixels the sensor element handled. At the end of the 1990s most had just one or two megapixels, capturing images that looked good on screen but would appear jagged when printed out. 
>> Because they were optimized for still images, which took less space than video, most still cameras used chip-based flash memory cards rather than tape (though some early models used floppy disks or CDs). Flash retained data when power was turned off but could be quickly and selectively overwritten. It was introduced by Toshiba in 1987, finding early applications in computers to store configuration settings for computers and to hold BIOS code in an easily updatable form. The cards used in early digital cameras could store only a few megabytes but, as with other memory chips, their capacities rose into the gigabytes as transistors shrank. Because it was very compact and power efficient, high capacity flash memory capacities was a crucial enabling technology for the creation of new portable devices. Flash memories able to store hundreds of gigabytes ultimately replaced hard disk storage in most PCs, though this took longer than expected because magnetic disk capacities increased even faster than chip densities during the 1990s and early-2000s. 
>> The digital cameras of the late-1990s were bulky, had small screens, and would deplete their batteries and fill their memory cards after taking just a few dozen images. Compared to the models available even a few years later they were terrible, but the relevant comparison was with consumer film cameras. Conventional film cartridges held only 24 or 36 pictures. Seeing those pictures cost at least ten dollars and usually took three trips to a drugstore, to buy the film, to drop it off for processing, and to collect the prints. Pocket sized camera forced users to squint through a plastic window, giving a vague idea of what might appear in a photograph. Larger, more expensive  single lens reflex cameras took better pictures and showed whether an image was in focus. Little wonder that most people took out their camera only for vacation trips and special occasions. 
>> Even the most primitive digital cameras enabled new photographic practices Digital cameras caught on fastest for business that needed to shoot images and use them immediately, for real estate sales, corporate newsletters, or identity cards. Their direct competition was Polaroid instant cameras, which had high running costs and mostly took small pictures. As prices dropped and picture quality improved, consumers began to buy digital cameras, and to take far more pictures than ever before. Vacations were now captured with hundreds of pictures, not just one or two films. Teenagers could mimic the practices of fashion photographers by taking a few dozen shots of a friend and using the best one. Since the early 2000s, daily life has been visually recorded on a scale unmatched in earlier history, a phenomenon known as “ubiquitous photography.”[5] <x-msg://19/#_ftn5>
>> Early memory cards held only a few megabytes, needing aggressive compression to hold even a dozen images. That was provided by a new image format, the JPEG (named for the Joint Photographic Experts Group), a cousin to the MP3 format that used a fractal compression algorithm to achieve similarly impressive reductions in file size. In 1991, when libjpeg, a widely used open source code module for JPEG compression, was released, it took a powerful PC to create these files. By the late 1990s the computer power could be put into a camera, though early models would be tied up for several seconds processing each image. Once the memory card was full, users moved the files onto a computer. Digital photography was another of the practices made possible by the arrival of PCs with voluminous hard drives as a standard feature of middle-class households. People who wanted to print out their photographs could still go to the drug store, or purchase an affordable little color printer, but photographs were viewed more and more on screens. They were shared with friends and family by email, or by copying them onto a zip disk or burning onto a CD rather than by handing over an envelope full of duplicate prints.
>> Screens got bigger, images sharper, battery life longer, camera bodies smaller, and sensors better. By the early 2000s sensors with a dozen megapixels were common, enough that the image quality would be limited primarily by the quality of the camera’s optics. Cameras began to use a different sensor technology, called CMOS after the chip technology it is based on. CMOS imaging was prototyped at the Jet Propulsion Laboratory, a centerpiece of the US space probe program. The new technology produced camera sensors cheaper, smaller, and lower powered than those based on CCDs. By 2006 a camera costing a few hundred dollars would fit in a trouser pocket, take hundreds of images without changing a battery or a memory card, and offer better image quality than any compact film-based consumer camera. Improvements under low light conditions, taking photographs at night or indoors without a flash, were particularly dramatic. 
>>  
>>  
>> -----Original Message-----
>> From: Members <members-bounces at lists.sigcis.org <mailto:members-bounces at lists.sigcis.org>> On Behalf Of Brian Randell
>> Sent: Thursday, July 9, 2020 5:22 AM
>> To: Sigcis <members at sigcis.org <mailto:members at sigcis.org>>
>> Subject: [SIGCIS-Members] How the Digital Camera Transformed Our Concept of History
>>  
>> Hi:
>>  
>> "How the Digital Camera Transformed Our Concept of History" is the title of a paper by Allison Marsh that has just been published by IEEE Spectrum.
>>  
>> It starts:
>>  
>> > For an inventor, the main challenge might be technical, but sometimes it’s timing that determines success. Steven Sasson had the technical talent but developed his prototype for an all-digital camera a couple of decades too early.
>> > 
>> > A CCD from Fairchild was used in Kodak’s first digital camera 
>> > prototype It was 1974, and Sasson, a young electrical engineer at Eastman Kodak Co., in Rochester, N.Y., was looking for a use for Fairchild Semiconductor’s new type 201 charge-coupled device. His boss suggested that he try using the 100-by-100-pixel CCD to digitize an image. So Sasson built a digital camera to capture the photo, store it, and then play it back on another device.
>> > 
>> > Sasson’s camera was a kluge of components. He salvaged the lens and exposure mechanism from a Kodak XL55 movie camera to serve as his camera’s optical piece. The CCD would capture the image, which would then be run through a Motorola analog-to-digital converter, stored temporarily in a DRAM array of a dozen 4,096-bit chips, and then transferred to audio tape running on a portable Memodyne data cassette recorder. The camera weighed 3.6 kilograms, ran on 16 AA batteries, and was about the size of a toaster.
>> > 
>> > After working on his camera on and off for a year, Sasson decided on 12 December 1975 that he was ready to take his first picture. Lab technician Joy Marshall agreed to pose. The photo took about 23 seconds to record onto the audio tape. But when Sasson played  it back on the lab computer, the image was a mess—although the camera could render shades that were clearly dark or light, anything in between appeared as static. So Marshall’s hair looked okay, but her face was missing. She took one look and said, “Needs work.”
>> > 
>> > Sasson continued to improve the camera, eventually capturing impressive images of different people and objects around the lab. He and his supervisor, Garreth Lloyd, received U.S. Patent No. 4,131,919 for an electronic still camera in 1978, but the project never went beyond the prototype stage. Sasson estimated that image resolution wouldn’t be competitive with chemical photography until sometime between 1990 and 1995, and that was enough for Kodak to mothball the project.
>>  
>> The article ends:
>>  
>> > Digital cameras also changed how historians conduct their research For 
>> > professional historians, the advent of digital photography has had other important implications. Lately, there’s been a lot of discussion about how digital cameras in general, and smartphones in particular, have changed the practice of historical research. At the 2020 annual meeting of the American Historical Association, for instance, Ian Milligan, an associate professor at the University of Waterloo, in Canada, gave a talk in which he revealed that 96 percent of historians have no formal training in digital photography and yet the vast majority use digital photographs extensively in their work. About 40 percent said they took more than 2,000 digital photographs of archival material in their latest project. W. Patrick McCray of the University of California, Santa Barbara, told a writer with The Atlantic that he’d accumulated 77 gigabytes of digitized documents and imagery for his latest book project [an aspect of which he recently wrote about for Spectrum].
>> > 
>> > So let’s recap: In the last 45 years, Sasson took his first digital picture, digital cameras were brought into the mainstream and then embedded into another pivotal technology—the cellphone and then the smartphone—and people began taking photos with abandon, for any and every reason. And in the last 25 years, historians went from thinking that looking at a photograph within the past year was a significant marker of engagement with the past to themselves compiling gigabytes of archival images in pursuit of their research.
>> > So are those 1.4 trillion digital photographs that we’ll collectively take this year a part of history? I think it helps to consider how they fit into the overall historical narrative. A century ago, nobody, not even a science fiction writer, predicted that someone would take a photo of a parking lot to remember where they’d left their car. A century from now, who knows if people will still be doing the same thing. In that sense, even the most mundane digital photograph can serve as both a personal memory and a piece of the historical record.
>>  
>> Full story at 
>>  
>> https://spectrum.ieee.org/tech-history/silicon-revolution/how-the-digital-camera-transformed-our-concept-of-history <https://spectrum.ieee.org/tech-history/silicon-revolution/how-the-digital-camera-transformed-our-concept-of-history>
>>  
>> Cheers
>>  
>> Brian Randell
>>  
>>>>  
>> School of Computing, Newcastle University, 1 Science Square, Newcastle upon Tyne, NE4 5TG
>> EMAIL = Brian.Randell at ncl.ac.uk <mailto:Brian.Randell at ncl.ac.uk>   PHONE = +44 191 208 7923
>> URL = http://www.ncl.ac.uk/computing/people/profile/brianrandell.html <http://www.ncl.ac.uk/computing/people/profile/brianrandell.html>
>>  
>> _______________________________________________
>> This email is relayed from members at sigcis.org <http://sigcis.org/>, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ <http://lists.sigcis.org/pipermail/members-sigcis.org/> and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org <http://lists.sigcis.org/listinfo.cgi/members-sigcis.org>
>> 
>> [1] <x-msg://19/#_ftnref1> Commodore’s Amiga was well suited to video production, thanks to high resolution video modes that functioned well with inexpensive genlock and frame grabber hardware. Maher, The Future Was Here: The Commodore Amiga, ch. 5.
>> [2] <x-msg://19/#_ftnref2> On the history of spy satellites, see William E Burrows, Deep Black: Space Espionage and National Security (New York: Random House, 1987).
>> [3] <x-msg://19/#_ftnref3> R W Smith and J N Tatarewicz, "Replacing a Technology: The Large Space Telescope and CCDs," Proceedings of the IEEE 73, no. 7 (July 1985):1221-1235.
>> [4] <x-msg://19/#_ftnref4> Chris O'Falt, "Pixelvision: How a Failed '80s Fisher-Price Toy Became One of Auteurs' Favorite '90s Tools", IndieWire, 2018, https://www.indiewire.com/2018/08/pixelvision-pxl-2000-fisher-price-toy-experimental-film-camera-lincoln-center-series-1201991348/ <https://www.indiewire.com/2018/08/pixelvision-pxl-2000-fisher-price-toy-experimental-film-camera-lincoln-center-series-1201991348/>.
>> [5] <x-msg://19/#_ftnref5> Martin Hand, Ubiquitous Photography (Malden, MA: Polity Press, 2012).
>> _______________________________________________
>> This email is relayed from members at sigcis.org <http://sigcis.org/>, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ <http://lists.sigcis.org/pipermail/members-sigcis.org/> and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org <http://lists.sigcis.org/listinfo.cgi/members-sigcis.org>
> Deborah G. Douglas, PhD • Director of Collections and Curator of Science and Technology, MIT Museum; Research Associate, Program in Science, Technology, and Society • Room N51-209 • 265 Massachusetts Avenue • Cambridge, MA 02139-4307 • ddouglas at mit.edu <mailto:ddouglas at mit.edu> • 617-253-1766 telephone • 617-253-8994 facsimile • http://mitmuseum.mit.edu <http://mitmuseum.mit.edu/> • she/her/hers
> 
> 
> 
> 
> 
> _______________________________________________
> This email is relayed from members at sigcis.org, the email discussion list of SHOT SIGCIS. Opinions expressed here are those of the member posting and are not reviewed, edited, or endorsed by SIGCIS. The list archives are at http://lists.sigcis.org/pipermail/members-sigcis.org/ and you can change your subscription options at http://lists.sigcis.org/listinfo.cgi/members-sigcis.org

-------------- next part --------------
An HTML attachment was scrubbed...
URL: <http://lists.sigcis.org/pipermail/members-sigcis.org/attachments/20200803/f6e8bf08/attachment.htm>


More information about the Members mailing list