Скачать книгу

were projects at Data Translation trying to grab NTSC color video and the lesson from that was that ‘basic’ components to do this gave really lousy results.

       COLORSPACE

      While Data Translation continued its research, others sold video boards that could drive color monitors and be used as 'frame grabbers'. By plugging in a camcorder users could 'grab' video frames to incorporate video images into a document or pass into an application like Aldus' PageMaker.

      Pixelogic had the ProViz digitizer, SuperMac the Spectrum/8 board, RasterOps the Colorboard 104 but it was Mass MicroSystems that promised desktop video with its ColorSpace card. Mass Micro founder Thomas Massie declared:

       We want to bring video production to the desktop.

      Trade magazines used the words ‘video’ and ‘Mac’ in the same sentence but the early marketing claims by video enablers were misleading.

       You can do things with this board and the Mac that you could only do before with $100,000 to $150,000 systems.

      The ColorSpace card claimed to ‘capture’ live video images at a rate of 30 frames per second in full color when in reality the only live component was the user’s monitoring of a video source on an Apple monitor. To capture video, an editor had several options.

      It was possible to immediately capture a single frame of video in 256 shades of gray or digitise a single frame of video at VHS quality that tok two to three seconds. Per frame.

      Teams from Boston to Toledo and Texas experimented with computers and video.

      A team at the Massachusetts Institute of Technology (MIT) underscored how far there was to go before genuine full-frame video editing arrived on every Mac. The ‘Movies of the Future’ team in the MIT Media Lab succeeded in playing compressed digital movies from a Macintosh's hard drive even though the one minute video clip used in demonstrations had taken a week to prepare using a DEC VAX minicomputer.

      Graduate students Bill Butera and Pat Romano developed algorithms to transmit data to a video board designed by John Watlington and built with help of Apple engineer James Lundblad.

      To accommodate the size of the custom video board, the MIT team had cut a slot in the case of the Mac II.

      Another team were pushing the boundaries of video images on a VAX at the Medical College of Ohio.

      Joseph Klingler, Lee (Tom) Andrews, Clifton Vaughan, Bruce Behrendt, Konstantyn Szwajkun, Carlos Baptista, Jacob Zeiss, Richard Leighton and Bruce Weide were applying computer technology to the medical environment. And they used a wide variety of workstations from a DEC VAX, to Sun Workstations, a Pixar Image Computer and a VICOM image processor.

      Klingler recalls:

       We used quantitative analysis on medical images at a time when many imaging modalities were just being invented. Being a college, we had most of the machines available such as CAT scanning (now CT), NMR (now MRI), phased-array (2D) ultrasound and digital radiography (including cine-angiography) as well as the support of a group of MD's who were very research oriented.

       We wanted to ask the question: Can we develop computer algorithms that measure information that will help the doctors make a diagnosis?

      Klingler's group acquired a Macintosh II and Behrendt created an image processing application called Image Workbench to help with the study of cardiovascular imaging.

       Our system was based on HyperCard and a standard Macintosh II to integrate hypertext retrieval, computer graphics, sound, and medical images into a single interactive environment stored on a standard hard disk.

      The 'hypermedia' approach gave medical students direct, immediate, easy traversal of the images and related text as well as the opportunity for them to move at their own pace. A lot like the requirements for video editors. Klingler and Andrews then formed the Image Analysis Research Center within the College to pursue the use of technology with medicine.

      In time their focus was digital editing systems for editors, not doctors. In Texas, Rush Beesley was looking to solve the problems he had encountered with video and editing. Beesley had moved into 10,000 square foot facility, built a 24-track audio production room; sound stage; off-line and on-line editing rooms (IVC-9000 2” and Bosch-Fernseh 1” VTRs) and started Sundance Technology Group.

      Soon enough his workload caused a bottleneck.

       Our decision to create ‘yet another’ editing system was based on customer need. The need for a more efficient method of creating a relationship between video and descriptive information, i.e. creating, naming and describing ‘virtual clips’. I couldn’t read my own handwriting … and having to ‘bicycle’ and cue multiple tapes manually was a tremendous waste of time that should have been spent making edit decisions.

      The “Quicksilver System” was created to enhance productivity by eliminating countless hours of labor-intensive ‘off-line’ editing time. This was accomplished by using a computer - our little Mac Classic to control VCRs (later laser disc players) and be able to locate specific scenes on the tapes/discs by key word search. The never credited 'programmer' was a field service rep for Data General who had worked on main frame computers. He experimented with emerging small microprocessors, and built his own frame buffer based on the IMSAI 8080 platform. From there he went to the Mac.

       Traditional off-line tape editing involved copying the master 1” acquisition ‘reels’ to U-Matic cassettes, then reviewing those cassettes one at a time. I had to write a time-code number and a description for each scene, ending up with dozens of pages of scene descriptions. To find a scene I wanted to use … for example, a particular sunrise for the opening shot … I had to pore through those pages, trying to read my barely legible handwriting in search of a sunrise shot.

       Then I had to located that cassette in my stack of tapes, put it in an available player, and manually cue to that time code to review the scene. More often than not it was one of several sunrises … but not the one I wanted. With the Mac communicating with the VCRs via RS-232 and time code, the logging process was still ‘linear’ … but under computer control. Manipulating the VCR motion control from command keys and/or with the mouse, we marked IN and OUT points for each scene (automatically stamping the Reel name/number), and typed in the scene description.

       Once this computerized logging was complete, the 'Magic' happened. To find the sunrise shot I wanted to use for the opening of the show, I just entered ‘sunrise’ in the Search field and clicked FIND. Not unlike a Google search today, the screen then displayed a list of all scenes with the word ‘sunrise’ in the description.

       I could tell by my complete description which sunrise scene I wanted. I clicked “PREVIEW” in the control software, and the prompt indicated “Put Tape 7 in VCR 2”, automatically selecting an available VCR depending upon the hardware configuration.

       Once the tape was inserted it automatically ‘fast forwarded’ to the IN point … where I could either manually jog the machine to trim the IN and OUT and remark … or just hit “Preview” again to see the insert shot, selecting V, A1, A2 or any combination. The time saved using key words to locate scenes, and auto-cueing of scenes, was enormous.

       We reduced an off-line editing session for a typical 12 minute corporate video from days to hours.

      In February 1988, Bill Warner and Jeff Bedell (above 2013) knew they had made enough progress with the editing system prototypes to get investor feedback. Warner invited Bill Kaiser to the former machine shop in Burlington:

       I have since seen a lot of start-ups but Avid, was my first and it was by far and away

Скачать книгу