Alright, apart from the sanitation, the medicine, education, wine, public order, irrigation, road, the fresh water and public health, what have the romans ever done for us.
Much like the Monty Python scene from Life of Brian, some might well say What has NASA ever done for us?
NASA is often derided by for spending money on things that don’t seem to have any purpose other than an academic understanding of the universe even though us humans can’t yet travel to our nearest planets let alone the nearest stars and this to many people looks like big boys toys with no direct benefit unless your a billionaire on a joy ride, even though NASA doesn’t offer space tourism.
But the technology developed to do these seeming far off things in the harsh environment of space has in many cases been used for far more earthly purposes and is used in many of the things that we use on a daily basis.
Although the list of NASA developments runs in to the thousands and 1 in every 1000 US patents is granted to someone working at NASA, I thought I’d look at just one these the spin offs that have made it big here on earth, the digital camera that is now found in billions of devices.
Although many of the things used by NASA are made and developed by outside contractors, some times the pure research that NASA does ends up going out the other way to become incredibly successful products and one of these was the digital camera and the sensors used to capture images.
The first digital camera was made by Eastman Kodak in 1975 but the actual concept was first proposed by NASA engineer Eugene F. Lally in the 1960s.
Lally had been a keen photographer since the age of 14 and also had published a solution in a photography magazine to reduce red eye caused by photographic strobe lighting which is still used today.
During his tenure at NASA he wrote several papers about spacecraft design and methods to explore the moon, Mars, Mercury & Jupiter. But his one of his most well known was the use a mosaic array of optical sensors that would sense and digitize light signals to produce an electronic image which could be fed in to a computer that would track stars and planets during missions, much like the sailors would navigate by the stars using a sextant in the old days.
All modern spacecraft now use this type of star tracker as part of their navigation but back then it was the first time that optical sensors had been used to construct a digital image and Lally is acknowledged to have coined the phrase Digital Photography whilst working at JPL, even the word pixel which is short for picture element was first published by NASA JPL engineer Frederic Billingsley in 1965.
Although the first digital camera was built by kodak, cameras using solid-state based sensors had been around since the early 1970s.
The basis for these camera sensors was our old friend the MOSFET but in this case it was Metal Oxide Semiconductor or MOS capacitors. When light fell on to them, the photons were converted into an electrical charge which was stored in the capacitor.
It was released at the same time that it was relatively simple to build MOSETS with capacitors in rows and clock them in such a way that they would pass the charge from one to another basically an analogue shift register or bucket brigade where water was moved by passed it from one bucket to another.
Soon Large arrays of sensors could capture an image with the level of light falling on individual pixels converted into a charge in a similar way to film cameras which use grains of chemicals to record light.
To read the charge from each sensor they used the analogue shift register technique and transferred the pixel charge one at a time from one MOSFET to another.
Once the charge was transferred to the edge of the array, they were shuffled down one at a time and read with an external analogue to digital convertor which created a digital numeric version of the charge stored which represented the amount of light that fell on to a particular pixel and the digital image could be built up bit by bit.
These devices were called CCD’s or Charge Coupled Devices and became the new standard for electronic cameras. Because each of the sensors could be larger and generated less noise, they were ideal for space and scientific purposes and well as high end broadcast TV as a replacement for vacuum tube cameras.
But these were still analogue devices and all the shuffling of electrical charges and extra circuitry required to convert it to digital information made them unsuitable for fast moving images and relatively power hungry. Also Because they moved tiny charges from one MOSFET to another they were susceptible to particle radiation especially in space which would introduce noise.
By 1973, Lally’s paper about the optical mosaic was picked up by Bell Labs and Fairchild Semiconductors who wanted him to build a digital camera based on his mosaic techniques but he was to busy at JPL to take up the offer and suggested they contact kodak which they did.
The first true digital camera was built by Kodak in 1975, was made up 10,000 pixels using a Fairchild sensor and took 23 seconds to take one picture but course this was the first attempt so things soon became more refined as kodak and other companies developed the technology.
BY the 1980s Active pixel sensors had been created that by Olympus in Japan, these used a couple of MOSFETs to do away with the capacitor and amplify the charge from the senor and convert it in to a voltage.
In the 1990s NASA was looking for sensors that would be smaller, lighter and use much less power for spacecraft on decade long missions with limited power supplies.
The JPL team led by Eric Fossum based their work on the Japanese Active Pixel Sensor technology. This and other on chip functions allowed much higher framerate to be used and in the process created a complete miniature imaging system with low power demands.
The NASA team worked with complementary metal-oxide semiconductor (CMOS) rather than PMOS that Olympus used. This allowed image sensors to be more cheaply manufactured and other functions could be fabricated directly on the chip in order to radically reduce the not only the size of the cameras but also the power consumption to less than 1/100th of the power of a CCD sensor for interplanetary spacecraft yet maintain scientific image quality. They could also be radiation hardened for use in deep space.
Frustrated by the slow adoption rate of the technology and realizing that this could have benefits outside that of a few spacecraft, Fossum, his wife and three other JPL engineers setup a company called Photobit to exploit the discoveries and having an exclusive license to do so from JPL.
The company went on to create the first megapixel sensors capable of over 500 frames per second which where used in the movie industry on films such as Star Wars episode II and The Mummy as well as micro CMOS cameras like the swallowable Pill cam, medical imaging, weapons imaging, biomechanics and soon digital consumers cameras, webcams and mobile phones. The development of JPEG and MPEG standards soon created huge growth for this digital camera technology
Today Photobit, though series of mergers and acquisitions is part of ON Semiconductor and has supplied over 2 billion sensors and regularly ships over a 1 million sensors per day for everything from mobile phones to automotive applications to pillcams and many specialist optical solutions but all the basic ground work was done inside NASA to try and make a better cameras for space.
So if have any suggestions for NASA inventions that have made a big impact on the world, let me know in the comments and it just remains for me to say I hope you enjoyed the video and if you did then please thumbs up, subscribe and share and don’t forget that Patreon supporters get ad-free versions of the videos before they are released on Youtube.