Miniaturization - The Biggest Little Thing to Happen in the Last 60 Years

Miniaturization – The Biggest Little Thing to Happen in the Last 60 Years

In Technology, Videos by Paul Shillito

0 Shares

You’ve heard the saying big is beautiful, well in the modern world of digital electronics and communications, small is far more important.

It is amazing how we take for granted the modern world we live in today, we expect our internet to work at full speed, our smartphones to give us on-demand music and video in high resolution and play massively multiplayer online games with people we’ve never met from all over the world in real-time and still to fit into our pockets.

Our world has changed dramatically from that of when I was a teenager, in fact, I like many others of my age, I’m 62, we were the last generation to grow up as a child before the personal computer revolution with no Internet, mobile phones, just three channels on the TV, no cable television, and no computer games.

And, yet I got my first home computer in 1978 when I was 16 and just a few years after the modern microprocessors like the 6502, Z80, 6800 and 8088 had gone into mass production. From that point onwards the modern world which we know of today had started and things would never be the same again.

But to get where we are today there had to be a major change in our technology, a point where things really started to get small and without which we would technologically speaking, still be where we were in the 60s.

If we look at the 20th century, the invention of the Triode vacuum tube or thermionic valve took place in 1906. This was similar to a light bulb in that it was a glass tube with the air removed to create a vacuum but as well as a Cathode or heater filament, there was a control grid and an anode.

If you applied a voltage to the grid, a proportional amount of current would flow between the cathode and anode creating an amplifier. If the gird is driven hard then the tube will act like a switch.

And if you have enough switches connected in the correct way and you can perform binary calculations and you have the basis for digital computing.

World War 2 was the catalyst that drove the development of electronics & computing. One development in particular was the development of the proximity fused shell. This was Like a normal shell that was fired from an artillery gun but it had an electronic fuse that sent out a radio signal and then would detect the reflection of that signal off of a nearby object. The closer the object was the stronger the reflected signal and at a preset distance the shell would detonate.

This was used to great effect against Japanese aircraft in the Pacific by the United States Navy as well as its use in Europe to detonate shells above the battlefield to act as anti-personnel devices and continue to be used today.

Fitting the electronics into a shell required the miniaturisation of the vacuum tubes and also making them resistant to the G forces involved during the acceleration of the shell when it was fired.

But tubes had a limit as to how small they could get and it was still a delicate object that was quite power-hungry.

And this is how things carried on up until the 1960s, circuits had become more sophisticated but the components that built them were limited by their physical size.

If you look in the back of the old style television set  from the 1960s you’ll see that the components that make them up and in an average TV there would be between 15 and maybe 30 vacuum tubes.

The first electronic programmable general purpose computer, the ENIAC was built in 1945 at the University of Pennsylvania to calculate artillery firing tables for the United States Army and later study the feasibility of the Hydrogen bomb.

By the end of its operation in 1956 it contained 18,000 vacuum tubes, 7200 diodes, 1500 relays, 70,000 resistors, 10,000 capacitors and about 5 million hand soldered joints.

It weighed approximately 30 tonnes and was 8 feet or 2 metres tall, 3 feet or 1 metre deep and 100 feet or 30 metres long and consumed 150 kilowatts of power.

Its processing power was the equivalent to about 500 FLOPS or floating point operations per second and was a 1000 times faster than other electromechanical computing devices.

But to put that into comparision, an iPhone 16 Pro Max has about 2227 GFLOPS  of processing power which is about 445 billions times the power of the ENIAC.

One of the biggest problems with the ENIAC was that because of all the vacuum tubes and their reliability issues, it meant that technicians spent a considerable amount of time fixing them. The heaters of the tubes would fail mostly during the warm up and cool down periods but this was greatly reduced with special high reliability tubes after 1948. The longest continuous period of operation without a failure was 116 hours or about 5 days.

However, all of this would change with the development of a transistor in 1947 as a solid-state replacement for the vacuum tube, but it was the way the transistor was fabricated that allowed miniaturisation to take off at an exponential rate.

The transistor did the same function as a triode valve but on a piece of silicon just a few millimetres across. This huge reduction in size and weight meant that it was much more robust and used much less power was of particular interest to the fledgling space industry and NASA in particular.

They needed to put increasingly sophisticated spacecraft into orbit and on the moon and that needed electronic control and computing that would not have been available before.

The Apollo moon missions would require computers in the Saturn 5 rocket to control it, the command service module and the lunar Lander in order to perform the calculations to control the craft on their way to the moon, landing on the surface and returning back to earth.

Although much of the heavy lifting calculations were done on mainframe computers back at NASA and the results sent to the spacecraft, they still required computers to carry out most of the instructions.

Although the transistor was a huge leap forward by itself, an even bigger leap forward was for discovery that you could fabricate multiple transistors on the same piece of silicon and reduce the size of each one dramatically and this would lead to the first integrated circuits.

Now CPUs of the digital computers that NASA would use for the moon missions could fit into something the size of a small brief case instead of the size of a whole rocket.

The Apollo guidance computer or AGC was installed onto each command module and the lunar module to provide computation electronic interfaces for guidance navigation and control of the spacecraft. Although it was the first computer to use integrated circuits it was a highly optimised design and it’s performance would match that of computers such as the Apple 2, the TRS-80 and Commodore Pet, all 6502 CPU based computers that would come along in the late 1970s almost 10 years later.

The AGC had the processing power equal to 14,245 flops almost 29 times the power of the ENIAC in the size of a shoebox.

The use of integrated circuits in the AGC kickstarted the microelectronics industry and many of the early pioneers went on to form companies like Intel, Texas Instruments and Fairchild.

In fact, one of the founders of Intel, Gordon Moore noticed that the number of transistors that you could get on a chip doubled about every 18 months, the more transistors you could fit on the chips, the more operations it could perform and as the distance between them decreased, the faster it could run.

This became known as Moores law, not a real law just an observation that remained pretty accurate up the 2000s when progress has slowed due to nearing the limits of just how small we can make physical transistors.

This ongoing miniaturisation for the last 60 years has changed the world in which we live in.

The first transistors were several millimetres in size, smallest field effect transistors or FETs in the most powerful CPU’s and GPU’s today are in the region of less than 20 nanometres. Over the last 40 years this has led to the exponential increase in computing power, far more than ever could have been imagined back when the AGC was built.

With this, we have gradually shifted over from analogue electronics which were dedicated circuits doing one job that could not be changed in the 50s, 60s and 70s to digital electronics in the 80s and beyond using general-purpose CPUs that could be programmed to do the job required. Things like the digital watch and CD player and games consoles were some of the first mass-produced digital systems.

When we converted analogue signals like music and images into digital information, CPUs could perform mathematical equations on the data to change it in the same way an analogue circuit would do before, such as applying filters to images or sound effects to audio and we can now simulate analogue processes so well they are often better than the originals.

Soon converting anything and everything into ones and zeros became the norm from music to film to control systems for everything from your central heating to spacecraft.

The reduction in size of the components of computers and their increasing power meant that  warehouses full of servers have become the new digital factories processing and storing all the information we generate and is held and processed by the tech giants such as Google, YouTube, Facebook, Amazon, Apple and Microsoft.

And now a new force is emerging, that of artificial intelligence using not the CPU’s like we have in our normal computers but the GPU’s, the graphics processing units the we use for making and watching videos or playing games on.

Their unique architecture and highly parallel processing mean that in my computer which has an NVIDIA RTX 4090 graphic card, it is over 40 times more powerful than the top of the range Intel i9 14900 CPU, and it is these hugely powerful processors which have some of the highest density chips with the so-called 7 nanometer technology that are powering the AI revolution with 100s of thousands of them running ChatGPT and many more AI applications.

There is a whole new universe opening up under our noses based on transistors and most of us no virtually nothing of it, and even if we do reach the minimum size of what is possible, we will still build them in their trillions of trillions.

So thanks for watching  and if you enjoyed the video please thumb up, share and subscribe.

Visited 1 times, 1 visit(s) today
Paul Shillito
Creator and presenter of Curious Droid Youtube channel and website www.curious-droid.com.