As we reach the 30th anniversary of the world wide web, there is a quiet revolution taking place above our heads as the future of the internet expands out into space and not just up to the ISS but the moon and beyond and into what will become the interplanetary internet.
Now some might be saying why do we need an internet in space when there is no one up there but its not just people this is intended for. Current communications with spacecraft that go out into solar system and beyond can be torturously slow not only because of the limitation of radio waves to the speed of light, there is not much we can do about that but the signals are so weak that data rates are literally down to a crawl by the time they get to the outer planets and make 1990’s dialup speeds look positively fast.
New robotic probes and increasingly sophisticated spacecraft will need to relay large amounts of data back to earth and to other spacecraft if they are working together with in a space network.
At some point, whenever we get people to Mars, they are also going to need high-speed data between Earth and Mars, not just for spoken communications but the vast amounts of telemetric and video data that the mission will generate.
Even though we haven’t got a base on the moon yet, the plans for an internet covering the earth, moon, planets, and space in between are already well under way. We have had the internet in space for while now with a connection to the International Space Station but for all the technology onboard they didn’t have access to things like twitter until 2010.
This may seem odd but is fundamentally down to poor connection speeds, with Astronauts rating it as worse than dial-up, OK for email and sending pictures and calling home but not much else.
The reason is that most of the internet now travels around the world over fibreoptic cables and allows huge amounts of data to be moved around very quickly and efficiently. But the ISS is definitely not wired and there also isn’t a planetary Wi-Fi network they could tap into either.
But over the last couple of years, SpaceX and other companies including Facebook have been taking what they think will be the first steps into rectifying that situation by setting up a planet-wide Wi-Fi network.
SpaceX’s plan is called “StarLink” and would use a constellation of over 4500 small low altitude satellites which it hopes would be in place by 2024. Now bear in mind that there only around 1500 active satellites in orbit now and about 2600 inactive ones so this alone would double the number of satellites and that’s without any others from the likes of Facebook if SpaceX’s initial trials are successful.
Now while this may be the start of the internet in Space, extending it further out is a where a whole new set of problems start.
One issue is the speed of light, be that as radio waves or things like LASER light and that’s not something we worry about down here on Earth. At just shy of 300,000km per second a pulse of light can go around the earth over 7 times in one second. This allows us to have Realtime communications even from the other side of the world.
But as we get farther out int to space even the speed of light won’t be fast enough to allow for realtime communications much farther than the moon.
At 384,000km there is a 2.5 second round trip delay for a radio signal to reach the moon and back and that’s assuming zero processing latency. This makes things Real-time remote control very difficult but not impossible from earth but get out to the distance of Pluto, like the new horizons spacecraft and that round trip delay has increased to 9 hours.
Whilst the speed of light limits our ability to have real-time communications it not the only problem. Because of the huge distances, the radio signals we send, even if we use highly directional antenna’s will still spread out and become weaker the farther they go. Radio signals follow the inverse square law so that for each doubling of the distance you receive only one-fourth of the power. This not so bad if you have a very large antenna or satellite dish like here on earth but on a spacecraft, it’s antenna might be just a few meters across and only able to capture a very small amount of the original signal. As the power of the signal reduces its signal to noise ratio worsens to the point where it becomes indistinguishable from the background noise and is lost
NASA can currently receive data from Mars at an average distance of 200 million km from earth at around 1.5Mb/s but from Pluto which is 7.5 billion km away, the data rate is just 2kb/s about the same as a dial-up Usenet connection from the early 1980s, if anyone remembers those.
At the moment there are 3 space networks. The original Space Network setup was in the ’70s for communicating with satellites and spacecraft and uses 10 geostationary satellites and two ground stations to provide 100% coverage around the earth for craft like the ISS and Hubble space telescope.
Then there is the Near Earth Network with 17 ground-based stations that deals with communications and data with satellites in a variety of orbits up to and including the distance to the moon.
The last and most sensitive telecommunications system is the Deep Space Network which supports interplanetary missions and has three ground stations at equidistant points around the earth so that no matter what the position of the earth is, it’s still has a constant connection to any spacecraft out to the distance of the farthest ever manmade probe, Voyager 1, currently over 21 billion km away.
The trouble with all of these is that there is more demand than there is capacity and with new missions from SpaceX, NASA, ESA, India, and the UAE planned for 2020s as well as possible Mars missions, networks like the Deep Space Network will be reaching their limits soon with the relatively slow radio technology even though they are constantly being upgraded.
This is where the first stages of the interplanetary Internet are coming in. NASA is already testing a system called Delay or Disruption Tolerant Networking or DTN that is designed to overcome the problems of signal delays because of the vast distances, high levels of noise and solar interference, limited energy resources of spacecraft and even cyberattack.
The idea for the DTN is that it makes use of an automatic “store and forward” data network. This is different to a normal internet connection where there must be a continuous connection from the source to the destination for the packets of data to be passed from one node to another.
In a DTN, the only requirement is that the next node be available and if it isn’t then the data is stored until communications can be re-established, which could be seconds, minutes or hours later. Once all data packets have reached the destination, they can be re-assembled into its original form.
NASA first tested a DTN in 2008 with the EPOXI spacecraft some 20 million km away to act as a Mars data-relay whilst EPOXI itself was on its way to encounter Haley’s comet. This showed that other spacecraft on other missions could be used to relay data instead of the normal direct communications method.
This could also speed up the data rates because the distance from earth to the first spacecraft node and then from that to the next node and the next node and on to the destination would be much less than a direct connection and enable much higher data transmission rates between them.
So whereas a direct connection to Mars might allow up 2Mb/s, using the DTN thru several nodes might allow 50Mb/s from Earth to node 1, then node 1 to node 2, then node 2 to node 3 etc and on to the destination. Each node would also boost the signal sent to the next one thus improving the signal to noise ratio and minimizing errors and the need to resend data, greatly increasing the data throughput overall.
Up to now all,
But the future of deep space communications looks like it will be using LASERs. NASA has been experimenting with optical lasers to transmit data between Earth and the ISS with a system called OPALS – Optical Payload for Lasercomm Science.
Using lasers is the same method as we currently use to transmit data around the internet on earth, down fibreoptic cables. The laser light is pulsed on and off at very high frequencies in the same way but in the OPALS system, they are fired up through the atmosphere to an optical receiver on the ISS.
Because of the much shorter wavelength of light compared to a microwave radio link, the OPALS system can deliver data rates of between 10 and 1000 times that of current radio systems. Whilst lasers have much narrower divergence over the same distance compared to radio and much lower error rates one of the biggest problems is getting it through the Earth’s turbulent atmosphere.
NASA is working with Boeing for its adaptive optics which compensates for the atmospheric distortion in real time by using high-speed cameras and mirrors.
In one test between the ISS and earth, an HD version of the Apollo 11 moon landing was beamed up by radio and then back again by laser. It took 12 hours to transfer the video up to the ISS using the current radio uplink system but just 7 seconds to send it back to earth by the OPALS laser system.
For long-range transmissions lasers could also work much better than radio. In 2013 NASA conducted the Lunar Laser Communications Demonstration (LLCD). Here they beamed a laser to a satellite in orbit around the moon and back to the earth. The test broke all the current transmission records with a download speed to the earth of 622Mb/s and an upload speed of 19.44 Mb/s, some 4800 times faster than the best Radio uplink ever used.
In 2019 a new set of tests using a near infra-red laser will help improve data handling, encoding methods and also tracking.
Once in space, there is no atmospheric distortion to contend with but the very narrow spread of the laser beam means that the targeting of spacecraft with a laser millions or billions of km away will have to be very accurate from both ends for it to work.
A radio signal sent from Mars using the Ka frequency band will have diverged to a width greater than the diameter of the earth by the time it arrives. A laser, on the other hand, will have diverged to just a few hundred km across. This also means that less power is required to send the signal as it isn’t spreading over a large area and the equipment is smaller and lighter than the equivalent radio systems.
Future systems combining lasers with DTN networking should allow data transmissions even out to the edge of the solar system to be several orders of magnitude greater than they are now and for the first Mars colonists, whilst they will still have a delay time on average of 20 minutes or so at least they’ll
If you want to be the first to connect up to the Interplanetary internet, our sponsor for this video Hover can be the perfect place to get the right domain like interplanetary.tech or firstinspace.com. Hover specialize in domains and has best in class customer support complete with all the features you will need to set you up for planetary exploration, from finding your perfect domain name to setting up your email so you don’t need to use those Gmail addresses and hosting your own website. If that isn’t enough you also can get 10% off your first purchase at www.hover.com/CuriousDroid using our code: CuriousDroid.