
Three cheers for the cloud. With apparently infinite elasticity in both compute power and storage, not only is it a smart move for many organizations to use it, but it is also a great place to keep Big Data and analyze it till the cows come home. Well, two cheers for the cloud, really. The limiting factor is not the space or the processing muscle. It’s the data highways for move all that data around. Where computer memory and processing speed increase by a factor of 10, network connections struggle to improve by a factor of 2. There is both a figurative and a literal disconnect.
The Backhaul Problem Will Get Worse
Moving data around uses links between nodes that are often collectively referred to as backhaul. These links also carry the data being generated by all the ‘things’ attached to the Internet – the ‘Internet of Things’ or IoT for short. That means all the smart cars, coffee-machines, house security systems and other artefacts that are now being equipped with a chip for generating information and communicating with the rest of the world. The data generated must then enter the Internet, be routed through it and delivered to the designated end-system for crunching and action.
Don’t Connect Your Jet Engine Before You Listen to THIS Announcement
While there are many, many coffee-machines in the world, they may not generate that much data individually. “Out of sugar in T-27 hours” might be about the size of it. Cars might generate rather more. Often in daily use, there is constantly evolving mechanical and geographical data to be had, not to mention the possibility of information about driving behavior. But jet engines, according to network vendor Cisco, put most other ‘things’ to shame by generating up to 10 terabytes of data in as little as 30 minutes. That kind of volume can put a serious strain on many network links. However, Cisco (as you might have guessed) has a solution to help make the Internet of Things compatible with the cloud.
Cloud Up There, Fog Down Here
Instead of sending all the data up into the cloud and clogging up all the network, Cisco’s suggestion is to use network edge routers that can already help to tame Big Data close to where it is being generated. The company has come up with the name of Fog Computing for this local deployment of compute plus networking power. In practical terms, the Internetworking Operating System (IOS, not to be confused with Apple’s mobile OS called iOS) will be combined with Linux-based processing capability to produce a new architecture called IOx.
Handling the 99.999% of the Time Your Jet Engine is OK
If a jet engine (for instance) is busily generating messages confirm that all values for operating parameters are in bounds and OK, it may not be productive to send all that information to the cloud. A specialized IoT router with minimum onboard computing capability – like the ones Cisco has announced – can verify locally that everything is OK and only send messages up to the cloud in the 0.001% of the time that there is exceptional information to report. So Cisco has a new networking market to play in (it’s already the dominant vendor in many others) and backhaul network links stay less clogged because of data being handled in the fog. Cautious optimism suggests that with this approach we could even be giving those full three cheers for the cloud again.