From the Branch to the Data Center, From the Cloud to the Fog: Questions you should ask about Fog Computing
Dr. Jim Metzler, principal – Ashton, Metzler & Associates
The last ten years has seen multiple trends around the centralization of IT resources. Back in the mid-2000s, many IT organizations began taking applications and storage out of branch offices and placing them in the company’s centralized data centers. A few years later, many companies began to use applications and services from public cloud providers such as Salesforce.com and Rackspace.
The hottest trend in networking today is Software Defined Networking (SDN). One of the key premises of SDN is taking control information, which has historically been distributed in individual switches and routers, and centralizing it into the SDN controller.
So are we in a golden era of centralization? Maybe yes, but maybe no. I say maybe no because Cisco has recently introduced the concept of Fog Computing.
According to Cisco1 , “Fog Computing is a paradigm that extends Cloud computing and services to the edge of the network. Similar to Cloud, Fog provides data, compute, storage, and application services to end users. The distinguishing Fog characteristics are its proximity to end users, its dense geographical distribution, and its support for mobility. Services are hosted at the network edge or even in end devices such as set-top boxes or access points.”
Whenever I evaluate the viability of any new technology or way of implementing technology, one of the first questions I ask is, “Does this solve a problem that people willingly spend money to solve?”
Cisco’s view is that Fog Computing is supposed to solve some of the problems associated with the Internet of Everything (IoE)2 . We will soon have tens of billions of smart devices generating exabytes3 of actionable data, and it doesn’t make much sense to send all that data to a relatively small number of centralized locations in the cloud.
The idea is that it makes the most sense to process the growing volume of actionable data close to where it is generated. One use case Cisco cites is smart traffic lights, where a video camera can sense an emergency responder’s flashing lights and automatically change streetlights to green so the vehicle can pass through traffic4. , “
No one sees Fog Computing as replacing Cloud Computing, but rather complementing it. For example, a Fog Computing platform could support real-time, actionable analytics and processes; and then filter the data—pushing to the Cloud only data that is global in geographical scope and time. Another option is would be to push only management data to the cloud.
Another reasonable question is, “how close are we to Fog Computing going mainstream?”
At this point, the answer has to be not very close. Cisco is still investigating some key enabling concepts5 such as technologies that support workload mobility between Cloud and Fog platforms, based on policies and the capability of the underlying infrastructure. The company is also still investigating methodologies, models and algorithms to optimize the cost and performance through workload mobility between Fog and Cloud.
So, if we are not close to Fog Computing going mainstream, should we just ignore it entirely? My answer is a resounding no. IT is undergoing, and will continue to undergo, more fundamental changes than it has at any time in its history. In order to be successful in these dynamic times, IT organizations need to develop a plan for how IT will continue to support the business. That plan has to identify where IT will store and process data, how it will secure that data, the LAN and WAN architectures it will use. and how it will manage all of this. Given the volatility of the environment, the plan has to be regularly updated to include emerging trends such as Fog Computing.
- An Exabyte is 1018bytes