Lost in the fog

Is fog computing just another marketing buzz phrase or a legitimate trend for the channel to follow?

Whatever next? First we had cloud computing, which was quite fuzzy enough for most people, but vendors have begun talking more about something of late that Cisco has been pleased to call "fog computing". The resultant lack of visibility is clear, in a manner of speaking.

In most realms of endeavour it's important to be able to see where you are, and the direction of travel. That is not generally the case in a fog. And if you cannot describe exactly what is going on - if "fog computing" isn't really a thing - does the concept have value beyond an attempt at repositioning a traditional networking hardware vendor?

Yet the ambiguity implied by the "fog" concept may itself potentially drive useful change for businesses and the IT channel that serves them.
As far as we can tell, "fog computing" means localising some functionality and resources that people have spent years working out how to put into the cloud.

The cloud was everywhere but nowhere; the trouble with this, as many resellers and tech providers have found, is that the cloud itself becomes the bottleneck, and often there simply isn't the bandwidth to keep the performance levels as high as many business customers have come to expect.

So Cisco's "fog" dilutes the concept of cloud and its real-world representation somewhat: let's have a nearby fog, not just a far-away cloud. Let's put some devices, some compute resources, some storage and so on at client level, like in the past.

This might help solve the problems posed by an ever-increasing number and type of devices that are on the network - even if it seems like we're turning back the technology clock.

Jason Dance (pictured, left), managing director of datacentre and virtualisation specialist distributor Big Technology, confirms it is about moving the processing power closer to the user.

"We've seen centralised computing, and ‘department' computing; we just called them different things in the 1980s and 1990s. The main difference we have now, I think, is that if you go back to the 1980s and 1990s, most of the data was created by human beings," Dance says.

The most helpful way to look at the fog concept is to examine specific use cases, he continues. In certain cases, it may well be best to move towards a localised or partially localised resource - and in other circumstances, it might all be best uploaded to a web-scale cloud platform.

"Take, for example, a jet engine: in action you might want to monitor and act on data quickly. The second part of that from a manufacturing perspective is you want to take data from lots of jet engines to assess risk," Dance says.

"Fog might be useful when you want to get information back as quickly as possible and a lot of processing power is required."

Mobile healthcare applications could benefit, as well as sensor-rich environments where small time lapses can ratchet up cost - such as the oil and gas industry.

Do we need a new term for that? Probably not - but the fog concept is certainly still a useful way to refer to a real-world trend and take that message to customers, perhaps around a diverse, converged solution set. The term "cloud" itself means different things to different people - but that doesn't stop it being useful.

One component in fog could be ‘smarter' routers with more application-level functionality - so long as the security is fit for purpose.

Richard Roberts (pictured, right), managing director of the UK partner and commercial sales organisation at Cisco, agrees with much of Dance's perspective: "It's fog because it's closer to the ground, closer to the point where the data is created. It's a logical extension of cloud, and pushing value through the cloud."

When it comes to product, Cisco's range of 819 routers which harness IOx provide an application-enabling environment, and Roberts says more fog-related products are in the pipeline and will be available to channel partners. "There are some smart grid products as well," he confirms.

"It's an opportunity and a challenge for channel partners, around the ability to understand the core business applications with the customer and leverage the value of that locally, and maybe some requirements for very low or predictable latency," he adds.

Part of the partner challenge will concern development skills - not least because a lot of the opportunities may centre on specific vertical markets, such as the aforementioned healthcare and energy sectors. Some of these sectors have been using other ways of achieving the same results - but fog represents a great chance for them to "embrace the power of IP", says Roberts.

Few IT resellers have the right application development skills in-house, and so working more closely with ISVs is an area where Cisco is intensifying its focus in relation to edge computing. It plans to assist other partners to gain those skills as well, he notes.

Everything old is new again
What would Roberts say to those who argue that "fog computing" is just the latest marketing buzz phrase? "I would say that very little is genuinely new in IT. It is very cyclical, which is apparent if you've been around as long as I have," Roberts says.

"And although it is similar, there is real value in it; there is both a technical and physical reason to do it, and a logical reason - because of some of the costs around linkages for cloud upload."

Cisco's own paper, Fog Computing, Ecosystem, Architecture and Applications,states simply that fog computing extends cloud and its related services to the network edge. "The distinguishing fog characteristics are its proximity to end users, its dense geographical distribution, and its support for mobility. Services are hosted at the network edge or even end devices such as set-top boxes or access points," it writes.

This can slash latency and boost quality of service - essential as the Internet of Things (IoT) expands. Industrial automation, transportation, and any sector where webs of sensors or actuators are required may benefit. Fog is well positioned for real-time big data and real-time analytics.

"Fog devices are geographically distributed over heterogeneous platforms, spanning multiple management domains. Cisco is interested in [receiving] innovative proposals that facilitate service mobility across platforms, and technologies that preserve end-user and content security and privacy across domains," Cisco writes.

"To accommodate this heterogeneity, fog services are abstracted inside a container for ease of orchestration. Example container technologies are Java Virtual Machine, and Linux containers."

The vendor earlier this year unveiled IOx, a mash-up of Linux and its own IOS operating system specifically designed to run self-contained applications on networked devices.

"Think about the idea that every single bit of data [has] to be backhauled to a cloud-based application so it can be analysed," Cisco's senior director of IoT products and solutions marketing, Roberto De La Mora, blogs in a post introducing IOx.

"We are going to run into the ‘data gravity' issue pretty fast. You can put all your data somewhere, but as it grows in size it becomes very expensive to move it around.

"It's becoming very clear that the IoT requires a different computing model, one that enables distributed processing of data with the level of resilience, scale, speed, and mobility that is required."

Fog devices can push data to the cloud where required, or to other places on the network. In this sense, it really isn't anything new.

Adam Davison (pictured, left), director of VAD Cloud Distribution, agrees that fog computing has essentially been around for a while. However, while "fog" is a fundamentally useful concept, care should be taken not to rename phenomena just for the sake of doing so as that can simply add to any market confusion.

"I guess you could say I am sitting on the fence when it comes to that naming convention," he confirms. "I think we have to be careful not to add confusion just by renaming every objective possibility of a solution that's out in the market. You could almost just call it ‘edge cloud'."

"Fog" is still cloud, he points out, and cloud computing is itself still a concept with which much of the market hasn't yet got to grips.

"In terms of solutions, though, I absolutely agree with its usefulness. We have a number of products in our portfolio that leverage this type of solution - and we have done for some time," Davison says.

He cites Meraki, which is managed in the cloud and has access points, firewalls, switches and the like at the edge. Then there's Talon, whose optimised cloud services are delivered to the edge, with software sitting in the cloud and in the branch office to enhance collaboration - providing a good option for enjoying local-speed access to files in the cloud.

"Fog doesn't have to be hardware - it can be a software client," Davison says.

Clouding the issue
There's WatchDox, which is in the cloud with edge clients, and Lastline's zero-day advanced persistent threat defences with compute-in-the-cloud receivers at the edge. Exinda Networks puts appliances at the perimeter supporting virtual appliances in the cloud.

"People are often doing it (fog) without realising that's what they're doing. In the Meraki or Aerohive example, customers buy these solutions because they need controllers that facilitate managed access points and they're simple to manage," Davison explains.

"And a reason they're simple to manage is they're in the cloud where you can configure hundreds from anywhere."

Customers don't often come and say ‘I want a cloud solution' - it's simply that cloud offerings these days tend to deliver the capabilities they want in the most efficient way currently available. That may mean a range of savings beyond the financial, although the opex versus capex argument, for example, still remains top of mind.

As Cloud Distribution has evolved its portfolio, that is the way it has tended to move. "And it just so happens that most of them have something in the cloud and something at the edge," Davison says.

However, some commentators have continued to complain that "fog computing" as a phrase has little solid content or meaning - such as managed services provider Azzurri Communications. Spokespeople were not available for interview at press time, but the firm sent CRN several quick comments through its PR company.

Steve Palmer, product marketing director at Azzurri, writes simply: "Fog computing is nothing more than marketing spin, and it is our job to point this out to our customers."

Rufus Grig, chief technology officer at Azzurri, says "marketing fog" might be a better definition. "When I first saw this term [fog computing], I checked the date to see if it was 1 April," he says. After all, cloud companies of all types are already doing similar things.

"Any sane IT leader is going to be adopting a balanced approach to cloud - with some services in the public cloud, some in its own datacentres, some in hybrid architectures, with some good old-fashioned kit on-site and in people's pockets.

This is just, well, fog! Move along please; nothing to see here," Grig says.

David Groves (pictured, right), director of product management at Azzurri, adds: "Fog computing is basically a collection of nascent technologies that will improve the performance of the cloud but which are inevitable and don't require a new buzzword."

Marketers believe that thinking differently about a phenomenon may give a tired concept a new lease of life. Fog computing is not the first reinvention of a concept to suit an updated audience - and it won't be the last.

It's likely that we merely await the arrival of a wide range of cloudy weather-related IT terms: "storm computing", "hail computing", "snow computing", or perhaps even "rainbow computing". What might they turn out to be? Definitions on the back of a postcard, please.