Deeper visibility into Deepfield

I just watched today’s SDxCentral Nokia Deepfield DemoFriday webinar, featuring Nokia GM and Deepfield architect Dr. Craig Labovitz, who described the product and demonstrated some of its features. Nokia acquired Deepfield earlier this year, and is now disclosing more information about Deepfield and how it fits into Nokia’s IP/optical networks portfolio, which Craig and others described at last month’s IP Networks Reimagined announcement (see my recent blog post).

I’ve been tracking Deepfield since I launched my ACG practice over a year ago, and had been briefed by the company prior to the acquisition, but as Craig acknowledged, Deepfield had been fairly secretive about the product and its technology. So it was good to finally see a demonstration of the actual product and hear Craig describe its capabilities in more detail.

Raising Deepfield’s profile is a good move by Nokia because the company’s global footprint will enable them to sell the product well beyond North America, where Deepfield is deployed by many leading service providers, although the company also has customers in Europe.

The premise for Deepfield is straightforward:

  1. The Internet has become much more complicated in the last 10 years, with complex network topologies, particularly in the metro area, the deployment of CDNs, the explosion of streaming video, and adoption of real-time voice and video communications. The big shift is from the Internet as a set of pipes for best effort bit delivery to a reliable end-to-end transport mechanism for high quality content and services with assured quality and performance.
  2. But what tools are available for service providers to deal with this shift? Deepfield recognized early on that advances in network instrumentation, streaming telemetry and Big Data analytics made it feasible to build a software-only platform for network visibility & analytics that was more powerful and yet more cost-effective than solutions employing DPI probes and monitoring appliances.

I would encourage those who are interested to watch a replay of the webinar, but here are some of the highlights:

  1. Deepfield uses “connectors” to implement southbound interfaces that collect data from a disparate array of sources, including many sources of telemetry data from the network itself, service provider data from OSS/BSS, customer care and billing systems, and data from Deepfield’s “Cloud Genome”, which maintains an up-to-date map of all of the content sources, services and devices on the Internet.
  2. Deepfield supports a petabyte-scale Big Data analytics engine for multi-dimensional, real-time analytics. Craig demonstrated how the system tracks network traffic by content source, website and application type, as well as by network or CDN, and generates intuitive visualizations of traffic load using built-in reports and in response to ad-hoc queries.
  3. Deepfield supports four main use cases: real-time QoE, network engineering, customer care and network security/DDoS. These are implemented as Deepfield applications that leverage a set of northbound interfaces from the core analytics engine. Craig also pointed out that these interfaces are also used to feed actionable intelligence to external systems supporting these various use cases.

It was clear from Craig’s brief demo that Deepfield’s software is a powerful tool for service providers, enabling them to gain in-depth, multi-dimensional, real-time visibility into traffic flowing across their networks and the Internet. Without the ability to gain this level of visibility, network operators would be flying blind and likely have a difficult time monitoring network performance and ensuring digital QoE for content and service delivery.

The webinar was light on implementation details, but Craig did say that the software can run on a cluster of Linux servers in a customer’s data center or can be hosted in the Amazon cloud as a SaaS-based service. Naturally, I’m keen to learn more about the full stack supporting real-time Big Data analytics and how the software is typically deployed operationally by service providers. However, it was good to gain deeper visibility into Deepfield, and I look forward to learning more.

 

 

Nokia couples cloud-scale network visibility & analytics for network automation

I attended Nokia’s IP Networks Reimagined event in June, where the company announced new 7750 SR-s IP core routers based on its new FP4 network processor chip, both impressive technical achievements in their own right.

However, what really got my attention is how Nokia is integrating the technology obtained via the Deepfield acquisition to directly couple cloud-scale network visibility with Big Data analytics for security, performance and network automation.

Deepfield’s petabyte-scale Big Data analytics engine provides visibility into tens of thousands of Internet applications and services as well as billions of IP addresses, mapping what it calls the Cloud Genome. The software is currently used by many leading service providers for DDoS protection and traffic engineering.

Nokia designed the FP4 chip so it can look anywhere inside packets for extracting real-time flow telemetry data. This data, along with machine data and network state provided by Nokia’s SR OS router software, then feeds the Deepfield analytics engine, which derives insights that are used to determine the actions taken by Nokia’s NSP software, which is an SDN-based network automation and management platform.

Using real-time network visibility & analytics for deriving actionable intelligence to drive network automation is the industry’s “holy grail”, and Nokia has articulated its vision for how to achieve this goal, so I’m keen to learn more about how these three pieces fit together.

For more information about Deepfield, be sure to tune into Nokia Deepfield DemoFriday at SDnxCentral this Friday, July 14, where Deepfield architect and Nokia GM Dr. Craig Labovitz will demonstrate the product’s capabilities.