Instana Blog

Date: December 30, 2019

The Data Fog of Observability

Category: Engineering

We’re coming to the end of 2019, a year in which we have seen Observability, as a malleable term or concept, gain widespread exposure and attention from software vendors and engineering communities. So I thought it might be an appropriate time to paint a picture of the current and changing landscape going into the new year.

The continued growth in complexity and increasing rate of change has many SRE and DevOps teams turning to Observability for improved understanding of an evolving rugged to dancing system (of services) landscape.

How effective the adoption of Observability based on yesteryear instrumentation and measurement approaches such as tracing, metrics, and logging is debatable. It depends on many factors, including where precisely an organization is starting from, as the saying goes in the land of the blind, the one-eyed person is king or queen.

The question we need to ask of Observability, as currently defined and employed, is whether once we have moved from this initial frame of reference, can it scale up from the low hanging fruit now visible and being picked off.

Exploration and Exploitation

Speaking as someone tasked with solving the hard problems in the effective and efficient monitoring and managing highly interconnected and interdependent systems, I consider the overemphasis on data, as opposed to signals and states, as the formation of a great fog. This data fog is very likely to result in many individuals in organizations losing their way and overindulging in the exploration of data as opposed to the exploitation of acquired knowledge and understanding. This has come about with the community still somewhat unconcerned with a steering process such as monitoring (or cybernetics). The current inhabitation of the middle ground is not sustainable going forward.

Within the thickening data fog, many will lose sight of the big picture, the changing landscape of system state, and not be able to pick up signals needed to orientate. Not all doing with data is useful when performed aimlessly. Any insights gleaned in trekking through a thick fog of data is likely only to reveal more about the forestry than the various local maximums in the landscape or the underlying tectonic and fundamental forces of system dynamics.

Cloud computing has allowed us to collect and process massive amounts of sensory data. Still, it comes with a cost when it overloads human cognitive capacities, reduces the visibility of what is significant and should be attended to, and makes orientation near impossible. More than ever, there is a need for expert guidance and adaptive tooling.

Abstraction and Simulation

In 2020 there will be two broad movements taken to break away from the data fog that many organizations have found themselves floundering in – abstraction and simulation. Ever-increasing levels and layers of abstraction will be employed to rise above the fog and more accurately assess the situation and state of play at scale. Abstraction in the form of data reduction and new higher-order model representations will better assist teams in identifying when to continue to exploit ( move and expand rapidly), or to explore (slowdown and consolidate understanding).

Orientation (of the situation) gained from abstraction will bring focus and frame effort whenever an engineering team needs to dive deeply into the fog. Explorative mission briefings to the “unknown unknown” quadrant will primarily consist of intelligence based on communicated signals and inferred states along with maps of signposts. The effectiveness and efficiency of tooling like Instana will be judged on the precise targeting of incursions into the fog, the degree of learning acquired during, the cost and time spent in doing so as well as the intelligence gathered and relayed to other teams moving elsewhere along the landscape – always guided and assisted.

While abstraction looks to laws, formulae, and models to offer a more effective birds-eye orientation for operational teams and their activities, simulation attempts to recollect and reconstruct the foundational and functional fabric of reality of systems that are expected to be under some degree of change and control.

Simulation exists beneath the data fog, with data pushed into the background and used solely to power the play-back or play-forward of execution to be experienced and introspected immersively. Within the fog, data consisting of traces, metrics, and logs cloud the vision of reality under execution. The fog offers up hints of software execution behaviors. Still, the behavior is never genuinely experienced to the extent that it can be natively mapped to code, constructs, and context. Within the fog, engineering teams are always observing data and not actual execution. Simulation, on the other hand, deals exclusively in episodic memories of the past or the projection of future potentials. While simulation is far more expansive and expressive, it is still very much simplified in the few fundamental and primitive elements of concern it contains. Teams will employ simulation for training purposes as well as tuning of tooling and data gathering. Perception in difficult conditions must follow and rely on experience.

14 days, no credit card, full version

Free Trial

Sign up for our blog updates!

Start your FREE TRIAL today!

Free Trial

About Instana

As the leading provider of Automatic Application Performance Monitoring (APM) solutions for microservices, Instana has developed the automatic monitoring and AI-based analysis DevOps needs to manage the performance of modern applications. Instana is the only APM solution that automatically discovers, maps and visualizes microservice applications without continuous additional engineering. Customers using Instana achieve operational excellence and deliver better software faster. Visit https://www.instana.com to learn more.