Micro Focus is now part of OpenText. Learn more >

You are here

You are here

AI is the future of ChatOps—and the end of ChatOps as we know it

Abbas Haider Ali CTO, xMatters Inc.
Rusty robot mural

At the peak of any tech cycle, one technology always seems to be the ultimate solution to a problem—which, of course, is never the case. The “next big thing” is always being cooked up somewhere to make the current solution look primitive. ChatOps is no exception.

With the explosion of tools, systems, and consoles intended to improve how we work, ChatOps was a necessary evolution to combat data overload and distil information down to the most relevant, useful components. Rather than manually juggling 20 different tools and systems, ChatOps creates a single, intuitive interface that effortlessly integrates relevant information into the right channels. It becomes a critical link in a chain of tools, people, and processes that allows teams to get their jobs done more efficiently and effectively.

While ChatOps is transforming the way we work, it’s still very much a wild and untamed beast. Implementations range from decent to disastrous, and without a definitive best practices toolkit, the vast majority of organizations must learn as they go.

With time and patience, however, we’ll refine our approach and develop a more standardized ChatOps model. We’ll gain new and wonderful capabilities and UI enhancements, courtesy of artificial intelligence (AI) and augmented reality (AR). This won’t be the death of ChatOps, just the end of ChatOps as we know it today.

How will it all go down? Based on our struggles now, here are my predictions.

1. Organizations will realize that the onboarding of new employees into an unstructured, disorganized chat world is a drag on velocity 

This is especially true for large enterprises, but smaller organizations will find the inefficiencies and bottlenecks equally intolerable. The new standard across the board, regardless of organization size, will be a refined, systematically organized chat model that immediately and contextually brings new users into the loop. While this is particularly critical at scale, smaller teams will also be following specific protocols to make onboarding as smooth and painless as possible.

2. As machine learning ops rises, people will combine the learnings and insights of a single organization’s application with those of peer organizations

The first phase of this trend will consist of massive analysis of past operational issues and service disruptions, and the actions that were taken to successfully diagnose and resolve them. These solutions will become invaluable to operations teams and developers, empowering them to execute pre-vetted resolution recommendations and creating feedback loops to strengthen machine learning ops (MLOps) signals.

As more applications and services start to self-heal, we’ll see the technical indicator of the end of current ChatOps. Its AIOps replacement will start small, but will quickly figure out who to engage for detecting, diagnosing, and resolving problems with minimal human intervention. It will follow a similar exponential intelligence trajectory as other machine learning-powered systems, which start off relatively dumb, get incrementally better, and then appear to suddenly become very good at what they do.

An example of how this might play out would be in the relationships between the different services that form an application. An untrained system could get a basic start from a CMBD or service directory; given the poor track record of those systems, however, it would frequently be incorrect in its understanding of the application. But with access to use transactions, applications logs, network flows, and other data, the machine would get smart.

The progress might be slow at first, but it would reach uncanny levels of accuracy very quickly. We’ve seen this at play with other systems we utilize every day, such as autocorrect on our mobile phones, Google Translate, Amazon recommendation engines, and more.

3. ChatOps will transform the user experience and interface for ops teams 

Machine learning and AR-powered ChatOps interfaces will enable faster user onboarding and more scalable teams, creating a superior experience to command-line interfaces—even those enhanced by helper bots.

Imagine how effective it will be when operations engineers can be visually interrupted for high-priority tasks, and can combine a visual interface with the power of a sophisticated Natural Language Processing (NLP) system to take action. The killer combination of smart audio and visual running in an AR interface will support richer, better forms of collaboration, and ultimately signal the demise of ChatOps as we know it.

The next generation of ChatOps will deliver

As we get better at ChatOps, organizations will reach chat nirvana faster than ever. We will see all kinds of standard integrations, such as video and voice collaboration components, connectors to continuous integration and delivery tools (to drive richer data into operational processes), more intelligent helpers (to automate the channels into, which will bring users), and reporting tools (to query data from lots of IT systems in the middle of a firefight). Data capture and analysis around chat activities will become the norm, allowing organizations to continuously improve the performance and results of operations teams.

In spite of all these improvements, however, the overarching principle of ChatOps will stand resolute: You should use chat to connect separate systems into a single console, and to expedite common workflows.

How would you like to see ChatOps evolve? Share your ChatOps wish list in the comments below.

Image credit: Flickr

Keep learning

Read more articles about: Enterprise ITIT Ops