Micro Focus is now part of OpenText. Learn more >

You are here

You are here

4 things IT Ops teams need to know about data management

David Linthicum Chief Cloud Strategy Officer, Deloitte Consulting

Why is there so much confusion within IT Ops teams about data management and data operations (a.k.a. DataOps)? The answer isn't simple. No, database administrators aren't going to make this go away. Yes, it's your problem in IT Ops. You need to learn to operate multiple databases and other data storage systems. To that end, there are four things you need to know to make your life easier.

But first, let's set the table so you have a firm understanding of DataOps and its role in IT Ops.

What it is, why it matters

DataOps is typically a subordinate of cloud operations (CloudOps), which may include cloud and non-cloud systems. And now organizations tightly couple DataOps with machine-learning operations (MLOps). Of course, data must be protected and governed, so you also have security operations and governance operations (SecOps and GovOps, respectively). 

It's little wonder so many operations jobs chase so few candidates. Operations skills in general, and DataOps skills in particular, now top the priority lists of all Global 2000 CIOs. 

Core to current ops issues is every enterprise's intense focus on data growth. You don't need site surveys to understand that data will continue to organically grow at an accelerating pace. Usage continues to expand as well, along with the deployment of key data analytical tools and AI that allow data to be leveraged for more strategic and tactical purposes. 

Data is growing in importance

Today, data defines a business. Enterprise data is no longer just a collection of information about customers and sales. It's the manipulation and key use of that data as a force multiplier that drives the entire business model. 

Look at the use of recommendation engines, which most major e-commerce businesses attach to their sites. These systems leverage customer and sales data, along with external data (such as demographics), to drive key recommendations that in turn drive sales. 

Based on how a user navigates a website, recommendation engines can accurately determine demographics such as gender, age, race, income, career, hobbies, military service, and more, even if the user doesn't log in or directly provide that information. 

The tracked data coupled with an AI engine can make uncannily accurate guesses and push certain products or services based upon the AI's determination of your most likely desires. Thus, sales increase. 

Of course, data can be used in ways other than recommendation engines, and the list is exhaustive. The point is, businesses need to find the value in their data if they are to survive. The Global 2000 already understand this new fact of life. 

The DataOps hot potato 

Most IT Ops teams will tell you it's the responsibility of the database administrators to deal with DataOps. After all, in most enterprises, IT Ops teams tend to focus on infrastructure such as storage and compute. But where have all the DBAs gone, and who's minding DataOps? Sorry, ops teams, but the DataOps hot potato gets passed back to you. 

We've moved away from a simple on-site database management infrastructure with a few relational and some non-relational systems holding enterprise data. These days, organizations are using cloud-native databases that are part of a public cloud provider's service offerings. Or they use a widely distributed and complex array of databases that are leveraged for special purposes that support the business. 

There are purpose-built databases such as in-memory databases that support high performance and databases that run on devices that support edge computing. 

While the business needs this sort of complexity to more effectively leverage data, it's a lot for the operations team to deal with on an ongoing basis. This is an unexpected challenge for most in IT Ops that requires an expanded skill set. 

So, what do you need to know about data management when DataOps lands on your doorstep? How do you operate with multiple databases and other data storage systems?

The answers lie in the cloud. Here are the four major DataOps concepts every ops team should know to make their jobs easier.

1. You may not need to be an expert on a specific database to successfully operate it

Databases require access to the storage, compute, and memory they need to run effectively. You don't need the ability to design and deploy a database using a specific database technology, but you must understand what it needs for successful operations and scaling. 

Successful DataOps is not so much about knowing how the database functions as it is about knowing how the database finds its resources. 

Tradeoffs arise if you provide too many resources, and thus waste money, or if you do not provide enough resources and the database crashes. This is a Catch-22 that's found in the world of cloud as well as in on-premises databases. 

The good news is that you can have a database operations specialist who can operate as many as five different databases without having a specific database skill set. Indeed, most database providers have simple ops training for those who keep the databases running but who do not build and deploy solutions. 

2. You can no longer separate security and data  

When we dealt with traditional systems, we could wrap data with a security system that had no clue how the database operates, including how the database stored its data. That is no longer the case, and if it is the situation in your organization, you're likely to be breached very soon. 

Today's SecOps and DataOps are joined at the hip, and all databases running, on premises or in the cloud, need to support security services such as data encryption, key management, and the ability to deal with credentialing systems such as identity access management (IAM)

Does this make the job of the ops teams more complex? Of course. Some good news: There are tools that can abstract you from that complexity and security tools that are often furnished by the database providers. 

So the GenOps pros need to understand DataOps, as well as how GenOps and DataOps relate to SecOps for data. Typically, this is accomplished by a data security operations specialist who provides SecOps for as many as 10 different databases, cloud and non-cloud. 

3. DataOps can be automated  

One of the nice features of living in 2021 is that we have ops tools that have a mind of their own. These are tools that can monitor databases, including resource usage, security issues (such as resource saturation indicating an attack), and performance. 

The monitoring tools can fix issues as they emerge as well as "learn" to fix issues to provide better operations and automation going forward. 

The dilemma is that many of the people charged with DataOps may not understand that these tools have this capability. Or, in some cases, they own the tools but do not leverage them for DataOps. 

At the end of the day, infrastructure issues constitute about 90% of the problems that databases will encounter. Again, this is true both in the cloud and for on-premises systems. 

Most AIOps tools include this "mind of their own" capability. But AIOps tools' applicability toward the challenges presented by DataOps is often overlooked, or staff do not leverage AIOps tools to their fullest capabilities. My best advice: Automate all you can. Data management is no exception. 

4. Learn how to apply future-proof processes and tooling 

It's no secret that technology quickly changes. Those charged with ops responsibilities, strategically at least, need to design and build ops systems and processes that account for this constant and consistent change. 

The solution? When you build ops systems and processes, and when you choose tools, the idea is to keep volatility inside one domain. In practice, this means you select general-purpose tooling that can be applied to databases operating today, as well as databases you may leverage five years from now. 

It's tempting to pick whatever tool is native to a specific database; it's often the easiest to use with that database. It's typically a tool supported by the database, or perhaps by a public cloud provider where the database is a service. 

While this may save you some time up front, the number of tools you'll have lying around will quickly become unwieldy, and any changes to the mix of databases will make DataOps come down like a house of cards. 

A much better approach is to create common processes and select tooling that can function across any number of different databases—on premises and in the cloud—today or five years from today. If you can keep change inside a domain, the processes and tooling will be consistent even though database management changes will continue to accelerate. 

Data management: The next ops frontier

This four-concepts approach is about adapting to a new responsibility that most ops teams now face: their need to deal with data

With a bit of planning and understanding of what needs to get done, how, and why, transferring DataOps responsibility to GenOps should result in a happy transition. Fingers crossed.

Keep learning

Read more articles about: Enterprise ITIT Ops