Micro Focus is now part of OpenText. Learn more >

You are here

You are here

Use containers, machine learning to deploy portable, smart apps

public://pictures/davidl.jpg
David Linthicum Chief Cloud Strategy Officer, Deloitte Consulting
 

Machine learning is the new artificial intelligence (AI). Most developers don't yet understand what it is, but use cases are beginning to emerge. Meanwhile, containers offer a new way to build and deploy portable cloud applications, as well as a new way to deploy applications that incorporate machine learning. Here's how these systems work, what machine learning can do for your applications, and how to create portable machine learning applications that deploy within containers.

AI is nothing new. People have been using these systems since the 1980s. While AI encompasses a broad category of "thinking systems," machine learning is a sub-category that focuses more on understanding data using algorithms and has more practical applications. Machine learning applications take a data-oriented approach. They marry our ability to store petabytes worth of data in cloud-based and traditional systems, with the ability to leverage that data for all kinds of new purposes.

Machine learning: What's in it for you

Machine learning is a form of artificial intelligence that uses algorithms to learn from data. These systems build models from incoming transactional data, apply algorithms to find patterns in that data, and make predictions. The predictions created by these "thinking systems" can be as simple as providing a recommendation to a shopper on an e-commerce site or as complex as determining whether or not an automobile model should be retired.

Developers are also applying machine learning to make predictions about wearables and the potential impact on the health care industry. Consumers now wear watches and other monitors that spin off megabytes of data each day, tracking such things as steps taken, calories burned, and heart rate. Data from these devices is typically collected on cloud-based storage systems, and consumers access their web-based user account to view simple daily reports.

Now, add machine learning to the mix. These monitoring devices have the capability of analyzing heath data, as well as making determinations as to what it means for the user. You can also review the logical meaning of the data and gain knowledge over time, based upon learned outcomes and the data patterns that lead to those outcomes.

For instance, if someone had a heart attack, machine learning systems can study the pattern of health data that led up to the event and teach machine learning systems to look for those patterns. As other events occur, new patterns could be identified to spot and prevent future negative health events. Machine learning systems may see patterns that humans have yet to identify.

The data on its own has little value. You need to apply a machine learning system and have access to plenty of historical data: the gigabytes and petabytes of data that provide patterns and outcomes. In this way, your machine learning system can gain the knowledge of thousands of physicians by culling through historical data to determine what patterns lead to what outcomes. If implemented effectively, this technology can save lives.

That's just one business case. Machine learning systems can consider many past and real-time data points to determine the most efficient inventory levels to maintain or make sales predictions that look at past, present, and predictive patterns of data.

As with their AI learning system forebears, the overhead of machine-learning systems is huge. But today you have the option to place these systems in the cloud. Amazon Web Services, for example, supports machine learning using its algorithms to read native AWS data, such as Relational Database Service, Redshift data warehouse service, and Simple Storage Service. Google has supported predictive analytics for some time with its Google Prediction API, and Microsoft also provides an Azure machine-learning service.

But most versions of machine-learning technologies aren't cloud-based. You can also find many open source offerings, as well as commercial versions of machine-learning software. But keep in mind that machine learning technology is relatively new, and features and functions vary a great deal.

Using containers for machine learning

By combining the advanced technology of machine learning systems with the deployment capabilities of containers, you can make machine learning systems much more useful and shareable.

I've talked about the purpose and function of containers before, so I won't cover it here. However, the ability to deploy machine-learning applications as containers and to cluster those containers has several advantages, including:

  • The ability to make machine learning applications self-contained. They can be mixed and matched on any number of platforms, with virtually no porting or testing required. Because they exist in containers, they can operate in a highly distributed environment, and you can place those containers close to the data the applications are analyzing.
  • The ability to expose the services of machine learning systems that exist inside of containers as services or microservices. This allows external applications, container-based or not, to leverage those services at any time, without having to move the code inside the application.
  • The ability to cluster and schedule container processing to allow the machine learning application that exists in containers to scale. You can place those applications on cloud-based systems that are more efficient, but it's best to use container management systems, such as Google's Kubernetes or Docker's Swarm.
  • The ability to access data using well-defined interfaces that deal with complex data using simplified abstraction layers. Containers have mechanisms built in for external and distributed data access, so you can leverage common data-oriented interfaces that support many data models.
  • The ability to create machine learning systems made up of containers functioning as loosely coupled subsystems. This is an easier approach to creating an effective application architecture where you can do things such as put volatility into its own domain by using containers.

The biggest issue to this approach is the newness of this technology. Containers, at least the Docker version, are relatively new, and so is machine learning. On the other hand, both are based upon past technology patterns, so there's nothing too scary about either of them.

Nonetheless, if you lead on the edge of any technology, you'll run into many parts of the technology that have some evolving to do. If you push into machine learning and containers, you'll find that many of the integrations are still do-it-yourself, but they're doable nonetheless for skilled software engineers.

Getting started

If you're still not sold on the synergy of both technologies, it could be a matter of doing a proof of concept prototype using well defined but simple use cases. I recommend using open source systems, but there are dozens of machine learning platforms from which to choose. Some currently work with containers.

Before getting started, review Paco Nathan's presentation on "Microservices, containers, and machine learning." It provides some great examples, including things you can try in your own shop.

As far as turnkey machine learning that exists within containers, solutions are still evolving. Most of the open source and commercial machine learning technologies have containers somewhere on their roadmap. So if you're already married to a machine learning technology, it's a good idea to find out its container strategy and begin to test it as soon as you can.

But if you want to combine machine learning systems with containers on their own, chances are it won't be that difficult. Containers are self-contained by definition, and the ability to place any number of workload types, including machine learning applications, is straightforward.

The core lesson here is the ability to get more value out of data and to deploy these systems in more portable and efficient ways using containers. The combination of the technologies is clearly a 1+1 = 3 kind of thing, if you're willing to put the time into your initial implementations.

As you get better at gathering and analyzing data, you're going to need systems that go well beyond the simple presentation of information in different ways. While humans can make sense of data through experience, the better path is to create systems that can do that job for them, and learn as it gathers information. This lets you automate the processes from end-to-end while removing human error from the equation.

By using machine learning effectively, you might be able to save lives—or just reduce the costs associated with excess inventory. As the amount of data your organization collects continues to accelerate, the business demand to place automated processes around that data and to actually make sense of it, will only accelerate.

Keep learning

Read more articles about: Enterprise ITCloud