Micro Focus is now part of OpenText. Learn more >

You are here

You are here

When computing gets ambient: Scott Amyx on the potential of wearables and the IoT

public://pictures/Mike-Perrow-Chief-Editor-TechBeacon.png
Mike Perrow Technology Evangelist, Vertica
 

Speculating on the impact of the Apple Watch in The Atlantic magazine recently, associate editor Robinson Meyer noted that Apple "will directly influence only one narrow part of our attire." He adds, "Still, the new watch heralds a broader convergence between the things we use and the things we wear." Maybe this convergence in wearables is loosely related to The Quantified Self movement, which centers on collecting personal bio-data during daily activity (think Fitbit). But does it have to be? Maybe it's simply cool to wear new tech visibly on our wrists, instead of carrying it around in our pocket or purse as we've done for years with our phones. And why not? For some of us, a commitment to the latest in tech devices can also be a kind of fashion statement.

The wearable tech agency Amyx McKinsey understands the fashion angle in wearables—and a whole lot more. They saw a huge megatrend in IoT and wearables about 18 months ago and decided to spin off a separate team to get in on the action. With its core competency in market research, consulting, and implementation, Amyx McKinsey has been diving into the field of "affective computing": the development of systems that can recognize and even emulate human emotions.

Naturally, they won't say exactly what they're developing, just that it involves algorithms that interpret signals from a variety of wearable systems and IoT devices, and they are attracting larger, established companies as early customers. However, founder and CEO Scott Amyx will talk about the amazing possibilities in this emerging space. I had a chance to speak with Scott last week, and he was very generous in answering my questions. What follows is a transcript of our discussion.

TechBeacon: You've said that the basic technologies around wearables have been around for several decades, but the market awareness has only taken off very recently. What has created this shift in the marketplace?

Scott Amyx: Yes, why is this so interesting, other than the cool factor of, say, the Apple Watch? Why should a user care about these wearables?

For the general public, the emergence of an ecosystem of wearables and IoT can start to create something we call "ambient intelligent computing (AIC)." Ambient computing has been around for a little while, but AIC is our term for what happens within this new wearable and IoT ecosystem. It means that, from the time you wake up, move to your kitchen to start the day, drive to work, you have a system on your body—along with the systems at home, at work, and public spaces—that understands your context, your external stimuli, what you're trying to accomplish and your reaction to things, both physiological and emotional. That's huge.

Here's where we're going. Computers that once lived on a PC laptop or smartphone are now scattered, miniaturized like dust, into building materials and your socks and your furnishings. This ubiquitous understanding of who you are, with you at the center, becomes powerful because so much of our day is really mundane. This collective intelligence can do things for you, for example—I'm on email all the time, trying to schedule and reschedule appointments. How many times do we have to go back and forth to schedule a meeting? What we need are intelligent agents that can figure out multiple parties' availability to schedule meetings, so you can excuse yourself from this tedious process. x.ai does exactly that using artificial intelligence.

So we began our effort to become a leader in this space. Based on our early research, we created a startup in stealth mode, and now we have a team of eight people. Some are AI, machine learning experts; others are experts in design and development, as well as advertising. It's a great team, and right now we're somewhat under the radar, working with large, established companies to develop partnerships, including customer service, retail, and consumer electronics companies. Our goal is to provide our brand engagement algorithm on a platform-as-a-service model to brands and companies to add tremendous value to the end customer. We'll be formally announcing our plans in late fall.

My personal background is in wearables and IoT; I get into everything from human psychology to system protocols and standards—both the soft side as well as the hard science. I am leading this effort by pushing the technology to our pilot customers.

TechBeacon: What are you seeing that's of greatest interest to your prospective customers and to the partners you hope to attract in this space?

Scott Amyx: I see a wide and growing spectrum of interest. Wearables can include everything from luxury fashion to smartwatches, to athletic wear, to wellness and medical applications. There's a cross between wearables and IoT with robots and drones, and in the IoT space it gets fairly broad. There's the smart home, the connected car, smart cities, industrial things, and it's complex. On any given day we're dealing with the nuance of nanofiber in fashion technology to advanced sensor technologies for medical and health applications.

One of the things driving current interest in the connected home is personal security. From AT&T to startups just wrapping up their Kickstarter campaign, we are seeing that IoT and miniaturization (enabled by the effects of Moore's Law) are affecting microcontrollers, battery life, in-memory processing, etc. which allows ecosystems to be built around home security that in the past would have been cumbersome to install, expensive, and problematic to maintain. Companies like People Power allow you to use your old smartphone to create video surveillance for security purposes. There are even companies using smart, camera-enabled doorbells that can unlock a door remotely.

I was recently at a conference with the CEO from People Power, who had initially thought the key benefit to consumers was the notion of lowering utility bills—to reduce consumption of electricity, gas, and water. But as it turned out, peace of mind from security turned out to be more important to the consumer. A good example is elder care for those living independently but with family members who want confidence that their loved ones are safe.

TechBeacon: With so many manufacturers out there, all competing, how can buyers ensure that their "things" will connect, such as wearable stuff or devices for the smart home?

Scott Amyx: At an industry level, you have alliances and consortiums such as AllSeen Alliance, Eclipse Foundation, and the Open Interconnect Consortium (OIC). These are creating the standards for interoperability, so regardless of the manufacturer or type of appliance or device, they can find and communicate with each other and with the smart home automation hub and the cloud.

But it's also important to be able to work through legacy issues. For example, an aging water heater that has no means of communication can be now equipped with sensors to detect things like leaks, heat, and vibration. So even though the water heater may be a legacy appliance, the sensor that's placed on the water heater makes it smarter. There's a water level sensor you can attach to your dog's water bowl, that can notify you to refill. These are just small examples of new capability enablement through connected home technologies. There's also a lot going on with the connected car, the environment, industrial applications, and applications that involve human psychology.

Let's take an extreme situation: a military engagement. Say you're deployed overseas at a battle scene. It's important that commanders in the field understand the "cognitive load" of their soldiers. What is their attention level? Their focus? Are they distracted or confused because they just got shot at? Or consider a less extreme situation, everyday drivers on the road. Truck drivers, or car drivers like us...what is their cognitive load? What if a system can be optimized through understanding your level of focus at any moment in time? How tired you are, how exhausted because you've forgotten to eat or fill up on coffee, for example. The system could advise you to take the appropriate actions for the right context.

In a medical context, for the time being, it's difficult to monitor our neurochemicals such as the levels of dopamine and oxytocin at work in the brain without an invasive technique. However, at some point, if we understand how the consumer is feeling, nonoptimally, the intelligent system can recommend a doctor-approved dosage of neurochemical supplements to restore the right balance in the brain chemistry.

TechBeacon: To what extent has the evolution of implanted devices—pacemakers, catheters, defibrillators, etc.—informed the design and capability of wearables today?

Scott Amyx: The current generation of wearables has played more on the fitness side of health than the medical device side. But this year, and definitely next year, we're going to see a transition to medical-grade wearables. Matter of fact, I am speaking on this very topic at an upcoming medical device manufacturers conference. This is important, because the ecosystem of doctors and medical professionals needs depth and accuracy that's better than the inaccurate monitoring from fitness bands. Also, while for some time we've had devices that are implantable into our bodies, the mechanism that does the reporting and alerting has been a cumbersome device we've had to wheel around. Soon we'll see that the thing implanted into our bodies will remotely work with our wearables, our smartphones, and other systems.

Now, project this very far out. Google is working on nanoparticles—tiny microcomputers that can be injected into the bloodstream to understand blood chemistry, including cancer detection, and other things that are happening in your body. A doctor always wants samples of your bodily fluids, because that's where the information lies. With something like this, you can imagine 24/7 monitoring of a critical patient, for example. Imagine this system recognizes anomalies, say cancerous cells. By using different sets of technologies, from radio frequency, to sonic, to heat, physicians could create a highly localized treatment for this cancer as opposed to requiring large dosages of chemotherapy with all the side effects.

In a more everyday context, we can envision new approaches to treating anxiety and stress. When we are at work, studies have shown that we have a higher degree of emotional vicissitudes than when are a home or at rest. No surprise, right? As much as we might want to appear at ease with our colleagues and our bosses, they can be trying at times. What a wearable system might do is record physiological information—your heartbeat, your breathing, your sweat and its composition, and your bloodstream, all in a noninvasive way. Calorie intake and burn, and so forth. If the system understands that you're in a stressful environment, that you're anxious or afraid, which affects productivity or at the very least our sense of well being, the system can recommend some remedy. Take a walk, or have a drink of water. Take some deep breaths or talk to a friend. WellBe is doing essentially that, with a focus on meditation and causes and events of those irritations. We can imagine such a system, very far into the future, that might even release a certain neurochemical or neurotransmitter to bring an internal system up to a more optimal level. More controversial but certainly plausible.

TechBeacon: You mentioned psychology. How can the biometrics used to measure stress and so forth play into social interactions?

Scott Amyx: OK. There's another piece that we're working on that has to do with emotion sensing. When, say, my wife and I make a decision, whether that's about a house, a car, clothing, any sort of purchasing or even job decisions, as much as we like to believe our decisions are based on rationality, they're actually based on emotions. But how do we know what others are feeling? When I go to a conference and interact with people, I may be told "yes, we're interested in your powerful cool solution." But in fact, they're just trying to be friendly and diplomatic. What if we could deploy a facial-recognition detection system? Or speech analysis? Body language detection? What we want to know is whether what people are saying is in line with what they actually feel. The most rudimentary form of this is the old polygraph test—the emotograph lie detector device invented nearly a century ago.

TechBeacon: For several years now, sentiment analysis has been used to gauge mood and predict behavior of a population. It's mostly based on text. But if you could have 1,000 people monitored for facial recognition and body language, you could take sentiment analysis way beyond where we are today.

Scott Amyx: In fact, I'm speaking at a conference coming up in New York called Sentiment Analysis Symposium [Scott's presentation is titled "Wearables Gushing With Emotions: New Brand Engagement Architecture"]. Online, sentiment analysis is based on emoticons and chatter, but what we're talking about is getting down to both primary and advanced emotions, including confusion, frustration, and so forth. We're not interested in an individual but aggregate crowd emotion. For example, say you're a real estate manager for a shopping mall. You want to know which stores are doing well, by location, by brand. When a shopper walks by Louis Vuitton, there is a certain emotional response, on average at an 80 percent confidence level, and it's positive. Whereas if they walk past this other store that's obscure or relatively new, they see the same measures are neutral or relatively negative. What that means from a property management perspective—when you're thinking about lease renewal, for example—you can decide whether a location for a particular business is working or not. Or you can see that when the same business is located next to a clothing store, the sentiment analysis looks better.

Now imagine this on a larger scale, a smart city. Sometimes a municipality's way to respond to a threat is only as quick as someone's ability to call it in. The Department of Homeland Security is very interested in arming first responders with wearables and a metro area with a distributed IoT grid that can sense chemical and biological threats.

TechBeacon: But conventional sentiment analysis is based on the fact that many people own and operate smartphones. How can wearable technology become so desirable and ubiquitous that any plans around things like affective computing can become real?

Scott Amyx: There will be a transitional period, but the technology I'm talking about works with the lowest common denominator, which as you suggest is the smartphone. What that means is that even with the capability of the smartphone, affective computing based on text, to the mic, to the camera, to the gyroscope, compass and GPS, we can make sense of those signals. For example, you're on your phone and are suddenly excited; you may not realize that you've just jerked your phone in a certain way, which indicates your level of excitement (just as the compass feature recognizes true north based on how you're holding the device). As you add more capabilities, our detection classification becomes stronger, in a combination of a deterministic and a probabilistic approach. The more signals, the higher the confidence and the accuracy. Any current device can be part of the mix—a fitness band, a computer-vision camera, microphones, and environmental sensors that can pick up on speech amplitudes and frequency and movement.

Things get more precise in a traditional enterprise model, where data storage can be fitted with some smart sensing capabilities. Imagine when you walk into an IKEA store you're presented with free Wi-Fi and given a chance to opt in. Once a customer opts in, IKEA can tell how customers are interacting with different products. If our system senses that a shopper has a favorability towards a product, a message is triggered to their smartphone—perhaps a promotion, product reviews, personalized messaging, etc. That info helps the buyer figure out the right product, the right pricing, the right type of display, etc., all based on the fact that the system recognized which product the customer was most excited about. That's just the beginning. Online capabilities such as cart abandonment metrics can now also be tracked offline. Huge opportunity for retargeting.

In a connected home scenario, say I'm headed home after a day's work. My smartphone or watch can send a signal to the house that I'm on my way. The garage door opens, I get out of my car, and the door to my house authenticates me via heart-rate identity, and it unlocks. Inside, the lights automatically come on, as does the music. Perhaps even while I was driving my meal began cooking within the smart oven, which is really convenient if I happened to be working late that evening. We suspect that in the near future, common household appliances will run on open source OS and interoperable standards, including Google Brillo, but what you don't want is years down the road to have to go through 50 different connected apps to control things. So, a company like OORT is providing a single GUI interface, a universal remote control for the connected home. Whether you've got a Samsung appliance, a Mitsubishi iron, or a Panasonic toaster, all can be connected through their system.

TechBeacon: So how do you explain the relationship of wearable devices and IoT?

Scott Amyx: In the strictest sense, you can't just say that wearables are an extension of the IoT, because wearables are not always a "thing," per se. They could be part of your body. But think about this in concentric circles. We've taken the capabilities of the mainframe, the PC, the mobile computer, and put all this in the smallest form, sprinkled all around us.

Closest to us is our body area network: computing is on our clothing or inside our body. Communication can be through Bluetooth Low Energy (BLE), or short range radio frequency. As you go farther out, you have the near environment—the connected home, the car with its security system, diagnostics, all those things that are working in conjunction. The company Jasper was able to handle a recall issue for auto manufacturer Tesla by downloading a fix to all the affected vehicles without requiring customers to drive their vehicles in for repair. Imagine the savings, both financially to the businesses, as well as in time and effort to the customer base.

Take this out to the farther concentric circle. This is the industrial IoT, where you can measure the water level of a river, the chemical composition of soil, the post-earthquake vibrations still active in a building. What's going on in this circle is exciting, it's the equivalent of the industrial revolution. Many of the sectors that previously were not technologically oriented—farming, manufacturing, municipalities, etc.—now you have the capability to collect information on climate, operation of physical assets, and movement of goods. This extends to even demand-driven supply networks. When you purchase a jug of bleach, eventually there will be a tiny throwaway commodity sensor that detects how far along you are in your use of that product. When you get to a certain weight level, a signal will be sent back to the bleach factory to create another jug and restock at the store nearest to you.

TechBeacon: What are some of the legal implications around wearables? In the medical field, we have privacy covered by HIPAA, for example. What challenges are there in your field?

Scott Amyx: From a purely legal perspective, this is a new frontier. Attorneys are bracing for the worst. There is an article that I am publishing on InformationWeek that presents a privacy playbook for wearables and IoT. I'll be talking about research being done ranging from users' privacy regarding wearables, privacy strategy, to specific tactical means by which companies can do this right. Here are a few "bullet points":

  • Lead with privacy. Many times, companies involved in this ecosystem are technologists first. Security and privacy come second or third in their concerns. We can't afford to do that. When you lead with privacy, it becomes a competitive differentiator for your business.
  • Establish a Chief Privacy Officer in your company. At the C-level, this position communicates to your consumers and shareholders that you take privacy matters seriously.
  • Use a framework (such as Privacy by Design) that ensures privacy is tightly integrated into the entire product development and management process. There are seven principles around that. Consortiums provide guidance around these things. Forrester is active in this space with their "Personal Identity and Data Management Playbook," which they continue to update.

At the industry standards level, the AllSeen Alliance and others are really taking privacy and security to heart, trying to solidify it in terms of operability and standards. We are encouraging firms to adopt best practices around privacy policies and disclosures.

Image source: Flickr

Keep learning

Read more articles about: App Dev & TestingApp Dev