What excites us in artificial intelligence for 2021? As a small mental wellness startup, we’re focused on some basic steps forward with AI:
- Few-shot learning in addition to text generation
- Emotional engagement before emotion recognition
- Artificial domain intelligence not artificial general intelligence
- Augmenting humans instead of trying to replace them
- Feature stores with our data warehouses and data lakes
1. Few-shot learning in addition to text generation
In mid-2020 AI research lab OpenAI announced the third version of its language model, GPT-3. Much excitement and not a little bit of hype ensued. We saw various examples of how it could be used: generating text that became a news article about AI, question-and-answer with historical figures, and the ever-popular AI recipe generation. Many of the examples focused on text generation, but what’s more interesting and useful to us is using pretrained language models for more narrow natural language processing tasks, such as annotating a particular statement with some sort of label. GPT-3 and similar systems allow us to use something called few-shot learning to train classification (and other) models with a few examples.
In the future, we expect that the heavy lifting of training huge models like GPT-3 will be handled by organizations like OpenAI that have lots of money and compute at their command. This allows small startups focused on a particular vertical domain to use targeted examples to quickly and easily develop useful AI language capabilities with small effort.
2. Emotional engagement before emotion recognition
There are plenty of startups building AI for emotion recognition, whether based on photos of people’s facial expressions, text or spoken communications, or even physiology measured by a wearable. Think of it as Lie to Me turned into software.
Our focus right now isn’t on emotion recognition. The latest research on emotion casts doubt on simplistic views about emotions. One might think “there are universal human facial expressions indicating emotion” but a relatively new theory–the theory of constructed emotion–suggests that emotions are not, in fact, hard-wired and cannot be detected through facial expressions or other physiological measurements, no matter what Dr. Cal Lightman says.
Instead, we’re interested in how AI might build emotional engagement, so as to draw a person into an enriching relationship with a human coach or counselor, assisted by that AI.
How do you build an emotionally engaging AI? Microsoft’s chatbot Xiaoice suggests the way. She is built on an “Empathic Computing Framework” that prioritized EQ over IQ, and optimized around a count of conversational turns rather than successful responses to requests. Our initial AI-enabled coaching platform will not include a conversational capability–that will be provided by the human coach or counselor. But in the future, we expect to build AI capabilities that emotionally engage clients in a way that the on-demand help they receive via AI will keep them connected to the process and to their human coach.
3. Artificial domain intelligence not artificial general intelligence
The big AI research centers–OpenAI, DeepMind, and FAIR–are garnering so much funding because some of their investors think they might achieve the holy grail of artificial general intelligence (AGI). AGI is artificial intelligence that can learn any task a human can learn. Half of OpenAI’s employees think that AGI will arrive within 15 years!
We see more opportunity in artificial domain intelligence (ADI), AI built for a particular vertical domain. This is similar to the concept of vertical AI, artificial intelligence built to support a particular industry vertical, but focused on a domain of human action or understanding. For example, automated driver assistance systems (ADAS) address the domain of driving. Driving isn’t an industry, but can be applied across a variety of industries, so it’s not the same as a vertical.
Similar to driving, the domain of human emotional and mental functioning can be a target of a suite of AI capabilities. That’s what we are building. The components in our mental wellness AI can be put together in different ways by a human expert (a coach or counselor or psychologist). This allows the computer to take on tasks that it is more proficient at than a human.
4. Augmenting humans instead of trying to replace them
Humans don’t scale well. Perhaps that’s why there is so much attention on entirely replacing them with AI (see: the dream of self-driving cars). Helping humans do what they do better is much more interesting! This is the promise of augmented intelligence, what Gartner defines as “a human-centered partnership model of people and artificial intelligence (AI) working together to enhance cognitive performance, including learning, decision making and new experiences.”
We are designing the incantata.ai coaching platform to give coaches and counselors superpowers, not to disrupt them out of a job. We believe that keeping humans not only in the loop but in charge of the loop will result in better results for both the coaches and the people they seek to serve.
5. Feature stores with our data warehouses and data lakes
Data management may seem comparatively boring to other topics in artificial intelligence. In fact, does it even count as an AI trend? Because the development of artificial intelligence is impossible without advanced data management, we think it is.
There have been many debates about whether warehouses, lakes, or some combination of the two will best suit the enterprise of the future. No matter what architecture an organization chooses, if you want to build AI systems you need to be able to define and compute features. Features are transformations of data that make it more meaningful and useful in machine learning and other AI models. In December of 2020, Amazon announced a feature store for their machine learning development platform Sagemaker.
Why these trends and advances matter
It’s not artificial intelligence that matters to us–it’s using artificial intelligence to empower people as they help other people. Successful therapy and coaching requires a lot from its participants–to be emotionally present, to create shared understanding, to recognize and interpret patterns, to identify and consider options, to evaluate results of actions. We hope our tools will supercharge these activities, with targeted assistance that an AI can provide, but directed by the humans involved in the coaching relationship.