Building a Better Bot | Danny Tomsett at Uneeq

Interacting with artificial intelligence is no longer a technology of tomorrow—it’s a way of life. If you’ve ever adjusted your car insurance, upgraded your cable service, or sought help installing a new printer, you’ve almost definitely found yourself chatting with an AI bot. And it probably wasn’t your most engaging conversation of the day.

Though AI bots can often deliver the solutions you need, they lack the emotional responsiveness we crave.

Danny Tomsett, the founder and CEO of Uneeq, is trying to change that with a product called digital humans, which aims to make AI personal. On this edition of UpTech Report, Danny tells us how the technology works, and what it could mean for you the next time you need help.

More information:

DISCLAIMER: Below is an AI generated transcript. There could be a few typos but it should be at least 90% accurate. Watch video or listen to the podcast for the full experience!

Danny Tomsett 0:00
We’ve been so good at taking human out of the equation. We’re now trying to figure out how do we bring some humanity, some emotional connection back into it?

Alexander Ferguson 0:15
Danny, I’m excited to be able to chat with you today and hear more about unique digital humans to begin, in a nutshell, five seconds, what would you say what is unique?

Danny Tomsett 0:25
In five seconds, I would say unique is a company that solves for bring an emotional connection and a digital world.

Alexander Ferguson 0:33
I like it. I imagine it’s changed over the years. But what’s the problem that you saw to set out that you wanted to solve what was the problem you initially saw?

Danny Tomsett 0:43
Yeah, and the problem is the same. over the 10 years we’ve been in existence, the solution has changed. As you highlighted, I think the problem really that we have is as we drive more and more about interactions via digital, accelerated even more. So if you think about current times and the climate we’re in, you start creating more transactional interactions versus relational emotional interactions. And when you think about the organizations that depend on emotional connection, there’s incredible study by the Harvard Business School, which actually quantified that, you know, this is like a 4x impact on conversion, or net promoter score, which is a measurement for the quality of experience. And these are the connection, that’s the human condition, that’s the experience factor, the experience that we value, and that we want. And we actually have, like 82% of people, when they were surveyed through some of the major, you know, analyst kind of research, whether it be Gartner or some of the others, you know, it’s a very consistent result and 80s to that we want interaction, we actually like human touch, because it creates something that makes us feel good, and the interaction and, and now we’re

Alexander Ferguson 2:00
going to our digital world, we don’t really have that same human interaction the way was,

Danny Tomsett 2:06
yeah, we’ve been so good at this, right? We’ve been so good at taking human out of the equation. We’re now trying to figure out how do we bring some humanity, some emotional connection back into it. And I think that’s really what we saw 10 years ago, our focus originally started on how do we do that with humans and realizing that, actually, when digital is applied, it brings a significant, I guess, scale, accessibility and ease. And that’s why we’re attracted to it. And there is this, I guess, tension. And that to align that with an experience, that’s more human, people are not very scalable, you know, they are limited in their time availability. And often a lot of the tasks that we’re trying to, obviously scale, very hard to train on a consistent manner as well. And so that’s kind of why we over the 10 years, we’ve pivoted towards digital humans, and how that now plugs directly into that same value proposition of digital, but now with the human touch.

Alexander Ferguson 3:16
So give me a use case and analogy have in one particular market, your product in play?

Danny Tomsett 3:25
Yeah, we, as you expect human touches ubiquitous. And so I got lots of these, but one of my favorite ones, and, you know, our vision is very much up, we have such a passion in our company, around how this technology will be used to improve human life. And one particular area, I think this technology is going to be a game changer. And early days, we’ve seen this already, is in healthcare. And in particular, if you think about the health care system, we have incredible people, doctors, nurses on the frontline and supporting people like you and I when we need help, right. And the issue is that when we go in for say, surgery, or we go and for some some level of treatment, they have very limited time to spend with us pre and post the operation or the treatment process. And the issue in the healthcare system is you basically get a pamphlet or you get some complex digital website or patient portal, that you have to now figure out how you look after yourself, either leading up to that moment or after that moment. And we have a huge adherence problem in the US. It’s it’s just massive, like over $200 billion problem just because of people unable to have access to good information in a way that they can consume easily without judgment. Right. And so digital humans enter the scene. Now instead of a complex website or a patient portal, you put a digital human on top of that and you come out of your Your surgery? And you say, Well, I’ve got 16 different medications. Which one should I be taking today? And I’m not feeling so good. Should I be calling my doctor now? Or what should I be doing? Right? And so these kinds of conversations are so powerful when you put a human interface in front of it and make it accessible 24 by seven,

Alexander Ferguson 5:20
there was a word that you said, I’m trying remember, without not fear, but without. They don’t have any thing disconnected. I’m not wanting to interact with this, because the judgment, judgment, you said no judgment. What’s that, then? Why would someone not feel judgment talking to then a digital human versus a human? Talk to you? Yeah, that behavioral mindset,

Danny Tomsett 5:44
as a really as an interesting psychological, I guess, insight that we’ve discovered, which is also aligned with some of our views. And as that we’re not trying to trick people into thinking digital humans or humans, in fact, that works to the disadvantage, the advantage is that they’re not human. So therefore, they remove a lot of judgment, and the interaction. And so we’ve found that time and time again, across various use cases that people would be more open to talk about their financial situation, or healthcare, or ask questions about pharmaceutical needs, and all sorts of those kinds of areas where judgment often impacts how comfortable we are to ask that question, right?

Alexander Ferguson 6:28
If you feel like it’s a stupid question, you want to ask it, which would hurt yourself, you should ask stupid questions. Because you don’t know. But so interesting that that could remove it. And I appreciate your point of, you’re not trying to hide the fact that it is digital human, because we’ve all seen these futuristic movies and things where, you know, you use it a real person is not You’re not trying to fake people out.

Danny Tomsett 6:52
No, no, not at all. And so I think that’s really been part of the journey with both understanding how digital humans play a role in a digital world. But also, how do they augment humans really well? What is it that they can do? That’s something that we can’t do very easily as humans, and judgment is definitely one of them.

Alexander Ferguson 7:14
Because you’re not trying to say replace a human interaction on one side, because human is a human in the they know, it’s not a human. It’s not there. But it’s more of a you sent emotional, it’s a different type of UI. Instead of trying to navigate a website, you’re talking to a conversation, to something that can emote back to you and respond to your emotions. Talk to me more about the the emotional response and how does that seem converses to? I mean, there’s tons of chatbots why not just use a chatbot?

Danny Tomsett 7:44
Yeah, 100%. Right. So that’s the beauty of what we do. And you said it well, in many ways, we just an intelligent UI. It’s an evolution of the flat world we live in today, the the interaction that we have, is very much related to a 2d screen. And in many instances, we’re interacting by keyboard or mouse. In some instances, now we’re moving into Alexa and Siri and compensation, which is, I guess, the predecessor to what we’re really talking about, and, and naturally chatbots doing a lot of work around trying to build into that transactional experience, you think about it, you’ve got a question, you need an answer that helps find that answer as quick as possible. But you’re still interacting through those channels of text and keyboard. And so when we try to bring feeling and experience into that, we’ve need to think more about a persona, like you and I have, we have a persona, and we also communicate via nonverbal. So a lot of what we’re just doing now we’re looking at each other’s body language, we’re listening to the tone and that’s all feeding into an overall experience of trust and, and a lot of other things that come into that. So digital, human, you know, when we think about that technology stack, it was really about the fact that the way that interfaces are going to evolve need to incorporate more, what I call multimodal inputs, it’s what can we see what can we hear? And what can we express in the same way that we can see and hear as well. And, and that combined, creates this experience that feels a lot more human. And it’s a lot more intuitive actually, to us as humans as we’ve communicated our entire lives.

Alexander Ferguson 9:32
Let’s let’s dive a little bit deeper into the technology those who enjoy that the specifics. We can talk about this if you for those who are watching, if you’re not that much of a technical person, you can skip to the next part. But if I understand correctly, you actually like a bass use. A lot of the standard NLP is out there are stacks like Google’s dialogue flow, but you can like plug into any one of those, but yours is the front end. The the digital human reacts as far as this sentiment that it senses that I get that right?

Danny Tomsett 10:02
Yeah, that’s That’s correct. So you think about the billions of dollars that have been invested today and incredible breakthroughs around natural language understanding, and the way that conversational AI is able to train on a set of data that then provides important what I would say accuracy and intent matching to what we what we expect when we talk to a chatbot, or a Siri or an Alexa. And that’s progressing at rapid speed now, because of that investment, what what I guess our strategy was always going to be was that that’s solving the content, the the ability to understand the intent of the question, who’s solving the human experience around who’s creating the characters and the persona. So it’s kind of like and movie making or game development. Just because you have a script, doesn’t mean you have a movie, or it doesn’t mean you’ve got a great game, you’ve still got to bring the character to life, you still have to have an interaction that people fall in love with. And that is an essence, what we do, we are the intelligent interface. And the AI technology we build enables us to scale that so that we can plug that directly into any chatbot any NLP framework, whether it be dialogue, flow, Watson Microsoft Bot Framework, and bring that to life without anyone having to program it. And that’s the power of digital human interface. What’s unique?

Alexander Ferguson 11:35
Digging a little bit further here is the emotional sentiment. Your unique human or digital human can understand the emotional sentiment of the viewer as well. And is that picked up by a what your side? Or how is that figured out?

Danny Tomsett 11:51
Yeah, this is very early days on this type of technology. So I don’t want to oversell this, right, but the digital human obviously has the ability to see you and he and with that we can accurately make some assumptions on how to respond more appropriately, some of these things are actually really straightforward that we take for granted. Because if I turn my head and talk to someone like this, you know, I’m now not talking to this good example, right? And we can see that and we can actually go, I don’t need to respond to that, because you’re obviously looking at someone else and talking to someone else. And that plays a huge role, say, in physical environments, where there’s kiosks and things like that, that you’re walking up with friends and your kids and all those types of things, right. But when it comes to the emotional levels of understanding, you know, we’re at that point where we can start to understand, is someone interested and engaged? Or are they disconnected does this, they now getting frustrated and moving away? The kind of tech that claims to know no happy and everything i Someone told me this the other day, I loved it, there’s like, it’s kind of like emoji level detection, you know, it has to be like all the extremes to actually detect it. So you know, this has got a journey. And definitely this will be solved. There’s no doubt about it. The technology is advancing at a rapid speed. And we’re making big investments, as are many other companies in the US. But our some of our best r&d technology and investment is actually also on the other side, where the Chatbot engine, whatever that is, it provides text straight into our platform. And we’re able to determine how that will then be expressed and behaved and spoken and all those things. And we’re able to take multiple inputs to like audio and other things that can combine to create this expression.

Alexander Ferguson 13:38
Because on your side, the AI is actually generating the the speech itself not generated by the Chatbot provides the text, the script, as you stated, so your research has been in development has been into okay, how is that sound delivered? Does it sound and happy? And is there right phrasing. That’s what you spend a lot of your time in as well as then the facial reactions of your digital human.

Danny Tomsett 14:01
Yeah, Alice’s especially around the animation systems and how digital human behaves, we actually interchange the voice systems because there’s some great synthetic voice options out there now, from companies like the big players, whether it be Amazon Pali, or Google WaveNet, through to smaller companies like voice or E and others, so we can interchange all the voice options. And it really focused on on that. Yeah.

Alexander Ferguson 14:26
Tell me more about the visual, obviously, like animation is as increased and the ability to show such detail. Like, I’m a big fan of fall, of course what Nvidia does, and there’s unreal what they do with their engine. Do you are you using a physics engine a game engine out there? Have you developed your own? How has that worked?

Danny Tomsett 14:45
Yeah, so we actually, we have a lot of proprietary tech around us, but when it came to rendering, actually we like we like the Unreal Engine. So we’re a strategic partner of epic and you know, we’re very closely around the development if it into producing high quality renders. And a lot of our proprietary tech, however, kind of extends beyond just rendering as you can imagine. So it’s all the requirements and how do you take something that what they would originally build that rendering engine for is typically entertainment, which, again, is quite controlled environments, you think about movies and games, everything’s controlled, you control what platform people use, you control that they sit in a seat and watch a big screen. But you also control the whole story from beginning to end. And once it’s kind of complete, that kind of works within those parameters. In our world, a digital human gets put into websites, mobile apps, you know, you name it, and then on top of that, it gets plugged into any type of chatbot system. And that can be changed within seconds and moments. So nothing is ever the same at any time. And so the technology that kind of extends from rendering is to really support both that hard challenge of real time behavior animation expression, and then also the adaptability into the business world enterprise systems and integrations and things like that.

Alexander Ferguson 16:09
I correct me if I’m wrong, but you guys are one of the first ones to productize this concept, bringing all the different elements of the NLP in the voice ability, and then the visual animation, and then provided as a as a SaaS solution that someone say, yeah, let me just embed it on my site and go, are you talking

Danny Tomsett 16:28
like, yeah, I wouldn’t be aware of anyone else that’s progressed the r&d to that level where they’re doing it now.

Alexander Ferguson 16:35
I mean, when you look at all the pieces by themselves, you could say, Well, that’s all I could get that here, I could get that there. But no one has yet combined all the pieces in such a way that it’s, if someone doesn’t want to have to be a master any of it, they could just implement it right now.

Danny Tomsett 16:49
Yeah, that’s exactly right. I think you’ve got on one side of, I guess, fences, companies that have probably focused more around the game community where they’ve, you know, maybe built a artistry or animation system that developers need to really work with. And then you’ve got probably more on the other side of the fence, which is focusing on very high end complex visual effects based technology stack. But it’s hugely bespoke, can’t be used, other than the purpose that has been designed for. And we’re kind of coming into this business around to commercialize. And it really comes back to our vision, our vision was, how do we empower creators really easily to make a difference. And then when we define creator, we realize this isn’t just developers, this is people with great ideas, who understand their problem really well. And if they could just have a human interface that they could plug in easily to a dialogue flow or whatever it might be, then they’ve got this magic that they’ve created. And it was quick, easy and affordable. And that’s really been a big part of who we are and where we’re going.

Alexander Ferguson 17:58
So as we kind of wrap up this, this part of our conversation, I’m going to play a little as devil’s advocate, just just a little hard challenge here. Let’s see how you can respond. I have a friend, he was a little frustrated as our local DMV switched to a chatbot. And he knew previously where to go click the button, get here know exactly get to it. And you don’t have to go through this long discourse of convert, conversing back and forth. Yes, people can be frustrated with the stupid digital things that aren’t responding the way you want it to, and you can frustrate them. What are your thoughts on that sentiment that is there and growing?

Danny Tomsett 18:39
Yeah, look, that’s that’s one of the key parts of this technology stack. Even prior to it putting digital human into the into the mix, right. When you think about conversation, as an experience, the biggest issue I’ve continued to see is that people think about it as a technology solution. And so they try to recreate what they built in technology, as a q&a system, that now somehow people should love. You know, and, and that’s why so many of us don’t love it, because I just fall short. The trick with compensation is it’s obviously not going to be the best solution for every use case, as well. So in this particular example that you gave me, maybe it’s just not the best solution, right. But if it was to be the best solution, it probably wouldn’t be implemented the way it is, it would be implemented on a in a way that really makes it feel far more engaging, and gives you something more than just the information, in my opinion. So a good example is if it speeds up the process, great, fantastic, right? If it doesn’t speed up the process. I still want to feel more comfortable or more confident in my decision making and maybe it helped me with that. Right? So it could be that I could get similar information but I Got a lot more questions. So I was able to fire through three or four more questions, and I feel a lot more comfortable. And so those are the things in conversational design that are fundamental. And I think companies are learning this that just kind of jumped in got a little bit messy, they kind of love the ROI on this, but they haven’t quite figured out the experience factor. And so the next three to five years, you just watch the space, you watch how fast the improvement in AI technology to adapt around existing datasets, just becomes way more conversational. And before you know it, we will have characters and personas that we interact with on a daily basis to help us with everything from, you know, life admin, banking, first mortgages, healthcare, I think it’s gonna be a much better world. Because in many ways, if you think about kids, the way that kids are talking to machines these days, whether it be Siri, Alexa, this is not the language we want to train them. And so as we evolve conversational AI and digital humans and build relationship and trust, this is how we want to teach our kids to interact. And the human language is about having more respect and building those kinds of connections that I think long lasting and great for brands great for solving big problems. So

Alexander Ferguson 21:19
speaking of the future, what can you share of your roadmap? Now looking forward in the next near term next year? What are you guys working on? And then long term? Where do you see you guys are going?

Danny Tomsett 21:31
Yes, a near term, we’ve got some really exciting releases coming out actually one of the areas that you’ve talked about, which is the analytics side, so a lot of our customers, deploying digital humans, and we’re trying to get them more and more insight, and how was that conversation working really well, to the example we just talked about? And how do we improve that, and digital humans have this ability to be able to also hear and see and be able to have more information on what’s going on, to help direct that versus just what you get back on text and a chatbot. So a really exciting part of our near term, we also going to expand on the creative platform to improve on many, many different areas that our customers are telling us. Voice is a big area, just humanizing that more and more. And just enabling more and more integrations as well, so that customers without even needing a developer could stand up a full experience and interact with it. And we’re seeing a huge market interest, where people are trying to validate and testing. So we’re going to make it so easy to do that which I think’s going pretty cool long term. It really is about the fact that this technology needs to continue to extend beyond some of the basic animation systems that we have today. Like I believe we one of the best in the world at this, but it’s still a long way from where it could be and where it should be. And that also includes things like body language was hand movement, positioning, and really moving these 2d interfaces that feel very static today, like a video call that we don’t move on to, you know, fully interactive with content. And And then lastly, it’s really about the immersive environments that we’ve seen finally starting to take place, I think so VR is going to have their moment. And it’s coming. And and this is where digital humans again, I think will really excel because they can obviously operate in that dimension very well.

Alexander Ferguson 23:33
Where can people go to learn more about digital humans and what’s a good first step for them to take?

Danny Tomsett 23:39
Yeah, so or uneeq, which is uniquely spelt is our website great place to start if you click on the Creator link, on the website, you there are free trials that you can actually log in, set up an account and start playing with a digital human, and that I would highly recommend it. It’s a great way to just understand the tech better. And if you’ve got a problem you’re looking to solve, you can then go from there.

Alexander Ferguson 24:09
That concludes the audio version of this episode. To see the original and more visit our UpTech Report YouTube channel. If you know a tech company, we should interview you can nominate them at UpTech Or if you just prefer to listen, make sure you subscribe to this series on Apple podcasts, Spotify or your favorite podcasting app.



YouTube | LinkedIn | Twitter| Podcast

Sibling Thrivalry | Crystal Huang at ProSky

The Integration Realization | Ivan Matkovic at Spendgo