Pamela Pavliscak: A future with feeling

Beyond AI lies artificial emotions — but can we teach machines about emotions, if we don't really understand them ourselves. Pamela Pavliscak shows NEXT18 the way forwards.

Liveblog

Pamela Pavliscak studies the future of feelings. Obsessed by our conflicted emotional relationship with technology, her work is part deep dive research, part data science, part design.

We’re going to talk about feelings. What emotions come to mind when you use Instagram? Joy? Fun? Envy? That moment at 2am when you are stalking your ex – and you accidentally tap the picture, then you desperately try to untap it? These are new emotions we are still finding names for.

What’s the emotion you feel when the messaging dots (that indicate someone else typing) appear – then disappear? Why do we cringe when a robot dog is kicked, but laugh when it slips on a banana skin?

For five years, I’ve been studying emotion. I’ve had people chop their days up into chunks, to mirror their emotions – all sorts of things. Our current technology is clearly inadequate to express the rich range of human emotions. It’s high IQ and low EQ.

The age of social machines

We need an age of social machines. We’re seeing robots coming on the scene – Jibo, for example – that looks childlike. We don’t need that. We attribute emotion to anything with these characteristics:

  • eyes
  • gestures
  • voice
  • movement
  • backstory

It starts young, with children attributing emotion to toys or objects, and we crave that interaction. Our hopes and dreams for voice assistants suggest we want them to be real people. That leads to people getting all WestWorld on it. But maybe we’re jumping too far ahead. Emotional attachment is a gradual process. Could we have a progressive intimacy with our objects?

Do you want a robot to hold your hand when you die? Probably not – we can create robots that look and sound human, but they don’t feel emotions – or read them. But could your phone end up knowing more about your emotions than your family? It’s the first thing you pick up in the morning, and the last thing you put down at night. It’s well on the way already.

Levelling up emotional machines

Pamela Pavliscak

Machines will level up – they will learn to read physical cues from emotion. The work is already being done. This could have a huge impact in cars. We hit many emotional milestones in our lives in and around cars – but even if that starts to go away, we’re still in this perfect little box filled with emotions. The car may be able to “see” that and react to it.

Customer services agents are using it with voice recognition of signals of frustration or anger.

Pplkpr is an app that tracks and assesses your relationship.

Pepper is a robot being trained in the most intense emotional environment known: high schools. SimSensei helps veterans with PTSD.

And yes, it’s being used for ads.

But right now, emotion recognition is not working very well. We’re only working on a few emotions. We’ll get more nuance. Estimates suggest that it’ll be a $50bn industry by 2020. Apple is doing work on this – the basis of it is in your phone already.

The age of emotional intelligence

By 2050 we could be in the age of emotional intelligence. We still don’t know what an emotion is really. Is it physical? Is it social? It appears they shift and change over time, and that’s being studied now. And if you’re studying internet emotions, you have to start with cats. Emotion detection works really well on cat faces.

But, basically, emotions are really, really complicated. And we can’t over-simplify them. We need to give people agency over them. We’re developing micro emotions and macro emotions globally. Emotions are expanding as we try to get our point across. And we’re having emotions about emotions.

It’s pretty easy for this to go dark. But it doesn’t have to.

If we understand emotions better, if we understand it for emotional robots, maybe we’ll be able to understand our own emotions better. Maybe as we design emotional technology, it will design us too.