The AI/Human Interface: experiments in action

AI has come at us fast — faster than we’ve been able to figure out how to use it. Matt Webb and Petr Janda explain how playful experiments can fix that.

In the second episode of the sixth season of the NEXT–Show, Petr Parkan Janda, studio lead at Accenture Song, and Matt Webb of Acts not Facts, discuss how we can playfully experiment with AI to figure out how it could improve our lives.


Watch the complete episode


Like everyone, Webb is very interested in how we’re going to interact with AI. “You can loosely divide it into two areas,” he said. “There’s programmatic AI, where we use it to make connected and computation systems faster, and then there’s the use of it by individuals.”

The way we use AI individually currently is unsatisfying to him. So, he’s making software sketches of the sorts of uses he’d like to see. Each sketch tells you something, and you can start iterating from there. This is needed because AI differs greatly from the technologies we’ve been exploring over the last decade. With those, we’ve been pretty sure of their capabilities.

“With AI, we don’t know what it’s capable of, so we can’t use the same sort of improvement processes we used in the past. The way we sort it out is that we roll our lives up, get our hands dirty, and think through our making. It’s my favourite point of the technology s-curve.”

Experimenting to understand AI

Janda made a comparison with the early days of the movie camera: people made small, quirky films, figuring out what film could be in the process. A lot of it happened in a magician’s theatre in Paris…

But these experiments are a necessity. It might take us 10 years to figure out what we can do with the AI capability we already have.

“At the moment we’re imagination bottle-necked,“ said Webb.

Petr again made a comparison, this time with the blockchain hype, which said everything would change fast — and it didn’t. “We’re moving forward in small steps with micro delightful things,” he said.

The intimate experimental approach

Webb’s approach is shaped by his time at Berg, and working with colleagues who specialised in interaction design — the moment the human being touched the material form of the product. It’s explicitly not about the user journey or goal. You look at the moments of interaction, and assemble them like Lego bricks into experiences.

“If I have to choose between that approach and spending all my time at a whiteboard, I’ll choose to get my hands dirty every time,” said Webb.

One example from those days a decade ago is the Little Printer. As Janda points out, it was popular everywhere, including in Prague. Webb remembers that it was born of that moment when the internet was transitioning from being trapped behind glass to being in the world around us. We’re largely through that now.

Products that talk back

But the studio was also thinking about how we relate to products that talk back. How do we do that in a way that doesn’t confuse us, and which fits into our psychology? And the way to do that is with psychology, he suggested.

“It got us into designing other machines that talk back to you. Some Nespressos now are Bluetooth connected, and can talk to an app, so you can order more coffee.”

They’re also interested in how you manage group-use devices. You don’t want people buying coffee multiple times when it’s attached to someone else’s credit card, for instance.

Borrowing from far away

Webb also broadens his thinking by taking aides from other fields. He draws inspiration from computing history, especially the paths not taken, as he touched on in his talk. He also looks at sociology, and how we exist in groups of six to 12 people. And he likes looking at the physical world, and how our cognition understands it. All of this feeds into his software sketches.

Some of Webb’s work currently is with PartyKit, which is building the infrastructure to improve real-time collaboration over the web. He wants to make the web feel more alive — to get a sense of how many people are on a page, or awareness of an upcoming event. He’s playing with this idea on his blog. If you visit the site, and other people are there, you’ll see their cursor, in the background under the text. The links tell you how many people are on the other end of it.

A screenshot from Matt Webb's blog, Interconnected, showing links thawed with the number of people viewing the page on the end of the link.

People highlighting text is shown to other visitors in real-time.

“Once you experience this, everything else feels inert and lifeless,” Webb said. “Why did we let the internet get so sterile?”

The sociable internet

Janda pointed out that Figma and Google Docs have paved the way to this — but Webb suggested that this will be an option, not the default. Sure, the new emerging technologies make it easier to experiment and then add features like this. They solve many of the computationally difficult issues that exist. But you probably don’t want to see real-time indications of other people looking at HR records, for example.

“What’s fascinating is that artists are getting involved,” said Webb. When they join in a process, there’s much more experimentation, and a search for the meaning that emerges when you have multiple people in a communal digital space.

Janda pointed out that this is very much a humanisation of the web. And Webb is trying to do the same with AIs, by trying to make interacting with them more like interacting with your colleagues. “I call them NPCs, from the non-player characters in the gaming world,” he said.

AIs as non-human colleagues

Fundamentally, he wants AIs you can riff with, ones that you can @mention to get them to fact-check something for you, followed by someone to copy-edit. “It doesn’t make sense to me to have just one ChatGPT to do all this,” said Webb. “To be honest, I find having an editor involved during my first draft is constraining.”

Putting different AI capabilities into different personas may be a more productive and more human way of interacting with AIs. It also makes it easier for them to become proactive — to offer help — which ChatGPT can’t do right now.

Communicating AI attention

The big challenge is that your preconceptions get in the way. How do we stop ourselves from slotting a new tool into existing ways of thinking? Webb tries to solve that problem by dressing his AIs up as other things — non-human things we interact with, like dogs. Currently, he’s “dressing up” AIs as dolphins, a self-confessed obsession of his. “It bumps you out of thinking of AIs like human colleagues, and opens you up to thinking about how it actually feels,” said Webb.

One thing he discovered was that the proximity of cursors is really important. If the AI’s cursor gets near your own, it feels like they’re “leaning in” to your work. The cursor is a proxy for attention here, in the same way that the Apple Vision Pro changes the icons you’re looking at to indicate that focus. This doesn’t translate well to the smartphone, where equivalents don’t really exist. And it might work very differently in AR, where indications of other people’s focus might feel too intrusive and intimate.

The character of an AI

Another AI experiment he’s running is a clock with an E Ink screen that displays a new AI-generated poem every minute. He’s discovered something of the “character” of different AI models, thanks to months of living with it.

“GPT3 uses better words, has a better vocabulary and makes better poems,” he said. “ChatGPT is a bit insipid — but is 1/10th of the cost. And occasionally, it fibs — it tells me the wrong time.”

The poems tend to be like “hanging out with a 15 cm tall LinkedIn influencer, telling me to get on with my day. Where is that coming from?”

There are 100 million people using ChatGPT now, and it, along with all its competitors, has this “high vibes, go-getter” tone.

“What’s that doing to us?” asked Webb. “Maybe the right thing isn’t to go out and be gung ho. Perhaps the right thing to do is to sit down and spend time with your family. We’re all exposed to this vibe without thinking about it. As a society, we should be thinking about this, and about how these models are trained.“


This post is based on the conversation between Petr Janda, studio lead at Accenture Song, and Matt Webb of Acts Not Facts, on the NEXT–Show in November 2023.