Experimenting with AI at the ABC
Introduction
Anna Dixon kicks off her talk by acknowledging the Gadigal people of the Eora Nation and addressing the current AI fatigue. She emphasizes her focus on practical applications of AI at the ABC, highlighting its potential beyond productivity gains.
The ABC: Australia's Creative Powerhouse
Anna provides context about the Australian Broadcasting Corporation (ABC) as a massive content creator, striving to cater to all Australians across radio, television, digital platforms, and social media. She positions the ABC as a leading creative force in Australia.
The Role of the Innovation Lab
Introducing the ABC's Innovation Lab, Anna explains their mission to explore future content experiences and embrace new technologies like AI. The team's focus is to empower content creators and amplify their impact, aligning with the ABC's values and maintaining public trust.
Addressing Cultural Narratives and Public Concerns
Anna delves into the dystopian portrayal of AI in popular culture and acknowledges the valid concerns surrounding it. She emphasizes the need for responsible AI implementation, emphasizing the ABC's commitment to editorial integrity, transparency, and maintaining audience trust.
Practical AI Applications at the ABC
Anna transitions into case studies showcasing practical AI implementations at the ABC. These examples include: automatically generating transcripts for audio content, providing audiences the option to listen to news articles, and exploring AI-powered translation for wider content reach.
Empowering Artistic Creativity with AI
This section highlights how the ABC's creative team utilized generative AI tools for the marketing campaign of their show "Gruen." The team explored AI for dynamic background generation, character animation, and visual effects, demonstrating AI's potential in creative fields.
Revolutionizing Weather Reporting
Anna presents a case study on using AI to automate localized weather reporting. She uses relatable scenarios to illustrate the challenges faced by both producers and audiences with current weather updates. The proposed solution leverages AI to generate localized scripts and audio, enhancing efficiency and relevance.
Understanding Audience Perspectives on AI
To gauge public sentiment towards AI-generated content, Anna announces joint research with the BBC across the UK, US, and Australia. The research aims to understand public awareness, perception, and expectations regarding AI in content creation, informing the ABC's approach.
Key Considerations for AI Implementation
Anna wraps up her talk by providing key takeaways for implementing AI. She advises focusing on scalable tasks, leveraging existing data, prioritizing transparency, considering job impact, and ensuring responsible use. Most importantly, she emphasizes the need to prioritize user experience and audience needs above all else.
Thanks, everyone.
I'd also like to acknowledge that this conference is taking place on, the land of the Gadigal people of the Eora Nation and pay respect to all First Nations people.
Has anyone here heard of AI?
It's It's pretty hard to escape at the moment and having conversations outside, it's, I'm picking up on some AI fatigue for sure.
It's in the news, it's in ads, there's talks, there's panels, there's think pieces.
So I appreciate that you're here, to hear more about it.
I think a lot of those things, or at least the things that I've been seeing, they're either quite esoteric and very detailed or they're big picture, theoretical, ethical discussions.
Hopefully my talk today will be a little bit different.
I'm talking about like really pragmatic, practical applications and things that we've been trying out at the ABC over the last year or so.
So you've already heard if you were in the, last talk in this room about the ABC and, everything, like different things come to mind for everybody.
Everybody has a different relationship to the ABC.
It is a massive and complex organization and we try and make content for every Australian.
That's why there's so much content.
I think it would probably take, it would probably be a full time job just to log all of the content that we make every day.
We've got 70 radio stations, we've got 5 TV networks and iview, we've got 3 audio apps, we've got, all of the constant news coverage, we've got, bespoke content that we make for social media, so we'll make specific things for TikTok, specific things for Instagram, etc.
We've got websites.
It's, really, if you think about that, it's just this massive, amount of, creativity, this hive of creativity.
And our new big boss, so I work in the content division, our new big boss is taken to saying that we are the biggest and best creative team in Australia.
And sometimes he'll chuck in, in the Southern Hemisphere, let's just say in the Southern Hemisphere.
Internally, I, I'm, skeptical.
I internally rolled my eyes, but then I started thinking about it and I grew up with the ABC.
I sit around the corner.
My desk is around the corner from the Triple J team.
And that gives me a little thrill every time I pass them because, that's like some of the content that I grew up with that really connected to me.
We, I work frequently with the children's team and the people that commissioned the biggest TV show in the world at the moment being Bluey.
And that's, incredible.
So.
There is something there, and that's the context to keep in mind as I'm talking about our AI experiments.
There's a lot of conversations on a lot of companies, talking about AI in terms of productivity gains or efficiencies, and that's part of it, for sure.
But, what I'm trying to do and what our team and I guess everyone at the ABC, working in this space is trying to do is support our content makers, support that creativity and amplify the work that they're able to do, to get more content to all Australians.
So why I get to work on this exciting stuff is it's the remit of my team.
I work in the innovation lab and we're a team.
We've got strategists, journalists, designers, engineers, and we support the content teams to think about.
They're all in the now, they're all working on BAU stuff, getting content out today, getting it out this hour, getting it out this week.
And we create a bit of a space to say, all right, but what about, what's a compelling content experience going to be for audiences in 12 months, in two years?
What's coming up?
And, we can try to support them through, doing experiments and pilots to say, like, how can we get ready for that future?
So it's our job to be brave, when it comes to adopting new technology and to help our content teams work out how to use these new tools.
So if that's our remit, we can't possibly ignore AI.
And I like to take a human centered design approach to, thinking about new projects.
And if you zoom out a little bit and think about the cultural narratives that have been told, to us, my whole life about AI, if you think about film and TV and media, it can tell us a little bit about how people might respond to applications of AI now and into the future.
And these narratives are pretty bleak, so I'm not surprised by all the concern, the concern that's coming out.
In a lot of these conversations that we're having.
So one common really dystopian plot is, humans losing control, losing superiority, so the matrix, iRobot.
Another one is the really creepiness of, of AI mimicking human physicality, so Ex Machina, and then the always common theme of robots going rogue and taking over like the Terminator turning against us.
And it doesn't really help when you get headlines like, Schwarzenegger proclaims the Terminator has become a reality due to AI.
It's not all bleak.
There's the cuddly AI as a best friend character, there's R2D2, Big Hero 6, and Megan.
Let's just replace that one.
But what, so if that's what we've always been told about AI, where does that leave our internal innovation lab?
We have been seeing these endless possibilities for machine learning and AI for a few years, but, there's also that concern and that confusion around it, which is completely understandable.
When ChatGPT went wild a year or so ago, that's when we really got the attention, or yeah, the possibilities really got the attention of all these other teams in the ABC and they started turning to us, for help and, and to narrow down like what should we do?
What should we do now?
So our focus this year in particular has been trying to help everyone separate that crazy hype from what's practical and applicable and safe to experiment with now.
And on top of that, we need to make sure that, we're making choices that align with the ABC's values.
We're not just trying out all this cool stuff that AI can do, as fun as that sounds.
We really need to bake our editorial and curatorial values into what we're building.
And we, we need to keep our sources tight.
We need to keep transparency because the number one thing that, that I care about and, the ABC cares about is trust and, keeping the trust of Australians.
So that's the aim.
I don't have all the answers, but, we have been working pretty hard on how AI can benefit different parts of the ABC.
So I'll talk through a handful of case studies, that show the kinds of problems that we've been, testing AI solutions on.
I'll talk about how we've been training our ABC AI model while also benefiting audiences in the process.
We'll talk about giving audiences the option to choose how they consume our content.
Making, this content available to more Australians via translation, using tools to empower artistic creativity and streamlining work to, to give the content makers more time for the tasks that matter.
So starting off, with a pretty basic application of machine learning, it feels safe.
It doesn't feel like, Megan coming to, to kill everyone.
So when I started this project, I honestly wasn't that psyched about it.
I was like, Oh yeah, okay.
This, seems valuable, sure.
But with the right partners, it really quickly grew into something that I'm, very proud of.
So podcasts are a really significant part of what we make at the ABC and, always growing, but 98 percent of our audio content had no transcript.
And, that means that not only are they inaccessible to the one in six Australians with hearing disabilities, but it affected our ability to recommend and personalise the content that we make to people, which is, absolutely necessary and expected.
And these standalone audio files are like a black box to search engines.
We were really limiting, SEO and content discoverability.
And so while the purpose of this pilot was initially to test out how ML based transcription could be implemented across a range of podcasts.
We knew that was only part of what transcribing everything offered us.
So we ran a pilot with a handful of podcast producers.
So every time they, uploaded their podcast to the CMS, they created a transcript and edited it.
Because, they're never going to be absolutely accurate.
And, for the ABC, both audiences expect us to be a hundred percent accurate and our content makers, it's been, baked into them to, do things properly.
Yeah, we were doing that and uploading the, transcripts to see what we could learn about that workflow.
But we were testing a whole bunch of external, transcription tools, and that meant we were just giving away our data.
We also realized that the cost to scale that to all of the audio that we make was just completely, unachievable.
So we worked with our amazing internal AI/ML team and this project grew into building our own ABC transcription tool.
So it's connected to our CMS and automatically transcribes audio as it's uploaded.
Better yet, once those producers edit the transcripts, the platform learns from those fixes that improves our dataset, which trains our ABC AI model, and that means that next time the transcription tool will make fewer mistakes, taking less time to edit, giving those producers back that time.
To make the cool audio that they make.
And maybe most importantly to me, the audience response was really passionate.
Quotes like these transcripts are like jewels to me.
I'm delighted to have this availability.
I wouldn't have listened to the podcast.
I find they don't work well for me.
That sort of stuff really gives me, joy and, meaning in my work.
And in terms of analytics, we saw that once they're finding that those transcripts are there, they're coming back.
So it's also possibly good for retention.
Next case study is about allowing audiences to choose how they want to interact with our content.
This pilot started with a pretty, pretty simple proposition.
What if we could use text to speech to offer newsreaders an option to listen to articles rather than read the articles?
Is that going to be helpful for people on the go?
Line up a few articles and listen to it on your drive into work.
Is that good for accessibility?
Is it helpful for multitasking?
And to test that, we went through a number of steps.
So we wanted to learn about the technology.
We wanted to create our own voice.
Like what, what would a synthetic voice that represents all of the ABC's audio content, or text content, sound like?
And we wanted to test with audiences in a live trial.
So we started by learning how to create a synthetic voice.
We worked with David McDonald, who's in our in house creative team, ABC Made, who's a senior sound designer.
And we set about making our custom neural voice.
So the first step was to record, the system told us 300 phrases.
It said, record these and put it into the tool.
And, so this was David, recording what the system told him to say.
[Synthetic male voice] He also admitted he may not be at the Broncos in 2019.
I really liked some of the weird things that it asked us to say, like this one.
[Synthetic male voice] My hoops were getting smaller.
Sure.
And then, we fed that, that was the training data, fed that into the system and, created a custom neural voice that sounded exactly like David.
And this is the result.
So David never said this.
[Synthetic male voice] Second Half First is a stunning memoir from one of Australia's most highly acclaimed writers.
So I was pretty blown away by this, like I was expecting kind of a Siri ish sound, but it was really, quite, natural.
And so we ran a 30 day trial on the news website with the help of the news product team, selecting a range of different types of articles, what does this work best for?
Creating the audio, we fixed up any errors, again, we don't like errors at the ABC, and it sometimes it said things in weird pronunciations, like 'Albanees' instead of Albanese.
And then we made it available to the audience.
So we ran this trial over 30 days and it worked really well.
People who started it listened to at least three quarters of the, of the story.
We had a survey attached to the feature, and 88 percent said yes, they would listen to more of these things.
They wanted more.
And it's a little over a year since we ran this pilot.
There's lots of internal reasons why we haven't implemented it wider.
But we learned a lot from it and it, fed into what, unlocked what we could do next, which was start exploring translation.
So the team that ran the, run the Chinese language news page, so we have, I can show you, there we go.
[Woman speaking Mandarin] ABC I didn't mean to start playing it.
So we've got a, news page written in Chinese.
They saw that, that pilot that we did with the other, with the virtual voice.
And, they saw the opportunity that they could also create an AI, use this AI to, text to speech technology and create an audio version of their news bulletin.
So they have a bulletin of top stories.
And there's multiple reasons to do this.
Increasing the availability of our news content in other languages is a priority for us.
Audio consumption, as I already mentioned, is really constantly growing.
And at present, there's no Chinese news audio offer that we have.
So this was a really quick, low resource way to leverage what we'd already done and test out this thing to see if it met any audience needs.
Don't have too many insights to share about this one at the moment, because it's currently live, and also the survey results are written in Chinese, so I'm going to need the team to help me, with that.
So that last example is using a Chinese synthetic voice reading Chinese text, so not translation, but, using, what we'd already done, we were able to take one of the voices that we created of, one of our radio hosts, Sarah McDonald, and, teach her voice Mandarin, literally overnight, and you can just click a button and, and teach her Mandarin.
So, this is [woman speaking Mandarin] an example of that.
So you can imagine the possibilities, that this provides us as the national broadcaster, being able to tell stories for all Australians in their native languages.
And just to clarify, so Sarah agreed to be part of this experiment and she used this as a segment on her radio show.
But recreating the voices of our key talent is not something that we're interested in.
Yeah.
So moving on to, empowering new, artistic options.
You might be familiar with Gruen.
It's the show that's always ready to dig into the world of advertising.
I wasn't involved in this project, but it was too interesting not to include.
Our in house creative team that I mentioned already, ABC Made, they've got a set of really incredibly skilled people who have just been like mastering these generative AI tools.
It's quite, impressive.
And they were already using these tools on about, 20 percent of their projects, but nothing audience facing.
So they were using it for idea generation, for storyboarding, for, artwork mock ups, that kind of thing.
But for the most recent, season of Gruen, the, the copy line, was real intelligence, artificially enhanced.
And they were like, all right, that kind of gives us some, license to play in this space.
And they were able to make this really dynamic standout campaign that, that uses the backgrounds here, all AI generated, and this campaign went, it was a huge campaign across billboards and posters and social media and TV and radio.
So it was the first opportunity to use this kind of technology on, that kind of, a campaign of that scale that was actually going to market.
They used a combination of DALLE 2, Mid Journey, Runway, Topaz, EbSynth, and Kaiser.
And, they worked with Prompts to design these sets at the same time that the, talent was being, in the photo shoot.
And so they were able to change the backgrounds to suit the personality of the person who was having their photo taken.
So you can see in the second image, Russell like spontaneously started disco dancing.
Lame, but, and immediately they were able to respond to that and, make some rapid prompts like, laser lights, futuristic, disco, humanoid, humanoid figures.
And so they were able to use what was inherent to that person and build the set around them, which I think was really interesting.
I'll just show a bit of a video to give you an idea about some of the work that was involved in animating Will Anderson's face to convert him into other characters.
[Woman's voice over] There are a few test animations converting footage of Will Anderson into other characters.
There are many programs we tried, Runway Gen 1, EBSynth, and WarpFusion.
Eventually, we settled on a program called CABER.
This was the most efficient due to our three day deadline.
It seemed the best equipped to handle the lip sync.
We found that AI animation works best when the subject is close to screen.
Facial features are harder to define at a distance.
The animation was then enhanced with graphics from the ABC made motion design team.
The three day deadline, it's pretty amazing the amount of stuff that they were able to generate and explore in that time.
They picked up a Gold Creative Excellence Award for that campaign for the best use of technology a couple of weeks ago at the, Promax Industry Awards.
So let's move on to talking a bit about reducing monotony with everyone's favorite monotonous topic, the weather.
So picture a radio station in far North, Queensland, Northwest Queensland.
And Eric is there.
She's a radio producer and she's run off her feet, trying to get everything organized for the, her morning show, her breakfast show.
She's got guests coming in, that are running late and she's trying to deal with that.
She's, keeping in mind all the stuff that's going on in the news, stuff that's going on in the local area.
What's, what's happening, in the weather?
And she's oh, the weather, all right, yep.
I've got the whole area, around me that I need to check because it's not just her little town that she needs to talk about.
So she logs onto the Bureau of Meteorology website.
She checks what's happening.
She needs to keep in mind, the gulf winds, the temperature ranges, if there's any rain, and she needs to do an update every seven to 10 minutes because people are constantly turning on the radio.
And that's one of the things that people want once they're starting their day.
She writes a quick script to keep everything in order about the weather and then has to get back to what she's doing, to make the radio show happen.
Later on in the day, we've got David here.
He's a 60 year old painter in Coorumba and he's listening to the radio, and he's planning to go for a fishing trip later in the day.
He heard yesterday that there might be some sort of flood warning, so he just, he's keeping an ear out for the weather.
He wants to understand what's going on.
Here's the weather update start.
And he's really annoyed because it's a state wide, Queensland wide forecast and it doesn't give him the information that he needs.
He doesn't use the internet.
And he, so he, doesn't quite know what to do.
He's probably just going to risk it.
Maybe he'll ask a friend if there's one down there and he's really annoyed.
So he's, he might even just call the radio station and give him a piece of his mind.
So there's two problems, here, with the way this is currently done.
We've got, the local radio producers that have so much to do in their day, prepping the weather is another task that nobody likes doing.
And, sometimes like when, for example, when Erica is running her show, sometimes She does, David does get these, local, relevant, topical, updates that help him.
But at other times, it's a statewide pre recorded forecast that aren't updated outside of staffed hours.
So you can see why his particular situation would not be solved by hearing what's happening in all of Queensland's weather, which is just a massive state with, you Lots of different weather systems.
So maybe neither of these sound like the biggest problems in the world, but, it's a real pain point for both staff and audiences.
And add that to the risk being a little bit lower, because weather forecasts aren't generally regarded as the world's most reliable things.
And that makes it a really good place for us to experiment.
If we add the data inputs that we already get, it's already clean and organized from the Bureau of Meteorology and other national services.
We have what we need to test out AI generated localised weather services, so weather updates that will take away that annoying task and give audiences more relevant information.
And then, possibly that'll open us also up to ideas for what else we could provide automated local updates for in the future.
So we worked out what we needed to do to make it happen.
What's the flow of data and systems that we need to involve to allow us to get the, the weather automatically out on the airwaves?
Starting with generating the script.
We do already have, as I mentioned, that, like an internal system that, that collates all of that weather data from BOM and other services.
We can add a prompt to that, to, feed into an LLM to generate a weather bulletin script.
And ideally we can get that to land in our scripting system that all the presenters use.
So once we get that that solves that first problem for Erica.
She doesn't have to do that extra task.
She can just take that script and read it out.
And the audience will hear that, but that doesn't solve David's problem.
The second problem, whereas if Erica is not, on staff, for that show, it'll revert back to that, pre recorded statewide podcast.
Sorry, weather update.
So that's, this is currently where we're at with the experiment, getting like feedback from, presenters about whether this is useful, is this helpful?
Is this something that we should just scrap?
It doesn't, really change their day.
But, the second part that we could do is, thinking back to what we've already learned about synthetic voices.
Is create a synthetic, use a synthetic voice to turn the script into audio.
We could then use a dynamic insertion to put it in the right place.
So as I mentioned before, we've got 70 radio stations.
Think about how many programs we've got on all of those radio stations.
We don't want the far North Queensland weather to be accidentally read out in Hobart.
But you might've heard, so if, say you're listening to a, American podcast.
And then you'll hear like a local ad that's dynamic insertion.
And we can use a similar, internal system to make sure that the audio goes to the right place.
And then that can be read out at, in the right, in the right program and that David can get his proper useful update.
So there's two things that we think a lot about with all of these projects.
How is this going to affect our staff?
How's this going to help our audience?
And, any of these projects might, not, continue, but, and that's, fine because they're experiments with, we're trying to learn things rather than just have, conversations that kind of end up going in loops, if we can actually do an experiment and, learn something and do something practical, it really focuses those conversations and we make more progress.
But sometimes, you do need good, solid foundational research as well.
So to help us understand from an audience perspective, what the vibe is on generative AI, we're partnering with the BBC on some research across the UK, US and Australia.
And we want to know what's the baseline level of awareness, and understanding of AI and its applications.
Like I think in this room, we'd have a very high level of awareness, but, there's a lot of.
We're not really sure if people know when they're interacting with AI already and when they're not.
We, want to know what role people think generative AI, could have in the content that audiences consume.
We want to know what qualities that they want when it comes to news, sport, music, video, the weather, education.
I'm really interested in that friction point.
When does something feel safe and fine?
I would say, I would guess transcription would, and when does it start to, click over into terminator territory and make people feel a little bit uneasy.
And the aim of all of this is to help us understand what good generative AI content looks like from a public broadcaster whose mission it is to inform, educate, and entertain all Australians.
You might have talked through a bunch of different, applications and maybe it's helped you identify some areas that you could start using AI in your work more than, you currently are.
And there's a few more things to look, to look for when, trying to find those problems.
Think about ones where the scale of, addressing that problem is not feasible to do by humans.
For example, transcription, we're not going to transcribe our enormous catalogue of audio by humans doing that.
Without data, there's no machine learning, so consider how you could explore automating tasks where there's already clean, organised data, like in that weather feed.
Play in spaces that are low risk if the information isn't, correct or if the model hallucinates.
At this point for the ABC, it's not at all feasible not to have a human, check the AI before publishing.
And if you saw the, the keynote yesterday afternoon, I think that she had some really good points around that as well.
So consider what your, risk appetite is for that.
And think about accessibility and how you can, make things easier for your users or customers or audience.
And a few more things to consider before you start experimenting.
Think about transparency and how you can, communicate that you're using AI in that task.
I think that's something that I, underestimated when we did that virtual voice, experiment a, over a year ago, but now I'm regretting that and really thinking about it a lot.
Think about the impact that your idea will have on jobs if it's really successful, and do you want to go there?
Ask if you are the right person, to, who should be doing this and who should be using this.
And if you're not, that's fine.
I'm not the right person for a lot of the stuff that I talked about and that's why there were so many partnerships.
Can you verify the info or do you need someone else who's an expert in that space to, to work with?
And it can also be really tempting to think about the efficiency gains, and less about the audience or the users, but it's really important to keep them at the center of your thoughts, and how can you use this to improve the experience that they're having?
That's it for me.
Thank you very much.