The chair of the ABC has warned that AI could become “dangerous and sinister” considering some who finance it hold views that are “extremely autocratic”.
Kim Williams, who has been chair of the national broadcaster since March 2024, is a prolific user of various AI applications, including ChatGPT, Gemini and Perplexity, saying it is important for people to understand the technology.
“I make myself use it and try to understand it … I don’t pretend to be an expert, but I’m certainly actively, passionately interested because it’s the next major technology that is going to change our world,” he told Guardian Australia.
Sign up: AU Breaking News email
“As with all technology, this is an immensely useful tool. But it’s a tool and we should not treat it as so many others … have: in a rather undisciplined, romantic way.”
Williams warned that technology can encompass the values of those who create or control it, and said he was concerned what this might mean for AI.
“There are many value constructs that are reflected in AI that can be seen in some ways as being potentially dangerous and sinister,” he said.
“Many of the participants in the financing – and even in the origination and leadership of some of the AI [companies] – have unusually severe views of human organisation and politics, and in some instances … have views that are extremely autocratic, and believe in an anointed few being in charge of the many.”
Williams said that as a believer in democracy and the contest of ideas, it’s “clearly immensely socially dangerous” for people to limit and censor the views of those they disagree with.
“We should not underestimate the potency and power of these technologies – and we are seeing living examples of the technologies in the hands of some governments, where we have real life demonstrations of just how dangerous this can be.”
Asked whether, he thought media organisations – including News Corp and the Guardian – should be signing deals with AI companies, Williams said everyone needed to carry the responsibility of using these technologies with a sense of the public and national interest.
“I speak about it openly with colleagues because I think these things are immensely important.”
AI companies failed to insert a text and mining data exemption into Australian copyright law, which would have allowed them to train AI on creative works without paying for it. Last month, the Albanese government ruled out introducing such an exemption.
Williams, who chaired the Copyright Agency for six years said people have a right to derive income from their creative work.
“Anything that is going to compromise that is not cool, and it’s not acceptable, and that is, as far as I know, illegal,” he said.
“And it should get the full … defence and prosecution by government. You’ve got to pay people who have invested their lifetimes in creating works.”
Williams said he saw AI as potentially devastating for entry-level jobs in industries such as accounting or law, but he believed the effect on journalism jobs would not be as damaging.
“I think the impact on journalism will probably be a lot more benign. Well, actually, a lot more positive than people think because journalists are smart, perspicacious people, and they’ll figure out ways that AI will actually make them better and stronger and will improve journalism,” he said.
“But boy, in so many other areas, I just think it can be an employment destroyer.”
