Creative workers face job loss and exploitation as AI outpaces industrial reforms
AI, they contend, is not just reshaping creative output but redefining ownership, credit, and consent. The collapse of traditional boundaries between human and machine creativity threatens to erase the very notion of the artist as a rights-bearing worker, a concern echoed across multiple submissions to parliamentary inquiries.
- Country:
- Australia
Australia’s cultural workforce is facing an unprecedented upheaval as artificial intelligence (AI) accelerates structural inequality, undermines job security, and blurs the boundaries of authorship.
A new study published in the Journal of Industrial Relations and titled “Rethinking Industrial Relations: Policy Ecologies, Cultural Work and Artificial Intelligence,” examines how AI is transforming the creative economy and exposing long-standing weaknesses in the nation’s industrial and cultural policy frameworks.
Creative work at the edge of automation
Australia’s cultural and creative industries (CCIs) have always operated on fragile foundations, freelance contracts, project-based work, and minimal collective protections. While the federal government’s Revive cultural policy, launched in 2023, acknowledged artists as workers deserving fair pay and sustainable careers, the authors reveal that the industrial relations (IR) framework remains structurally incompatible with creative employment.
Nearly two-thirds of creative businesses in Australia are non-employing entities, meaning they rely on independent contractors rather than salaried staff. This leaves the majority of creative professionals, musicians, writers, designers, and actors, outside the reach of labor laws that protect traditional employees. Without union coverage or standard contracts, these workers remain vulnerable to income instability and intellectual property theft, challenges now magnified by the rapid adoption of AI tools.
The authors point to the Hollywood strikes of 2023–2024 as a global warning of what happens when creative labor confronts automation without protection. As AI systems begin generating scripts, soundtracks, and visual designs, human artists face displacement from production chains once reliant on human skill and originality. Beyond economic loss, the authors argue, this technological disruption destabilizes the cultural meaning of work itself, transforming art into data and authorship into algorithmic replication.
AI, they contend, is not just reshaping creative output but redefining ownership, credit, and consent. The collapse of traditional boundaries between human and machine creativity threatens to erase the very notion of the artist as a rights-bearing worker, a concern echoed across multiple submissions to parliamentary inquiries.
The three fronts of the AI crisis: Input, output and displacement
Based on submissions to the Senate Select Committee on Adopting Artificial Intelligence and the House of Representatives Inquiry into the Digital Transformation of Workplaces (both held in 2024), the study identifies three main dimensions of concern among artists, unions, and cultural institutions: input, output, and displacement.
Input: Consent and Transparency AI’s dependence on massive datasets has triggered alarm among creators whose works are used to train generative models without permission or compensation. The study documents widespread evidence of unauthorized scraping of copyrighted material, including voices, images, and songs, to feed commercial AI systems. Actors have reported unauthorized voice cloning, musicians have seen their melodies replicated, and visual artists have discovered their distinctive styles mimicked in AI-generated content.
The authors note that this practice violates both copyright law and moral rights, undermining the principle that artists retain control over how their works are reproduced or altered. The opacity of AI training processes compounds the problem: most creators have no way of knowing if or how their material has been used. Stakeholders are now calling for mandatory transparency registers that disclose the datasets behind generative systems.
Some industry bodies have proposed a statutory licensing scheme, similar to royalty collection systems in the music industry, to ensure fair compensation. Others, particularly tech-focused advocacy groups like the Australian Digital Alliance, oppose such measures, arguing that they could stifle innovation. The authors warn that this clash between creators’ rights and technological freedom encapsulates a broader struggle over cultural value in the digital age.
Output: Authorship, Attribution, and Market Confusion The second area of concern revolves around the ownership of AI-generated material. As algorithms produce works that replicate the style or voice of human creators, questions arise about who owns the output and whether such material can be protected under copyright law. The Australian Copyright Council maintains that only works involving “sufficient human creativity” qualify for protection, leaving hybrid human–AI collaborations in a legal grey zone.
Organizations such as the Australian Directors Guild and Australian Production Designers Guild warn that this uncertainty is destabilizing creative markets. Producers risk infringement when using AI tools trained on unlicensed data, while artists face competition from synthetic imitations of their own work. The study documents how voice actors and illustrators have lost significant income as clients opt for cheaper AI-generated alternatives.
The impact extends beyond economic loss. The authors argue that the erosion of authorship diminishes cultural diversity by privileging algorithmically generated content trained on existing, often Western-dominated datasets. This amplifies structural inequities, sidelining Indigenous, female, and culturally diverse creators whose work may be underrepresented in the digital corpus.
Displacement: The Disappearing Workforce The third dimension concerns direct labor displacement. AI’s capacity to automate routine creative tasks, script drafting, voice recording, music scoring, has triggered widespread job insecurity. Voice actors report work reductions of up to 80%, and early-career writers and journalists struggle to find opportunities as AI-generated content floods digital media.
Most concerning, according to the study, is the impact on First Nations creators. The unregulated scraping of Indigenous music, imagery, and stories for AI training datasets is described as a continuation of cultural appropriation. Existing legal frameworks offer no protection for Indigenous Cultural and Intellectual Property (ICIP), leaving communities powerless to prevent their heritage from being commercialized or distorted by AI.
The authors describe this as a form of “digital colonization”- a process that reproduces historical injustices under the guise of technological progress.
Rethinking policy ecologies for the AI era
The governance of creative labor in Australia is fragmented across disconnected policy domains: industrial relations, copyright, cultural funding, and technology regulation. This “disorganized policy ecology” leaves workers navigating contradictory systems that fail to protect them from AI-driven exploitation.
The authors call for an integrated reform strategy that bridges these silos, aligning labor law, cultural policy, and AI governance to ensure fair and ethical treatment of creative workers. They point to promising developments abroad: the Writers Guild of America (WGA) and Screen Actors Guild–American Federation of Television and Radio Artists (SAG-AFTRA) secured landmark agreements in 2023–2024 that established consent requirements for AI use, credit for AI-assisted work, and residual payments for synthetic reproductions. Similar progress in Canada, through collective bargaining by the Alliance of Canadian Cinema, Television and Radio Artists (ACTRA), shows how unions can negotiate directly for digital rights.
On the other hand, Australia’s current approach remains heavily policy-driven rather than enforcement-based. The government’s Creative Workplaces initiative, launched under the Creative Australia Act 2023, promotes fairness and safety in cultural employment but lacks the authority to enforce compliance or coordinate with copyright regulators. As a result, policy ambition far outpaces legal reality.
The authors propose expanding collective bargaining rights for freelancers and independent contractors, allowing them to negotiate AI-related protections. They also advocate embedding moral rights and cultural integrity clauses in AI governance frameworks, ensuring creators can consent to, control, and benefit from the use of their work in digital training systems.
They stress that the stakes extend beyond economics. The erosion of creative labor rights threatens cultural democracy itself, reducing artistic production to data extraction and weakening public trust in cultural institutions. Without comprehensive reform, the creative industries risk becoming laboratories for exploitative digital labor practices, where technology advances at the expense of human dignity.
- READ MORE ON:
- AI and creative labor
- industrial relations
- cultural work
- artificial intelligence
- generative AI
- creative industries Australia
- artists’ rights
- copyright reform
- cultural policy
- moral rights
- freelance workers
- creative economy
- voice cloning
- authorship
- AI regulation
- digital transformation
- job displacement
- creative workforce
- policy reform
- First Nations cultural rights
- FIRST PUBLISHED IN:
- Devdiscourse

