Generative AI fuels job insecurity arts, researchers push for labour protections

The study warns that failure to act risks normalizing a creative economy defined by exploitation, invisibility, and algorithmic appropriation. Conversely, a reimagined industrial relations system, one that embraces the realities of AI and digital work, could anchor a fairer and more resilient creative sector for decades to come.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 11-11-2025 17:18 IST | Created: 11-11-2025 17:18 IST
Generative AI fuels job insecurity arts, researchers push for labour protections
Representative Image. Credit: ChatGPT
  • Country:
  • Australia

Artificial intelligence (AI) is rapidly transforming creative work, reshaping the meaning of authorship, and amplifying long-standing labor inequalities in Australia’s cultural industries. A new academic study published in the Journal of Industrial Relations reveals that AI’s rise has exposed structural flaws in the nation’s industrial relations framework and revealed how fragmented governance is failing creative workers in the digital age.

The study “Rethinking Industrial Relations: Policy Ecologies, Cultural Work and Artificial Intelligence” offers a critical evaluation of how existing labor, copyright, and cultural policies fail to keep pace with automation and the expanding use of generative technologies in film, music, media, and visual arts.

AI disruption deepens precarity in creative work

According to the study, AI is a catalyst intensifying pre-existing vulnerabilities in Australia’s cultural and creative industries (CCIs), which have long been characterized by freelance, project-based, and informal employment. While the federal government’s Revive cultural policy, launched in 2023, recognized artists as workers and promised stronger career sustainability, the authors argue that this recognition remains largely symbolic. Australia’s industrial relations laws, they note, remain rooted in an outdated model of standard employment that excludes most creative professionals from the protections it offers.

According to the research, around two-thirds of Australian creative businesses are non-employing entities, meaning the majority of artists operate as self-employed contractors. This status leaves them outside collective bargaining coverage and denies access to social protections such as paid leave, minimum wage standards, and dispute resolution mechanisms.

As generative AI automates key creative processes, from scriptwriting and voice acting to design and music composition, these structural weaknesses are becoming more pronounced. The authors argue that while AI is often presented as an innovation driver, its rapid adoption in the creative sector has led to job displacement, erosion of authorship, and new forms of economic exploitation.

The study draws parallels with the 2023–2024 Hollywood strikes by the Writers Guild of America (WGA) and SAG-AFTRA, where artists demanded guarantees over AI usage, digital likeness rights, and residual payments. In Australia, similar anxieties are emerging but without clear legislative safeguards or enforceable agreements.

To sum up, AI’s integration into production and creative workflows is reshaping not just the economics of cultural labor but its cultural meaning, destabilizing traditional notions of ownership, credit, and creative control.

Policy ecologies in disarray: Fragmented regulation and lost protections

Using a policy ecology approach, the authors analyze how multiple governance systems, industrial relations, copyright law, cultural policy, and technology regulation, intersect and often contradict one another. They describe these overlapping frameworks as a “disorganized policy ecology,” one that leaves creative workers unprotected at the convergence of artistic labor and automation.

The study examines 30 formal submissions made to two major Australian parliamentary inquiries: the Senate Select Committee on Adopting Artificial Intelligence (2024) and the House of Representatives Inquiry into the Digital Transformation of Workplaces (2024). These submissions, representing unions, professional guilds, media organizations, and advocacy bodies, reveal widespread concern about the exploitation of creative work through AI.

Several key themes emerged:

  1. Unlicensed Data Use: Artists report that their creative works, art, scripts, songs, and voice recordings, are being scraped from digital platforms to train AI models without consent or compensation. This practice not only breaches copyright but also violates moral rights, which protect attribution and integrity.

  2. Loss of Authorship and Attribution: AI-generated works increasingly mimic identifiable human creators, blurring artistic ownership and diminishing the cultural and economic value of authorship. This phenomenon threatens to devalue the distinctive creative identities that underpin cultural production.

  3. Market Displacement: Voice actors, illustrators, and musicians are among those most affected. Submissions reveal that voice-over work has declined by as much as 80%, while AI-generated journalism and stock imagery continue to undercut human labor.

  4. Cultural Exploitation of First Nations Artists: Indigenous creators face heightened risk as their art, stories, and music are reproduced or simulated by AI systems. The lack of protection for Indigenous Cultural and Intellectual Property (ICIP) perpetuates historical inequities, allowing cultural appropriation to persist in digital form.

As per the study, these challenges arise not simply from technology but from the absence of coordinated policy. While Australia’s copyright law still insists on human authorship, it offers little clarity for hybrid works created through human–AI collaboration. Meanwhile, industrial relations law fails to classify gig-based creators as workers, leaving them unrepresented in wage and contract negotiations.

This regulatory fragmentation, the authors warn, allows tech companies and production studios to exploit ambiguity, claiming ownership over AI-assisted works and bypassing the collective frameworks that once secured fair labor practices in the arts.

Lessons from abroad: Toward an integrated industrial Relations Model

The authors cite international examples to demonstrate how other jurisdictions are adapting more proactively. In the United States, new contracts negotiated by WGA and SAG-AFTRA established baseline rights for creative workers in an AI-saturated environment. These agreements require explicit consent before using an actor’s digital likeness, credit for AI-assisted scripts, and compensation for derivative works.

In Canada, the Alliance of Canadian Cinema, Television and Radio Artists (ACTRA) has pushed for similar protections against voice cloning and synthetic reproduction. These collective bargaining victories illustrate the potential of union-led negotiation as a model for protecting creative labor in the AI era.

On the other hand, Australia’s Creative Workplaces initiative, introduced through the Creative Australia Act (2023), lacks enforcement authority and remains disconnected from both copyright and AI governance. It encourages ethical practices but offers no binding legal mechanisms to ensure compliance.

The authors argue that Australia needs to move beyond fragmented reform toward a unified, cross-sectoral strategy. This includes:

  • Expanding collective bargaining rights to include independent contractors and gig-based creative professionals.
  • Embedding moral rights protections within AI governance frameworks to secure consent and attribution.
  • Mandating transparency in AI training datasets to ensure creators can verify when and how their work is used.
  • Integrating industrial relations, cultural policy, and copyright law into a single, enforceable system of creative labor governance.

The study also calls for stronger collaboration between regulators, unions, and technology developers to ensure that innovation does not come at the expense of artistic livelihoods.

Rebuilding cultural work for the AI future

Protecting creative labor in the AI age requires a shift in policy philosophy, from treating artists as isolated contractors to recognizing them as essential contributors within an interconnected cultural ecosystem. This transformation demands not only legal reform but also a cultural commitment to valuing creative labor as work, deserving of the same rights and security afforded to other sectors.

Practically, the authors envision a coordinated policy ecology where cultural, technological, and legal systems operate in tandem. AI governance would include ethical guidelines, transparent data management, and labor protections, while copyright law would clarify ownership boundaries in hybrid creative processes.

The study warns that failure to act risks normalizing a creative economy defined by exploitation, invisibility, and algorithmic appropriation. Conversely, a reimagined industrial relations system, one that embraces the realities of AI and digital work, could anchor a fairer and more resilient creative sector for decades to come.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback