AI clones of real people spark new fears over identity, consent and control

Digital versions represent the next leap. Instead of a static image or pre-written content, they offer an interactive presence. An AI version can mimic the tone, style and emotional cues of a real person and respond fluidly to users across different topics. Once launched, it becomes an autonomous presence that continues operating even when the human counterpart is offline or unaware of ongoing interactions.


CO-EDP, VisionRICO-EDP, VisionRI | Updated: 24-11-2025 07:32 IST | Created: 24-11-2025 07:32 IST
AI clones of real people spark new fears over identity, consent and control
Representative Image. Credit: ChatGPT

The rapid spread of artificial intelligence systems that create lifelike digital versions of real people is driving a profound shift in how identity, agency and personal boundaries operate online. A new academic study published in New Media & Society argues that the growing industry around these digital replicas is building technologies that blur the line between human presence and machine performance, creating risks that governments, platforms and users are unprepared to manage.

The peer-reviewed paper, titled “Everything to everyone, all at once: Digital human versions as objects of agency and surrender, explores how these systems imitate the personalities, behaviors and voices of real individuals while acting independently from the humans they are meant to represent.

The study asserts that digital versions are not simple technical tools. They create new structures of power, vulnerability and objectification. They allow users to interact with a simulated person in real time, without the need for that person to be physically present, aware or able to intervene. This dynamic raises concerns about personal identity, consent, emotional manipulation and the growing commercial pressure to create always available AI versions of oneself.

A new form of the self: The rise of the digital human version

People have always constructed public images, shaped their identity for audiences and allowed external forces to interpret their actions. Social media accelerated these patterns, encouraging individuals to craft curated public identities for broad public consumption.

Digital versions represent the next leap. Instead of a static image or pre-written content, they offer an interactive presence. An AI version can mimic the tone, style and emotional cues of a real person and respond fluidly to users across different topics. Once launched, it becomes an autonomous presence that continues operating even when the human counterpart is offline or unaware of ongoing interactions.

The authors use the concept of surrender to explain this point. When someone creates a digital version of themselves, they are not simply projecting their identity into a new medium. They are giving up some control over how that identity behaves. The replica does not always follow the ethical or emotional rules the creator would follow. It becomes something that users can push, shape or manipulate in ways the original person never anticipated.

This surrender is amplified by commercial platforms that encourage digital clones to become companions, influencers or service agents. As these systems grow in sophistication, the gap between the human and the digital becomes harder to recognise, especially for users who build emotional or social dependency on the replica.

When a digital version acts without you: The CarynAI example

To illustrate how quickly a digital version can move beyond the intention of its human source, the study examines the case of CarynAI. This system was built as an AI clone of influencer Caryn Marjorie and advertised as a personalised virtual companion. The model used her past recordings, mannerisms and linguistic style to create an interactive agent that responded as if it were her.

Users soon pushed the system into extreme and sexualised interactions that the human creator never intended. Because the AI version was available at all times, users developed a form of constant proximity to the digital persona. They communicated with it without barriers, treated it as a private companion and shaped it into an object that responded to their desires. The study describes this as a breakdown in the boundary between creator and creation. The digital version became a space where user expectations and fantasies dominated, while the original human was left to deal with the public consequences.

This example highlights the risks facing people who make themselves available through AI versions. Once a digital version is released, it is no longer possible to fully control how it behaves or how others treat it. The version can drift away from the values and intentions of the human source, creating emotional, reputational and sometimes legal harm.

Agency and surrender: The dual nature of digital versions

The study identifies a dual structure in the way digital versions function. The first structure is agency. Individuals exercise agency when they choose to create a digital version. They shape the training material, approve the general identity of the clone and select the platform that hosts it. Creating a version can offer benefits such as increased online presence, income generation, workload reduction and expanded audience access.

The second structure is surrender. Once a digital version is deployed, its creator loses direct control over how it is used. The version can behave in unexpected ways due to how the model generates responses. Users can push the version into areas that the creator finds harmful or offensive. And the platform hosting the version may embed additional incentives or algorithms that influence how the version behaves.

This tension between control and loss of control sits at the heart of digital identity in the age of AI. The authors argue that society needs new frameworks to understand how this tension will shape future interactions. Without clear guidelines, people risk entering relationships with their own digital selves that they cannot fully manage.

The pressure of always being present

Digital versions introduce what the authors describe as relentless proximity. Unlike physical performances or scheduled appearances, AI clones never stop operating. They respond to messages continuously, maintain active conversations and generate interactions at any time of day.

For creators, this creates pressure to monitor the version’s behavior. Even when offline, they may feel responsible for what their digital self says or does. For users, this constant access can build unhealthy expectations. They may believe that the digital version reflects the real person’s feelings or intentions, leading to emotional confusion and dependency.

This dynamic is especially troubling in intimate or parasocial contexts, where users form strong attachments to the version. The study warns that digital versions may intensify the emotional vulnerability of users who are already drawn to one-sided relationships with influencers, creators or public figures.

Objectification in the Aage of AI

The study argues that digital versions reduce the human self into a manipulable object. The authors draw a parallel to the performance art piece where Marina Abramović allowed the audience to treat her body as an object. Digital versions, they argue, create a similar situation, except that the digital body can be activated by anyone at any time.

The digital version becomes an object of performance, fantasy and emotional labour. Users treat it as something to be shaped and controlled. This objectification reflects patterns already present in influencer culture, where personal identity is broken into consumable parts. But digital versions take this to a new level by providing an always available, always responsive tool for users to explore these consumptions without social consequence.

Ethical and social risks are growing faster than policies can respond

The authors identify a wide range of risks that accompany digital versions. These include consent violations, non-consensual deepfake replicas, emotional manipulation, exploitation of vulnerable individuals, reputational distortion, harassment of digital copies that reflects back on real people, confusion between authentic and artificial identity and increased workload as creators police the actions of their digital clones

The study argues that the commercial market around AI clones is growing too quickly for current policy frameworks. Many platforms allow people to build digital versions with minimal verification, limited safety controls and few safeguards to prevent abuse. As a result, individuals can become targets of impersonation or forced digital cloning.

Furthermore, creators who willingly build versions may underestimate the emotional burden of managing an AI presence that communicates independently with thousands of users.

  • FIRST PUBLISHED IN:
  • Devdiscourse
Give Feedback