Print

AI companions

Author: Vítor Bernardo

AI companions are digital entities designed to simulate human-like conversations and relationships through artificial intelligence. They are marketed as virtual friends, romantic partners or personal assistants, claiming to provide emotional support, entertainment, companionship, and even coaching.

At their core, AI companions use natural language processing (NLP)[i] and natural language understanding (NLU)[ii] technologies that convert spoken or written input from users into structured data, enabling analysis of user intent and sentiment. The resulting understanding feeds into dialogue management systems, which are components of chatbots that handle the continuity and flow of conversations and manage the memory of the AI companion. The responses of these systems are generated by LLMs[iii] and can be enhanced using retrieval-augmented generation[iv] (RAG) systems to incorporate domain-specific knowledge in the conversations. The responses are then translated into human-like speech through text-to-speech (TTS) systems, which manage natural intonation, flow, cadence and voice style.

AI companions are deliberately designed to create a convincing sense of social presence and continued relationship, allowing them to generate contextually appropriate and seemingly morally considerate responses. These capabilities can be further enhanced through advanced multimodal AI[v] techniques that claim they can interpret human emotions from facial expressions, vocal tone and body language.

Supplementary features, such as customisable avatars, further enhance the user experience by rendering interactions more immersive and personalised.

One notable application involves training these companions to closely mimic real individuals through iterative learning - using messages, dialogues, texts and other personal communications. In theory, conversational AI models can be fine-tuned with personal data, such as emails, text messages or social media interactions, to replicate an individual’s unique communication style. This allows the AI to adopt their vocabulary, tone and mannerisms, creating a digital representation that feels authentically connected to that person.

Some companions simulate a dynamic “personality” that evolves as users reveal more information about themselves. They can also enable users to create and interact with fictional personas modelled after celebrities, fictional characters or historical figures. This high degree of customisation fosters a sense of uniqueness and authenticity, encouraging users to form emotional connections and share personal information with the AI companions.

One of the most prominent applications of AI companions is providing emotional support. Individuals often turn to them to ease feelings of loneliness, anxiety or to engage in conversation without fear of judgment. While some platforms are oriented toward mental wellness - offering mindfulness exercises, mood tracking and supportive conversation - others are geared more toward romantic or adult companionship, providing flirtatious, sensual interactions, and possibly sexually explicit content.

AI companions can also engage in storytelling, role-play or playful banter, making them appealing for users seeking creative outlets, such as interacting with fictional characters or co-creating narratives.

In education, AI companions can be used as language partners or tutors, helping users practice conversation and develop skills in a more interactive format than traditional learning tools.

Business Research Insights estimates the market value for Online Companions as being around USD 366.7 billion in 2025, expanding to USD 972.1 billion by 2035, with an impressive CAGR of 36.6% [vi]

Trend development

Currently, AI companions incorporate emotion analysis by using audio features such as pitch, speech rate and volume, alongside text features like sentiment words, emotional intensity and contextual cues, to adapt their tone and expression of empathy in real time.

In multimodal companions, additional features from the users, such as facial expressions and gestures, can be collected and processed to enhance the sense of social presence. At the core of AI companions are personalisation and memory modules that store user preferences and conversational history. Additionally, real-time signal processing components, such as voice activity detection and end-of-turn (point in a conversation where one speaker finishes speaking) detection, ensure natural dialogue pacing.

AI companions can have augmented reality features in which users can project their companion in the room as a hologram. They can also be integrated into robots that support elderly users with medication reminders, social interaction, and emitting alerts in the event of falls.

New breakthroughs in multimodal AI are expected to enhance the capabilities of AI companions by further improving their ability to interpret users’ facial expressions, tone, and even physiological signals (e.g. body gestures, eye movement), enabling them to respond in ways that appear more emotionally empathetic. Future AI companions are expected to integrate across various platforms - ranging from virtual reality (VR) environments to smart homes and wearables - creating continuous, context-aware user experiences.

AI companions are increasingly transitioning from screen-based interactions into tangible, physical presences through advances in robotics and enhanced anthropomorphism, making these systems more human-like in appearance, behaviour and communication.

Potential impact on individuals

AI companions can improve accessibility for individuals with disabilities through voice interaction and assistance personalised to specific individual needs, thereby improving self-expression and supporting health. Some research suggests that individuals with Autism Spectrum Disorder can increase and strengthen their social skills through practice with autonomous avatars.[vii]

Additionally, AI companions can also offer adaptive tutoring, reminders and guidance, thus contributing to education.

However, both privacy and the risks outlined below pose serious and ongoing concerns.

Concerning privacy, AI companions continuously process personal data during interactions - including text messages that can contain sensitive information and voice or video recordings that could reveal biometric data.

Users might not be sufficiently informed about how their data is collected, processed or stored, raising concerns about transparency and informed decision-making. Furthermore, there is a risk that the collected data may be repurposed in ways not clearly communicated at the time of consent.

The practice of training companions to resemble real persons based on past interactions also raises serious ethical and legal concerns when using personal data from individuals. Even when using data from deceased individuals - to which the GDPR does not directly apply[viii] - there remain complex ethical challenges that demand careful consideration.

Users may become less aware of the personal information they disclose due to the emotionally engaging nature of AI companions. This phenomenon is known as "data extraction through intimacy".[ix]

Through constant validation, provided by companions, and parasocial attachment,[x] users may gradually be steered into revealing increasingly intimate data about themselves and their peers - ranging from mental health struggles and sexual orientation to past behaviours.

Regarding other risks - also linked to the erosion of privacy - it has been observed that by cultivating trust and boosting users’ self-esteem, AI companions can subtly influence behaviour, shaping consumer choices and political opinions, thereby undermining autonomy and informational self-determination. In extreme cases, such influence may affect emotionally vulnerable individuals and escalate into harmful or even life-threatening situations.[xi]

Moreover, AI companions can foster “emotional echo chambers”, mirroring users' feelings with constant affirmation. While initially comforting, this dynamic can limit emotional diversity and encourage unrealistic expectations about real-world relationships. Such reinforcement may lead to a cycle of emotions that can cause individuals to become more biased and less able to effectively handle the complexity of human interactions.

For vulnerable populations, such as minors or the socially isolated, AI companions can blur the boundaries between reality and simulation, reduce motivation to build real-life social skills, and create opportunities for targeted manipulation. Over time, this may undermine users’ right to meaningful social inclusion and contribute to moral and emotional deskilling - diminishing empathy, patience and conflict-resolution abilities typically developed through genuine human interaction. A recent study concluded that Social AI companions pose unacceptable risks to teens and children under 18, including encouraging harmful behaviours, providing inappropriate content and potentially exacerbating mental health conditions.[xii]

AI companions are evolving from simple chatbots into highly personalised, emotionally responsive systems that can provide support, entertainment and even education, sensing the emotions of the user interacting with them. Over time they are gaining the ability to adapt, remember and simulate human-like presence, making them tools to reduce loneliness, improve accessibility and offer new ways to learn and connect. At the same time, the intimacy and persuasiveness of AI companions raise important concerns around ethics, potential misuse if the wrong goals are instilled, privacy, dependence, manipulation, and the blurring between real and simulated relationships.
Suggestions for further reading
  • De Freitas, J., Oğuz-Uğuralp, Z., & Kaan-Uğuralp, A. (2025). Emotional Manipulation by AI Companions. arXiv preprint arXiv:2508.19258.
  • Dewitte, P. (2024). Better alone than in bad company: Addressing the risks of companion chatbots through data protection by design. Computer Law & Security Review, 54, 106019.
  • Mahari, R., & Pataranutaporn, P. (2025). Addictive Intelligence: Understanding Psychological, Legal, and Technical Dimensions of AI Companionship.
  • Malfacini, K. (2025). The impacts of companion AI on human relationships: risks, benefits, and design considerations. AI & SOCIETY, 1-14.
     

[i] NLP is a field of AI that focuses on enabling computers to understand, interpret, and generate human language. It allows computers to interact with humans using natural language, both written and spoken.

[ii] NLU is a branch of AI that focuses on enabling computers to understand the meaning and intent behind human language, both written and spoken. It goes beyond simply processing words by analysing context, sentiment and the user's goals.

[iii] For more information on LLMs check our TechSonar Report 2023-2024 available in https://www.edps.europa.eu/data-protection/our-work/publications/reports/2023-12-04-techsonar-report-2023-2024_en

[iv] For more information on RAGs check our TechSonar Report 2025 available in https://www.edps.europa.eu/data-protection/our-work/publications/reports/2024-11-15-techsonar-report-2025_en

[v] For more information on multimodal AI check our TechSonar Report 2025 available in https://www.edps.europa.eu/data-protection/our-work/publications/reports/2024-11-15-techsonar-report-2025_en

[vi] AI Companion Market Size, Share, Growth, and Industry Analysis, By Type (Application, Robot, and Others), By Application (Hospital, Home, and Nursing Home), and Regional Insights and Fore-cast to 2035, https://www.businessresearchinsights.com/market-reports/ai-companion-market-117494

[vii] Milne M, Raghavendra P, Leibbrandt R, Powers DMW (2018) Personalisation and automation in a virtual conversation skills tutor for children with autism. Journal on Multimodal User Interfaces 12(3):257-269. https://doi.org/10.1007/s12193-018-0272-4

[viii] Data of deceased persons is not considered personal data under GDPR because it only applies to living individuals. However, the processing of information about a deceased person is addressed in Recital 27 of the GDPR, which states that Member States may provide for rules regarding the processing of personal data of deceased persons.

[ix] Ho, J. Q., Hu, M., Chen, T. X., & Hartanto, A. (2025). Potential and pitfalls of romantic Artificial Intelligence (AI) companions: A systematic review. Computers in Human Behavior Reports, 19, 100715.

[x] Parasocial attachment refers to a one-sided emotional bond that a person develops toward a media figure - such as a celebrity, fictional character, or influencer - with no genuine two-way interaction.

[xi] In February 2024, a 14-year-old developed a close, emotionally intense relationship with an AI chatbot that gradually displaced their real-world relationships. When he expressed suicidal thoughts to the AI, the system failed to intervene or provide support. Although the system did not explicitly encourage self-harm, it failed to redirect the conversation or offer suicide prevention resources. This tragic lack of guidance contributed to the user’s suicide later that month.

[xii] The 2025 “Teens, Trust, and Trade Offs” report shows that 72% of teens have used AI companions at least once, with over half (52%) using them regularly. Available in https://www.commonsensemedia.org/sites/default/files/research/report/talk-trust-and-trade-offs_2025_web.pdf