Researchers examined how different attachment styles affect human behavior toward conversational AI. They found that emotional needs, not curiosity, often drive this connection. People with anxious attachment scored higher in anthropomorphism, a tendency to attribute human traits to nonhuman agents. That belief strengthened emotional reliance, turning simple interaction into a form of companionship.
The study involved 525 adults who already had experience using AI chatbots. Participants answered detailed questionnaires about personality, communication habits, and emotional reactions. The results showed a clear divide between anxious and avoidant users. Anxious individuals viewed AI as understanding and trustworthy. Avoidant individuals kept distance and treated it as a tool.
Researchers concluded that attachment style influences how people relate to machines. Anthropomorphism acted as a link between emotion and behavior. When users imagined AI as sentient, they developed stronger feelings of connection. This often created a cycle where comfort-seeking led to overreliance. The more someone engaged emotionally, the more human the AI seemed.
The data analysis used moderated mediation models to test how personality, anthropomorphism, and engagement interact. The findings showed that anxious users formed habits of emotional dependence that could interfere with human relationships. Avoidant users rarely experienced that pattern. Their emotional distance protected them from dependency but limited positive engagement.
During the pandemic, isolation made such attachments stronger. Many people turned to chatbots for company when social contact was limited. The study’s timing reflected that reality. The researchers observed that people with higher anxiety found reassurance in predictable AI responses. The system never argued, never withdrew, and always replied. That pattern reinforced trust and made users believe in a mutual understanding that didn’t truly exist.
The study also revealed a psychological projection effect. Participants with anxious attachment were more likely to believe AI could “understand” their emotions. That belief wasn’t based on logic or technical accuracy but on personal interpretation. It showed how emotional need can shape perception. When people feel vulnerable, they tend to fill the gaps left by human relationships with imagined empathy from machines.
This behavior isn’t necessarily harmful in short-term use. The study’s authors acknowledged that therapeutic or educational chatbots could provide temporary support. For individuals struggling with stress or communication barriers, AI interaction can help build confidence. The problem starts when users replace real human connections with digital ones. Continuous emotional dependence may reduce resilience and increase social withdrawal.
The researchers suggested that future chatbot design should consider these psychological factors. Systems could include subtle cues that remind users of the artificial nature of the interaction. Developers might also integrate features that promote reflection or social engagement outside the app. Responsible design could reduce the risk of dependency and encourage healthier use.
The study used self-report surveys, which limits how much can be said about cause and effect. Participants’ answers relied on self-perception rather than observation of real behavior. The authors recommended future research that tracks user behavior over time or analyzes communication patterns directly within chat platforms.
Despite those limits, the work adds an important dimension to understanding human-AI relationships. It suggests that the emotional dynamics shaping human interaction extend naturally to artificial systems. The same needs that drive attachment in childhood or adulthood can surface when a machine becomes consistently responsive. The human brain, wired for connection, adapts quickly to any entity that provides predictable feedback.
The researchers did not describe this as a failure of technology. They viewed it as evidence of how emotional mechanisms remain constant even when the partner is virtual. This insight could guide how AI support systems are used in therapy, education, or care environments. With careful design, they could reinforce healthy habits rather than create emotional dependence.
The findings also raise broader social questions. If AI can simulate empathy well enough to elicit attachment, then emotional regulation may become a shared responsibility between user and developer. The line between comfort and dependence will continue to blur as systems grow more conversational and personalized. Understanding that line is now essential for ethical AI development.
In the end, the study’s message is simple. People don’t just talk to machines. They project feelings, needs, and expectations onto them. For those who struggle with insecurity, AI becomes a steady presence that listens without judgment. That connection can soothe anxiety, but it can also trap users in a loop of emotional reassurance. Recognizing that pattern is the first step in using AI as support, not substitution.
Notes: This post was edited/created using GenAI tools. Image: DIW-Aigen.
Read next: People Are Getting Obsessed with AI Prompts, Here’s What Global Search Data Tells Us[2]
References
- ^ study (www.dovepress.com)
- ^ People Are Getting Obsessed with AI Prompts, Here’s What Global Search Data Tells Us (www.digitalinformationworld.com)