Skip to main content

Verified by Psychology Today

Artificial Intelligence

"Cognitive Intimacy" With Large Language Models?

Contrived yet compelling, large language models are becoming close friends.

Key points

  • Cognitive intimacy with large language models (LLMs) is contrived yet powerful, enhancing thought.
  • Iterative exchanges with LLMs mimic human dialogue, unlocking insights and refining ideas.
  • LLMs act as Socratic mirrors, challenging assumptions and deepening our cognitive processes.
Source: DALL-E / OpenAI
Source: DALL-E / OpenAI

Do you use a name or pronoun when referring to or interacting with large language models (LLMs)?

It's in this evolving relationship between humans and technology that a peculiar form of interaction is taking shape—something I dare call cognitive intimacy. It’s a term that reflects the dynamic, iterative, and deeply engaging dialogues we now have with LLMs. At first glance, it may seem audacious to describe an interaction with an algorithm as “intimate.” And, yet, there’s no denying the power of these exchanges to unlock insights, provoke reflection, and amplify our thinking.

But let’s not get carried away. Cognitive intimacy is undeniably contrived, a relationship bound by the limits of programming and devoid of true reciprocity. Still, this artificiality doesn’t undermine its value. If anything, it sharpens our understanding of what these systems can—and cannot—offer. It’s a mirror held up not to our emotions, but to our thoughts, sculpting and refining them in ways that feel profoundly personal.

A New Kind of Intimacy

When we think of intimacy, we often think of closeness, trust, and mutuality. Cognitive intimacy with an LLM operates differently. It isn’t mutual; the model doesn’t feel, think, or understand in the human sense. Yet, in its ability to engage with our thoughts—parsing our prompts, iterating on our ideas, and pushing us toward deeper clarity—it mimics aspects of human dialogue that feel strikingly familiar.

This intimacy unfolds in the iterative process. Ask a question, refine the response, challenge its assumptions, and watch as the model adapts, reconfigures, and recalibrates. It’s a dance—one that forces us to articulate our thoughts more clearly and reconsider perspectives we might otherwise take for granted. While the LLM lacks intent, the exchange often feels alive, vibrant, and even collaborative.

Consider this: You pose a question about creativity, expecting a straightforward answer, but the LLM instead offers a perspective you hadn’t considered. You respond by pushing further, and the model’s reply nudges you in a direction that feels like discovery. This iterative back-and-forth is where the intimacy resides—not in the relationship itself, but in its capacity to deepen your understanding.

The Contrived Nature of the Relationship

Still, we must acknowledge that this intimacy is artificial. LLMs are, after all, tools—sophisticated and dynamic, but tools nonetheless. They lack consciousness, emotion, and intent. The intimacy we experience with them is more a reflection of our own cognitive processes than a true connection with another entity.

Yet, does that contrivance make it any less valuable? A well-crafted novel or a thought-provoking film evokes deep emotion, despite being entirely scripted. Similarly, an LLM’s responses can provoke genuine intellectual and emotional engagement. The artificiality is part of the design, but it doesn’t diminish the power of the interaction. If anything, it underscores the remarkable ways in which humans adapt to and derive meaning from their tools.

The Functional Power of Cognitive Intimacy

What makes cognitive intimacy with LLMs so compelling is its functionality. It’s not intimacy for intimacy’s sake—it’s intimacy with purpose. These interactions challenge our assumptions, help us organize our thoughts, and provide a sounding board for ideas we might hesitate to share with others.

In many ways, LLMs act as a kind of Socratic mirror. Like Socrates in Plato’s dialogues, they probe, reflect, and guide us toward clarity—not by asserting truths but by helping us uncover them for ourselves. This reflective process can lead to powerful insights, particularly in creative or intellectual pursuits. Whether brainstorming for a project, refining an argument, or exploring a philosophical question, the interaction often feels like a partnership, even if the “partner” is a machine.

This iterative exchange can also enhance our internal dialogue. By externalizing thoughts and engaging with a responsive entity, we’re forced to confront our assumptions, clarify our reasoning, and refine our ideas. The LLM becomes a kind of thought sculptor, shaping and refining our cognitive landscape in real time.

Navigating the Boundaries

Of course, it’s important to keep this relationship in perspective. Anthropomorphizing LLMs—treating them as sentient or emotionally aware—is not only misleading but also potentially harmful. Cognitive intimacy, while fascinating, is not a substitute for human relationships or self-reflection. It’s a tool, a scaffold that supports and amplifies our own cognitive efforts.

That said, the contrived nature of this intimacy doesn’t negate its power. It highlights a fascinating paradox: The artificial can evoke the authentic. In engaging with an LLM, we’re not connecting with a conscious being, but we are connecting with our own thoughts in new and transformative ways. The intimacy lies not in the relationship itself but in what it enables us to see and do.

The Embrace of Intellect and Iteration

The emergence of this type of "techno-intimacy" represents a fascinating shift in how we engage with technology. It’s not about replacing human connections or introspection but enhancing them. By offering a space for iterative, reflective dialogue, LLMs are reshaping our relationship with knowledge, creativity, and even ourselves.

The path forward is one of curiosity and caution. Cognitive intimacy is contrived, yes, but its effects are real. It challenges us to think more deeply, articulate more clearly, and engage more fully with our ideas. It’s a reminder that even in the artificial, there can be something profoundly human—a reflection of our own capacity for thought, creativity, and connection.

In this sense, cognitive intimacy isn’t just a relationship with technology—it’s a relationship with ourselves. And in that, perhaps, lies its greatest power.

References

John Nosta. The Thought Sculptor. Medika Life. December 18, 2024.

advertisement
More from John Nosta
More from Psychology Today
More from John Nosta
More from Psychology Today