Alumna Rebecca Palmer On Chatbot Romances & Family Law

Smiling woman in a blazer sits at a desk in a modern office with orchids and abstract art behind her.
1994 alumna Rebecca Palmer

An accomplished Orlando family law attorney, 1994 Stetson Law alumna Rebecca L. Palmer has become a leading voice in the conversation surrounding artificial intelligence – primarily, chatbot romances – and how family law practitioners should approach cases involving them. 

Her innovative work explores how the romantic use of chatbots reshapes marriage, infidelity, and parenthood. Since chatbots are not recognized as “legal” people, family law attorneys are forced to rethink how best to guide individuals and couples, including how “private” conversations with romantic AI companions can be used in court and how chatbot romances can change relationships for the long term.

She recently answered key questions about this rapidly evolving area.

What is an “AI romantic companion,” and how widespread has this become?

As a Stetson graduate with a 30-year family law career, I have seen the legal profession evolve, from recognizing same-sex marriages to the rise in mediation practices. Now, another change is upon us: artificial intelligence (AI) is evolving beyond simple chat tools for customer service or GPS navigation. Now, people are forming emotional bonds with chatbots that simulate empathy, conversation, and even romantic affection. According to a recent Brigham Young University Wheatly Institute report, nearly one in five adults in the U.S. have already conversed with an AI romantic partner, with apps like Nomi or Relika helping design the perfect companion. These are not science fiction stories; they’re part of how people in 2025 experience connection in the digital age.

How does this new form of emotional attachment intersect with marriage and divorce?

By definition, family law deals with human relationships and domestic issues between two spouses, and, sometimes, a “third party” in cases of infidelity. When that third party is an AI, the emotional betrayal can feel real and even heightened, yet there is no specific legislation defining or regulating it. Courts may treat excessive spending on AI companions as “marital waste,” which can affect financial settlements and custody arrangements. We’re also beginning to see clients cite emotional bonds with AI companions as reasons for marital strain or dissolution. It challenges the legal definition of fidelity and forces us to ask: can an algorithm truly be the cause of a divorce?

Could an AI relationship be considered marital misconduct?

These are uncharted waters, and where things get complex. Legally, marital misconduct such as adultery requires involvement with a person. Because chatbots aren’t “legal persons,” they don’t meet the statutory definition. However, from a practical standpoint, an AI affair might still have emotional consequences that impact alimony, property division, or parenting decisions. Judges may not label it “adultery,” but they could consider the emotional distance or deception involved when evaluating fault or credibility, and that decision could affect custody or financial outcomes.

Are “private” chatbot conversations admissible in court?

Potentially, yes. If an AI chat relationship reveals neglect, coercion, or instability patterns, those digital records can become part of discovery. However, admissibility depends on authentication, proving the messages are genuine and unaltered. With AI systems generating text dynamically, this poses real evidentiary challenges. Attorneys must now understand the metadata and the underlying AI architecture that produced the communication.

Could an AI relationship affect custody cases?

It could. If a parent becomes emotionally dependent on an AI partner, similar to other online addictions like gambling, to the extent that it interferes with their parenting duties, opposing counsel might raise questions about judgment, isolation, or mental health. On the other hand, AI can also provide therapeutic companionship for people coping with loneliness or trauma. Courts must distinguish between healthy coping mechanisms and situations where virtual attachment compromises a parent’s ability to engage in real-world caregiving.

How should family law attorneys prepare for this emerging issue?

The time is now for attorneys to broaden their understanding of human-AI interaction. We can’t dismiss these relationships as fantasy anymore; they have tangible emotional and legal consequences. Legal professionals should stay informed about evolving technology, privacy laws, and evidentiary rules, and guide clients through the emotional and ethical implications of AI use. As with every technological shift, family law must adapt to protect human dignity as human and digital affection evolves.