DR. PAM | MEDIA PSYCHOLOGIST
THE PSYCHOLOGY OF DIGITAL BEHAVIORS
  • Home
  • Blog
  • About
    • About Dr. Pamela Rutledge
    • Media Psychology
      • What Is A Media Psychologist?
      • 8 Reasons Why We Need Media Psychology
      • Careers in Media Psychology
      • Example Careers in Media Psychology
      • Media Psychology at Fielding Graduate University
      • Positive Media Psychology
    • MPRC
      • Media Psychology Research Center
    • Media Psychology Review
  • Consulting
    • Speaking & Consulting
    • Audience Engagement: Why Use Personas?
      • How to Build a Persona
    • Adapting to Change
    • Transmedia Storytelling
      • Storytelling Across Platforms
      • Transmedia Storytelling Starts with the Power of Story
      • Our Transmedia World
      • Transmedia Case Study: The Three Little Pigs
      • Transmedia Storytelling Workshop
  • Story Power
    • Brand Storytelling
    • Storytelling: Brands, Entertainment & Organizations
      • Storytelling for Organizations
      • Core Story: Case Study
  • In the News
    • Press Quotes & Interviews 2022-2025
    • 2021-2019
    • 2018-2016
    • 2016-2017
    • 2015-2013
    • 2012 & EARLIER
    • Video Interviews & Webinars
  • Resources
    • Mindful Media & Digital Literacy
      • Positive Media Psychology
      • Benefits of Video Games Part 1
      • Benefits of Video Games Part 2
      • Benefits of Video Games Part 3
      • Becoming Mindful: Exercises
      • Mindful Media Journal
    • Academic Materials
      • Media Psychology Syllabus 2021
      • Media Psychology Syllabus 2012
      • Media Psychology Syllabus 2015
    • Articles
      • Persuasion & Augmented Reality
      • Psychology of Transmedia Engagement
      • Theories of Attention
      • The Psychology of Color
      • Website Design: How to Use Psych Theory
      • Data Strategy: Listen to Your Consumers’ Stories
      • The Psychology of Story
  • Archives
  • Contact
DR. PAM | MEDIA PSYCHOLOGIST
  • Home
  • Blog
  • About
    • About Dr. Pamela Rutledge
    • Media Psychology
      • What Is A Media Psychologist?
      • 8 Reasons Why We Need Media Psychology
      • Careers in Media Psychology
      • Example Careers in Media Psychology
      • Media Psychology at Fielding Graduate University
      • Positive Media Psychology
    • MPRC
      • Media Psychology Research Center
    • Media Psychology Review
  • Consulting
    • Speaking & Consulting
    • Audience Engagement: Why Use Personas?
      • How to Build a Persona
    • Adapting to Change
    • Transmedia Storytelling
      • Storytelling Across Platforms
      • Transmedia Storytelling Starts with the Power of Story
      • Our Transmedia World
      • Transmedia Case Study: The Three Little Pigs
      • Transmedia Storytelling Workshop
  • Story Power
    • Brand Storytelling
    • Storytelling: Brands, Entertainment & Organizations
      • Storytelling for Organizations
      • Core Story: Case Study
  • In the News
    • Press Quotes & Interviews 2022-2025
    • 2021-2019
    • 2018-2016
    • 2016-2017
    • 2015-2013
    • 2012 & EARLIER
    • Video Interviews & Webinars
  • Resources
    • Mindful Media & Digital Literacy
      • Positive Media Psychology
      • Benefits of Video Games Part 1
      • Benefits of Video Games Part 2
      • Benefits of Video Games Part 3
      • Becoming Mindful: Exercises
      • Mindful Media Journal
    • Academic Materials
      • Media Psychology Syllabus 2021
      • Media Psychology Syllabus 2012
      • Media Psychology Syllabus 2015
    • Articles
      • Persuasion & Augmented Reality
      • Psychology of Transmedia Engagement
      • Theories of Attention
      • The Psychology of Color
      • Website Design: How to Use Psych Theory
      • Data Strategy: Listen to Your Consumers’ Stories
      • The Psychology of Story
  • Archives
  • Contact
Nov 03
Your thoughts are becoming data for Meta targeting. Photo: Artem Sukhoroslov/Getty Images Pro

Meta Is Using Your AI Chats to “Personalize Your Experience”

  • November 3, 2025
  • Pamela Rutledge
  • No Comments

Meta will begin using AI chat as data for ad targeting and content personalization across the Meta ecosystem. Here’s what we can do.

Recently, I got an email from Meta Privacy saying that they’ll start using my interactions with Meta’s AI tools to “personalize” my experience by suggesting content I may find interesting and showing ads that are more relevant to me. I’m generally positive about technology use (when accompanied by digital literacy training), but this struck me as beyond the pale. First, having “better” ads isn’t my idea of personalization. Second, yes, the email made this intention transparent, but in practice, it will be completely invisible to everyone who hasn’t read the user agreement. This is my take. What do you think?


Key Points

  • Meta’s AI chats now factor into our likes, follows, and clicks when driving content and ad algorithms.
  • Our tendency to bond with AI leads to self-disclosure that can be recorded and repurposed.
  • AI-driven targeting reinforces our existing thoughts, narrowing exposure to new ideas and increasing perceived risks or threats.
  • Teens naturally use chat tools to explore, inadvertently seeding algorithms that reinforce tentative ideas and misinformation.

Starting December 16, 2025, Meta will begin adding AI chats to the behavioral data it gathers to further personalize its experiences and ad targeting. This move blurs the line between private communication and behavioral data, raising questions about how AI interactions can be used to shape what we see and how we see ourselves online.

Meta currently provides generative AI to users across its ecosphere of apps, including Facebook, Instagram, WhatsApp, and Messenger, and as a standalone app. Even though we don’t pay in cash, Meta’s apps aren’t free. We pay with our data. AI chats take this to a whole new level. Conversations that feel private will now be part of what you “pay” Meta for using its tools.

Behavior as Data

Every click, swipe, and chat tells a story. AI conversations are no different, but they’re more revealing. All kinds of technology gather data about what you do. It can track your online activity using cookies, device fingerprinting, IP addresses, and tracking codes. Offline, technology tracks you through GPS, Bluetooth beacons, and Wi-Fi scanning.

But you are not powerless. Understanding these two things is the superpower that lets you take control.

  1. Learn how any technology you use works, such as how it handles your data and privacy, and what settings you can control.
  2. Understand how our built-in vulnerabilities to technology keep us engaged with notifications, the allure of social connection, and FOMO.

New Kind of Targeting: Your Thoughts

When Meta uses your AI chat interactions as additional input, your thoughts about hiking, cooking, parenting challenges, or mental health can shape the Reels, posts, suggested groups, and ads you see. Meta says it will not use chats about “sensitive topics” such as religion, sexual orientation, politics, and health for ad targeting. But to make that distinction, they still must collect and process your data to decide if it’s fair game.

Why AI Feels Personal

Humans are wired to connect through conversation. Chatting, whether with a person or a bot, activates social cognition networks that foster trust and self-disclosure. We anthropomorphize technology, projecting intention and empathy onto digital assistants.

That makes AI feel safe. When users treat AI as a confidant, they often share details, doubts, and ideas they’d never post publicly. This psychological illusion of intimacy makes AI chats a goldmine for platforms that trade on behavioral insights.

A New Kind of Targeting: Your Thoughts

Previously, algorithms could only infer your interests based on behavior. With AI chat, they can go straight to the source. Your thoughts.

Algorithms will be able to match our thoughts with tracked behaviors. If the algorithm knows we asked about teen stress, it might serve us more posts about adolescent behavior. If it links that AI query with visiting a site for parental-control apps or teen mental health resources, it now has a whole new level of specificity from which to infer and shape our reality.

This can sometimes feel helpful, but it creates a tighter “feedback loop” that can narrow your worldview in ways you may not even realize. The more you engage with a topic, the more the system reinforces it. Over time, this can magnify anxieties or amplify perceived threats, fueled by our natural attentional bias toward emotionally charged content.

The Illusion of Privacy

The use of AI chat can be helpful and entertaining, and, because it feels private, gives us a place to try things out. We can role-play social situations, explore identities, or ponder our wildest dreams. AI chats feel like a safe space to try on ideas, experimenting with self-presentation, and ask questions that may reflect our vulnerabilities or curiosities. This kind of exploration is particularly common (and healthy) during adolescence.

However, those signals feed our recommendation system. In that case, the algorithm can strengthen aspects of the self that were tentative or experimental but not internalized. Narrow feeds can bias our views of what’s good and bad in ourselves and others. They can be reflected in the stories we see, the communities we join, the ads we’re served, and our sense of self. This algorithmic reinforcement has been linked to polarization, anxiety, and distorted perceptions of social norms (e.g., Pariser, 2011; Ito et al, 2023).

What You Can Do

  1. Be mindful of what you ask your AI. When you ask Meta’s AI a question, it may influence what you see in your feed, so use it with awareness.
  2. Check your ad preferences. Navigate to Meta’s Settings to check what personal data the system is using and limit personalization.
  3. Educate teens/kids. Explain that AI chats in the Meta ecosystem are not isolated private spaces. What feels like a private conversation can feed algorithms.
  4. Alternate tools. If you want to use AI chat interactions for personal reflections or sensitive topics, look for AI tools outside the Meta ecosystem. Meta does not have access to conversations with other AI chatbots, such as ChatGPT or Claude. Each has its own user agreements, so check the privacy and security policies to confirm that your content will not be used for targeting or AI training.

Final thoughts

Meta’s policy shift marks another step toward algorithmic intimacy, the blending of private cognition and public data. What we once considered internal, our curiosities, fears, and reflections, is now measurable and marketable.

The good news: Awareness brings choice. You might not be able to stop all personalization. You can choose where and how you engage, what data you feed into these systems, and how you guide others, including your kids, to use AI thoughtfully.

References

Pariser, E. (2011). The filter bubble: What the Internet is hiding from you. Penguin Press.

Ito, M., Cross, R., Dinakar, K., & Odgers, C. L. (Eds.). (2023). Algorithmic rights and protections for children. MIT Press. https://doi.org/10.7551/mitpress/13654.001.0001

  • Facebook
  • Twitter
  • Reddit
  • LinkedIn
  • E-Mail

About The Author

Pamela Rutledge, PhD, MBA is the Director of the Media Psychology Research Center. A consultant, author, speaker, and professor, she consults on a variety of media projects developing audience engagement and brand storytelling strategies.

Leave a reply

Your email address will not be published. Required fields are marked *

This site uses Akismet to reduce spam. Learn how your comment data is processed.

FOR THE PRESS

Dr. Pam Rutledge, media psychologistDr. Pamela Rutledge is available to reporters for comments on the psychological and social impact of media and technology on individuals, society, organizations and brands.  pamelarutledge@gmail.com

SEARCH THE SITE

RECENT POSTS

  • The Legacy of Daytime Talk Shows Lives in Your Feed
  • Oprah's High Road: Diverging Paths in Daytime Talk Shows
  • Daytime Talk Shows: Why We Couldn’t Look Away
  • FAFO Parenting: Letting Kids Learn the Hard Way
  • Meta Is Using Your AI Chats to “Personalize Your Experience”

MEDIA PSYCHOLOGY RESEARCH

The Media Psychology Research Center (MPRC) is an independent research organization directed by Dr. Pam Rutledge.  Read about MPRC at www.mprcenter.org.

CONSULTING PROJECTS

Dr. Rutledge consults on a variety of media projects using psychology to translate data into human behavior for powerful results.

  • Parenting in a Digital World webinar series
  • Persona Development for audience segmentation
  • Fan and Audience Engagement: Identifying audience narratives to satisfy needs
  • Brand Storytelling: Supercharging brand meaning

RECENT POSTS

  • The Legacy of Daytime Talk Shows Lives in Your Feed
  • Oprah's High Road: Diverging Paths in Daytime Talk Shows
  • Daytime Talk Shows: Why We Couldn’t Look Away
  • FAFO Parenting: Letting Kids Learn the Hard Way
  • Meta Is Using Your AI Chats to “Personalize Your Experience”

SEARCH

Content copyright Pamela Rutledge 2026.