Home Technology & Science Computer Anthropic’s Big Choice: Share Your Chat Data or Opt Out by September...

Anthropic’s Big Choice: Share Your Chat Data or Opt Out by September 28

0
Anthropic introduces a new policy requiring users to opt out or share chat data for AI training by September 28, 2025.

Let’s say you use Claude, an AI helper that can chat with you, write code, or answer questions. Now, the company behind Claude—Anthropic—wants to use your words to make Claude even smarter. But there’s a catch: you have to choose whether that’s okay by September 28, 2025.

In this blog, we’ll explore:

  1. What’s changing at Anthropic
  2. What this means for you
  3. Why Anthropic is doing this
  4. How can you control your data?
  5. Why it matters for privacy and AI

1. What’s Changing at Anthropic?

Starting now, Anthropic will begin using your new or continued chats and coding sessions to train Claude AI—unless you opt out. This policy applies to all users, whether you’re using Free, Pro, or Max versions⁤

You’ll see a pop-up titled “Updates to Consumer Terms and Privacy Policy.” There’s a big “Accept” button, plus a small toggle to opt out of sharing your data. But here’s the catch—the toggle is preset to “On,” meaning you’re sharing unless you switch it ⁤

You have until September 28, 2025, to decide—or Claude won’t keep working for you afterward. Chat history older than the one you continue now isn’t used unless you resume it ⁤


2. What This Means for You

If you choose Yes (data sharing):

  • New/continued chats are used to train and improve Claude.
  • Your data may be kept for up to five years.
  • Anthropic says it filters sensitive info and doesn’t sell user data.
  • You can change your mind later—but data already used stays part of the training 

If you choose No (opt out):

  • You’ll stick with the old 30-day deletion policy.
  • Claude will not use your new chats for training.
  • But any data already used can’t be removed from the training history

You can revisit your choice anytime in Privacy Settings under “Help improve Claude.” Just keep in mind, prior used data remains part of model training.


3. Why Is Anthropic Doing This?

Anthropic says:

  • AI models like Claude learn better with real conversations and coding examples.
  • Increasing retention lets them refine safety tools and make smoother model upgrades over time.
  • Chat data helps improve Claude’s ability to code, reason, and answer questions accurately ⁤

But critics worry this shift—from opt-in to opt-out as default—may burden users who might not notice the change or understand it fully ⁤


4. How to Control Your Data

When the pop-up appears:

  • Toggle “Help improve Claude” to OFF to opt out.
  • Or click “Not now” to delay the decision (you must choose by Sept. 28).

You can also:

  • Go to Settings → Privacy → Help improve Claude, and toggle OFF.
  • Deleting individual chats prevents them from being used in future training rounds.
  • But remember: once your data is used, it’s part of the training history and can’t be removed from that process.

5. Why This Matters for Privacy and AI

This policy highlights a big question: Who owns your chat data?
Even trusted AI tools collect and use your words behind the scenes.

  • In the broader AI world, many tools default to opt-out data collection. Other platforms like ChatGPT, Gemini, and Copilot let users disable data sharing in settings—but often it’s easy to miss
  • Some research shows you can train powerful AI models using only publicly available or non-protected data—but specialized tasks may still rely on high-quality data that comes from user interactions ⁤

Anthropic’s change puts the decision in your hands—but only if you take action. It also raises the importance of clear privacy options and staying informed.


Conclusion

Anthropic’s new policy means Claude will only get smarter using your conversation—if you say it’s okay. You must choose by September 28, 2025, whether to share your chat data for model training or opt out and keep your data private for 30 days.

This is your chat, and it’s your choice. Maybe you’ll help improve Claude or maybe you prefer keeping your words private—just make sure you know where the toggle is before the deadline.

NO COMMENTS

LEAVE A REPLY

Please enter your comment!
Please enter your name here

Exit mobile version