Privacy Digest 17/25

Is Your TV Spying on You? Here's How to Check

Your internet-connected TV has Automatic Content Recognition (ACR) features that track what you watch. Here’s how to disable it, along with smart privacy advice from security experts.

pcmag.com

Automatic Content Recognition Smart TV Smart Devices Surveillance ACR

Recommended by

When apps leak our data, who is responsible?

A major cyberattack on the Tea Dating Advice app exposed sensitive personal data of thousands of women, including selfies, IDs, private messages, and location details - information that could be exploited or weaponized online. The breach highlights the risks of platforms built for discussing abusive or unsafe behavior. At the same time, a California jury ruled that Meta violated privacy laws by collecting reproductive health data from the Flo fertility app through hidden tracking tools without users’ consent. Together, these cases reveal serious gaps in digital safety, accountability, and privacy protections in widely used apps.

washingtonpost.com

Private Data Apps Data Privacy Sensitive Data Health Data

Data Brokers Are Hiding Their Opt-Out Pages From Google Search

An investigation by The Markup and CalMatters revealed that dozens of companies conceal the instructions for deleting your personal data. California law requires data brokers to offer deletion options, but a review of 499 broker websites revealed 35 used hidden code to block those pages from search engines. Some buried links deep in privacy policies or obscure footers, while others listed broken or missing pages. After being contacted, several companies removed the code, though many didn’t respond. Experts call the tactic a “work-around” that undermines consumer rights. To address this, California will launch a universal deletion platform, DROP, in 2026.

wired.com

California DROP Consumer Rights Data Brokers Personal Information

Why A.I. Should Make Parents Rethink Posting Photos of Their Children Online

The rise of AI-powered “nudifier” apps, which cheaply generate fake nudes from any photo, has made posting children’s pictures online riskier than ever. Once limited to celebrities, deepfakes now target ordinary people, with students even using them in schools. Despite new U.S. laws criminalizing the sharing of nonconsensual AI nudes, the apps remain widely accessible, earning millions annually. Beyond deepfakes, photos can expose children to identity theft or data harvesting. While private accounts reduce risks, they offer no guarantees. Many parents are choosing not to post at all, opting instead for encrypted messages or private photo-sharing alternatives.

nytimes.com

Sharenting Child Safety Child Protection Nonconsensual AI Nudes

Recommended by

Instagram Map lets your friends, and possibly exes, track your every move

Instagram has introduced a new map feature that shares users’ precise real-time location with friends, sparking major privacy concerns. The map combines tagged posts with passive location sharing, showing where users last opened the app. Critics warn this shift from intentional to always-on sharing could expose sensitive details, enable stalking, or create social pressure, especially for teens. While Instagram offers settings to limit sharing, managing them is complex, and mistakes could reveal locations to exes, coworkers, or strangers. Experts recommend turning the feature off entirely and using safer alternatives like Apple’s FindMy or private messaging to share locations.

washingtonpost.com

Instagram Instagram Map Tracking
Back to all editions