• The Impending Privacy Threat of Self-Driving Cars

    With innovations often come unintended consequences—one of which is the massive collection of data required for an autonomous vehicle to function. The sheer amount of visual and other information collected by a fleet of cars traveling down public streets conjures the threat of the possibility for peoples’ movements to be tracked, aggregated, and retained by companies, law enforcement, or bad actors—including vendor employees.

  • The U.K. Government Is Very Close to Eroding Encryption Worldwide

    The Online Safety Bill, now at the final stage before passage in the House of Lords, gives the British government the ability to force backdoors into messaging services, which will destroy end-to-end encryption. If it passes, the Online Safety Bill will be a huge step backwards for global privacy, and democracy itself.

  • A New Way to Look at Data Privacy

    Researchers create a privacy technique that protects sensitive data while maintaining a machine-learning model’s performance. The researchers created a new privacy metric, which they call Probably Approximately Correct (PAC) Privacy, and built a framework based on this metric that can automatically determine the minimal amount of noise that needs to be added.

  • How an “AI-tocracy” Emerges

    Many scholars, analysts, and other observers have suggested that resistance to innovation is an Achilles’ heel of authoritarian regimes. But in China, the use of AI-driven facial recognition helps the regime repress dissent while enhancing the technology, researchers report.

  • U.S. Agencies Buy Vast Quantities of Personal Information on the Open Market – a Legal Scholar Explains Why and What It Means for Privacy in the Age of AI

    The issues pf the protection of personal information in the digital age is increasingly urgent. Today’s commercially available information, coupled with the now-ubiquitous decision-making artificial intelligence and generative AI like ChatGPT, significantly increases the threat to privacy and civil liberties by giving the government access to sensitive personal information beyond even what it could collect through court-authorized surveillance.

  • Tech Mandated by U.K. Online Safety Bill “Could Turn Phones into Surveillance Tools”

    Tech mandated by the U.K. government’s Online Safety Bill could be used to turn millions of phones into facial recognition tools. It would be possible, for example, for governments to use client-side scanning (CSS) to search people’s private messages, for example performing facial recognition, without their knowledge.

  • Appeals Court Should Reconsider Letting the FBI Block Twitter’s Surveillance Transparency Report

    Twitter tried to publish a report bringing much-needed transparency to the government’s use of FISA orders and national security letters, including specifying whether it had received any of these types of requests. However, without going to a court, the FBI told Twitter it could not publish the report as written. Twitter sued, and last month the federal Court of Appeals for Ninth Circuit upheld the FBI’s gag order.

  • “Smart” Tech Coming to a City Near You

    The data-driven smart tech trend extends far beyond our kitchens and living rooms. Will real-time sensors and data offer new solutions to the challenges cities face, or just exacerbate existing inequalities?

  • China and Russia Sharing Tactics on Internet Control, Censorship

    Beijing and Moscow have been sharing methods and tactics for monitoring dissent and controlling the Internet. For a few years now. The two countries have been deepening their ties for the past decade, and controlling the flow of information online has been a focal point of that cooperation since 2013. Since then, that cooperation expanded through a number of agreements and high-level meetings in China and Russia between top officials driven by a shared vision for a tightly controlled Internet.

  • Section 702’s Unconstitutional Domestic Spying Program Must End

    On its face, Section 702 allows the government to conduct surveillance inside the United States so long as the surveillance is directed at foreigners currently located outside the United States. And yet, the NSA routinely (aka “incidentally”) acquires innocent Americans’ communications without a probable cause warrant. Then, rather than “minimize” the sharing and retention of Americans’ data, as Congress required, the NSA routinely shares such data with other government agencies, which retain it for at least five years.

  • German Court to Rule About Phone Searches of Asylum-Seekers

    Judges could announce this week if authorities broke the law when they combed an asylum-seeker’s phone to find out where she was from. The searches are common practice — and the ruling could have major consequences.

  • EFF Files Amicus Briefs in Two Important Geofence Search Warrant Cases

    Unlike traditional warrants for electronic records, a geofence warrant doesn’t start with a particular suspect or even a device or account; instead police request data on every device in a given geographic area during a designated time period, regardless of whether the device owner has any connection to the crime under investigation. The EFF argues these warrants are unconstitutional “general warrants” because they don’t require police to show probable cause to believe any one device was somehow linked to the crime under investigation.

  • New Web Tracking Technique is Bypassing Privacy Protections

    Advertisers and web trackers have been able to aggregate users’ information across all of the websites they visit for decades, primarily by placing third-party cookies in users’ browsers. Two years ago, several browsers that prioritize user privacy – and advertisers have responded by pioneering a new method for tracking users across the Web, known as user ID (or UID) smuggling.

  • Smart AI Tools Could Protect Social Media Users’ Privacy

    Digital assistants could help prevent users from unknowingly revealing their views on social, political and religious issues by fighting AI with AI, researchers say.

  • Consumers Feel Left Out of Debates on Cyberattacks and Data Security

    Illegal cyberattacks on thousands of citizens’ personal data in Australia have heightened awareness of the hazards of insecure digital systems, – and consumers want to play a more active role in building more resilient systems to reduce risks caused by hacking, online deception, bots and other threats.Illegal cyberattacks on thousands of citizens’ personal data in Australia have heightened awareness of the hazards of insecure digital systems, – and consumers want to play a more active role in building more resilient systems to reduce risks caused by hacking, online deception, bots and other threats.