In the case of the Filter Bubble Transparency Act, it’s not just spin; it’s an example of how badly defined buzzwords can make it impossible to address the internet’s problems. The bill is named after Eli Pariser’s 2011 book The Filter Bubble, which argues that companies like Facebook create digital echo chambers by optimizing content for what each person already engages with.
The FBTA aims to let people opt out of those echo chambers. Large companies would have to notify users if they’re delivering content — like search results or a news feed — based on personal information that the user didn’t explicitly provide.
However, the FBTA doesn’t make platforms explain exactly how their algorithms work. It doesn’t prevent them from using arcane and manipulative rules, as long as those rules aren’t built around certain kinds of personal data. And removing or disclosing a few factors in an algorithm doesn’t make the overall algorithm transparent.