Digital networked society needs friction-in-design regulation that targets the digital architectures, supposedly smart (data-driven, algorithmic) systems, and interfaces that shape human interactions, behavior, and will (beliefs, preferences, values, intentions). The
relentless push to eliminate friction for the sake of efficiency has hidden social costs that affect basic human capabilities and society. A general course correction is needed.In this article, we clear the First Amendment brush and reveal an open and mostly underappreciated regulatory territory to explore. We argue that friction-in-design regulation should be understood as Twenty-First century time, place, and manner restrictions, akin to laws that prohibit using megaphones in the middle of the night, require permits before marches, and prohibit adult theaters in residential neighborhoods. This does not mean that friction-indesign regulation would escape First Amendment scrutiny altogether, of course. But it would trigger intermediate rather than strict scrutiny, so long as the friction-in-design regulation remained content neutral.
An Act providing for consumer data privacy, for duties of controllers and for duties of processors; and imposing penalties.
Pennsylvania Consumer Data Privacy Act
To date, 20 states have passed data privacy laws in the U.S. Other states have also introduced bills to keep up with the data privacy race.
CCPA (California Consumer Privacy Act) went into effect more than five years ago. I was in grad school when this legislation was signed into law. I distinctly remember being hopeful that it would be a good model that could be adopted at the federal level. Fast forward to 2025 and I am not actually surprised that this remains at the state level and has that classic patchwork approach vibe.
Automated decision-making systems contain hidden discriminatory prejudices. Weβll explain the causes, possible consequences, and the reasons why existing laws do not provide sufficient protection against algorithmic discrimination.