
Lifelogging under fascism
How self-tracking became self-incrimination
How to think about vendors, technology, and power
Designing a values-centered life from first principles.
Building community-driven public media for the post-federal funding era.
How to transform the internet's most toxic platform into essential infrastructure.
Why effective opposition requires more than just saying no
The case for moving AI down the stack
How we might rebuild journalism from the ground up by rethinking what a newsroom is.
Why the idealistic promise of collaborative software often falls short of its potential.
I relaunched my website on Ghost. Here's why.
Werd I/O explores the intersection of technology, democracy, and society. It's independently published by Ben Werdmuller, reader-supported, and always free to read.
Building community-driven public media for the post-federal funding era.
The Trump administration is using AI as a way to shill fossil fuels. But even for tech companies that don't care about climate change, renewables are a far better option.
OpenAI claims a significant result: gold-level performance International Mathematical Olympiad. But they're scant on details and it needs to be independently verified.
When vendor promises meet government warrants, the warrants win every time. Microsoft's Senate testimony shows why "trust us" isn't a data protection strategy.
Without public media funding, local stations will close, creating news deserts and allowing political corruption to thrive.
Global Majority nations are building ways to store their citizens' data locally. But will they own the datacenters themselves?
Tony Stubblebine's account of saving Medium is remarkable in its transparency - and in its execution.
How to transform the internet's most toxic platform into essential infrastructure.
"I do not have the time or emotional energy to screen out regular attacks by Large Language Models, with the knowledge that making the wrong decision costs a real human being their connection to a niche community."
It's wild to me how many people are still engaging with X.
As more people look to AI to learn about the world, the people who control how it's trained and how it responds will control our prevailing narratives. That's wildly dangerous.
Forget San Francisco - I wish every single US city would do this. We're too car-dependent, too isolated, too unhealthy. (Honestly, European mixed-use development should be the model.)