The “social internet” is so called because of user-generated content (UGC). When, in 1999, web designer Darcy DiNucci observed the evolution of the internet from the static presentation of documents to what she named Web 2.0, she said its defining attribute would be interactivity.…the ether through which interactivity occurs. She concludes her piece with an evocative image from the natural world to describe the flourishing, innumerable web-app designs to come: “The process will be long and unpredictable, though an organic system of mitosis, mutation, and natural selection that we can only regard with wonder.”

I’m a sucker for natural-world metaphors for human drama, and DiNucci’s prophecy transfixed me. After years of working on Trust & Safety in social tech, I think about harmful UGC also as a physical law. If you build it, they will come: where there be image-upload, there be swastikas, penises, marijuana leaves. Where there be free-from text fields, there be n-words, grooming, fraud. To the extent that social platforms address online harms, it has been through bolted-on technologies and ex post operational expenditures on content moderation.The phrase “Safety By Design” got some truck with platforms and regulators in recent years. For example, Australia’s e-Safety Commissioner uses it as a regulatory standard. And for a few months the Discord Product Policy team was renamed “Safety By Design,” partly to meet regulators’ appetites. But to me it’s a bar that no social company has ever met. True “Safety by Design” exists in certain manufacturing contexts, such as the production of elevators and automobiles, and I think social tech companies should steer clear of that phrase unless they walk the walk. Read this 2011 piece by Cory Doctorow for more on “failing well,” an exceptionally rare requirement in our field today.

The field of Trust & Safety is preoccupied with the idea that harms are an inevitable consequence of free expression: not a side-effect, but rather an effect. The speed of digital communication, and the pseudonymity Vocab: Anonymity describes a user being completely unknowable by platform administrators (e.g. the platform does not log any client data, IP address, etc, and requires no log-in, and so forth), and thus also unknowable by other users. In contrast, Pseudonymity describes a system in which the platform admins can identify unique users and attribute distinct actions to them (and, importantly, can ban them), but enables the users to control their identities with respect to other users (e.g. through a displayname policy that lets Abe Katz identify as kazoodude89). and distance of social networks, seem to draw out our demons.

And so if I work in social tech, I work in hydrology. Every crevice fills with water. The slightest surface out-of-level becomes a river. And water eventually shapes the land, sure as the US political right shapes platform policy.Renée DiResta is heroically documenting this, from her own experiences with jawboning by Jim Jordan to clear-eyed accounts like this: “The companies that rightly resisted Biden-era requests seem to be folding under Trump-era threats. That’s the difference between pressure and coercion, between persuasion and power.”

Of all the online harms, I think spam is the most helpful specimen for understanding the wither-tos and why-fors of trust & safety. Nobody likes spam. Spam is legal. Spammers (American Spammers, anyway) have a First Amendment right to spam. And platforms (American Platforms) have a First Amendment right to take down spam. If you want to be online and not encounter spam, then you want the operators of the internet to block spam. And “To block spam” is the metonymy of so much: Censorship! Trust & Safety! Content Moderation!My other favorite metonymy of this whole field is “Is that a good nipple?” but more on that later/elsewhere. Spam is unique among online harms because it’s always the most prevalent problem platforms face, by several orders of magnitude (over things like “nudity” or “copyright infringement” or “extremism,” and so forth). And so it’s the most useful image for hydrology: the “molecule” of spam is any ping of our server. More and more and more they accumulate, until they hit our rate limits; to hit our rate limits is to know our rate limits, and to know us is to optimize their assault. So we adapt our rate limit, hourly, randomly; we invent new, ever-more-contorted logics to differentiate “good ping!” from “bad ping!” and we keep it secret. And into any crack in our security system - a bad Captcha implementation, say - will leak, will flow millions of weightloss pills and penis enhancements. Some rain is good, too much rain is bad.

I learned some “tenets of organic farming” from two farmers in 2021. Their axiom, which has stuck with me in many areas of life, is: feed your soil, not your plant. It means if you tend more to the health of the microbiome in the soil than to the flowers and fruits of a specific plant, you will find fewer pests, fewer weeds, and healthier leaves on that very plant. In the parlance of organizational strategy, this relates to what’s called “systems thinking.” It is a frontier of social tech governance, and ideally government regulation, if we can appreciate the social internet for the landscape and biome that it is. Researcher evelyn douek put forth the “systems thinking” argument in her 2022 article,

…looking past a post-by-post evaluation of platform decisionmaking reveals a complex and dynamic system that needs a more proactive and continuous form of governance than the vehicle of individual error correction allows. Lawmakers need to embrace a second wave of regulatory thinking about content moderation institutional design that eschews comforting but illusory First Amendment–style analogies and instead adopts a systems thinking approach. This approach focuses on the need to look to structural and procedural mechanisms that target the key ex ante and systemic decisionmaking that occurs upstream of any individual case.

In the fall of 2024, Hurricane Helene dumped the equivalent of Lake Tahoe on the Black Mountains of North Carolina. Those hollows now peer over new waterways and deleted roads. Effective policy responses to such events include things like “emergency” designations for funding; “relief” efforts for the immediate needs of those harmed (first aid care and Campbell’s soup); “resilience” programs for economic recovery (plans for “surging” capital allocation); and “mitigation” investments for reducing harms the next time it happens.Did you know that the American Society of Civil Engineers’ 2025 Infrastructure Report Card gave the bridges of America a grade of C? Notably, prevention of disasters is not a policy tool - it cannot be. There Will Be Water.

On Web 2.0, There Will Be Speech. Unfathomable amounts of it. If social platforms, government regulators, and We The People begin to see the tides, currents, and meteorologies of online expression in a given piece of software as a force of nature, we can manage it more intentionally than we have so far.