Photo by Rob Curran on Unsplash
You don’t encounter a lot of Nazis on Pinterest. You have to go to X for that.
Strolling through the comments on a typical Pinterest post you’ll see questions about where you found such and such an item, positive responses about how sweet something is, and other such warm feedback. The comments in response to a viral X post is less uplifting. I probably can’t share here what’s often typed out.
It’s clear that users behave a certain way on Pinterest and a completely different way on Twitter X. What product decisions were made that led to such divergent user behavior? Or perhaps this is a corollary to Godwin’s law where if you gather enough people online in one place, the Nazis show up?
Pinterest has 522 million monthly active users (MAU) and X has 650 million [include links to sourcing on MAU]. Both platforms share a lot of core functionality: Both platforms are designed to post and share content (Pinterest is primarily image-based while X is primarily text-based). Users can browse, search, like, comment, share. Both platforms rely on ad revenue for monetization and so want you to spend as much time on their site as possible.
Pinterest states their mission is “to bring everyone the inspiration to create a life they love.” Their platform is designed to allow people to “search, save and shop the best ideas in the world for all of life’s moments.” They have a specific audience that they target and that audience shares a common purpose.
X (and Twitter before the name change) has famously stated that they want to be the digital town square for the world. The first sentence of the official X Rules states: “X’s purpose is to serve the public conversation.” This is a broader audience and a much broader purpose.
So this is the challenge I see platforms face: the broader their intended audience, the more toxic they tend to become. Platforms succeed when they are designed for specific audiences who share a common purpose.
Design for community
The Stanford Encyclopedia of Philosophy defines social norms as:
…the rules of a group of people that mark out what is appropriate, allowed, required, or forbidden for various members in different situations. […] Once a person adopts a norm, it functions both as a rule that guides behavior and as a standard against which behavior is evaluated.
When there’s a sense of community, which arises from aligned purpose, norms evolve and they become self-enforcing. I believe this is the force at work with Pinterest and absent from X. Platforms that try to attract everyone, that fail to provide a unifying purpose are places where community norms struggle to form and take hold. As such assholeish behavior crops up and there’s no norm to quell it. X becomes a hellscape without healthy social norms. Pinterest is a sunny place because social norms take root. I’m willing to wager that there are Nazis and other assholes on Pinterest, but social norms keep them from behaving thusly on the platform. They play by the rules on Pinterest. They act brutishly on X.
Design as Governance
So how does a platform design for community? It starts with the realization that design is governance:
“When we create designs, we’re basically defining what is possible or at least highly encouraged within the context of our products. We’re also defining what is discouraged.”
It’s easier to do this when you have a clear understanding of desired behavior, as well as an audience that shares a common goal or unifying interest. Too wide open and it’s near impossible to craft the guardrails that will shape those norms.
There is an axiom that states “to design for everyone is to design for no one.” If your product aims to please everyone, it becomes so generic that it appeals to no specific audience. This hurts the formation of community, and thus social norms for your platform.
The design choices we make influence the behaviors of its users. It’s easier to do this when we have a clear audience and set of behaviors we want to encourage. Conversely, if we design for the broadest set of users and purposes, that influence wanes. Chaos ensues.
LinkedIn vs. Facebook
Consider LinkedIn and Facebook. LinkedIn has about 1.15 billion users worldwide. Facebook has a little over 3 billion. Yes FB is a lot bigger, but both are massive in size and user base. Both are global in their reach.
According to their official Professional Community Policies, LinkedIn wants to “reflect the best version of professional life. This is a community where we treat each other with respect and help each other succeed.” That is a specific purpose for a specific audience. The posts you find on LinkedIn — and the comments they generate — comport with this view. For the most part, people present a version of themselves that is more professional. You do not find the vitriol and toxic behavior found on other platforms.
Similar to X, Meta (Facebook’s parent company) wants to be the digital public square. They want the broadest possible audience and be the place where everyone is talking. Their community guidelines state: “Meta wants people to be able to talk openly about the issues that matter to them, whether through written comments, photos, music, or artistic mediums, even if some may disagree or find them objectionable.” The Facebook experience certainly lacks the professional one you find on LinkedIn.
The toxic behavior found frequently on Facebook does not show up on LinkedIn. In fact, I’ve seen LinkedIn users chastised for posting even the positive types of posts you find on Facebook; when those posts aren’t of a professional nature and topic. Social norms exerting themselves.
Content Moderation and free speech
I should add that all four of the social media platforms discussed in this post have similar guidelines in terms of what types of content is and is not allowed. No nudity, no harassing or threatening messages, no calls for violence. By the strictest reading of each platform’s guidelines, the same content should be allowed on all of the sites, and the same content forbidden should not be allowed on all of the sites. The wildly different experiences across the platforms is not the result of different content policies.
And this isn’t about free speech and the first amendment, which is so often invoked when discussing behavior on social media platforms. The First Amendment prohibits the government from choosing what speech is acceptable. All of these are private businesses; each decides for themselves what they will and won’t allow on their platform. Our “free speech” on any platform is at the discretion of that company’s governing body. Their platform, their rules.
Design as a community catalyst
The platforms we build are more than just digital spaces — they’re social architectures that shape human interaction. Pinterest and LinkedIn demonstrate that specificity isn’t a constraint, but a powerful design strategy. By creating platforms with clear intentions and well-defined audience purposes, these companies have proven that meaningful online communities aren’t accidents — they’re intentional designs.
The lesson for product designers and technologists is clear: every design choice is a governance choice. When we create social media platforms, we are not just building interfaces; we’re establishing social ecosystems. The guardrails we put in place, the behaviors we encourage or discourage, and the specific audience we target fundamentally determine the quality of human interaction that will emerge.
As we continue to build the next generation of social media platforms, we must eschew the idea of a universal audience. True connection doesn’t happen when we try to include everyone — it happens when we create spaces where specific communities can thrive, where shared purposes create natural, self-reinforcing social norms.