About 67 Stations

We are a post-social media platform dedicated to solving the problems of coporatized internet. Influence peddling is a multi-billion dollar industry. But it doesn't have to be that way.

See What You Want To See

Have you heard of the Dead Internet Theory? I just heard the term. It's a fascinating mix of observable facts and dystopian conspiracy. The part I like is the claim that the vast majority of content on the internet is generated by AI and posted by bots. The conspiracy part is how some kind of centralized shadow cabal controls this trend this to make the population docile. I don't advocate this part of the theory. The incentives of flooding the internet with AI generated garbage are enough on their own, so we will ignore the claims of why this is happening.

Did you know you could create an article, blog post, podcast, or YouTube video entirely with AI? I can show you how to automate everything. Once a day, or once a minute, a new piece of content can be created and published without any human influence or intervention. I love the True Crime genre, and it has been overrun with fake AI generated content. I'm not talking about clickbait titles. I'm not talking about low-quality analysis of real events. AI invents crimes that never happened. Perpetrators and victims that never existed. The revenue generated by True Crime content is not impacted by how fake it is. These content providers are called Content Farms. It's faster to make content without humans doing research or quality control. And this approach affects all genres.

This dead-internet content is created for one purpose: to make money. Not to enrich your life. Not to inform you. Just to get your attention long enough to generate ad revenue.

Is this what you want to see? Computer generated content recommended to you by algorithms? It's not what I want to see, so we we behan working on a way to create an island of life in an ocean of dead content. I recently came across a video from Pursuit of Wonder. This is where I learned about the Dead Internet Theory, and why we use the word Spam to describe unsolicited crap. This video outlines the problems we are trying to solve better than I could on this page. If you want to take a deep dive into the problems of a for-profit internet, watch this video.

Here are some highlights from the video:

  • Most websites are motivated to show you lots of low-quality crap.
  • Subscriber/Follower models no longer work, and online artists are separated from their audience.
  • Legitimate artists have to create art to please algorithms rather than pleasing themselves and their audience.
  • Most websites show you what they want, rather than what you want

How are we trying to bring the internet back to life?

All it takes is a dedication to a few simple principles can solve some of the biggest problems we face.

  • Show users content from people they want to follow.
  • Don’t show users content from people they don't follow
  • Don't add addictive or manipulative functionality
  • Don't decide what people or ideas are better than others.

The power of these ideals become evident when we look at what happens in their absence on social media today. Algorithms are designed to show you content you didn't ask to see. Content from channels/people/pages you follow lightly trickles into your feed, surrounded by content you didn't ask to see. Facebook asks content creator pages to pay for followers to see their posts. Even if you follow my Facebook page, you wont see my page posts unless I pay Facebook to show the posts to my followers. Facebook calls this Organic Reach - the ability of people to see content that content creators didn't pay to share. Facebook says Organic Reach isn't what content creators want. Content creators actually want to pay to share their content. Here is their article on the topic, written over a decade ago: Organic Reach on Facebook  Of course, it's not just Facebook. And it's not just a nuisance. Artists that used to rely on social media to connect with their audience now have to pay an arm and a leg to reach them.

Is this what you want from social media? Me neither.

What about the problems with seeing content you didn't ask to see? When I look at social media, most of what I see is "recommended" content from people I don't follow. This is the engine of the dead internet. Algorithms choose low-quality content and bring it to you, whether you want it or not. But the problem is worse than seeing some low-quality slop AI invented. This is how misinformation is spread. Someone makes up a lie, usually a shocking and aggravating lie, and the algorithm shows it to everyone. The more scandalous, the better. What 67 Stations does is simple: we don't show you random inflammatory or sensation content. Nobody can go viral here. Lying is not rewarded. Am I saying 67 Stations can get rid of misinformation? Of course not. But if you don't see content from bots and people you don't follow, the misinformation you see is what you choose to see. We all have that cousin or sibling or parent that shows the most ridiculous stuff. You know their posts are usually crap. You know what to do with their posts. You don't see some aggravating post from some doctor you never heard of that doesn't really exist because they are really some chump in a basement in Indonesia.

There are a few fundamental rules and beliefs that we follow that guide the features and functionality of 67 Stations.

  • Filtering of content is transparent and controlled by you. No secret algorithms.
  • No addictive behaviors designed to suck you in. No likes or shares.
  • We value the recommendations of a few trusted individuals over statistics or aggregated polling.
  • We trust individuals to make their own decisions. We don't have to protect users from ideas they don't like.
  • Abuse isn't tolerated, but you cannot silence an idea you disagree with by calling it abuse.

These behaviors will prevent us from becoming a 500 billion dollar media empire. But they can help us create millions of little communities where meaningful connections can be made and maintained.

What we Are and Are Not Trying to Accomplish

Another way to describe our intentions is to iterate what we think our responsibilities are.

  • We are not the place for privacy.  We share content to the people who want to see it. We do not offer end-to-end encryption. Our emphasis is only showing people content they want to see. Do not confuse that with data protection. We do not offer robust privacy controls. Don't post anything here you wouldn't want the world to see.
  • We are not the place for anonymity.  We require a valid email address to sign up. We put your name next to your posts and comments. Don't post anything here you wouldn't want the world to see.
  • We are not a safe haven for your beliefs.  We are not a political alternative to other web sites. Or a non-political alternative. We are dedicated to keeping artificial influence out of your discussions by keeping unsolicited content from your feed. If you want a political bubble, you can create one in a station. If you want to engage in debate, you can do it. We are not here to influence or protect you.
  • Illegal activity is not allowed.  While we are against censoring ideas, this isn't the wild west. You cannot harass people. You cannot break the law. We will work with law enforcement  (even if we disagree with the law)
  • We are not trying to be the next Facebook, Twitter, Tik Tok, or Instagram.  Content cannot go viral here. You will probably want to use those other sites. Algorithms are not all bad. Viral content is not all bad.
  • We are not trying to be the next Patreon.  We don't charge for access to content. We don't take a cut of your earnings. We don't have a way for you to earn money from your content.
  • We are not trying to be the next YouTube.  We don't host videos. We don't have a way for you to earn money from your content.