TikTok tumbles into 2024
The social media giant quietly clarifies its political ad prohibitions, as creators continue to claim they’re being silenced
Social media companies used to “lean in” to politics. After 2016, tech giants like Facebook, Twitter, and Google built entire teams around elections and asked their staff tough questions about how to moderate content, police advertising, provide transparency, and stem the flow of misinformation. Those decisions and conversations were often difficult and highly scrutinized - but they happened often and were well reported in the press.
As those companies learned hard lessons and pulled back from the politics game in recent years, Tiktok, a relative newcomer, remains one company that is seemingly still trying to figure out its approach.
Outlawing political ads
In 2019, the company announced that it would ban politicians and political groups from running election-related advertising. “We will not allow paid ads that promote or oppose a candidate, current leader, political party or group or issue at the federal, state or local level — including election-related ads, advocacy ads or issue ads,” a company spokesman said at the time. Ahead of the 2022 midterms, TikTok doubled down, releasing an update saying it would prohibit all political fundraising content, “disallowing” accounts from “directing people to a fundraising page on their website.”
Last week, the company quietly updated its help page further outlining the political ad prohibitions, with a list of detailed examples of what types of ads are not allowed. The list also clarifies that ads from “non-political advertisers expressing political views” are off the table, in addition to mere “references to an election,” “the promotion or criticizing of government policies or track records,” and “the display of real war scenes.”
Most notably, political ads “advocating for stopping wars and armed conflict, and raising awareness of war victims, may be allowed.” That carveout is particularly relevant given the conflict in Gaza being a major election issue for some young voters. It’s unclear if Gaza-related ads would fall into the bucket of “criticizing government policies,” or “the display of real war scenes,” which are both prohibited. How can you effectively advocate for stopping war without criticizing government policies or raise awareness of war victims without showing real war scenes? But those ads “may” be allowed? Who’s going to decide?
We’re already confused.
TikTok’s other policies towards the organic promotion and prioritization of political content have been even more unclear to users.
Penalizing certain words and topics?
Political and non-political creators alike have long alleged that the platform censors, deprioritizes, or “shadow bans” certain words or topics, without clearly telling users what’s not allowed. The company’s Community Guidelines, updated several weeks ago, clearly state that some content is indeed ineligible for the app’s algorithmic “For You” feed. Those categories predictably include things like health misinformation and violent content, but many creators assert that the penalized topics are much broader and political than what the company has publicly outlined.
We spoke with several political power users of the app, and every single one shared anecdotes of experiencing a “shadowban” of some kind. When asked about what got them banned, folks largely cited words like “vote,” “abortion,” “marijuana,” and “guns.”
For Sammy Kanter, who co-runs @girlandthegov, “Anything that says “vote”… is always the thing [that gets me banned].” She also noted that the only way her account can “bounce back” is to use a trending sound or make viral content about Donald Trump.
The New York Times reported this week that many creators who make content mentioning “abortion” see a steep decline in views and engagement. But when we discussed this with V Spehar of @underthedesknews and Amelia Montooth (@ameliamontooth), they both noted that their videos about abortion often perform really well. Instead, Spehar noted, “as a gay person using the word gay,” they have received strikes against their account - and were unable to get them appealed.
Much of this alleged suppression or content deprioritization has been attributed to the company’s moderation process, which is complicated. Spehar, who often voices creator concerns to the TikTok team, explained that “there is [currently] a combination of moderated-by-machine rules that TikTok has and then moderated-by-people rules.” Basically, it could either be a person or AI analyzing content. Both have their own biases and issues and a user will probably never know which one is moderating them.
For the platform’s millions of users, the lack of clarity is the crux. “What’s interesting and potentially hazardous here is how the algorithm drives what we're able to see in general,” said Sami Sage, co-founder of Betches. “The fact that we're sitting here talking about how are we going to get around the algorithm’s very particular, mysterious, and not-posted-anywhere rules, actually affects the information that filters to us.”
Ashwath Narayanan, CEO of Social Currant, noted another harm: account strikes and shadow bans disproportionately harm smaller accounts who are then unable to grow and add their voice to the space - a sentiment that V, Sami, and Amelia echoed. He noted, “Creators that talk about these issues all the time… often get deprioritized by the algorithm in reaching new audiences.”
Putting it in context
With other platforms deprioritizing political content writ-large, TikTok is now one of the only major tools that political creators and campaigns have at their disposal to reach and organically grow massive new audiences. As the 2024 election heats up, transparent, clear policy guidelines on advertising and the organic spread of political content are critical for those looking to reach voters online. They will also save companies like TikTok major headaches later on.
Digital ad spending, by the numbers:
FWIW, political advertisers spent just over $12.8 million on Facebook and Instagram ads last week. These were the top ten spenders nationwide:
Former Rep. Liz Cheney is back on the national political scene with new ads on Facebook + Instagram that call on voters to “save the republic” by refusing to support Donald Trump and instead support “a president of character” who “honors our troops.” The ads are being run through Cheney’s PAC, Our Great Task, and interestingly don’t mention Joe Biden.
Meanwhile, political campaigns spent $7.1 million on Google and YouTube ads last week. Here were the top ten spenders nationwide:
Pro-Biden Super PAC Future Forward USA Action launched its first large wave of ads across Google & YouTube (as well as other platforms) this year, targeting battleground states like Georgia, Pennsylvania, and Michigan. Many focus on job creation, the Inflation Reduction Act, and lowering drug costs.
Rep. Colin Allred, the Democrat challenging Ted Cruz for Texas’s U.S. Senate seat, spent a whopping $228,400 on Google + YouTube ads last week. The ads center around Rep. Allred’s bipartisanship and focus on policy over politics.
On Snapchat, political advertisers in the U.S. have spent $2.1 million on ads year to date. Here are the top spenders:
…and on X (formerly Twitter), political advertisers have spent over $4.2 million on the platform in 2024. Here are the top spending accounts:
Your 2024 digital dispatch
FWIW, here’s how weekly digital ad spending (Facebook/Instagram, Google/YouTube) compares between the Trump and Biden campaigns year-to-date:
More from around the internet:
Edward “Jake” Lang has been in prison for over 1,200 days (after he swung a baseball bat at Capitol officers in the Jan 6th insurrection)... and now, according to WIRED, he’s using encrypted messaging channels, like Telegram, to create a network of armed militias in every state.
TikTok suspended an account for Hey Jane, an abortion pill service, and now a ton of groups – both pro-choice and anti-choice – are coming forward to complain that social media platforms are deprioritizing content about abortion.
While campaigns and political groups continue to ramp up their TikTok presence ahead of November’s elections, new research from Pew found that only 4 in 10 users are seeing political content on the app.
Pro-Trump influencers on social media are once again hyping a migrant invasion to win clicks and clout, according to Reuters.
The Washington Post published yet another look at how political and news content has receded on Facebook…
The vibes on TikTok last week:
FWIW, here’s a look at 10 of the most-liked videos mentioning Trump on TikTok in the past week:
The vibes for Donald Trump on TikTok this week were yet again incredibly polarized. The most-liked video (by a lot) was Trump’s own (@realdonaldtrump) video with Logan Paul. There was a lot of hype, including viral videos from @clarkepayne and the official @drphil account, around Trump’s appearance on the Dr. Phil show. On the other hand, a video from @celebs.against.trump featuring Mac Miller (who passed in 2018) roasting Trump also garnered a lot of attention.
Meanwhile, here were some of the most-liked videos mentioning Biden in the past week:
The vibes for President Biden on TikTok this week trended mostly negative. Several of the top videos, including those from @skynews and @thedailyshow, were about a moment at the White House Juneteenth concert where President Biden appeared to freeze up. On a more positive note, a video from @nbcnews of a nice moment between Biden and a WWII veteran also garnered a lot of attention this week.
That’s it for FWIW this week. This email was sent to 21,623 readers. If you enjoy reading this newsletter each week, would you mind sharing it on Twitter or Threads? Have a tip, idea, or feedback? Reply directly to this email.
Resembles Chinese highly filtered social media