Tumgik
#BitChute
frithwontdie · 5 months
Text
They Finally Admit the Truth About Weather Control
Tumblr media
45 notes · View notes
Text
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Mastadon and Bluesky
https://twitter.com/garth_nader
17 notes · View notes
blogparanormal · 11 months
Text
3 notes · View notes
ultramaga · 2 years
Video
youtube
Bitchute May Have To Censor More Due To New UK Law Bitchute should just geoblock the UK and any other such country, and sane people should just use a vpn to bypass the violent fascist authorities that rule there.  The Tories should just give up - they exist to be controlled opposition, never actually opposing anything.
2 notes · View notes
iamapeacefulpoet · 12 days
Text
0 notes
jamespoeartistry · 1 year
Text
Tumblr media
Dear Perceptive Readers, as I've continued to fight for my stolen name and creative works, I am James Lynch Jr; namesake son of James Lynch Sr. Thank you for the interest James PoeArtistry Productions
0 notes
thefantasticone21 · 2 years
Text
New BitChute Channel for Livestream VODs and Gameplay Videos
New BitChute Channel for Livestream VODs and Gameplay Videos
I now have a BitChute channel as an additional place where you can view my livestream VODs of my Twitch streams in case you missed me live, as well as another place you can watch gameplay videos that I may create when I’m not streaming. What is BitChute? It is another YouTube alternative, and another place you can go to watch videos that your favourite content creators make, if they happen to…
View On WordPress
0 notes
frithwontdie · 7 months
Text
Professor Needs Armed Guard After Telling Truth About Race and Crime
7 notes · View notes
Text
Tumblr media
We don’t stop
3 notes · View notes
blogparanormal · 11 months
Text
2 notes · View notes
merlinsbed · 6 months
Text
deeply concerned that my world health issues instructor linked to a penn and teller video on bitchute
like what was she doing on bitchute?? did she just google the clip and that was the first place it came up???? does she not know that bitchute is where right wing conspiracy theorists go when they get banned from youtube???????
ASKDJAKSD I JUST GOOGLED PENN AND TELLER BULLSHIT! AND ALL OF THE LINKS WERE FOR PARAMOUNT+ OR YOUTUBE HOW DID SHE END UP ON BITCHUTE????????????
0 notes
georgiarealestate1 · 2 years
Text
Tumblr media
1 note · View note
jamespoeartistry · 2 years
Text
0 notes
mariacallous · 4 days
Text
Tumblr media
Neo-Nazis and white supremacists are sharing Hitler-related propaganda and trying to recruit new members on TikTok, according to a new report from the Institute for Strategic Dialogue (ISD) shared exclusively with WIRED. The TikTok algorithm is also promoting this content to new users, researchers found, as extremist communities are leveraging the huge popularity of TikTok among younger audiences to spread their message.
The report from ISD details how hundreds of extremist TikTok accounts are openly posting videos promoting Holocaust denial and the glorification of Hitler and Nazi-era Germany, and suggesting that Nazi ideology is a solution to modern-day issues such as the alleged migrant invasion of Western countries. The accounts also show support for white supremacist mass shooters and livestream-related footage or recreations of these massacres. Many of the accounts use Nazi symbols in their profile pictures or include white supremacist codes in their usernames.
Nathan Doctor, an ISD researcher who authored the report, says he began his investigation earlier this year when he came across one neo-Nazi account on TikTok while conducting research for another project.
He was quickly able to unmask a much broader network of accounts that appeared to be actively helping each other through liking, sharing, and commenting on each other’s accounts in order to increase their viewership and reach.
The groups promoting neo-Nazi narratives are typically siloed in more fringe platforms, like Telegram, the encrypted messaging app. But Telegram has become a place to discuss recruitment techniques for TikTok specifically: White supremacist groups there share videos, images, and audio tracks that members can use, explicitly telling other members to cross-post the content on TikTok.
“We posted stuff on our brand new tiktok account with 0 followers but had more views than you could ever have on bitchute or twitter,” one account in a Neo-Nazi group posted on Telegram about their outreach on TikTok. “It just reaches much more people.”
Others have followed suit. One prominent neo-Nazi has often asked his thousands of Telegram followers to “juice,” or algorithmically boost, his TikTok videos to increase their viral potential.
An extremist Telegram channel with 12,000 followers urged members to promote the neo-Nazi documentary Europa: The Last Battle by blanketing TikTok with reaction videos in an effort to make the film go viral. Researchers from ISD found dozens of videos on TikTok featuring clips from the film, some with over 100,000 views. “One account posting such snippets has received nearly 900k views on their videos, which include claims that the Rothschild family control the media and handpick presidents, as well as other false or antisemitic claims,” the researchers wrote.
This is far from the first time the role that TikTok’s algorithm plays in promoting extremist content has been exposed. Earlier this month, the Global Network on Extremism and Technology reported that TikTok’s algorithm was promoting the “adoration of minor fascist ideologues.” The same researchers found last year that it was boosting Eurocentric supremacist narratives in Southeast Asia. Earlier this month, WIRED reported how TikTok’s search suggestions were pushing young voters in Germany towards the far-right Alternative for Germany party ahead of last month’s EU elections.
“Hateful behavior, organizations and their ideologies have no place on TikTok, and we remove more than 98 percent of this content before it is reported to us,” Jamie Favazza, a TikTok spokesperson tells WIRED. “We work with experts to keep ahead of evolving trends and continually strengthen our safeguards against hateful ideologies and groups.”
Part of the reason platforms like TikTok have in the past been unable to effectively clamp down on extremist content is due to the use of code language, emojis, acronyms, and numbers by these groups. For example, many of the neo-Nazi accounts used a juice box emoji to refer to Jewish people.
“At present, self-identified Nazis are discussing TikTok as an amenable platform to spread their ideology, especially when employing a series of countermeasures to evade moderation and amplify content as a network,” the researchers write in the report.
But Doctor points out that even when viewing non-English-language content, spotting these patterns should be possible. “Despite seeing content in other languages, you can still pretty quickly recognize what it means,” says Doctor. “The coded nature of it isn't an excuse, because if it's pretty easily recognizable to someone in another language, it should be recognizable to TikTok as well.”
TikTok says it has more than “40,000 trust and safety professionals” working on moderation around the globe, and the company says its Trust and Safety Team has specialists in violent extremism who constantly monitor developments in these communities, including the use of new coded language.
While many of the identified accounts are based in the US, Doctor found that the network was also international.
“It's definitely global, it's not even just the English language,” Doctor tells WIRED. “We found stuff in French, Hungarian, German. Some of these are in countries where Naziism is illegal. Russian is a big one. But we even found things that were a bit surprising, like groups of Mexican Nazis, or across Latin America. So, yeah, definitely a global phenomenon.”
Doctor did not find any evidence that the international groups were actively coordinating with each other, but they were certainly aware of each others’ presence on TikTok: “These accounts are definitely engaging with each others' content. You can see, based on comment sections, European English-speaking pro-Nazi accounts reacting with praise toward Russian-language pro-Nazi content.”
The researchers also found that beyond individual accounts and groups promoting extremist content, some real-world fascist or far-right organizations were openly recruiting on the platform.
Accounts from these groups posted links in their TikTok videos to a website featuring antisemitic flyers and instructions on how to print and distribute them. They also boosted Telegram channels featuring more violent and explicitly extremist discourse.
In one example cited by ISD, an account whose username contains an antisemitic slur and whose bio calls for an armed revolution and the complete annihilation of Jewish people, has shared incomplete instructions to build improvised explosive devices, 3D-printed guns, and “napalm on a budget.”
To receive the complete instructions, the account holder urged followers to join a “secure groupchat” on encrypted messaging platforms Element and Tox. Doctor says that comments under the account holder’s videos indicate that a number of his followers had joined these chat groups.
ISD reported this account, along with 49 other accounts, in June for breaching TikTok’s policies on hate speech, encouragement of violence against protected groups, promoting hateful ideologies, celebrating violent extremists, and Holocaust denial. In all cases, TikTok found no violations, and all accounts were initially allowed to remain active.
A month later, 23 of the accounts had been banned by TikTok, indicating that the platform is at least removing some violative content and channels over time. Prior to being taken down, the 23 banned accounts had racked up at least 2 million views.
The researchers also created new TikTok accounts to understand how Nazi content is promoted to new users by TikTok’s powerful algorithm.
Using an account created at the end of May, researchers watched 10 videos from the network of pro-Nazi users, occasionally clicking on comment sections but stopping short of any form of real engagement such as liking, commenting, or bookmarking. The researchers also viewed 10 pro-Nazi accounts. When the researchers then flipped to the For You feed within the app, it took just three videos for the algorithm to suggest a video featuring a World War II-era Nazi soldier overlayed with a chart of US murder rates, with perpetrators broken down by race. Later, a video appeared of an AI-translated speech from Hitler overlaid with a recruitment poster for a white nationalist group.
Another account created by ISD researchers saw even more extremist content promoted in its main feed, with 70 percent of videos coming from self-identified Nazis or featuring Nazi propaganda. After the account followed a number of pro-Nazi accounts in order to access content on channels set to private, the TikTok algorithm also promoted other Nazi accounts to follow. All 10 of the first accounts recommended by TikTok to this account used Nazi symbology or keywords in their usernames or profile photos, or featured Nazi propaganda in their videos.
“In no way is this particularly surprising,” says Abbie Richards, a disinformation researcher specializing in TikTok. "These are things that we found time and time again. I have certainly found them in my research."
Richards wrote about white supremacist and militant accelerationist content on the platform in 2022, including the case of neo-Nazi Paul Miller, who, while serving a 41-month sentence for firearm charges, featured in a TikTok video that racked up more than 5 million views and 700,000 likes during the three months it was on the platform before being removed.
Marcus Bösch, a researcher based in Hamburg University who monitors TikTok, tells WIRED that the report’s findings “do not come as a big surprise,” and he’s not hopeful there is anything TikTok can do to fix the problem.
“I’m not sure exactly where the problem is,” Bösch says. “TikTok says it has around 40,000 content moderators, and it should be easy to understand such obvious policy violations. Yet due to the sheer volume [of content], and the ability by bad actors to quickly adapt, I am convinced that the entire disinformation problem cannot be finally solved, neither with AI nor with more moderators.”
TikTok says it has completed a mentorship program with Tech Against Terrorism, a group that seeks to disrupt terrorists’ online activity and helps TikTok identify online threats.
“Despite proactive steps taken, TikTok remains a target for exploitation by extremist groups as its popularity grows,” Adam Hadley, executive director of Tech Against Terrorism, tells WIRED. “The ISD study shows that a small number of violent extremists can wreak havoc on large platforms due to adversarial asymmetry. This report therefore underscores the need for cross-platform threat intelligence supported by improved AI-powered content moderation. The report also reminds us that Telegram should also be held accountable for its role in the online extremist ecosystem.”
As Hadley outlines, the report’s findings show that there are significant loopholes in the company’s current policies.
“I've always described TikTok, when it comes to far-right usage, as a messaging platform,” Richards said. “More than anything, it's just about repetition. It's about being exposed to the same hateful narrative over and over and over again, because at a certain point you start to believe things after you just see them enough, and they start to really influence your worldview.”
207 notes · View notes
sjbattleangel · 7 months
Text
Li*C*nv*y is dangerous. Stay away from him.
TW: N*zism. Homophobic, ableist hate speech. Su*cide baiting. Targeted harassment amd cyberbulling.
So when I first heard about this guy, it was when he was throwing a tantrum over a certain cartoon reboot not being a "serious, chest-pumping epic". On his YouTube page, I noticed he had a link to BitChute*, a literal website dedicated to hosting white supremacist material. When confronted over this, he kept shifting the goalposts all while using thought-terminating "Eu tu" excuses.
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
When I wrote post on this guy a while back, someone told me that he is a user of ANOTHER hate site: KiwiF*rms. Under the name of "*ldManB**mer". For my own safety, I didn't visit the site but by searching the username and using the Internet Archive, I found these:
Tumblr media Tumblr media Tumblr media
But most damning of all:
Tumblr media
Yes. It is him. No matter how much he claims to care for victims of sexual abuse, no matter how nice he is to others, Li*C*nv*y is a member of white supremacist hate sites where he causally throws about homophobic, ableist slurs and baits other users into k*lling themselves. He just uses abuse victims and his own family members as shields to cover up his true nature
Please, stay away from him. Don't support him. He doesn't deserve an audience.
One of the reasons I was forced to delete my old post is because one of his lifeless simps decided to lead a hate campaign against me to unalive myself.
This is true audience he attracts: (Report and block this turd-stain)
Tumblr media
No. The only person here who's effing pathetic is you: You simp for a sad little manbaby, so scared of the world no longer revolving around him that he joins literal hate movements to freeze his childhood in amber, police what children watch and harass women and minorities out of nerd culture. No wonder you adore him. You're perfectly alike! Keep the waterworks coming, asshole. I need them for my tea.
Tumblr media
Mmmm! Now that's delicious tea.
*And before you come at me with your neoliberal idealism, that "having a BitChute account doesn't make you a N*zi". I'm going to stop you there because yes it freaking does.
27 notes · View notes
frithwontdie · 3 months
Text
Dan Bilzerian Finally Comes Out and Tells the Truth About THEM
3 notes · View notes