#algorithms of oppression
Explore tagged Tumblr posts
Text
Image: Wangechi Mutu, 'You are my sunshine,' collage painting on paper, 24 x 36", 2015.
âMarginalized and oppressed people are linked to the status of their group and are less likely to be afforded individual status and insulation from the experiences of the groups with which they are identified. The political nature of [search engines] demonstrates how algorithms are a fundamental invention of computer scientists who are human beingsâand code is a language full of meaning and applied in varying ways to different types of informationâ -Safiya Umoja Noble, "Algorithms of Oppression"
Before reading Safiya Umoja Nobleâs âAlgorithms of Oppression,â I was among the many that truly believed Google was a neutral public resource. Maybe youâve heard the saying âlibrary at your fingertips.â
However, despite popular belief, its algorithms serve paid advertising which happens to sport search results as a product of âmost relevant and useful informationâ (Noble 37). Algorithms involve the process of ârendering,â a procedure that Noble calls âexpressly social, economic, and human,â implemented through a series of steps called âalgorithmsâ (37). The common misconception that Search is a mirror to human beliefs is combatted by Nobleâs notion that âcurrent algorithmic constraints [do] not provide appropriate social, historical, and contextual meaning to already overracialized and hypersexualized people who materially suffer along multiple axesâ (36).
This means that for the white programmers who are uncomfortable talking about race, accurate depictions of Black, Indigenous, and People of Color escape the clutches of our everyday algorithms.
Furthermore, search engine optimization or SEO represents pushing ads or websites to the âtop of a results list for a query,â providing profit to various companies or Google based on website clicks (Noble 40). Because Google is what Noble calls a âmultinational advertising company,â it can prioritize search results for, say, âBlack womenâ under a multitude of porn hyperlinks over the âeleven and a half billion documents that could have been indexedâ (49).
Even someone such as Noble, who defines herself from the Black feminist perspective, a pedagogy that analyzes the intersection between racism, sexism, and so on, would find that search results do not always directly reflect her interests.
One quote that I would like to include, and that pulls away the smoke screen of ânaturalizedâ algorithms, comes early on in the chapter, âMarginalized and oppressed people are linked to the status of their group and are less likely to be afforded individual status and insulation from the experiences of the groups with which they are identified. The political nature of [search engines] demonstrates how algorithms are a fundamental invention of computer scientists who are human beingsâand code is a language full of meaning and applied in varying ways to different types of informationâ (Noble 26).
Edited: 5/2/2023
7 notes
¡
View notes
Text
I love good Audiobooks on new tech.
#Accessibility#AI#AI 2041#AI and Global Power#AI Ethics#AI hidden costs#AI history#AI risk#AI successes and setbacks#AI systems#Ajay Agrawal#Alexa#Algorithms of Oppression#Artificial Intelligence: A Guide for Thinking Humans#Atlas of AI#Audible#Audiobooks#Brian Christian#Caroline Criado Perez#Data bias#Ethical Machines#Future of artificial intelligence#Google's AI#Inclusivity#Invisible Women#Kai-Fu Lee#Kate Crawford#Literature consumption#Mark Coeckelbergh#Melanie Mitchell
2 notes
¡
View notes
Link
âIt would take more than a year for a federal judge to conclude the insurerâs decision was âat best, speculativeâ and that Walter was owed thousands of dollars for more than three weeks of treatment. While she fought the denial, she had to spend down her life savings and enroll in Medicaid just to progress to the point of putting on her shoes, her arm still in a sling.â
0 notes
Photo
Coded Bias and the Algorithm Justice League Algorithms reflect bias: that of the programmers and the data sets, a new kind of virus infecting machine learning. Machine learning can, through algorithmic oppression, replicate social prejudices and structures of inequality. These are conclusions that creative researchers have emphasized when studying algorithms and social life. â Read the rest https://boingboing.net/2023/02/16/coded-bias-and-the-algorithm-justice-league.html
0 notes
Text
Goodreads
The master algorithm seems to give its name an inglorious connotation.
WEIRD comes to mind. White, Educated, Industrialized, Rich and Democratized. This problem of many humanities has also made it into the coding of algorithms. Previously in history, the problem was that many foundations for research used a too small and homogeneous pool of people and most of the study was done by white, male and wealthy people, so that their results were not representative for the whole population.
And that in two ways. On the one hand, prejudices, when they were not yet politically incorrect, flowed directly into pseudo-research. As emancipation and equality spread, it were only the indirect, unvoiced personal opinions. But the research leaders, professors and chief executives incorporated their conscious and unconscious worldviews into the design of questions, research subjects, experimental methods and so on.
Secondly, these foundations for the research were presented to an equally biased audience as well as test subjects. For a long time, much of the research has been based on these 2 foundations and is it indirectly until today, because new research builds on older results. It is like trying to finally remove a bug in the source code that evolved over the years with the bug as a central element of the whole system. That alone is not the only severe problem, but the thousands of ramifications it left behind in all other updates and versions. Like cancer, it has spread everywhere. This makes it very expensive to impossible to remove all these mistakes again. A revision would be very elaborate and associated with the resistance of many academics who assume dangers for their reputation or even whole work. They would see their field of research and their special areas under attack because nobody likes criticism and that would be a hard one to swallow.
What does this have to do with the algorithms? The programming of software is in the same hands as in the example above. Certainly not in such dimensions, but subconsciously inadvertent opinions may flow into it and the way search engines generate results is even more problematic. Especially in the last few years, deep learning, AIs, big data, and GANs, Generative Adversarial Networks, have been much more integrated into the development so that the old prejudices could begin evolving in the machines themselves without extra human influence.
This means that, in principle, no one can say anymore decidedly how the AIs come to their conclusions. The complexity is so high that even groups of specialists can only try timid approaches to reverse engineering. How precisely the AI has become racist, sexist or homophobic cannot be said anymore and worse, it can not be quickly repaired in hindsight. Because the reaction patterns on a search input cannot be selected in advance.
There is a sad explanation: Unfortunately, people are often infested with low instincts and false, destructive mentalities. When millions of people have been focusing their internet activity on aggressive hostility for decades, the algorithm learns to recognize their desires. There is a lot of money to earn and the AI should provide the users with what they want. The market forces determine the actions of the Internet giants and these give the public what it craves for. Ethics and individualized advertising can hardly follow the same goals. This is, even more, the case for news, media and publishers who suffer from the same problems as the algorithms. With the difference that they play irrelevant, rhetorical games to distract from the system-inherent dysfunctions.
The same problem exists with the automatic proposal function of online trade, which can inadvertently promote dangerous or aggressive behavior. Or with the spread of more extremist videos by showing new and similar ones, that are automatically proposed, allowing people to radicalize faster. The AI does its job, no matter what is searched for.
On a small scale, the dilemma has already been seen with language assistants and artificial intelligence, degenerating in free interaction with humans. For example, Microsoft's intelligent, self-learning chatbot, which was transformed into a hate-filled misanthrope by trolls within days. It is not difficult to imagine the dimension of the problem with the far-spread of new technologies.
One of the ways to repair these malfunctions is time. When people reset the AI's by doing neutral and regular searches. That's too optimistic, so it's more likely we will have to find a technical solution before people get more rational. For both better alternatives, the search queries and the results would have to change significantly before there could be the beginning of positive development.
The academic search results should not be underestimated. These often false, non-scientific foundations on which many of the established sciences stand. Even if the users became reasonable, there would still be millions of nonsensical results and literature. These fake, antiquated buildings of thought, on which many foundations of modern society are based, must be the primary objective. The results are the symptoms, but those dangerous and wrong thinkings are the disease. The long-unresolved history behind it with all its injustices has to be reappraised because it is the reason for widespread poverty and ignorance, which has its roots in wrong social models so that innocent AIs get deluded by search requests.
And the effects of search input and search results are mutually reinforcing. The mirror that they hold for society testifies only to hidden prejudices. However, those feel saver in their secret corners because they are supposedly unrecognized and subtly stoked by populists additionally and for their benefit. When wrong thinking has buried itself so deeply into a society, it also becomes part of all the products of that culture.
So you read So You Want to Talk About Race and now you have more questions. Specifically, youâre wondering how privilege affects your life online. Surely the Internet is the libertarian cyber-utopia we were all promised, right? Itâs totally free of bias and discriminaâsorry, I canât even write that with a straight face.
Of course the Internet is a flaming cesspool of racism and misogyny. We canât have good things.
What Safiya Umoja Noble sets out to do in Algorithms of Oppression: How Search Engines Reinforce Racism is explore exactly what it is that Google and related companies are doing that does or does not reinforce discriminatory attitudes and perspectives in our society. Thanks to NetGalley and New York UP for the eARC (although the formatting was a bit messed up, argh). Noble eloquently lays out the argument for why technology, and in this case, the algorithms that determine what websites show up in your search results, is not a neutral force.
This is a topic that has interested me for quite some time. I took a Philosophy of the Internet course in university evenâbecause I liked philosophy and I liked the Internet, so it seemed like a no-brainer. We are encouraged, especially those of us with white and/or male privilege, to view the Internet as this neutral, free, public space. But itâs not, really. Itâs carved up by corporations. Think about how often youâre accessing the Internet mediated through a company: you read your email courtesy of Microsoft or Google or maybe Apple, and ditto for your device; your connection is controlled by an ISP, which is not a neutral player; the website you visit is perhaps owned by a corporation or serves ads from corporations trying to make money ⌠this is a dirty, mucky pond we are playing around in, folks. The least we can do as a start is to recognize this.
Noble points out that the truly insidious perspective, however, is how weâve normalized Google as this public search tool. It is a generic search termâjust google itâand, yes, Google is my default search engine. I use it in Firefox, in Chrome, on my Android phone ⌠I am really hooked into Googleâs ecosystemâor should I say, itâs hooked into me. But Googleâs search algorithms did not spring forth fully coded from the head of Zeus. They were designed (mostly by men), moderated (again by men), tweaked, on occasion, for the interests of the companies and shareholders who pay Googleâs way. They can have biases. And that is the problem.
Noble, as a Black feminist and scholar, writes with a particular interest in how this affects Black women and girls. Her paradigm case is the search results she turned up, in 2010 and 2011, for âblack girlsââmostly pornography or other sex-related hits, on the first page, for what should have been an innocuous term. Nobleâs point is that the algorithms were influenced by societyâs perceptions of black girls, but that in turn, our perceptions will be influenced by the results we see in search engines. It is a vicious cycle of racism, and it is no one personâs faultâthere is no Chief Racist Officer at Google, cackling with glee as they rig the search results (James Damore got fired, remember). Itâs a systemic problem and must therefore be addressed systemically, first by acknowledging it (see above) and now by acting on it.
Itâs this last part that really makes Algorithms of Oppression a good read. I found parts of this book dry and somewhat repetitive. For example, Noble keeps returning to the âblack girlsâ search exampleâreturning to it is not a problem, mind you, but she keeps re-explaining it, as if we hadnât already read the first chapter of the book. Aside from these stylistic quibbles, though, I love the message that she lays out here. She is not just trying to educating us about the perils of algorithms of oppression: she is advocating that we actively design algorithms with restorative and social justice frameworks in mind.
Let me say it louder for those in the back: there is no such thing as a neutral algorithm. If you read this book and walk away from it persuaded that we need to do better at designing so-called âobjectiveâ search algorithms, then youâve read it wrong. Algorithms are products of human engineering, as much as science or medicine, and therefore they will always be biased. Hence, the question is not if the algorithm will be biased, but how can we bias it for the better? How can we put pressure on companies like Google to take responsibility for what their algorithms produce and ensure that they reflect the society we want, not the society we currently have? Thatâs what I took away from this book.
Iâm having trouble critiquing or discussing more specific, salient parts of this book, simply because a lot of what Noble says is stuff Iâve already read, in slightly different ways, elsewhereâjust because Iâve been reading and learning about this for a while. For a newcomer to this topic, I think this book is going to be an eye-opening boon. In particular, Noble just writes about it so well, and so clearly, and she has grounded her work in research and work of other feminists (and in particular, Black feminists). This book is so clearly a labour of academic love and research, built upon the work of other Black women, and that is something worth pointing out and celebrating. We shouldnât point to books by Black women as if they are these rare unicorns, because Black women have always been here, writing science fiction and non-fiction, science and culture and prose and poetry, and itâs worthwhile considering why we arenât constantly aware of this fact.
Stransformed, rebranded for the 21st century. They are no longer monsters under the bed or slave-owners on the plantation or schoolteachers; they are the assumptions we build into the algorithms and services and products that power every part of our digital lives. Just as we have for centuries before this, we continue to encode racism into the very structures of our society. Online is no different from offline in this respect. Noble demonstrates this emphatically, beyond the shadow of a doubt, and I encourage you to check out her work to understand how deep this goes and what we need to do to change it.
0 notes
Text
Take action and stand in solidarity with Palestinian people
#signal boost#stop apartheid in palestine#free gaza#free palestine#save gaza#palestinian liberation#save the people of gaza#social justice#palestine#speak up now#social matters#adding tags to your posts helps people find them#tumblr post#tumblr tags#human rights#settler colonialism#oppression#solidarity#solidarity with palestine#algorithm#fyp#fypage#blog
23 notes
¡
View notes
Text
Hi this is a vent post! Continue scrolling if you'd rather not see that
#Giving time...#Still more time...#Wouldn't want to plague any previews#Maybe another filler. Just for some fun#Is this enough?#It certainly is now#Alright start:#I'm so bored. I am so incredibly; intrinsically; entirely bored. I have been taught the same thing for four years straight#'It's only four years!' that's literally a quarter of my lifetime right there. My formative years are being spent stressed and in a state /#/of constant self-loathing#I was watching a YT video and the phrase 'attention-starved STEM major' came up and I was like. Yea#What am I even working towards? The hope that my version of capitalist hell isn't as bad as everyone else's? I'm just so sick of not /#/having a stable future what with politics and normal working people becoming more and more oppressed#I don't want to work and that's not because I'm lazy. It's because my brain is recognising that there is no reward anymore#I used to have such a little spark in Yr7. I remember having things to say and wanting to share everything I've done#I still do that now; sure I do. I don't enjoy it though#I thought I liked drawing but I'm realising that all I really like is the attention. I COULD draw things I like drawing... but then I /#/ don't get attention which my mind then classifies as zero reward#I'm very tired of doing things for no credit; reward; or validation. This is becoming a theme#Then I wonder what I'm doing wrong. What part of the algorithm am I not hitting. Then I realise that I'm just not marketable in a way#God. I'm seriously breaking rn. It's not even only because of GCSEs#It's just a culmination of doing all these things to be told that I am unworthy of Having as a result. It doesn't matter if I'm smart; my /#/ parents still don't own their house and can't afford to pay for heating most days#Literally what am I doing this for#And then I realise that all of this is ALSO attention-seeking behaviour! I'm my own worst problem; I recognise exactly what's wrong with /#/ myself but the body wants what it wants. And what it wants is validation that I'm not going to get in this life#Hi guys! Maybe don't interact. That could fix me#Wean me off of needing virtual numbers just to feel something. Jesus#I can't even be happy with the things that I make for myself. Because I make nothing for myself anymore#It's just a whole sad existence of an expected 12hr+ of school every day until I get a job I guess. Then it's 12hr+ of job every day until
5 notes
¡
View notes
Text
tumblr needs to stop sending me G!le defence posts, i stg, I write at least one hate-filled love letter to that boy every month.
#Im pretty sure its a neverending cycle#this cycle of gale-hate that I perpetuate myself thanks to whatever algorithm tumblr may or may not have#yeah he sucks as a love interest#but by Mj he's a morally shitfucked character as well#all wrapped in a sympathetic backstory with the oppressed childhood friend trope thrown in
5 notes
¡
View notes
Text
thinking about the controversy around mag 185 again because (?) it was so weird. setting aside the fact that mag 185 wasnât even that revolutionary in its critique against police brutality, itâs still crazy why people who believed it would be got upset.
i know that a lot of people were angry w/ the idea of real world systematic trauma being used as material for the show, but um⌠uh. thatâs the point thatâs the point thatâs literally the whole point. in s5 everyoneâs deepest fears and traumas have been made manifest in the most literal ways and. ? obviously that includes real world fears and not-cartoony, easy to digest ones too,,
itâs also odd to me that this ep was controversial, considering the fact that weâd previously had episodes focusing on topics like eating disorders and addiction before. by that point in the show, itâs assumed that the audience would be comfortable with mag 185 because (imo) we had handled much darker and visceral depictions of real world trauma prior. so ? did ppl online suddenly get scared that jonny wouldnât handle a topic like that with sensitivity?? after 185 episodes ???
i guess the follow up argument to this would be whether jonny had any right to include these kinds of things in the show which.. um. he did actually sorry :P
colourblind character casting and writing doesnât do as much to critique the actual systemic issues that kept minorities away from the spotlight for so long. everyone cheers for writers when they do the bare minimum of having good representation in their work, but then get angry when writers use their platform to critique real world systematic issues âšď¸
like u know theyâre supposed to do that right,, thatâs the whole reason why people believe independent media has the potential to be world changing. writers can actually talk about these issues without having to deal with networks censoring them. why wouldnât he be allowed to do that :/
#ed mention#addiction mention#sliding my shiny gold âi am black i can talk about this cardâ across the table rn#actually not only can i talk about this⌠my opinion on this is actually higher than the law on this through lived experience so L + ratio#broader topic on how neoliberalism and idealism has ruined our ability to think critically about systematic oppression but idgaf#bro goes to one marxist reading group and thinks heâs engels đâď¸#no fandom tags youâll see this if the algorithm deems u worthy
16 notes
¡
View notes
Text
OK I do not like tumblr recommending me râ˘dfem posts (block pillarsalt) because I liked this post about feminist topics, (op of the first post seems to be supportive of transfemmes) Especially with staffâs flippant policy on active hate groups like these. I am horrified to think about how many people algorithms have led into these forms of violence.
#I donât think thereâs some ulterior motive here tbh. itâs just inexcusable policy#unless youâre being actively threatened they wonât ban unless they get MULTIPLE complaints#REGARDLESS these policies and algorithms are designed by ppl who donât have the best interests of oppressed groups in mind
15 notes
¡
View notes
Text
instagram
From the comments
So was her argument really just, "I tried to do research, but the evidence doesn't seem to back up my beliefs, so I'm going to rely on common sense, which is actually just reiterating what I already believe."
More projection from her than a cinema.
So when all your research points to the opposite of the point youâre trying to prove it doesnât mean that itâs biased it could possibly just maybe mean YOU ARE WRONG
I love when conservatives are like "it's so hard to find data that supports our views!!!" Um honey... that's when you're supposed to change your views...
Why is it that the algorithms pushes conservative content? I've been far left my entire life, yet every time I've logged on for the past eight years I've been flooded with Jordan Peterson and Joe Rogan content! What's going on??
#Emily Wilson but not the cool Iliad one#the other one#the one who says she talks about things four hours a day with no research#(what like it's hard)#and then complaints when she can't find any data to support her claims#because she is being oppressed... not cause the facts show she is wrong#lmao#were podcasts a mistake?#algorithms#2025: listen less to podcasts#Instagram
1 note
¡
View note
Text
Synthesis Report
Technology, simply defined, is information with a goal and a way to be reproduced. Writing is technology. So are telephone polls, hammers, knives, and AI. Technology has often been cited as the driving force of the United States empire. And for Western ways of living, technology has been seen as a gift to foreign regions. For the purposes of this essay, I will connect concepts of racism that have led to what we consider today as objective technologyâ that is, tech that is unbiased and bears science-backed ideas of truth. And thus, we will discover how the intersection of tribal sovereignty, (un)naturalized algorithms, and colorblind ideology all help us understand how technology has historically been used as a tool to boost colonial motives. Letâs first look at tribal sovereignty, what it means, and how it works in our modern world.
Tribal sovereignty, as defined by Marisa Elena Duarte, is the right for Indigenous and Native tribes to self-govern. But because white leaders have a penchant for manifest destiny, the Indigenous peoples of what is now known as âAmericaâ are forced into difficult situations like losing land and access to the internet. To understand how to move forward, Duarte asks us in her book âNetwork Sovereigntyâ to look at Indigenous spiritual practices and appreciation of the land in contrast to how colonial technology is developed under racist guidelines. She lists these considerations for the digital generation: colonial technologies have historically been weaponized against Native and Indigenous peoples, digital technology can support Native youth, and awareness of digital technologyâs ability to limit traditional Native notions of grace and peace.
On that note, the âNatives in Techâ conference, which was held in 2020, has provided that there are big movements created by Indigenous and Native tribes to cut out third-party vendors and big tech companies and is moving forward with creating a network of Native software engineers. These software engineers move within the parameters of the sacred land connection integral to Indigenous ways of living. These movements teach us that it is possible to create systems of technology that fully encapsulate non-white and non-western existences and that doing so actually makes the spread of information found in the complexities of globalization.
Ever since the birth of search engines and the internet at large, we've been told that algorithms tell us more about ourselves than they do about the people who make them. What our screens show us is a product of what weâve searched previously and what weâve clicked on the most. The reality is algorithms are not naturalized. Algorithms are, actually, highly simplified systems based on the coderâs bias. Safiya Umoja Noble, author of âAlgorithms of Oppression,â contends they currently â[do] not provide appropriate social, historical, and contextual meaning to already overracialized and hypersexualized people who materially suffer along multiple axesâ (36). A big hint for why this is such an issue comes from the lack of Black, Indigenous, and People of Color employed in Silicon Valley. In âSilicon Valley Pretends That Algorithmic Bias Is Accidental,â Amber Hamilton discusses the tech culture, which has a history of racist and sexist hiring discrimination.
Whatâs more, Hamilton continues, is that tech companies, like Google, have a habit of dissuading employees from holding political discussions in the workplace. Yet again, we see an example of how easy it is for white tech to reflect white interests. We return, again, to the overarching idea that white tech seeks white power. Even if these instances seem unintentional, they tell us a story. The colorblind ideology that ensues, as Ruha Benjamin says in âRace After Technology,â âare sold as morally superior because they purport to rise above human biasâ (38). It is almost impossible to challenge tech as we are brought to believe it's an entity all of its own, totally void of its creator's morality. MIT's data scientists work hard to construct robots without gender, class, or race. While the robots indeed were âservantsâ and âworkers,â MIT scientists referred to them as âfriends and children, addressing them in âclass-avoidantâ terms (Benjamin 42). Programmers felt so uncomfortable inputting the varying histories of racism, transphobia, and misogyny that they just let them out altogether. Unfortunately, acting as if these things didnât exist doesnât make technology better. It only makes it worse. So how do tribal sovereignty, naturalized algorithms, and colorblind ideology all tie together?
Colonial tech is so focused on reaching the biggest audience it can that there really is no space for them to care about the repercussions of their product. And if they are legitimately concerned, it's generally in favor of discriminating against Black, Indigenous, and People of Color. Decolonizing tech looks like creating tech according to accurate histories and with values that empower people. It doesnât look like plowing through sacred land. It doesnât look like perpetuating racism or claiming racism doesnât exist. Tech has the power to be something more. Tech has the power to create better lives not just for white people but for Black, Indigenous, and People of Color.
0 notes
Text
Who fucking narced on the robotgirl dicksucking pic. God damn it.
0 notes
Text
i think the point is that it is especially fucked up for the family and friends of this 18 year old who was killed to watch videos of her face being puppeted around, which i guess isn't unique to AI but is made much more accessible by it, as we have seen with famous women literally being edited into nudity.
ai is not going to leave the uncanny valley if it continues to walk on the dead and make "perfection" from nothing :/ wish her loved ones peace and healing
#ai is often used as a new form of existing oppression bc its kind of a fancy algorithm and it is built off of ppls biases!#and as such needs to be regulated to take those biases into account! bc the ppl making the ai arent going to lol
238 notes
¡
View notes
Text
A religious extremist terrorist group deciding that today is the day to do a murder suicide situation with an entire population of oppressed people is not the #progressive revolution that you guys think it is. Despite what your tiktok algorithm is showing you actual news reports and ground footage is showing 90% of Palestinians are freaking the fuck out because Hamas just gave Netanyahu the excuse he's been waiting for to commit genocide.
This is not going to end well for the Palestinians and Hamas has no right to decide for everyone that their physical safety isn't important anymore. Anyone who sees this as anything but a tragedy needs to realize that real world geopolitics do not fucking work like a YA novel revolution.
2K notes
¡
View notes
Text
sorry to like, keep going on about this but as a (former?) media professional: seeing so many comments on @wearewatcher's video saying "I would fire people so I didn't have to charge for content" and accusing WATCHER of being capitalist about it is actually insane.
you used to have to pay for content. you had to pay like ÂŁ16 to watch one whole movie on DVD. this is because it cost money to make and people were supposed to be paid for it. remember the writers' strike because streaming now only benefits executives with no equanimous income distribution model for the people who create the stuff you love?
you are meant to root for the independent creators. you are not meant to want all content to be beholden to an algorithmic nightmare that ransoms what creators can make or monetise for its own arbitrary censorship. you are not meant to root for a platform that works to silence and limit people under oppressive regimes. you are not meant to demand that you get content only on something that platforms and promotes right wing prejudice.
employing people is good!!!!!!! you are meant to want people to be fairly paid for their work even and especially when that work goes into creating things you love and get enrichment from
yes you are also meant to share your subscription with your friends. OUR login details. THAT'S fucking socialism.
1K notes
¡
View notes