#rationalists
Explore tagged Tumblr posts
dawnlightmelody · 7 months ago
Text
hello everyone! tumblr seems like a plausible successor site for posters as twitter becomes less fun, so i'm moving part of my exodus here.
where on here do the rationalists hang out? or the zoomer transfems? or the zoomer transfem rationalists, even?
16 notes · View notes
sewer-swan · 10 months ago
Text
I have to consciously dodge sounding like a rationalist sometimes. We probably share some ground, we both worship the effective and love analysis. But they have this certain bright-eyed optimism, a total lack of edge with a slight note of corniness. You hear them pipe up from 6 feet deep in the weeds, all nuance and pet theory, and you just think "god, I should step on that thing."
I think if tumblr people can be corny too, we at least like to think we're outrageous outsiders. Rationalists seem to allow no concept of the outrageous and they're so damn sincere about that: you need a patina of cynicism, a hint of jadedness, to pull it off.
7 notes · View notes
stackslip · 2 years ago
Text
i know i have mutuals and followers who are/were close to rationalist circles, or rationalists themselves, or who are simply anarchist trans girl hackers who seem to br broadly familiar with the subject. and tbc it really REALLY isnt for me (im vaguelu familiar with the arguments and have read about rokos basilisk and lesswrong etc and it's beffudling to me). but do you guys have any good writings or posts on how much it's influencing tech giants and in which ways? there's been a lot of sensational writings on the ftx polycule etc and how elon and co are claiming to embrace longtermism but I'd love to see stuff on what they specifically get out of it and how close their interpretation is to the og ideas and circles, and what movements and groups DO exist today and how much they oppose or embrace recent tech industry developments (from ai chat and art to crypto crashing to elon and others indicating they're fans of the whole thing)
tbc this isn't about drama or sensationalist stuff, and im not seeking intellectual arguments on why it's the right way or utter bullshit. rather im trying to understand how these ideas developed, where the movement is today, and what kind of cultural and intellectual influence they actually DO have on tech giants and tech billionaires as a whole
15 notes · View notes
wisdomfish · 2 years ago
Quote
This promise in Jeremiah [29:13] eliminates two types of people who claim to be seekers. There is the rationalist (who only allows reason – but the true seeker needs to be a whole person not just an 'autonomous' mind). And there is the cynic (whose start point assumes there is not enough evidence available and they seem committed only to seeking confirmation for their start-point). Hence both are defective seekers.
Andrew Fellows 
4 notes · View notes
philotreat · 2 years ago
Text
Rationalist Philosophers Tutorial - YouTube
In this video, I explained about Rationalist philosophers- Descartes, Spinoza and Leibnitz, their theories of Substance, God and Rationalism.
0 notes
w-ht-w · 2 years ago
Text
Eliezer Yudkowsky in 2009 on the coordination problem in effective altruist / rationalist communities.
Our culture puts all the emphasis on heroic disagreement and heroic defiance, and none on heroic agreement or heroic group consensus.  We signal our superior intelligence and our membership in the nonconformist community by inventing clever objections to others' arguments.  Perhaps that is why the atheist/libertarian/technophile/sf-fan/Silicon-Valley/programmer/early-adopter crowd stays marginalized, losing battles with less nonconformist factions in larger society.  No, we're not losing because we're so superior, we're losing because our exclusively individualist traditions sabotage our ability to cooperate.
The other major component that I think sabotages group efforts in the atheist/libertarian/technophile/etcetera community, is being ashamed of strong feelings. We still have the Spock archetype of rationality stuck in our heads, rationality as dispassion.  Or perhaps a related mistake, rationality as cynicism—trying to signal your superior world-weary sophistication by showing that you care less than others.  Being careful to ostentatiously, publicly look down on those so naive as to show they care strongly about anything.
We should aspire to feel the emotions that fit the facts, not aspire to feel no emotion.  If an emotion can be destroyed by truth, we should relinquish it.  But if a cause is worth striving for, then let us by all means feel fully its importance.
I've heard it argued that the taboo against emotional language in, say, science papers, is an important part of letting the facts fight it out without distraction.  That doesn't mean the taboo should apply everywhere.  I think that there are parts of life where we should learn to applaud strong emotional language, eloquence, and poetry.  When there's something that needs doing, poetic appeals help get it done, and, therefore, are themselves to be applauded.
You need both sides of it—
the willingness to turn away from counterproductive causes, and the willingness to praise productive ones; 
the strength to be unswayed by ungrounded appeals, and the strength to be swayed by grounded ones.
1. https://www.lesswrong.com/posts/7FzD7pNm9X68Gp5ZC/why-our-kind-can-t-cooperate
1 note · View note
tenth-sentence · 2 years ago
Text
And the only way to rein in corruptible politicians, these rationalists believed, was by balancing them against other politicians.
"Humankind: A Hopeful History" - Rutger Bregman
0 notes
pageofheartdj · 1 year ago
Text
Tumblr media Tumblr media Tumblr media Tumblr media
There is something about this dynamic and I want more XD
2K notes · View notes
centrally-unplanned · 6 months ago
Text
Very much enjoyed Tracing Woodgrain's foray into the internet life of jilted ex-rationalist and Wikipedia editor David Gerard. It is of course "on brand" for me - the social history of the internet, as a place of communities and individual lives lived, is one of my own passion projects, and this slots neatly into that domain in more ways than one. At the object-level it is of course about one such specific community & person; but more broadly it is an entry into the "death of the internet-as-alternate-reality" genre; the 1990's & 2000's internet as a place separate from and perhaps superior to the analog world, that died away in the face of the internet's normalization and the cruel hand of the real.
Here that broad story is made specific; early Wikipedia very much was "better than the real", the ethos of the early rationalist community did seem to a lot of people like "Yeah, this is a new way of thinking! We are gonna become better people this way!" - and it wasn't total bullshit, logical fallacies are real enough. And the decline is equally specific: the Rationalist project was never going to Escape Politics because it was composed of human beings, Wikipedia was low-hanging fruit that became a job of grubby maintenance, the suicide of hackivist Aaron Swartz was a wake-up call that the internet was not, in any way, exempt from the reach of the powers-that-be. TW's allusion to Gamergate was particularly amusing for me, as while it wasn't prominent in Gerard's life it was truly the death knell for the illusion of the internet as a unified culture.
But anyway, the meat of the essay is also just extremely amusing; someone spending over a decade on a hate crusade using rules-lawyering spoiling tactics for the most petty stakes (unflattering wikipedia articles & other press). The internet is built by weirdos, and that is going to be a mixed bag! It is beautiful to see someone's soul laid bare like this.
It can be tempting to get involved in the object-level topics - how important was Lesswrong in the growth of Neoreaction, one of the topics of Gerard's fixations? It was certainly, obviously not born there, never had any numbers on the site, and soon left it to grow elsewhere. But on the flip side, for a few crucial years Lesswrong was one of the biggest sites that hosted any level of discussion around it, and exposed other people to it as a concept. This is common for user-generated content platforms; they aggregate people who find commonalities and then splinter off. Lesswrong's vaunted "politics is the mindkiller" masked a strong aversion to a lot of what would become left social justice, and it was a place for those people to meet. I don't think neoreaction deserves any mention on Lesswrong's wikipedia page, beyond maybe a footnote. But Lesswrong deserves a place on Neoreaction's wikipedia page. There are very interesting arguments to explore here.
You must, however, ignore that temptation, because Gerard explored fucking none of that. No curiosity, no context, just endless appeals to "Reliable Source!" and other wikipedia rules to freeze the wikipedia entries into maximally unflattering shapes. Any individual edit is perhaps defensible; in their totality they are damning. My "favourite" is that on the Slate Star Codex wikipedia page, he inserted and fought a half-dozen times to include a link to an academic publication Scott Alexander wrote, that no one ever read and was never discussed on SSC beyond a passing mention, solely because it had his real name on it. He was just doxxing him because he knew it would piss Scott off, and anyone pointing that out was told "Springer Press is RS, read the rules please :)". It is levels of petty I can't imagine motivating me for a decade, it is honestly impressive!
He was eventually banned from editing the page as some other just-as-senior wikipedia editor finally noticed and realized, no, the guy who openly calls Scott a neo-nazi is not an "unbiased source" for editing this page wtf is wrong with you all. I think you could come away from this article thinking Wikipedia is ~broken~ or w/e, but you shouldn't - how hard Gerard had to work to do something as small as he did is a testament to the strength of the platform. No one thinks it is perfect of course, but nothing ever will be - and in particular getting motivated contributors now that the sex appeal has faded is a very hard problem. The best solution sometimes is just noticing the abusers over time.
Though wikipedia should loosen up its sourcing standards a bit. I get why it is the way it is, but still, come on.
221 notes · View notes
loki-zen · 7 months ago
Text
like there was the worry that you'd make a computer for something specific like making paperclips and it'd develop general intelligence and take over the world when in fact they developed a chatbot and people are falling over themselves to use it like a general intelligence
I feel like the AI risk rats got it almost exactly backwards actually.
45 notes · View notes
thefriendoforatioisdead · 2 months ago
Text
Anin is very angry at Pin for accepting to marry Kuea and I understand her BUT I'm afraid I'm siding with Pin on this one.
Because, besides what it would to her reputation, to her family's name, the fact that they don't belong to the same social class, that she's following her duty, that she's being loyal to her aunt, and all the reasons that motivate Pin to take this decision, the thing that in my opinion really determined her, is Anin saying she'd give up her title.
Because giving up her title is giving up her family. And Anin, with all the love she has for Pin and her family as well, is still a bit self-centered, and selfish, and probably not as mature as Pin because it had never been asked of her to be. So, when in the middle of her righteous fight for her love she throws that she's fine with being disowned, abandonning everything and eloping to another country, Pin has to stop it. She can't bear it. Because SHE knows what it is to lose your family. She lost her parents as a child, and all she has left is her aunt who, despites her genuine love for Pin, is not always the gentle parental figure someone like Pin might have needed. She can't watch Anin throw away something so important for her. She could not stand to be the reason the love of her life lost her family, and felt the same pain that she did. Because unlike Anin, she knows what it means to be all alone in the world.
So she had to put an end to this and say she'd marry Kuea. To protect Anin from the pain of losing her family, because she knows how much she loves them and how much they love her
91 notes · View notes
anghraine · 5 months ago
Text
ngl I always find it wild to see Star Wars stuff that's like "if you think about it in terms of realistic statistics/science then..." about almost any aspect of it.
I mean, what about the Star Wars films gives the impression that this universe abides by realistic statistics, or realistic anything else? SW is broadly a fantasy epic projected onto an IMAX screen with a space background painted on it. Yeah, the planets and moons in the films almost always have improbably limited biomes and two major locations max, because narratively these locations are usually just fantasy city-states with space aesthetics.
Starships travel at the speed of plot and we simply jump past the amount of time that presumably is passing, and sort of imply the passage of that time through shifts in the character dynamics. But this passage of time cannot be analyzed with any kind of consistency because the only logic governing it is the pace of the story.
Just how long did it take the Empire to send a full contingent of forces to Dantooine, search the entire planet, find the Rebel base, and then report back to Tarkin between one scene and another? No one says and no one appears to care. How long did it take Han and Leia to reach Bespin and what exactly went on between them while Luke was, in the same time frame, going through a protracted training over multiple days at an absolute minimum? ¯\_(ツ)_/¯
How do giant space worms survive inside asteroids that somehow have an Earth-approximate gravitational field and I guess an atmosphere? Shhhh don't think about it. The point of the sequence is not "how does the giant space worm subsist off this random asteroid and how does it breathe and how does gravity work in this context, seriously" but that the giant worm sequence is fucking sick.
There's probably some after the fact EU justification invented by people who had nothing to do with the original writing of the space worm (or perhaps there are several mutually incompatible explanations) and I am profoundly disinterested in them. Nothing could make this even slightly realistic and it was never intended to be. Star Wars sings space shanties at scientific/mathematical realism as it sails past on a completely different ship going in the exact opposite direction.
And I do mean "sails" because while astronomy might tell us that space is unfamiliar and wild on a level we as Earthbound lifeforms can barely comprehend, Star Wars understands that space is basically an ocean, yet with stars and cool but survivable planets in it, or sometimes it's air but combined with a super cool space background so you can have early 20th century aerial combat that would make no sense in actual space conditions and doesn't need to.
"If you consider relativity, then just running the Empire would be..." General relativity does not govern the galaxy far, far away. Space magic does. I'm not sure there are even time zones.
113 notes · View notes
annies-scrapbook · 1 year ago
Text
i'm so sick of how ~biohacking~ has become this tech-bro-ass, supplement-grifter-ass, joe-rogan-ass, GOOP-for-cishet-men-ass, buttered-coffee-ass bullshit. back in my day, a biohacker was a weird goth enby who did terrifying diy surgery on themself on their kitchen counter so they could experience shrimp senses. as the gods intended!
140 notes · View notes
profound-yet-trivial · 1 year ago
Text
When discussing the degree to which rationalists have an improved win rate, it's important to remember that the control group for LessWrong isn't "startup founders" or something like that, it's r/atheism (or perhaps a more neurodivergent version of it).
Rationalists were well ahead of academia + industry on AI risk, we were well ahead of the "nerdy Bay Area" curve on taking COVID seriously, and (although I'm not endorsing the externalities, and I personally missed the boat) many rationalists were well ahead of the curve on cryptocurrency.
The only real area I can think of where rationalists are plausibly doing worse than control (and here we run into massive selection effects) is mental health and vulnerability to cults. Rationalist techniques haven't paid off as substitutes for therapy and medication, and it's possible that indulging one's disposition to take weird ideas seriously is a risk factor for mental illness and cult susceptibility, above the risk factor inherent in simply having that disposition.
(The number of people attempting to build cults in the vicinity is a curse of success in my opinion; charlatans are drawn to communities with money and status where people will hear out a weird outsider.)
Anything obvious I'm missing?
I appreciate the existence of the Rationalist Community, because the whole point of The Sequences is that if you got it right, You Should Be Winning. And these people are actually out there, testing all these ideas. And on the whole, the thing we learn is that no, this does not really improve your win percentage by a significant amount. But occasionally they find something that does work, and I suddenly have a hundred relatively solid anecdotes to reason off of. They are very useful canaries, even if I would not want to be a canary
128 notes · View notes
katherinakaina · 2 months ago
Text
I keep seeing once in a while people pondering on an apparent contradiction in Daniil’s character – he is said to be a rationalist but he is evidently extremely emotional. Those things do not go together, right? People notice their confusion. They find all sorts of interesting explanations. From him being manipulative and performative, using his displays of emotion like tools to control people. To him not being rational at all actually, him lying to himself and others, not even knowing who he is, pretending and failing.
Every time I get over it and completely forget and then another one of these hits me in the face. What I forget is that in common understanding rationality is opposed to being emotional. While in the community it is a basic level understanding that there are rational emotions and irrational ones. The same way there are rational beliefs and irrational beliefs (which is to say true and false basically).
From here:
A popular belief about “rationality” is that rationality opposes all emotion—that all our sadness and all our joy are automatically anti-logical by virtue of being feelings. …
For my part, I label an emotion as “not rational” if it rests on mistaken beliefs, or rather, on mistake-producing epistemic conduct. “If the iron approaches your face, and you believe it is hot, and it is cool, the Way opposes your fear. If the iron approaches your face, and you believe it is cool, and it is hot, the Way opposes your calm.” Conversely, an emotion that is evoked by correct beliefs or truth-conducive thinking is a “rational emotion”; and this has the advantage of letting us regard calm as an emotional state, rather than a privileged default. …
Becoming more rational—arriving at better estimates of how-the-world-is—can diminish feelings or intensify them. Sometimes we run away from strong feelings by denying the facts, by flinching away from the view of the world that gave rise to the powerful emotion. If so, then as you study the skills of rationality and train yourself not to deny facts, your feelings will become stronger. …
I visualize the past and future of humankind, the tens of billions of deaths over our history, the misery and fear, the search for answers, the trembling hands reaching upward out of so much blood, what we could become someday when we make the stars our cities, all that darkness and all that light—I know that I can never truly understand it, and I haven’t the words to say. Despite all my philosophy I am still embarrassed to confess strong emotions, and you’re probably uncomfortable hearing them. But I know, now, that it is rational to feel.
Daniil probably suppresses some of his emotions to be taken seriously. But this is masking. And he is bad at it. He has strong emotions and strong convictions and they spill out of him regardless. He also values truth and honesty and that’s another reason why he can’t fully suppress his authenticity.
But all of it is about how to behave in polite society. How not to freak out neurotypicals. It has nothing to do with his thinking process, his beliefs and his goals. His rationality.
Now you can argue that his sincerity and his openness are irrational instrumentally, which is to say they lead to his downfall. He should have masked better and become more cynical if he wanted to succeed. Maybe? But that would also have its downsides, I’m pretty sure. (we’ll see what apathy meter does to his decision making soon enough)
Anyway, that is not the point I see people make. And I just really want people to stop making it. Strong emotions, strong ideals, passionate belief in a better future for humanity – those are all perfectly rational if they align with truth. And he does fail as a rationalist quite a lot as well, but this is purely an epistemological issue that has nothing to do with him being emotional.
41 notes · View notes