#THIS PROF HAS LOW REVIEWS FOR A REASON
Explore tagged Tumblr posts
Text
MY PROF JUST GAVE US BACK OUR MIDTERM AND EVERYONE GOT LOWASS MARKS, THE HIGHEST MARK WAS A 60%.
#huh#I GOT A 54#im happy i passed but I didn't think he was going to brutally mark out midterms ruthlessly that the whole class barely passed#THIS PROF HAS LOW REVIEWS FOR A REASON#man wtf#idk why he's getting mad at us when he doesn't even teach properly in the first place#i mean like this is supposed to me an easy class IT IS AN EASY CLASS#ITS ABOUT CANADIAN CITIZENSHIP AND IMMIGRATION AND HE PROF IS BEING DIFFICULT BRUH WTF#i can't believe the whole class failed man wtf
6 notes
·
View notes
Text
Published: Jul 13, 2023
As experienced professionals involved in direct care for the rapidly growing numbers of gender-diverse youth, the evaluation of medical evidence or both, we were surprised by the Endocrine Society’s claims about the state of evidence for gender-affirming care for youth (Letters, July 5). Stephen Hammes, president of the Endocrine Society, writes, “More than 2,000 studies published since 1975 form a clear picture: Gender-affirming care improves the well-being of transgender and gender-diverse people and reduces the risk of suicide.” This claim is not supported by the best available evidence.
Every systematic review of evidence to date, including one published in the Journal of the Endocrine Society, has found the evidence for mental-health benefits of hormonal interventions for minors to be of low or very low certainty. By contrast, the risks are significant and include sterility, lifelong dependence on medication and the anguish of regret. For this reason, more and more European countries and international professional organizations now recommend psychotherapy rather than hormones and surgeries as the first line of treatment for gender-dysphoric youth.
Dr. Hammes’s claim that gender transition reduces suicides is contradicted by every systematic review, including the review published by the Endocrine Society, which states, “We could not draw any conclusions about death by suicide.” There is no reliable evidence to suggest that hormonal transition is an effective suicide-prevention measure.
The politicization of transgender healthcare in the U.S. is unfortunate. The way to combat it is for medical societies to align their recommendations with the best available evidence—rather than exaggerating the benefits and minimizing the risks.
This letter is signed by 21 clinicians and researchers from nine countries.
FINLAND Prof. Riittakerttu Kaltiala, M.D., Ph.D. Tampere University Laura Takala, M.D., Ph.D. Chief Psychiatrist, Alkupsykiatria Clinic
UNITED KINGDOM Prof. Richard Byng, M.B.B.Ch., Ph.D. University of Plymouth Anna Hutchinson, D.Clin.Psych. Clinical psychologist, The Integrated Psychology Clinic Anastassis Spiliadis, Ph.D.(c) Director, ICF Consultations
SWEDEN Angela Sämfjord, M.D. Senior consultant, Sahlgrenska University Hospital Sven Román, M.D. Child and Adolescent Psychiatrist
NORWAY Anne Wæhre, M.D., Ph.D. Senior consultant, Oslo University Hospital
BELGIUM Em. Prof. Patrik Vankrunkelsven, M.D. Ph.D. Katholieke Universiteit Leuven Honorary senator Sophie Dechêne, M.R.C.Psych. Child and adolescent psychiatrist Beryl Koener, M.D., Ph.D. Child and adolescent psychiatrist
FRANCE Prof. Celine Masson, Ph.D. Picardy Jules Verne University Psychologist, Oeuvre de Secours aux Enfants Co-director, Observatory La Petite Sirène Caroline Eliacheff, M.D. Child and adolescent psychiatrist Co-director, Observatory La Petite Sirène Em. Prof. Maurice Berger, M.D. Ph.D. Child psychiatrist
SWITZERLAND Daniel Halpérin, M.D. Pediatrician
SOUTH AFRICA Prof. Reitze Rodseth, Ph.D. University of Kwazulu-Natal Janet Giddy, M.B.Ch.B., M.P.H. Family physician and public-health expert Allan Donkin, M.B.Ch.B. Family physician
UNITED STATES Clin. Prof. Stephen B. Levine, M.D. Case Western Reserve University Clin. Prof. William Malone, M.D. Idaho College of Osteopathic Medicine Director, Society for Evidence Based Gender Medicine Prof. Patrick K. Hunter, M.D. Florida State University Pediatrician and bioethicist
Transgenderism has been highly politicized—on both sides. There are those who will justify any hormonal-replacement intervention for any young person who may have been identified as possibly having gender dysphoria. This is dangerous, as probably only a minority of those so identified truly qualify for this diagnosis. On the other hand, there are those who wouldn’t accept any hormonal intervention, regardless of the specifics of the individual patients.
Endocrinologists aren’t psychiatrists. We aren’t the ones who can identify gender-dysphoric individuals. The point isn’t to open the floodgates and offer an often-irreversible treatment to all people who may have issues with their sexuality, but to determine who would truly benefit from it.
Jesus L. Penabad, M.D. Tarpon Springs, Fla.
[ Via: https://archive.today/IRShy ]
==
The gender lobotomists just got called out.
#Leor Sapir#Colin Wright#Endocrine Society#gender lobotomy#genderwang#gender ideology#queer theory#sex trait modification#ideological capture#medical malpractice#medical transition#medical scandal#gender affirming care#affirmation model#gender affirming#ideological corruption#religion is a mental illness
288 notes
·
View notes
Text
Re-reading The Fellowship of the Ring for the First Time in Fifteen Years
Hi, Hello, Welcome! The conceit of these posts is pretty self-explanatory. I read the Lord of the Rings for the first time at age 17, in the middle of my parent's divorce (it was messy, we're not going into any details). Needless to say, I remember pretty much nothing about that read, and I would like to give the books a fair shake of a re-read. That's what this is, and there will be spoilers throughout!
I usually do full-book reviews, but if ever I was going to do a chapter-by-chapter re-read, it would be for LotR. The rules are that I'm going in as blind as I possibly can (I have watched the movies and have absorbed like...a reasonable amount of lore from existing on the internet as a millennial) and I'm not doing any research beyond like, defining words for myself as I read. So here we go, and I hope you enjoy rereading with me! Let's talk "The Shadow of the Past."
Good LORD JRR Tolkien can lore dump when he wants to. This chapter was mainly lore dump, which is fine because it was at least interesting lore dump. I'm not a lore girly though, I'm a character girly, so let's go with "we got the One Ring's backstory, now let me talk about other characters because the Ring isn't one just yet."
This is going to sound initially harsh, but it is said with affection: Gandalf is 1000% the pedantic asshole professor who is way too into the Socratic method who you absolutely detest in undergrad but somehow his classes still end up sticking with you more than any other. You then get to understand this prof better as a master's student, and deeply love this prof as a PhD. That's literally the vibe I'm getting from his lecture to Frodo about finding some goddamn pity and compassion for the tragedy that is Smeagol and Gollum. Because it is VERY easy to judge and be critical in the abstract, which Frodo very much is, having never encountered Gollum, and Gandalf has spent time and effort tracking down Gollum with way more background knowledge with which to contextualize the layers of tragedy that Gollum personifies and affects. It's a big ask, to get people to abstract compassion (and do not come in here and argue with me about this, I live in 20-goddam-24, I know what I'm talking about), but Gandalf kind of doesn't let it go with Frodo until Frodo at least softens his position and is open to, if not at, compassion. I've been a student and I've been a teacher, and these conversations are hard from both directions, so kudos to Gandalf for sticking with it, and to Frodo for getting to a place where he was truly listening.
Especially after Gandalf just CASUALLY DROPS that Gollum literally ATE BABIES. I'm not even kidding, he just casually, in the midst of an infodump on Gollum's time tracking Bilbo after losing the Ring, says,
The woodsmen said that there was some new terror abroad, a ghost that drank blood. It climbed trees to find nests, it crept into holes to find young, it slipped through windows to find cradles.
AND THEN WE JUST CASUALLY MOVE ON LIKE BABY EATING ISN'T SOMETHING WE NEED TO ADDRESS HERE. I would like to address the baby eating, Gandalf!!!
Despite not addressing the baby eating though, there was some interesting new information in the Gollum infodump that I understand why it got cut from the movies, but I was low-key fascinated. Smeagol was specifically noted to be interested in roots. Gandalf framed that like literal tree and mountain roots, but this is Tolkien we're talking about. Roots have a metric ton of metaphorical meanings too, and the fact that Smeagol was interested in the origins of things, in where they came from, in what made them as they are, is both deeply ironic and deeply interesting. I kind of hope we do more with that, since becoming Gollum is like ouroborosing roots; Smeagol's interest in Gollum is deeply self-reflexive, which might also be how we end up with that bifurcated personality thing. I dunno, but that would be really cool to follow up on.
I also deeply appreciated Frodo's "WHAT THE ACTUAL FUCK" reaction to realizing that Gandalf had let him keep the One Ring for so long. Notably, Gandalf kind of doesn't explicitly apologize for putting Frodo at risk, but he does acknowledge that yes, yes he made a choice, took a risk, and put Frodo in some level of danger. I suppose we'll take it, even as we acknowledge that yes, Gandalf was working with imperfect, incomplete information. We do the best we can with what we know at the time, or something. And if it took 20-odd years to figure all of this out (which makes sense for the kind of field and archival work required here), then y'know what, better late than never.
That said, Gandalf also kind of...LIGHTLY SKATES OVER the fact that even just possessing the Ring and doing nothing with it for 20 years has affected Frodo. He's not aging. He can't cast it away. He's already caught. Right at the beginning, in CHAPTER TWO of this massive trilogy, it's not a matter of preventing Frodo from being caught by the ring. It's a matter of how long Frodo can resist. He was doomed before anyone knew, concretely, that there was a problem. And jaysus, if that isn't how you tee up a tragedy, I don't even know how you do that. Maybe there wasn't a good reason for Gandalf to say that to Frodo, maybe it would have hurt more than it helped, but I do kind of think PERHAPS YOU MIGHT POINT THIS OUT???
I get the sense that I'm going to be very back-and-forth on book Gandalf...this is going to be an interesting thing to watch develop as I keep reading.
In addition to Gandalf's "Backstory Via The Socratic Method 101" course, we also get some additional Samwise Gamgee in this chapter. Saying "I adore this hobbit and he should be protected at all costs" is not new or even interesting, so let's take a different tack. In the films, Sam's excitement for going to see the elves is...ungrounded. It's a thing about him that we just accept. I deeply relate to and adore the sense we get of why and how the elves thing comes about in the book:
He believed he had once seen and Elf in the woods, and still hoped to see more one day. Of all the legends that he had heard in his early years such fragments of tales and half-remembered stories about the Elves as the hobbits knew, had always moved him most deeply.
This might seem ungrounded, but it's deeply aware of how stories work. Sam knows that the hobbits don't have the extent of Elven lore that exists, but he knows that there is a magic and a power in even the fragments they have, and that captured his imagination to such an extent that a yearning to see, to understand, to know that magic, was born in his heart. That grounds Sam in stories just as much as Frodo is grounded in stories, and more than that, Sam WANTS the magic to be real in a way that Frodo, primed on all the tragedy by Gandalf, I don't actually think does. Frodo is "I wish it need not have happened in my time," but Sam is "Me go and see the Elves and all."
That "and all" at the end is particularly poignant, because if Sam knows some of the stories of the elves, I have to imagine a few tragic tales survived along with the magical ones, so Sam isn't going starry-eyed into this as a bumblefuck gardener from nowhere. There's an acceptance there of the magic that encompasses all that magic offers, both good and bad. Yeah, I'm probably over-reading into this, but I support it at least a little with the fact that at the beginning of the chapter, we're with Sam when the hobbits down the pub are talking about strange beings and creatures and *foreshadowing the ents*. Sam knows that the stories tell of more than just elves, but for him, that wonder is enough to warrant everything else. No, I am not taking criticism (constructive or otherwise) at this time.
Other than a wee shoutout to the legendary "Mad Baggins"--and let's be real, if history must become myth and myth must become legend, I want Mad Baggins to stay alive and not be forgotten--that's about all I have for this chapter. Professor Gandalf shows up to school Frodo and kick his ass out the door, and Sam gets to go see the elves. We'll pick up again next time with chapter 3.
#the fellowship of the ring#the lord of the rings#chapter 2#shadows of the past#reread#HOW DID WE JUST SKATE OVER GOLLUM EATING BABIES????
8 notes
·
View notes
Text
Level 5 is interesting cause like, they did a Konami but good. Came out swinging with *multiple* hits. Inazuma Eleven boom, Professor Layton boom, Ni No Kuni boom, YoKai Watch BOOM and a bunch of other smaller but generally liked games (Fantasy Life, Megaton Musashi, Snack World) and then they…dipped for a while during the generation change. And that dip *feels* weird right? I looked at the reviews of the games! People still think the games feel pretty damn good yet there hasn’t been a new yoKai watch since 2019. Prof Layton is coming back when it feels he should’ve been like a bi-yearly thing. Is it that they really struggled with higher development costs versus developing just for the DS?
Even if that’s only part of the reason it kinda proves to me how unique the DS was as a popular option to develop games for tbh. Before the DS atleast on Nintendo handheld consoles a lot of games where Lesser ports tm or Tetris. The DS had this combination of power, gimmick, install base and low dev cost that companies like Level 5 and others could just afford to experiment. I think that’s the platonic ideal of the gaming handheld, smaller than the Deck, has *just* enough power for a game that feels “full” but also has a central and strong enough gimmick to make *not* taking advantage of it a little sad.
2 notes
·
View notes
Text
MY REVIEW ON PSYCHO PASS PROVIDENCE- part 2
SOME ISSUES (and THOUGHTS) THAT I NEED TO ADDRESS:
I still don’t get the reason why Prof. Saiga and Prof. Stronskaya made the paper. What are they aiming with that? Did I miss something? And conflict coefficient is too much like crime coefficient. This is personal reason, but honestly conflict coefficient is much more reasonable than Sybil. At least, they don’t judge based on how cloudy people are.
I like Saiga and Kougami relationship, and for once we get to see Kougami use Keigo.
No hate, but idk some people debate over Kougami calling Frederica by her name. Akane also called her like that too, except she always use -san, like she did with Shion-san. Kougami also address Shion with her name and nothing happen. He’s just comfortable with both of them, I think. (idk how close they were, need novel)
Why Azusawa never appear on this movie? I thought since we don’t know anything about him in PP3 we somehow have explanation in PPP, with Shindou especially. Well, I despise PP3 anyway, pls don’t ask me why. I appreciate PP staff work tho.
Sugo is cute, loyal hound to his inspector. He’d make quick call, inform anything suspicious to Akane.
I find it funny Kougami act childish in this scenes:
He clearly keep his distance from Akane and div 1 after he and Akane’s convo late night call, even Saiga asked him. He only dare to look at her face again after their body almost exploded with c4, meanwhile Akane from the first time act as if nothing happen. Nice writing, I guess.
Until PPP, I think so many characters have died. And their death is almost have no meaning nor progress to the story (yet with Saiga’s case). PP is dystopian anime, I get it, the story is dark and tragic, I get it, but why they shown so many death after s1? No, it’s not meaningless, it’s actually create situation or drama (for ex Sugo killed Aoyanagi), but not with audience. They don’t make us fully feel related or attachment to the character, but suddenly they’re gone. (like entire div 3 on PP2.) This is just personal opinion, but Saiga’s death is kind of ashamed, they shouldn’t just kill him as a mere bait for Peacebreaker, it’s too bad. There’s so much potential to his character in the future instalment if they ever make one :’(
Akane running away to fallen Saiga is kinda throw me off. Why not help Kougami or Frederica first? This is out of character. The old Akane would never.
I like that Akane bursts in front of Kougami in elevator scene. When it’s just the two of them, she actually lower her wall because in my case, I never want to cry in public or in front of strangers. Well, she cried near the fountain holo before though, but you get the point.
Kougami sympathy to Akira.
Akira is precious, that’s it. Everyone who watch it already know why.
Just for a brief second, I can see Akane’s eyes shaking when she heard Kougami’s tone on the call. She must be aware he’s in pain.
The scene where Tonami had possessed Akira, Akane could have done so much more or at least TRY something. Where’s the badass Akane we always adore?
Akane always have puppy eyes for one man and one man only. Guess who?
Those puppy eyes can kill me.
Also, I find this script is irrelevant. In the hospital, first Akane relief that Kougami is unhurt, then suddenly he said ‘my mistake’? Why is he suddenly apologize, lmao.
Sybil is a bitch. After Yabuki and Shindo’s tasks done, they told Shindo to just die. Sigh.
I HATE that Tonami has low CC. In s1, we only learn that people with low CC can’t even commit crimes except being asymptomatic, and suddenly this new villain who basically terrorist has clear hue just because he feel sympathy to low life and wants fairness for them (cmiiw). He kill people for God’s sake! This shows inconsistence of writing and will create plot hole in the future. And for Akane’s case, I tolerate it probably because Kasei isn’t actually human.
Akane is so strong, she has been shot twice by Tonami, yet she can speak clearly, even sit. Meanwhile we can find waver in Kougami’s voice on that call (he only get one bullet on his thigh.)
I find this interesting.
Akane is NOT aware that Tonami is about to shoot her. If she knows, would she see Kougami’s sin differently? But after much thought, I think it’s still wouldn’t change a thing, because she wants to arrest Tonami. But dude, Kougami shoots because he can’t lose her, if he shoots his leg or hand or whatever, anything could happen and Akane would’ve been died. If I were him, I’ll shoot his head also without hesitation. I wonder if she understand that she’s at stake.
Who transfer Sugino to SAD? Akane?
Imagine Kougami’s feeling when he can’t do anything to stop Akane. Wow.
Akane’s decision to do that is actually admirable, she honour Shindo and Akira’s death.
This is the end of my chattering. I hope I’m not seen for being a pathetic shipper. I love psycho pass as anime.
#psycho pass#mypsychopassthought#shinkane#kouaka#kougami shinya#akane tsunemori#ginoza nobuchika#ppp#psycho pass providence
14 notes
·
View notes
Note
surprise! your OCs are required to attend Professor Barbara Allen's forensics lectures for a semester! what's the ratemyprofessor review they give her and how do they do?
oooh bestie you have NO idea what's coming, I literally WAS a forensics major before I switched to trade school :D
some long answers here, so they'll be under the cut. thank you so much for the ask!
Ophelia: 5 stars. Loves the lecture, takes exceptionally detailed notes and ends up with curve-breaking high grades. Her only problem would be missing class due to late night hero work making her exhausted in the morning, but... well, she feels like Professor Allen's very familiar with that.
Jasper: 4.4 stars. They're a nursing student, so they don't mind gruesome pictures or high-intensity anatomy discussions, but they feel like Professor Allen just goes a bit too fast in the lectures (lol). They'd pull good grades, and probably end up with a bit of unintentional extra credit when Professor Allen lands in the ER with some unexplained cuts and bruises and Jasper's the one to stitch her up and send her on her way.
Kestrel: No rating. Probably didn't even show up to the first lecture. To be fair, pretty much all of their higher education has come from exploring and adventuring in the wilderness. They probably weren't even aware they'd been signed up for the lecture. (but don't take it personally, if they were there, they'd be sure to pay attention)
Rae: Originally rated 3.5 stars, but bumped it up to 4.5 after Prof. Allen was kind and understanding in offering her an extension. She did well enough in college the first time around (Masters in Foreign Language), but she also struggles with severe insomnia and some anxiety issues. At first, she probably say Prof. Allen as somewhat distant, but once she actually went up and talked to her and explained everything, she actually ended up with a renewed respect and finished the rest of the semester very well.
Robin: Hm... 3 stars. Not a bad class, but she's much more music-inclined than science-inclined, and having an ASL interpreter would've made the class a hell of a lot easier for her (that's not Professor Allen's fault, of course, but still wasn't the best experience for Robin). She'd probably end the semester decidedly neutral - not her favorite experience, but would also understand that it's probably a very enjoyable class for someone who enjoys the material a little more.
Madison: 4.5 stars. She likes psychology a lot, so she'd probably really enjoy a forensics class, and I feel like she'd like Professor Allen as an instructor. And I feel like she'd be able to keep up with the lectures pretty well! She's a quick thinker. If anything, she'd be a little irritated by the amount of studying she needs to do, but that's the only thing keeping her from a full 5 stars.
Quinn: It really depends. Either she'd get fed up with all the stifling structure and order of the class and end up dropping out (and rating Prof. Allen pretty low), or she'd actually find the material interesting and would end up at the top of the class. It's not really a matter of capability or intelligence, she's got plenty of both, but she'll only bother with the class if she enjoys the subject being taught. So... it's a coin flip.
Katherine: She wouldn't enjoy the class, but she'd rate Professor Allen highly anyway (4 or 5 stars). After all, she's a good professor, Katherine just isn't really a fan of that sort of subject. She's plenty smart, but she's much more artistic-minded than forensics would imply. The only reason she'd take the class to begin with is for the promise of forensic sketch artistry, but even that's only a week and a half out of the full semester.
#my friends!!!#answered asks#ask game#my ocs#ophelia octavius#jasper wilson#oc kestrel#rae mckinney#robin cassidy#madison douglas#oc quinn/aces#oc katherine johnson
3 notes
·
View notes
Text
Week 3, Mon 28/8, Day 1/5
01 49:
After tutoring, I went straight home since S was sick. Fell asleep on the couch right after dinner. We didn't get much done but that's ok, what's more important now is managing our energy levels to avoid burn out.
Later in the day, we will have our violin lesson, a trip to the salon, and a presentation meeting, this means that we need to get as much done before noon and squeeze in some practice time at a more reasonable timing.
1315-1415: violin lesson
1530-1700: salon
1900-2000: ethno meeting
Goals:
- complete preparation for ethno (tues), multiculturalism (wed), literature (wed)
- review violin lesson
- groceries and household needs
- budget tracking
1248:
Since preparations for ethno class will be completed during ethno meeting, I'm left with multiculturalism and literature prep. Multiculturalism requires more cognitive energy and requires a transcript of the lecture and highlighted points from the reading. All these will allow me to complete my week 3 notes for multiculturalism before I attempt the tutorial questions. as for literature readings, the prof has not been using assigned reading materials for the past 2 weeks, so the reading will be a low priority. from 1415-1700, i can use travel time and time at the salon to review my violin lesson, come up with a practice plan, and then work on my multicultural readings. if i take a direct bus back home from the salon, I would be able to avoid the rush hour by a margin and reach home by 1735. I can then have dinner and from 6pm-7pm, practice the violin. If everything goes well and the ethno meeting ends at 8pm, I can do my groceries and household shopping and be back by 9pm. The remaining time can be used to complete the prep for multiculturalism
1 note
·
View note
Text
Happy back-to-school y’all
I’ve attended and worked at a couple of super liberal universities. I avoid the gender studies departments for obvious reasons and I still had a lecture in which the female prof gave a brief overview of TERFs and proclaimed her hatred of JKR. Being openly critical of gender ideology, the porn industry, kinks, and ‘sex work’ are the kind of things that can ruin your future in academia. Not to mention the fact that any speech or actions that could be labelled transphobic (ie. defining woman as adult human female) can get you a suspension according to many universities anti-hate-speech policies.
So, here’s a list of small and smallish (small in terms of overt TERFery, some may require more effort than others) radical feminist actions you can take as a university student:
(this is a liberal arts perspective so if you’re a stem gal this may not apply. but also if you’re in stem maybe you can actually acknowledge that women are oppressed as a sex class without getting kicked out of school. idk)
(Note for TRAs hate reading this: One of the core actions of radical feminism is creating female networks. This is not so that we can brainwash people into being anti-trans. This is because female solidarity is necessary for creating class consciousness and overturning patriarchy. It is harder to subjugate the female sex when we stand together.)
Take classes with female profs. Multiple sections of a class? Pick the one taught by a woman. Have to chose an elective? Only look at electives offered by women. When classes have low numbers they get cancelled. When classes are super popular, universities are forced to consider promoting the faculty that teach them
Make relationships with these female profs. Go to office hours. Chat after class. Ask them about their research. Building female networks is sooooo important!
Actually fill in your end of year course feedback forms. Profs often need these when applying for tenure or applying for a job at another university so it is very important (especially with young and/or new profs) that you fill out these forms and give specific examples of how great these women are. Go off about what you love about them! Give her a brilliant review because you know the idiot boy in that class who won’t shut up even though he knows nothing is going to give her only negative feedback because he thinks any woman who leaves the house is a feminazi b*tch.
(note: obviously don’t go praising any prof - female or male - who is blatantly racist, homophobic, etc.)
(Also if you have shitty male profs write down all the horrible things they have done and said and put it in these forms because once a shitty man gets tenure they are virtually untouchable)
(also also, leave a good review on rate my profs or whatever other thing students use to figure out if they want to take classes. idc if you copy paste your feedback from the formal review. rave about the class to your friends. do what you can to get good enrolment for that prof for reasons above.)
Participate in class. Talk over the male students. Say what you mean and mean it. Call out the boys when they say dumb shit
Write about women. If you have the option to make a text written by a woman your primary text in an essay, do it. Pick the female-centred option if you’re writing an exam-essay with multiple prompts. (Profs often look at what works on their syllabus are being written about/engaged with as a marker of whether to keep those texts the next time they teach the class. If there are badass women on your syllabus, write about them to keep them on the syllabus) Use female-written secondary sources whenever possible.
(pro tip: many women in academia are more than happy to talk to you about their papers. expand your female networks by reaching out to article authors through email and asking them about their cool shit)
Get your essays published! Many departments have undergrad journals you can publish in. This will ensure more people read about the women you write about and will demonstrate to the department that people like learning about women
Consider trying to publish your undergrad essay with a legit peer-reviewed journal. If you can do it, your use of female-written secondary sources boosts the reputations of the women who wrote those secondary sources. Also this helps generally to increase scholarship about women’s writing!
Present your papers at conferences! Many schools have their own undergraduate/departmental conferences that you can present at. Push yourself by submitting to outside conferences. Bring attention to women’s works by presenting your papers. Take a space at a conference that would otherwise be reserved for mediocre men
Talk to your profs and/or your department and/or your university about mandating the inclusion of female works in classes if this isn’t something they do already
Sit next to other women in your classes. Talk to them. Make friends. Form study groups. Proofread each other’s essays. Give each other knowing looks when the boys are being dumb. Just interact with other women! Build those female networks!
Be generous with your compliments. A female classmate and I were talking to a prof after class and the classmate told me (out of the blue) that I always have such interesting things to say. I think about that whenever I’m lacking confidence about my academic skills. Compliment the women in your classes for speaking up, for sharing their opinions, for challenging your classmates/profs, for doing cool presentations, etc.
Talk to other women about sexist things going on on campus. Make everyone aware of the sexist profs. Complain about how there are many more tenured men than tenured women. Go on rate my professor and be explicit about how the sexist profs are sexist
Be active on campus and in societies. If a society has an all male executive or is male-dominated, any women who join that society make it less intimidating for more women to join. Run for executive positions! Bring in more women!
(Pro tip: Many societies’ elections are super gameable. You can be eligible to vote in a society election sometimes just by being a student at that university — even without having done anything with the society before. Other societies might just require that you’ve taken a class in a particular department or attended a society event. (Check the society’s governing documents.) Use those female networks you’ve been building. If you can bring three or four random people to vote for you, that might be enough for you to win. Societies have trouble meeting quorum (the minimum number of people in attendance to do votes) so it is really super achievable to rig an election with a few friends. And don’t feel bad about this. The system is rigged against women so you have every right to exploit loopholes!)
(Also feel free to go vote “non-confidence”/“re-open election” if only shitty men are running. Too often people see that only candidates they don’t like are running and so they give up. But you can actually stop them getting elected)
Your campus may have a LGBTQIA+alphabetsoup society. That society definitely needs more L and B women representation. It may be tedious to argue with the nb straight dudes who insist that it’s fine to use “q***r” in the society’s posters and that attraction has nothing to do with genitals, but just imagine what could happen if we could make these sorts of societies actually safe spaces for same-sex attracted women and advocated for our concerns
Attend random societies’ election meetings. Get women elected and peace out. (or actually get involved but I’m trying to emphasize the lowest commitment option with this one)
Write for the campus newspaper. Write about what women are doing - women’s sports, cool society activities, whatever. Review female movies, books, tv shows, local theatre productions. Write about sexism on campus. We need more female by-lines and more stories about women
Get involved with your campus’s sexual assault & r*pe hotline/sexual assault survivor’s centre/whatever similar organization your campus has if you can. This is hard work and definitely not for everyone (pls take care of yourself first, especially if you are a survivor)
(If your campus doesn’t have an organization for supporting survivor’s of sexualized violence, start one! This is probably going to be a lot of hard work though, so don’t do it alone)
Talk to your student council about providing free menstrual hygiene products on campus if your campus doesn’t already do this. If your campus provides free condoms (which they probs do), use that as leverage (ie. ‘sex is optional, menstruation is not. so why do we have free condoms and no free pads?’)
If you’re an older student, get involved with younger students (orientation week and such activities are good for this). Show the freshman that you can be a successful and well-liked woman without shaving your legs, wearing heels, wearing make-up, etc. Mentor these young women. Offer to go for coffee or proofread essays.
Come to class looking like a human being. Be visibly make-up less, unshaven, unfeminine, etc. to show off the many different ways of being a woman
Talk to the custodial staff and learn their names. (I know there are men who work in this profession, but it is dominated by low-income women) Say hi in the hallways, ask them about their lives, show them they’re appreciated
Be explicit with your language. When you are talking about sex-based oppression, say it. Don’t say ‘sex worker’ when you mean survivor of human trafficking. This tip is obviously a bit tricky in terms of overt TERFyness, so use your best judgement
That’s all from me for now! Feel free to add your suggestions and remember that feminism is about action
829 notes
·
View notes
Text
The great glaciatic meltdown
A titanic piece of Greenland's ice cap estimated at 110 square meters had split and started to float away towards the far northeastern Arctic, flagging a grave risks' that is bound will follow, and the glaciatic obliteration has recently gazed. The part that severed is toward the finish of the Northeast Greenland Ice Stream. It's 42.3 square miles (110 square kilometers) or around multiple times as large as Central Park in NY. This ice desert split away from a fjord called Nioghalvfjerdsfjorden, which is roughly 50 miles (80 kilometers) in length and 12 miles (20 kilometers) wide, as distributed in the National Geological Survey of Denmark and Greenland diary. In any event, being the coldest spot in the outside of the world's air, this district has recorded an increment by enormous 3 degrees Celsius since 1980," as per Dr. Jenny Turton, a polar analyst working Friedrich-Alexander University in Germany. What's more, even with the European landmass recording the most noteworthy temperatures ever, in any event, throughout the mid year of 2019 and 2020.
The previous few months have seen heap features of chilly liquefying – especially in Greenland – and ice sheet breakdown. According to the report which was distributed in the diary Nature Communications Earth and Environment, Greenland's ice sheets have contracted such a lot of that regardless of whether an unnatural weather change were to stop at the present time, the ice sheet would keep contracting a similar distribution further cited satellite information, the Greenland ice sheet lost a record measure of ice in 2019, comparable to 1,000,000 tons each moment across the year. Another paper which was a paper distributed in The Cryosphere, educated that an amazing ice misfortune wasn't brought about by warm temperatures alone yet in addition credits to and non-occasional and remarkable environmental flow designs as the significant reason contributing gigantically to the way the ice sheets quickly of shed's their weight. As these environment models that project the future softening of the Greenland ice sheet don't consider for adjusting barometrical examples, there is an undeniable degree of plausibility that they might have been thought little of by a proportion of 1/2.
According to a report distributed in September 2020,, the last completely flawless ice rack in the Canadian Arctic – the Milne Ice Shelf, which is greater than Manhattan – fell, shedding an abundance of 40% of its space in only two days somewhat recently of July. Which frightened researchers to notice the example of a floated piece of a Mont Blanc icy mass – the same size of Milan basilica – was in danger of breakdown and occupants of Italy’s Aosta valley were organization to clear their homes? The most noticeably terrible was on the way. A British Antarctic Survey along with a group from the USA, planned depressions estimating a large portion of the size of the Grand Canyon that are permitting warm sea water to disintegrate the immense Thwaites icy mass in the Antarctic, speeding up the ascent of ocean levels across the world. As indicated by the International Thwaites Glacier Collaboration, the icy mass measures bigger than England, Wales, and Northern Ireland set up and if it somehow happened to implode completely, worldwide ocean levels would increment by 65 cm (25 in). This isn't the finish of the story. Nature has planned glacial masses to go about as a scaffold or as a cushion between the warming ocean and different ice sheets. A breakdown is sure to convey adjoining ice sheets in western Antarctica down alongside it. This welcome with open arm a cataclysmic situation where the ocean levels will undoubtedly will be an ascent of ocean level by around by a stunning 10 feet, forever sinking some low-lying waterfront regions that incorporate those pieces of Miami, New York City, and the Netherlands, which is a visa for implosion.
An Earth-wide temperature boost as the actual name conveys, walks ahead unabated. While the Paris revelation on environmental change promised to confine a dangerous atmospheric devation to 1.5℃ in any event, during this century, a report by the World Meteorological Organization cautions that breaking point can be penetrated as ahead of schedule as 2024. As per Prof Anders Levermann from the Potsdam Institute for Climate Impact Research in Germany, it will be a judicious to anticipate an expansion in the ocean level more than five meter's, regardless the objective set up in Paris have been accomplished 100%Hence is the obligation of each person to be responsible for their activities, to do all that could be within reach under the sun, not anticipating other's to act. Each nation has distributed standard rules to be clung to, if which followed will bring down the chance of early disaster y striking us early. One of the potential risky Thwaites ice sheets is bigger than England, Wales and Northern Ireland set up and if the inescapable occurs, there is high likelihood of a significant part of England and Wales being gulped by the Atlantic.
In August '20, the last completely flawless ice rack in the Canadian Arctic – the Milne Ice Shelf, which is greater than Manhattan – fell, losing in excess of 40% of its space in only two days toward the finish of July. Researcher's admonished that that an enormous piece of a Mont Blanc ice sheet – which is in the same size of Milan house of prayer – was in danger of breakdown and inhabitants of Italy's Aosta valley were told advised to clear their homes. Further adding to the anguish, a British-American Antarctic review group planned depressions estimating a large portion of the size of the Grand Canyon that are permitting warm sea water to dissolve the tremendous Thwaites icy mass in the Antarctic, speeding up the ascent of ocean levels across the world. A report in the International Thwaites Glacier Collaboration has cautioned that if the ice sheet measures bigger than England, Wales and Northern Ireland set up and if it somehow managed to implode completely, worldwide ocean levels would increment by 65 cm (25 in).
There is no sign that ocean levels won't increment further. Icy mass goes about as a guardian angel, go about's as a support between the warming ocean and different glacial masses. The impending breakdown has the ability to drag adjoining ice sheets in western Antarctica down with it. The most pessimistic scenario most dire outcome imaginable can be that see ocean levels ascend by almost 10 feet, for all time lowering some low-lying beach front regions including portions of Miami, New York City, and the Netherlands meets the substance of the Titanic, which was viewed as resilient and it is amusing that the landmass will be let go in a similar floor. An unnatural weather change is presently a really worldwide proceeding unabatedly. Paris statement expects to restrict a worldwide temperature alteration to 1.5℃ by end of this century, anyway worryingly, a report by the World Meteorological Organization cautions this breaking point might be surpassed by as ahead of schedule as 2024. As per Prof Anders Levermann from the Potsdam Institute for Climate Impact Research in Germany, there are high prospects of ocean level's expanding more than five meters, regardless of whether the objectives of the Paris Climate Agreement are met. Over the year’s each administration has understood the degree's of obliteration that environmental change would incur in their nation and are taking each conceivable measure to even the smallest risk exacting the country. The aggregate exer
1 note
·
View note
Text
PSA FOR THE BESTIES STARTING OUT COLLEGE/UNIVERSITY:
IF THE PROF HAS LOW REVIEWS, ITS FOR A REASON, JUST DON’T TAKE THE CLASS IF THE PROF HAS LOW REVIEWS
2 notes
·
View notes
Text
Low Effort in Their Own Way
All happy families are alike; each unhappy family is unhappy in its own way." - Leo Tolstoy, "Anna Karenina"
I've been watching a fair amount of D&D content on YouTube of late, for varying reasons, and if I may paraphrase Tolstoy's famous quote above, I've learned that all good D&D channels make high-effort content, while each bad D&D channel makes low-effort content in its own way.
Low-effort content tends to be:
Content that is or can be created quickly; it doesn't require a lot of prep time (and the presentation usually allows this limited prep time to show)
Content that copies current trends; while a certain amount of response to significant events in the gaming world is to be expected, low-effort channels regularly feature content that basically boils down to 'here's my reaction to whatever rumor or scandal is currently being talked about among the community'
Content that does not spark or contribute to a discussion; when such channels go beyond simply recapitulating a recent event, they frequently spend very little time explaining their own reaction and seldom spend any time at all explaining or exploring contrary opinions except to make jokes or elicit emotional reactions from an over-simplified or straw-man version of the contrary opinion
Now let's start off by saying that I'm not knocking low-effort content per se; anybody who knows anything about online marketing can tell you that low-effort content has a role to play in any marketing strategy. Ideally, though, your low-effort content, the stuff that you can get out the door quickly and easily and get in front of your potential customers, exists to guide those customers to your higher-quality content that convinces them to buy your product, order your service, or otherwise become someone who believes that you have something of value to say. Because it's cheap and easy to produce, low-effort content can be cast far and wide to serve as a net to capture many potential viewers and guide them to the gold mine of the really important stuff you have to say. Unfortunately, when your low-effort content is what you have to say, it very much begs the question of what exactly it is people should be coming to your channel for.
Here are a few but by no means an exhaustive list of the YouTube channels that to me seem to feature way too much low-effort content.
The Dungeon Dudes
The Dungeon Dudes are two guys (Kelly McLaughlin and Monty Martin) who mainly do scripted back-and-forth style discussions of D&D-related topics. I've talked about the Dungeon Dudes before, when taking apart one of their recent videos, but they also stream a D&D game they play in on Twitch (and frequently post recordings of those sessions on their channel), do product reviews, and generally do whatever they can to maintain a consistent pace of content output, generally a minimum of twice weekly. They've been around for nearly four years now, and have amassed about 273 thousand subscribers on their channel, with over 44 million views for their content, which seem like decent numbers for a niche content channel. (Contract with CinemaSins, which exists as a viral content manufacturer, and has amassed over 9 million subscribers and over 3.3 billion views. I'm not trying to say the Dungeon Dudes are the CinemaSins of D&D; if they were, their numbers would probably look a lot more like those of CinemaSins.)
The big problem with the Dudes as content creators is that, despite being a niche content channel, they are clearly in it to try to eke out some kind of income or living from the work they put into the channel: they've got a Patreon, they use affiliate links in the descriptions of their product review videos to gain some additional referrer income, and they do sponsored content when they can get a sponsor. They started back in the summer of 2017 with a very 2016-era plan on how to succeed at YouTube: put together a bunch of short (5-10 minutes, occasionally longer, but go over 15 minutes at your peril) videos and release them on an iron-clad schedule to get people used to coming back to your channel and looking over your new content, and to their credit, they've kept up their content production schedule very consistently over the past four years.
They've also learned a few things during that time and have adapted the channel in response: their videos explaining rules and reviewing new products tend to be more popular, so they work those topics in on a more regular basis. They've learned that the YouTube algorithm has subtly changed over the past few years to reward channels that can provide longer 'engagement' (which gives YouTube more opportunities to run ads), and have expanded their video length to an average of about a half-hour, with their re-broadcasts from Twitch being extra-long videos (between two and two-and-a-half hours) which, while drawing fewer total views, probably draw as much or more 'engagement' from the algorithm for the views they have.
But the need to spit out so much content on such a rigid, unforgiving schedule means that they have to aim for quick-creation and easy digestion: putting subclasses into a bog-standard tier ranking, making 'top five' and 'top ten' lists that seem like they're being cribbed from a more thoughtful resource, and generally getting stuff out the door (like their 'Powerful Spell Combos Using Teamwork' video) without spending too much time thinking about how valuable or even accurate their advice happens to be. More to the point, it seems to be taking its toll on the guys who serve as the hosts of the show: Kelly McLaughlin has a fairly dour expression in general, but lately he seems to have the countenance of a man who's about to post a 'very special episode' discussing the dangers of YouTuber burnout.
The Dungeon Dudes feature low-effort content because they have to in order to support the publishing frequency they've chosen; if they were to take the time to put together a truly high-effort piece regarding one of their traditional topics, their Patreon subscribers would likely be asking why their release schedule had slowed down before their work was even half-done.
Dungeon Craft
The Dungeon Craft channel is run by a fellow who refers to himself as 'Professor Dungeon Master'; I have not yet found any reference in his channel or elsewhere that identifies who he actually is, so I'll just refer to him as Prof. Prof has been on YouTube a bit longer than the Dungeon Dudes, having launched his channel in October of 2016, and has put out 185 'episodes' (as of the time of this writing), thus averaging between three and four episodes per month. Prof's own 'trailer' video explicitly states his channel's concept: "Some channels focus on running the game, others on building terrain, others on painting minis. I do it all!" You might think, then, that this would be a place to find quite high-quality content, especially related to terrain and miniatures painting tips, but it seems like the main effect of Prof making his channel be about multiple topics (and there are plenty of topics he discusses that don't fit into any of those three categories above) is that he can't successfully communicate what his channel is actually about, other than about his specific opinions. Maybe that's the reason he's sitting at about 65 thousand subscribers and just under 5 million views.
However, being at a slightly lower 'tier' of content production than the Dungeon Dudes is not itself any kind of crime or even indicative of poor quality -- after all, one of my favorite D&D lore channels on YouTube is RavenloftTravelAgent, and she's got just over a thousand subscribers and only about 50 thousand views on her videos. No, Prof could have a very high-quality, high-content channel with the subscriber numbers and views he has, but he doesn't.
Prof's issue is almost exactly the opposite of that of the Dungeon Dudes: instead of cranking out a rapid-fire, breakneck volume of content to keep up with an arbitrary content production schedule because that's how you make a living producing content for YouTube and you have to keep feeding the hungry algorithm, Prof cranks out content that's very easy for him to write because he's been involved in the game for a long time and already knows that the way he learned to play the game is the best way. Any topic that comes up related to D&D, he's got an opinion and can spit out a script explaining his opinion quickly because it's the same opinion he's held for decades. Classic D&D didn't have skills, so the next edition of D&D shouldn't have them either. Classic D&D had slow advancement, so slow advancement is better than fast advancement. This becomes even more obvious in the videos that have very little or nothing to do with running a D&D game, such as where Prof explains why he thought Avengers: Endgame sucked, or why he thought Season 8 of Game of Thrones was 'nearly perfect'.
Some of the oddest episodes of Dungeon Craft have to do when Prof makes admissions that make him out to be, well, the D&D channel for 'that kind' of old-school gamer: the ones who can make comments to each other that they can't make in front of their wives or significant others because the latter find the comments sexist, the kind of guys you can complain to about not being able to tell a Polack joke at work, the guys who treated D&D in the 1980s and 1990s the way that guys in the 1950s and 1960s treated golf where they could build a wall between the world as it existed and the world as they wanted to believe it was (and, if we're being honest, the way that they believed it should actually be). Nowhere is this more evident than in the video where Prof starts by discussing the hot, rich girlfriend he had once who tried but never got into D&D who he just had to break up with, and which by the 3 minute mark has him "calling bullshit" on the idea that relationships are built on compromise and negotiation. (I mean, you saw this coming, right? Right there at the end of the last paragraph about how the ending of Game of Thrones was so good? You knew that's where this was going, right?)
And, of course, he's not immune to just jumping on the latest bandwagon to contribute his drone to the chorus of voices talking about things just to be talking about things. It shouldn't be surprising that Prof jumped on the bandwagon of the lawsuit brought by Hickman and Weis against Wizards of the Coast over the upcoming Dragonlance trilogy, which turned out to be a nothing-burger. Even weirder is the tag in the description of that video which says "Analysis you can't get anywhere else", even though the video doesn't contain anything that hadn't already been discussed over the three weeks between the lawsuit and Prof's video other than Prof's own opinions about it. My favorite howler that Prof makes in this video is his assertion that, because Hickman and Weis got a lawyer to file a lawsuit, that means there's definitely fire under that smoke, because "big law firms do not accept cases they don't think they can win", which both ignores the existence of SLAPP suits as well as the existence of authors who seem to take perverse glee in suing rival authors just to drive them out of the industry. He's also responded with multiple videos in response to Cody at Taking20s controversial 'illusion of choice' essay, and his response to Ginny Di's essay on making online D&D suck less didn't include any of Ginny's solid advice on making online play more compatible with an in-person mentality (recognizing interruptive behavior, or using text chat to maintain side-conversations that would otherwise not be distracting in person), but instead gave these recommendations to players:
Keep your camera turned on
Mute yourself when not talking
Don't distract yourself with technology during the game
Nothing specific on recognizing how online play differs from tabletop play and suggesting ways to bring those two styles closer together, just commands because he's the DM and he says so. Or, in other words, low-effort, opinion-based content.
Nerd Immersion
Nerd Immersion, a channel by Ted that started in May of 2014 and has amassed over 70 thousand subscribers, starts his "channel trailer" video by leafing through a book, then looking up and saying, "Oh, hello" as if he'd just noticed that there was a camera on pointing at him while he's sitting in his orange-trimmed gaming chair. That, sadly, is roughly the level of thought that goes into the actual content contained on this long-tenured but seemingly still super-niche channel.
The weird thing is that at some point, it was obvious that Ted put some real effort into this channel. There are defined sections of the channel that focus on particular things, avoiding the Dungeon Craft problem of 'what topic is our channel about this week?' On Tuesdays, Ted posts a top-10 list. Ted comes up with an idea for a series, like 'Fixing 5E' or 'Reviewing Unearthed Arcana', posts regular articles until he's said what he means to say, then ends the series. (There hasn't been a new Fixing 5E video in roughly a year, meaning that Ted isn't wasting his own time and that of the viewer continually beating horses he's long since killed.) And he comes up with some great ideas for series, such as his series reviewing products on the DMs Guild; that particular series comes out somewhat irregluarly, but not so irregularly that you think he may have stopped doing the series without telling you.
Nerd Immersion's big problem can be summed up by simply looking at the list of videos on his channel and noticing that when he puts his own face on the thumbnail of the video, the startling frequency with which he's shrugging or has a puzzled face or just seems to be presenting himself as if he's not sure what's happening in his own video. I mean, I get it -- that's his image, the personality he wants to present to his audience. He doesn't have all the answers (a refreshing change from Dungeon Craft, honestly), but has some things to share if you're interested, so go ahead and take a peek. But then you take a look at those different sections we spoke about earlier and see that the 'Fixing' series all have the word Fixing at the top of the screen, the Nerd Immersion logo in the top left, two images underneath the text, one on the right side of the page and one on the left, separated right down the middle, and they all have Fix-It Felix on the far right. The Top 10 videos always have Top 10 at the top of the thumbnail. The Unearthed Arcana reviews all have 'Unearthed Arcana' at the top, then 'Review' in an odd off-set to the right beneath 'Unearthed Arcana'.
In other words, Ted has a formula, and he's damn well going to follow it.
Now it's not a bad thing to have a workflow -- if you're going to be cranking out videos at the volume that Ted does (not to mention the others on this list), you'd better have some kind of process for making the video, getting the thumbnail on it, etc.; otherwise each new video is a horrible nightmare of effort as you re-invent the wheel for every project. Nobody wants to do that, and the results would likely be unwatchable. Having a process is a good thing. But the Dungeon Dudes clearly also have a process -- they've put out at least two videos a week for three and a half years, so they damn well have a process or they wouldn't have been able to get out that much content. Looking at their channel, though, shows you that while they have a brand, and one that's evolving over time to boot, they're not just making the same video over and over again, or at least you wouldn't think that from looking at the thumbnails.
Ted's most interesting videos are where he's interviewing another person or even just having another person in the video, because having another person around clearly takes him at least a bit outside his rigid formulaic comfort zone. The problem is that those videos are few and far between -- the review of the infernal tiefling is about eight months separated from his interview with Celeste Conowitch about her Venture Maidens campaign guide. Also interesting are his unboxing videos, because Ted clearly likes minis and takes some degree of joy in cracking open and looking at new minis. His unboxing videos aren't as irregular as his interview videos, but they are fairly recent, with the first appearing just a few months ago, so it's still not clear if this is going to be a new regular part of the channel, or just another series that goes until he says what he wants to say about minis and then stops.
Most of the stuff on the site, though, is just, well, stuff, cranked out on a formula and thrown out into the digital void with the same soft-spoken volume regardless of whether it's major news or a press release. As an example, while pretty much everybody had an opinion on the Dragonlance lawsuit, Ted covered when the suit was announced, when it was dismissed by Weis and Hickman, when the actual trilogy that was the subject of the novels was announced, and the official release date of the first book in the new trilogy. When it came time to get ready to announce the newest campaign book, Ted was on the job, posting a video preparing for the announcement, another video later the same day when his original prediction of a Feywild adventure book seemed to be contradicted by other rumors that the book would be a Ravenloft book, then posted yet another video when the actual book was leaked on Amazon at 11:24pm later that same day confirming Van Richten's Guide to Ravenloft, posted the video discussing the official announcement of Van Richten's Guide to Ravenloft the next day, and then the day after that followed up with more details on Van Richten's Guide to Ravenloft revealed in Dragon+. That's five videos in three days, for a grand total of just over 100 thousand views combined. The intention seems like Ted wants to be the CNN of the D&D news scene, but with those kind of distribution numbers, the result is more like your local home town's shopping circular that occasionally also features stories about the latest project to fix the potholes on Main Street. Just like nobody's doing 24/7 news coverage of your local town council, nobody is (or probably should strive to) doing 24/7 coverage of the gaming industry and Wizards of the Coast. At some point it just becomes running a script, pressing a button to upload the next video, because it's news, and while you don't have to think about news to quite the same degree you have to think about more opinion-based topics, once you stop thinking about the process and what it is you're making, all you have left is executing the formula, over and over again, and both the input and the output becomes repetitive.
Repetitive videos, in repetitive formats, with repetitive text, to keep the monster fed for another day. I can admire the effort that goes into it, but the overwhelming presence of the formula involved in cranking out this content keeps me from feeling that it's worth engaging with. It's low-effort, because the effort has been meticulously removed from the process.
I could go on, but I think I'll stop here. There's not really any constructive criticism I could provide to these channels because, as I hope I've pointed out, it seems like low-effort content is pretty much the only thing these channels have to offer or in truth can offer, and anything that might cause their owners to re-consider their channels to improve their content would almost certainly lead to a very different if not wholly different channel. With things being as they are online, there's no guarantee that any new, higher-effort channel would be any more successful than the old low-effort one (remember the RavenloftTravelAgent channel with absolutely miniscule numbers; effort doesn't automatically equate with success). I can't even claim that being low-effort channels necessarily makes these channels bad (despite what I said in the intro); after all, they all have at least some good ideas, especially Nerd Immersion, and they each have subscribers and a following. I guess this is just my way of putting some small amount of effort into explaining why I don't feel like doing more to help these channels succeed, because I'd rather put my support toward channels making higher-quality, higher-effort content, especially because its not the content itself, but people engaging with that content that really drives a channel's success.
3 notes
·
View notes
Link
Those who plan to improve their learning skills must be alert against a volley on false claims that are ripe in books and materials devoted to accelerated learning. This short and concise list should help you avoid books or websites that do not stick to the basics of science. In addition to memory myths, you will find, at the bottom, a summary of other myths described extensively at other places of this website.
Remember to remain skeptical. Hone your skepticism and treat this list with skepticism too. Consult reputable sources.
Contents:
Memory myths
Genius and creativity myths
Sleep myths
SuperMemo myths
Language learning myths
Skepticism (links to skeptic websites)
Memory myths
Myth: It is possible to produce everlasting memories. Even reputable researchers use the term permastore (see: Prof. Harry Bahrick). It is a widely-held belief that it is possible to learn things well enough to protect them permanently from forgetting. Fact: It is possible to learn things well enough to make it nearly impossible to forget them in lifetime. Every long-term memory, depending on its strength, has an expected lifetime. When the memory strength is very high, the expected lifetime may be longer than our own lease on life. However, if we happened to get extra 200 years to live, no memory built in present life would remain safe without repetition
Myth: We never forget. Some accelerated-learning programs claim that we never forget what we learn. Knowledge simply gets "misplaced" and the key to good memory is to figure out how to dig it out. Fact: All knowledge is subject to gradual decay. Even your own name is vulnerable. It is only a matter of probability. Strong memories are very unlikely to be forgotten. The probability of forgetting one's name is like the probability of getting hit by an asteroid: possible but not considered on a daily basis
Myth: Memory is infinite. Fact: Anyone with basic computational understanding of memory knows this claim is absurd. However, this is just one of a million living claims that are incongruent with primary school level science. After all, half of Americans still believe the earth was created by God less than 10,000 years ago (apology). We cannot even hope to memorize Encyclopedia Britannica in lifetime. Memories are stored in a finite number of states of finite receptors in finite synapses in a finite volume of the human central nervous system. Even worse, storing information long-term is not easy. Most people will find it hard to go beyond 300,000 facts memorized in a lifetime. For the other extreme of this myth see: Memory overload may cause Alzheimer's
Myth: Mnemonics is a panacea to poor memory. Some memory programs focus 100% on mnemonic techniques. They claim that once you represent knowledge in an appropriate way, it can be memorized in a nearly-permanent way. Fact: Mnemonic techniques dramatically reduce the difficulty of retaining things in memory. However, they still do not produce everlasting memories. Repetition is still needed, even though it can be less frequent. If you compare your learning tools to a car, mnemonics is like a tire. You can go on without it, but it makes for a smooth ride
Myth: The more you repeat the better. Many books tell you to review your materials as often as possible (Repetitio mater studiorum est). Fact: Not only frequent repetition is a waste of your precious time, it may also prevent you from effectively forming strong memories. The fastest way to building long-lasting memories is to review your material in a precisely determined moments of time. For long memories with minimum effort use spaced repetition (see SuperMemo)
Myth: You should always use mnemonic techniques. Some enthusiasts of mnemonic techniques claim that you should use them in all situations and for all sorts of knowledge. They claim that learning without mnemonic techniques is always less effective. Fact: Mnemonic techniques also carry some costs. Sometimes it is easier to commit things to memory straight away. The pair of words teacher=instruisto in Esperanto is mnemonic on its own (assuming you know the rules of Esperanto grammar, basic roots and suffixes). Using mnemonic techniques may be an overkill in some circumstances. The rule of thumb is: evoke mnemonic techniques only when you detect a problem with remembering a given thing. For example, you will nearly always want to use a peg-system to memorize phone numbers. Best of all, mnemonic tricks should be a part of your automatically and subconsciously employed learning arsenal. You will develop it over a long run time with massive learning
Myth: We cannot improve memory by training. Infinite memory is a popular optimist's myth. A pessimist's myth is that we cannot improve our memory via training. Even William James in his genius book The Principles of Psychology (1890) wrote with certainty that memory does not change unless for the worse (e.g. as a result of disease). Fact: If considered at a very low synaptic level, memory is indeed quite resilient to improvement. Not only does it seem to change little in the course of life. It is also very similar in its action across the human population. At the very basic level, synapses of a low-IQ individual are as trainable as that of a genius. They are also not much different from those of a mollusk Aplysia or a fly Drosophila. However, there is more to memory and learning than just a single synapse. The main difference between poor students and geniuses is in their skill to represent information for learning. A genius quickly dismembers information and forms simple models that make life easy. Simple models of reality help understand it, process it and remember it. What William James failed to mention is that a week-long course in mnemonic techniques dramatically increases learning skills for many people. Their molecular or synaptic memory may not improve. What improves is their skill to handle knowledge. Consequently, they can remember more and longer. Learning is a self-accelerating and self-amplifying process. As such it often leads to miraculous results.
Myth: Encoding variability theory. Many researchers used to believe that presenting material in longer intervals is effective because of varying contexts in which the same information is presented. Fact: Methodical research indicates that the opposite is true. If you repeat your learning material in the exactly same context, your recall will be easier. Naturally, knowledge acquired in one context may be difficult to recover in another context. For this reason, your learning should focus on producing very precise memory trace that will be universally recoverable in varying contexts. For example, if you want to learn the word informavore, you should not ask How can I call John? He eats knowledge for breakfast. This definition is too context-dependent. Even if it is easy to remember, it may later appear useless. Better ask: How do I call a person who devours information?. Now, even if you always ask the same question in the same context, you are likely to correctly use the word informavore when it is needed. For more on encoding variability and spacing effect see: Spaced repetition in the practice of learning
Myth: Mind maps are always better than pictures. A picture is worth a thousand words. It is true that we remember pictures far better than words. It is true that mind maps are one of the best pictorial representations of knowledge. Some mnemonists claim that all we learn should be in the form of a picture or even a mind map. Fact: It all depends on the material we learn. One of the greatest advantages of text is its compactness and ease at which we can produce it. To memorize your grandma's birthday, you do not really need her picture. A simple verbal mnemonic will be fast to type and should suffice. In word-pair learning, 80% of your material may be textual and still be as good or even better than pictorials. If you ask about the date of the Battle of Trafalgar, you do not need a picture of Napoleon as an illustration. As long as you recall his face at the sound of his name, you have established all links needed to deduce relevant pieces of knowledge. If you add a picture of the actual battle, you will increase the quality and extent of memorized information, but you will need to invest extra minutes into finding the appropriate illustration. Sometimes a simple text formula is all you need
Myth: Review your material on the first day several times. Many authors suggest repeated drills on the day of the first contact with the new learning material. Others propose microspacing (i.e. using spaced repetition for intervals lasting minutes and hours). These are supposed to consolidate the newly learned knowledge. Fact: A single effective repetition on the first day of learning is all you need. Naturally it may happen, you cannot recall a piece of information upon a single exposure. In such cases you may need to repeat the drill. It may also happen that you cannot effectively put together related pieces of information and you need some review to build the big picture. However, in the ideal case, on the day #1 you should (1) understand and (2) execute a single successful active recall (such as answering the question "When did Pangea start breaking up?"). One exposure should then suffice to begin the process of consolidating the memory trace
Myth: Review your material next day after a good night sleep. Many authors believe that sleep consolidates memories and you need to strike iron while it is hot to ensure good recall. In other words, they suggest a good review on the next day after the first exposure. Fact: Although sleep is vital for learning and review is vital for remembering, the optimal timing of the first review is usually closer to 3-7 days. This number comes from calculations that underlie spaced repetition. If we aim to maximize the speed of learning at a steady 95% recall rate, most well-formulated knowledge for a well-trained student will call for the first review in 3-7 days. Some pieces must indeed be reviewed on the next day. Some can wait as long as a month. SuperMemo and other computer programs based on spaced repetition will optimize the length of the first interval before the first review
Myth: Learn new things before sleep. Because of the research showing the importance of sleep in learning, there is a widespread myth claiming that the best time for learning is right before sleep. This is supposed to ensure that newly learned knowledge gets quickly consolidated overnight. Fact: The opposite is true. The best time for learning in a healthy individual is early morning. Many students suffer from DSPS (see: Good sleep for good learning) and simply cannot learn in the morning. They are too drowsy. Their mind seems most clear in the quiet of the late night. They may indeed get better results by learning in the night, but they should rather try to resolve their sleep disorder (e.g. with free running sleep). Late learning may reduce memory interference, i.e. obliteration of the learned material by the new knowledge acquired during the day. However, a far more important factor is the neurohormonal state of the brain in the learning process. In a hormonal sense, the brain is best suited for learning in the morning. It shows highest alertness and the best balance between attention and creativity. The gains in knowledge structure and the speed of processing greatly outweigh all minor advantages of late-night learning
Myth: Long sleep is good for memory. Association of sleep and learning made many believe that the longer we sleep the healthier we are. In addition, long sleep improves memory consolidation. Fact: All we need for effective learning is well-structured sleep at the right time and of the optimum length. Many individuals sleep less than 5 hours and wake up refreshed. Many geniuses sleep little and practice catnaps. Long sleep may correlate with disease. This is why mortality studies show that those who sleep 7 hours live longer than 9-hour-sleepers. The best formula for good sleep: listen to your body. Go to sleep when you are sleepy and sleep as long as you need. When you catch a good rhythm without an alarm clock, your sleep may ultimately last less but produce far better results in learning. It is the natural healthy structure of sleep cycles that makes for good learning (esp. in non-declarative problem solving, creativity, procedural learning, etc.). It is not true that if your sleep is short, so is your memory
Myth: Alpha-waves are best for learning. Zillions of speed-learning programs propose learning in a "relaxed state". Consequently, gazillions of dollars are misinvested by customers seeking instant relief to their educational pains. Fact: It is true that relaxed state is vital for learning. "Relaxed" here means stress-free, distraction-free, and fatigue-free. However, a red light should blink when you hear of fast learning through inducing alpha states. Alpha waves are better known from showing up when you are about to fall asleep. They are better correlated with lack of visual processing than with the absence of distracting stress. You do not need "alpha-wave machinery" to enter the "relaxed state". You can do far better by investing your time and money in ensuring good peaceful environment for learning, as well as in skills related to time-management, conflict-resolution, and stress-management. Neurofeedback devices may play a role in hard to crack stress cases. However, good health, peaceful environment, loving family, etc. are your simple bets for the "relaxed state"
Myth: Memory gets worse as we age. Aging universally affects all organs. 50% of 80-year-olds show symptoms of Alzheimer's disease. Hence the overwhelming belief that memory unavoidably gets rusty at an older age. Fact: It is true we lose neurons with age. It is true that the risk of Alzheimer's increases with age. However, a well-trained memory is quite resilient and shows comparatively fewer functional signs of aging than the joints, the heart, the vascular system, etc. Moreover, training increases the scope of your knowledge, and paradoxically, your mental abilities may actually increase well into a very advanced age
Myth: You can boost your learning with memory pills. Countless companies try to market various drugs and supplements with claims of improved memory. Fact: There are no memory pills out there (August 2003). Many drugs and supplements indirectly help your memory by simply making you healthier. Many substances can help the learning process itself (e.g. small doses of caffeine, sugar, etc.), but these should not be central to your concerns. It is like running a marathon. There are foods and drugs that can help you run, but if you are a lousy runner, no magic pill can make finish in less than 3 hours. Do not bank on pharmiracles. A genius memory researcher Prof. Jim Tully believes that his CREB research will ultimately lead to a memory pill. However, his memory pill is not likely to specifically affect desired memories while leaving other memories to inevitable forgetting. As such, each application of the pill will likely produce a side effect of enhanced memory traces for all things learned in the affected period. Neural network researchers know the problem as stability-vs.-plasticity dilemma. Evolution solved this problem in a way that will be hard to change. Admittedly though, combination of a short-lasting memory enhancement with a sharply-focused spaced repetition (as with SuperMemo) could indeed bring further enhancement to learning
Myth: Learning by doing is the best. Everyone must have experienced the value of learning by doing. This form of learning often leads to memories that last for years. No wonder, some educators believe that learning by doing should monopolize educational practice. Fact: Learning by doing is very effective in terms of the quality of produced memories, but it is also very expensive in expenditure of time, material, organization, etc. The experience of a dead frog's leg coming to life upon touching a wire may stay with one for life (perhaps as murderous nightmares resulting from the guilt of killing). However, a single picture or mpeg of the same experiment can be downloaded from the net in seconds and retained for life with spaced repetition at the cost of 60-100 seconds. This is incomparably cheaper than hunting for frogs in a pond. When you learn to program your VCR, you do not try all functions listed in the manual as this could take a lifetime. You skim the highlights and practice only those clicks that are useful for you. We should practise learning by doing only then when it pays. Naturally, in the area of procedural learning (e.g. swimming, touch typing, playing instruments, etc.), learning by doing is the right way to go. That comes from the definition of procedural learning
Myth: It is possible to memorize Encyclopedia Britannica. Anecdotal evidence points to historical and legendary figures able of incredible feats of memory such as learning 56 languages by the age of 17, memorizing 100,000 hadiths, showing photographic memory lasting for years, etc. No wonder that it leads to the conviction that it is possible to memorize Britannica word for word. It is supposed to only be the question of the right talent or the right technique. Fact: A healthy, intelligent and non-mutant mind shows a surprisingly constant learning rate. If Britannica is presented as a set of well-formulated questions and answers, it is easy to provide a rough estimate of the total time needed to memorize it. If there are 44 million words in Britannica, we will generate 6-15 million cloze deletions, these will require 50-300 million repetitions by the time of job's end (see spaced repetition theory), and that translates to 25-700 years of work assuming 6 hours of unflagging daily effort. All that assuming that the material is ready-to-memorize. Preparing appropriate questions and answers may take 2-5 times more than the mere memorization. If language fluency is set at 20,000 items (this is what you need to pass TOEFL in flying colors or comfortably read Shakespeare), the lifetime limit on learning languages around 50 might not be impossible (assuming total dawn-to-dusk dedication to the learning task). Naturally, those who claim fluency in 50 languages, are more likely to show an arsenal of closer to 2000 words per language and still impress many
Myth: Hypertext can substitute for memory. An amazingly large proportion of the population holds memorization in contempt. Terms "rote memorization", "recitatory rehearsal", "mindless repetition" are used to label any form of memorization or repetition as unintelligent. Seeing the "big picture", "reasoning" and leaving the job of remembering to external hypertext sources are supposed to be viable substitutes. Fact: Associative memory underlies the power of the human mind. Hypertext references are a poor substitute for associative memory. Two facts stored in human memory can instantly be put together and bring a new idea to life. The same facts stored on the Internet will remain useless until they are pieces together inside a creative mind. A mind rich in knowledge, can produce rich associations upon encountering new information. An empty mind is as useful as a toddler given the power of the Internet in search of a solution. Biological neural networks work in such a way, that knowledge is retained in memory only if it is refreshed/reviewed. Learning and repetition are therefore still vital for the progress of mankind.
Myth: People differ in the speed of learning, but they all forget at the same speed. Fact: Although there are mutations that might affect the forgetting rate, at the very lowest biological level, i.e. the synaptic level, the rate of forgetting is indeed basically the same; independent of how smart you are. However, the same thing that makes people learn faster, helps them forget slower. The key to learning and slow forgetting is representation (i.e. the way knowledge is formulated). If you learn with SuperMemo, you will know that items can range from being very difficult to being very easy. The difficult ones are forgotten much faster and require shorter intervals between repetitions. The key to making items easy, is to formulate them well. Moreover, good students will show better performance on the exactly same material. This is because the ultimate test on the formulation of knowledge is not in how it is structured in your learning material, but in the way it is stored in your mind. With massive learning effort, you will gradually improve the way you absorb and represent knowledge in your mind. The fastest student is the one who can instinctively visualize and store knowledge in his mind using minimum-information maximum-connectivity imagery
Myth: Learning while sleeping. An untold number of learning programs promises you to save years of life by learning during sleep. Fact: It is possible to store selected memories generated during sleep by: external stimuli, dreams, hypnagogic and hypnopompic hallucinations (i.e. hallucinations experienced while falling asleep and while waking up). However, it is nearly impossible to harness this process into productive learning. The volume of knowledge that can be gained during sleep is negligible. Learning in sleep may be disruptive to sleep itself. Learning while sleeping should not be confused with the natural process of memory consolidation and optimization that occurs during sleep. This process occurs during a complete sensory cut-off, i.e. there are no known methods of influencing its course to the benefit of learning. Learning while sleeping is not only a complete waste of time. It may simply be unhealthy
Myth: High fluency reflects high memory strength. Our daily observations seem to indicate that if we recall things easily, if we show high fluency, we are likely to remember things for long. Fact: Fluency is not related to memory strength! The two-component model of long-term memory shows that fluency is related to the memory variable called retrievability, while the length of the period in which we can retain memories is related to another variable called stability. These two variables are independent. This means that we cannot derive memory stability from the current fluency (retrievability). The misconception comes from the fact that in traditional learning, i.e. learning that is not based on spaced repetition, we tend to remember only memories that are relatively easy to remember. Those memories will usually show high fluency (retrievability). They will also last for long for reasons of importance, repetition, emotional attachment, etc. No wonder that we tend to believe that high fluency is correlated with memory strength. Users of SuperMemo can testify that despite excellent fluency that follows a repetition, the actual length of the interval in which we recall an item will rather depend on the history of previous repetitions, i.e. we remember better those items that have been repeated many times. See also: automaticity vs. probability of forgetting
The list of myths is by no means complete. I included only the most damaging distortions of the truth, i.e. the ones that can affect even a well-informed person. I did not include myths that are an offence to our intelligence. I did not ponder over repressed memories, subliminal learning, psychic learning, or remote viewing (unlike CIA). The list is simply too long.
See also: Memory FAQ
Sleep myths (see: Good sleep for good learning for a more comprehensive list)
Myth: Since we feel rested after sleep, sleep must be for resting. Ask anyone, even a student of medicine: What is the role of sleep? Nearly everyone will tell you: Sleep is for rest. Fact: Sleep is for optimizing the structure of memories. If it was for rest or energy saving, we would cover the saving by consuming just one apple per night. To effectively encode memories, mammals, birds and even reptiles need to turn off the thinking and do some housekeeping in their brains. This is vital for survival. This is why the evolution produced a defense mechanism against skipping sleep. If we do not get sleep, we feel miserable. We are not actually as wasted as we feel, the damage can be quickly repaired by getting a good night sleep. Our health may not suffer as much as our learning and intelligence. Feeling wasted in sleep deprivation is the result of our brain dishing punishment for not sticking to the rules of an intelligent form of life. Let the memory do restructuring in its programmed time
Myth: Sleeping little makes you more competitive. Many people are so busy with their lives that they sleep only 3-4 hours per night. Moreover, they believe that sleeping little makes them more competitive. Many try to train themselves for minimum sleep. Donald Trump, in his newest book, tells you: "If you want to be a billionaire, sleep as little as possible". Fact: It is true that many geniuses slept little. Many business sharks slept even less. However, the only good formula for maximum long-term competitiveness is via maximum health and maximum creativity. If Trump sleeps 3 hours per night and enjoys his work, he is likely to run it on alertness hormones (ACTH, cortisol, adrenaline, etc.). His sleep is probably structured very well and he may extract more neural benefit per hour of sleep than an average 8-hours-per-night sleeper. Yet that should not make you try to beat yourself to action with an alarm clock. You will get shortest and maximum quality sleep only then when you perfectly hit your circadian low-time, i.e. when your body tells you "now it is time to sleep". Sleep in wrong hours, or sleep interrupted with an alarm clock is bound to undermine your intellectual performance and creativity. Occasionally, you may think that a loss on intellectual side will be counterbalanced with the gain on the action side (e.g. clinching this vital deal). Remember though, that you also need to factor in the long-term health consequences. Unless, of course, you think a heart attack at 45 is a good price to pay for becoming a billionaire
Myth: Sleeping pills will help you sleep better. Fact: Benzodiazepines can help you sleep, but this sleep is of far less quality than naturally induced sleep (the term "sleeping pill" here does not apply to sleep-inducing supplements such as melatonin, minerals, or herbal preparations). Not only are benzodiazepines disruptive to the natural sleep stage sequence. They are also addictive and subject to tachyphylaxis (the more you take the more you need to take). Sleeping pills can be useful in circumstances where sleep is medically vital, and cannot be achieved by other means. Otherwise, avoid sleeping pills whenever possible
Myth: Silence and darkness are vital for sleep. This may be the number one advice for insomniacs: use your sleeping room for sleep only, keep it dark and quiet. Fact: Silence and darkness indeed make it easier to fall asleep. They may also help maintain sleep when it is superficial. However, they are not vital. Moreover, for millions of insomniacs, focusing on peaceful sleeping place obscures the big picture: the most important factor that makes us sleep well, assuming good health, is the adherence to one's natural circadian rhythm! People who go to sleep along their natural rhythm can often sleep well in bright sunshine. They can also show remarkable tolerance to a variety of noises (e.g. loud TV, family chatter, the outside window noise, etc.). This is all possible thanks to the sensory gating that occurs during sleep executed "in phase". Absence of sensory gating in "wrong phase" sleep can easily be demonstrated by lesser changes to AEPs (auditory evoked potentials) registered at various parts of the auditory pathway in the brain. Noises will wake you up if you fail to enter deeper stages of sleep, and this failure nearly always comes from sleeping at the wrong circadian phase (e.g. going to sleep too early). If you suffer from insomnia, focus on understanding your natural sleep rhythm. Peaceful sleeping place is secondary (except in cases of impaired sensory gating as in some elderly). Insomniacs running their daily ritual of perfect darkness, quiet, stresslessness and sheep-counting are like a stranded driver hoping for fair winds instead of looking for the nearest gas station. Even worse, if you keep your place peaceful, you run the risk of falling asleep early enough to be reawakened by the quick elimination of the homeostatic component of sleep. Learn the principles of healthy sleep that will make you sleep in all conditions. Only then focus on making your sleeping place as peaceful as possible. For more see: Good sleep, good learning
Myth: People are of morning or evening type. Fact: This is more of a misnomer than a myth. Evening type people, with chronotherapy, can easily be made to wake up with the sun. What people really differ in is the period of their body clock, as well as the sensitivity to and availability of stimuli that reset that rhythm (e.g. light, activity, stress, etc.). People with an unusually long natural day and low sensitivity to resetting stimuli will tend to work late and wake up late. Hence the tendency to call them "evening type". Those people do not actually prefer evenings, they simply prefer longer working days. The lifestyle affects the body clock as well. A transition from a farmer's lifestyle to a student's lifestyle will result in a slight change to the sleeping rhythm. This is why so many students feel as if they were of the evening type
Myth: Avoid naps. Fact: Naps may indeed worsen insomnia in people suffering from DSPS, esp. if taken too late in the day. Otherwise, naps are highly beneficial to intellectual performance. It is possible to take naps early in the day without affecting one's sleeping rhythm. Those naps must fall before or inside the so-called dead zone where a nap does not produce a phase response (i.e. shift in the circadian rhythm)
Myth: Night shifts are unhealthy. Fact: People working in night shifts are often forced out of work by various ailments such as a heart condition. However, it is not night shifts that are harmful. It is the constant switching of the sleep rhythm from day to night and vice versa. It would be far healthier to let night shift people develop their own regular rhythm in which they would stay awake throughout the night. It is not night wakefulness that is harmful. It is the way we force our body do things it does not want to do
Myth: Going to bed at the same time is good for you. Fact: Many sleep experts recommend going to sleep at the same time every day. Regular rhythm is indeed a form of chronotherapy recommended in many circadian rhythm problems. However, people with severe DSPS may simply find it impossible to go to sleep at the same time everyday. Such forced attempts will only result in a self-feeding cycle of stress and insomnia. In such cases, the struggle with one's own rhythm is simply unhealthy. Unfortunately, people suffering from DSPS are often forced into a "natural" rhythm by their professional and family obligations
Myth: People who sleep less live longer. In 2002, Dr Kripke compared the length of sleep with longevity (1982 data from a cancer risk survey). He figured out that those who sleep 6-7 hours live longer than those who sleep 8 hours and more. No wonder that a message started spreading that those who sleep less live longer. Fact: The best longevity prognosis is ensured by sleeping in compliance with one's natural body rhythm. Those who stick to their own good rhythm often sleep less because their sleep is better structured (and thus more refreshing). "Naturally sleeping" people live longer. Those who sleep against their body call, often need to clock more hours and still do not feel refreshed. Moreover, disease is often correlated with increased demand for sleep. Infectious diseases are renowned for a dramatic change in sleep patterns. When in coma, you are not likely to be adding years to your life. Correlation is not causation
Myth: A nap is a sign of weakness. Fact: A nap is not a sign of weakness, ill-health, laziness or lack of vigor. It is a philogenetic remnant of a biphasic sleeping rhythm. Not all people experience a significant mid-day slump in mental performance. It may be well masked by activity, stress, contact with people, sport, etc. However, if you experience a slump around the 6th to 8th hour of your day, taking a nap can dramatically boost your performance in the second half of the day
Myth: Alarm clock can help you regulate the sleep rhythm. Fact: An alarm clock can help you push your sleep rhythm into the desired framework, but it will rarely help you accomplish a healthy sleep rhythm. The only tried-and-true way to accomplish a healthy sleep and a healthy sleep rhythm is to go to sleep only when you are truly sleepy, and to wake up naturally without external intervention
Myth: Being late for school is bad. Fact: Kids who persistently cannot wake up for school should be left alone. Their fresh mind and health are far more important. 60% of kids under 18 complain of daytime tiredness and 15% fall asleep at school (US, 1998). Parents who regularly punish their kids for being late for school should immediately consult a sleep expert as well as seek help in attenuating the psychological effects of the trauma resulting from the never ending cycle of stress, sleepiness and punishment
Myth: Being late for school is a sign of laziness. Fact: If a young person suffers from DSPS, it may have perpetual problems with getting up for school in time. Those kids are often actually brighter than average and are by no means lazy. However, their optimum circadian time for intellectual work comes after the school or even late into the evening. At school they are drowsy and slow and simply waste their time. If chronotherapy does not help, parents should consider later school hours or even home-schooling
Myth: We can sleep 3 hours per day. Many people enviously read about Tesla's or Edison's sleeping habits and hope they could train themselves to sleep only 3 hours per day having far more time for other activities. Fact: This might work if you plan to party all the time. And if your health is not a consideration. And if your intellectual capacity is not at stake. You can sleep 3 hours and survive. However, if your aspirations go beyond that, you should rather sleep exactly as much as your body wants. That is an intelligent man's optimum. With your improved health and intellectual performance, your lifetime gains will be immense
Myth: We can adapt to polyphasic sleep. Looking at the life of lone sailors, many people believe they can adopt polyphasic sleep and save many hours per day. In polyphasic sleep, you take only 4-5 short naps during the day totaling less than 4 hours. There are many "systems" differing in the arrangement of naps. There are also many young people ready to suffer the pains to see it work. Although a vast majority will drop out, a small circle of the most stubborn ones will survive a few months and will perpetuate the myth with a detriment to public health. Fact: We are basically biphasic and all attempts to change the inbuilt rhythm will result in loss of health, time, and mental capacity. Polyphasic sleep has not been designed for maximum alertness (let alone maximum creativity). It has been designed for maximum alertness in conditions of sleep deprivation (as in solo yachting). A simple rule is: when sleepy, go to sleep; while asleep, continue uninterrupted. See: The myth of polyphasic sleep
Myth: Sleep before midnight is more valuable. Fact: Sleep is most valuable if it comes at the time planned by your own body clock mechanism. If you are not sleepy before midnight, forcing yourself can actually ruin your night if you wake up early
Myth: The body will always crave excess sleep as it craves excess food. Some people draw a parallel between our tendency to overeat with sleep. They believe that if we let the body dictate the amount of sleep, it will always ask for more than needed. As a result, they prefer to cut sleep short with an alarm clock to "optimize" the amount of sleep they get. Fact: Unlike storage of fat, there seems to be little evolutionary benefit to extra sleep. Probably, our typical 6-8 hour sleep is just enough to do all "neural housekeeping". People with sleep deficit may indeed tend to sleep obscenely long. However, once they catch up and get into the rhythm, the length of their sleep is actually likely to decrease!
Myth: Magnesium, folates, and other supplements can help you sleep better. Fact: Nutrients needed for good health are also good for sleep. However, supplementation is not likely to play a significant role in resolving your sleep problems. Vitamins may help if you are in deficit, but a vast majority of sleep disorders in society come from the lack of respect or understanding of the circadian rhythm. Only wisely administered melatonin is known to have a beneficial effect on the advancement of sleep phase. If you are having problems with sleep, read Good sleep for good life. As for supplements, stick to a standard healthy diet. That should suffice
Myth: It is best to wake up with the sun. Fact: You should wake up at the time when your body decides it got enough of sleep. If this happens to be midday, a curtain over the window will prevent you from being woken up by the sun. At the same time sun may help you reset your body clock and help you wake up earlier. People who wake up naturally with the sun are indeed among the healthiest creatures on the planet. However, if you do not wake up naturally before 4 am, trying to do so with the help of an alarm clock will only add misery to your life
Myth: You cannot change the inherent period length of your body clock. Fact: With various chronotherapeutic tricks it is possible to change the period of the clock slightly. It can be reset or advanced harmlessly by means of melatonin, bright light, exercise, meal timing, etc. It can also be reset in a less healthy way: with an alarm clock. However, significant lifestyle changes may be needed to resolve severe cases of DSPS or ASPS. The therapy may be stressful, and the slightest deviation from the therapeutic regimen may result in the relapse to an undesirable rhythm. Those who employ free-running sleep may take the easiest way out of the period length problem: stick to the period that is the natural outcome of your current lifestyle
See also:
Sleep FAQ
Creativity myths (see: Genius and Creativity for a more comprehensive list)
Myth: You must be born with a creative mind! Fact: Some kids indeed show an incredible curiosity and rage to master. However, there are many techniques that can help you multiply your creativity. Creativity is trainable. See Genius and Creativity for some hints
Myth: If you miss childhood, your genius is lost! Fact: Human brain is plastic by definition. In many fields of learning, childhood neglect makes later progress harder; however, training can always produce miracles. Childhood is very important for growth, but if you lost it, you can still catch up in many areas with intense training
Myth: Do not memorize! Fact: This fallacy comes from the fact that many sources fail to delineate the full spectrum of knowledge applicability from dry useless facts to highly abstract reasoning rules. Understanding, thinking, problem solving, creativity, etc. are all based on knowledge. This rule should rather be formulated as: Knowledge selection is critical for success in learning. The correct and non pejorative definition of the word memorize is to: "commit knowledge to memory". Along this definition, you can say: Do memorize! Just make a smart selection of things to learn. See: Smart and dumb learning for a discussion and examples
Myth: Proliferation of geniuses is a threat to humanity. Fact: Most of the good things that surround us are a product of nature, love, or human genius. It is true that the output of genius minds is often used for evil purposes; however, halting genius would be equivalent to halting or reversing the global progress
Myth: If you do something stupid, so are you! Fact: Human brain is an imperfectly programmed machine. It never stops learning and verifying its errors. Its knowledge base is painfully limited. The same brain may be able to disentangle the complexities of the string theory and then slip on simple sums. Notes left by Newton, Leibnitz or Babbage show that they erred on their way to great discovery or meandered in an entirely wrong direction. We measure genius by its top accomplishments, not by the lack of failures
Myth: Geniuses do not forget things! Fact: Genius brains are made of the same substance as average ones. Consequently, their memories are subject to the exactly same laws of forgetting. All knowledge in the human brain declines along a negatively exponential curve. Forgetting is as massive in a genius mind as it is in any other. The best tools against forgetting are (1) good knowledge representation (e.g. mnemonic techniques) and (2) review (based on active recall and spaced repetition). Geniuses may hold an advantage by developing powerful representation skills that make learning much easier. They often develop those skills early and without a conscious effort. However, the science of mnemonics is well developed and you can see a dramatic difference in your knowledge representation skills after a week-long course
Myth: Geniuses sleep little! Fact: When looking at Edison, Tesla, or Churchill it is easy to believe that cutting down on sleep does not seem to pose a problem in creative achievement. Those who try to work creatively in conditions of sleep deprivation will quickly discover though that fresh mind is by far more important than those 2-3 hours one can save by sleeping less. A less visible side effect of sleep deprivation is the effect on memory consolidation and creativity in the long term. Lack of sleep hampers remembering. It also prevents creative associations built during sleep. It is not true that geniuses sleep less. Einstein would work best if he got a solid nine hours of sleep
Myth: Early to ripe, early to rot! Fact: Terman Study contradicts this claim. A majority of precocious kids go on to do great things in life
Myth: You need a degree! Fact: Edison got only 3 months of formal schooling. Lincoln spent less than a year at school. Benjamin Franklin's formal education ended when he was 10. Graham Bell was mostly family trained and self-taught. Steve Wozniak, Steve Jobs, Dean Kamen, and Bill Gates were all college drop-outs. Isaac Newton found school boring and was considered by many a mediocre student. However, there is one thing they all had in common: they loved books and could spend whole days reading and studying
Myth: Genius can be evil! Fact: Evil, by definition, is foolish. One can show genius skills in a narrow field and still be an evil person, but an evil human being does not deserve a title of a genius. True wisdom can reach far beyond a narrow field of specialization. It will inevitably encompass the matters of ethics. This is why all true geniuses are deeply concerned with the future of humanity. See: Goodness of knowledge
Myth: Be unique! This boosts creativity! Fact: The relationship between uniqueness and creativity is reverse. It is true that many creative people are unique or strange in behavior. This comes from their creative way of looking at things and unwillingness to stick to those forms of tradition that defy reason. By no means an effort towards uniqueness will boost creativity. It is true that Einstein smoked a pipe, but it does not mean that you will be more of a genius if you take on smoking a pipe
Myth: TV makes you stupid! Fact: TV or radio can be harmful if you are unable to control what you watch or listen, or if you are unable to optimize the proportion of your time spent on broadcasts. Otherwise, TV is still hard to match in its ability to present to you a pre-selected and emphatically graphic video material for the purposes of education or getting informed. Video education based on the material from reputable channels may be the most efficient form of tutor-less education. Swap MTV for Discovery, and make a good selection. Although you cannot employ incremental video watching yet (cf. incremental reading), a dose of daily DVDR viewing will help you stay up to date with the news and brush up your general education
Myth: Curiosity killed the cat! Fact: As long as you stay within the boundaries of politeness, live by a better proverb: Curiosity is your pass to the kingdom of knowledge
Myth: We use only 0.1% of our brain power. Some reputable researchers derived the 0.1% figure from a simple calculation involving the number of neurons and the numbers of synapses residing in the human brain. The resulting figure seemed to imply an astounding computational capacity. Fact: The brain is energetically a very expensive organ. Only major improvements in human diet in the course of human evolution made it possible to provide for a substantial gain in the brain mass. If the 0.1% or even the 10% claim was to be true, the unused portions of the brain would quickly fall prey to natural selection resulting in energy-saving shrinkage of the brain. A living brain even prunes those circuits that are of little use and sprouts new connections there where they are needed. Portions of brain are programmed to execute highly specialized functions, other portions can easily be used to store vast expanses of declarative knowledge. The process of forgetting has been fine tuned to maximize the use of the existing storage in the reproductive lifetime. Nevertheless, it is not likely we ever run out of memory space when using the trick of spaced repetition to maximize the inflow of new information to memory
Myth: Gifted kids become genius adults. Fact: It is the personality and the training that determine the final outcome. Most of gifted kids are lucky to do well; however, giftedness should not be taken for granted
Myth: Mozart effect. Listening to Mozart increases intelligence. Fact: Mozart was one of the greatest musical geniuses in history. His music might be used in musicality training and produce far better neural effects than, say, today's pop music. However, Mozart's impact on neural growth cannot be verifiably judged better than that of solving cross-word puzzles, singing, playing soccer or learning chemistry. To a philistine, Mozart may do as much good as a recitation of Goethe's poems to a baboon. Neither is listening to Mozart superior to listening to your favorite pieces of music for the sake of boosting "happy brain messengers". Mozart has been cannibalized by the accelerated learning industry as a simple way towards a quick buck. Few gimmicks are as simple as packaging a Mozart CD with a label "Learn 10 times faster". Mozart Effect powerfully illustrates the myth-making power of money. This power has also spawned other cheap "learning solutions" such as learning while sleeping, learning while relaxing, or memory-boosting supplements. Regrettably, even highly respected and reputable websites, journals or TV program fall prey to these catchy memes. Your vigilance needs to triple in these areas
See also: Genius and creativity FAQ
SuperMemo myths
Ever since it was conceived, SuperMemo had to struggle with myths slowing down its popularization. Preventing the reappearance of myths appears to be a never-ending battle. The knowledge about SuperMemo has grown to a substantial volume. Not all users can afford reading dozens of articles. Many are prone to arrive to the same wrong conclusions independently of others. Some of these myths are rooted in general myths of memory (as above). Others seem to spring from the common sense thinking about learning. Here are some most damaging myths related to spaced repetition and SuperMemo:
Myth: SuperMemo can only be used for learning languages. SuperMemo gained most popularity by its effectiveness in learning vocabulary of foreign languages. Hence the myth that SuperMemo is a program for learning languages. A related myth is that it is a program that can only be used for cramming facts, while it cannot effectively be used for complex sciences, rules, modeling, problem solving, creativity, etc. Fact: SuperMemo can be used in any form of declarative learning (i.e. learning of things you can find in textbooks as opposed to learning to ride a bike, etc.). Word-pair learning appears to be the simplest application, while learning complex facts and rules of science may require far more skills in formulating the learning material. This is why many users are indeed unsuccessful when trying to learn, for example, astronomy. If you read 20 rules of formulating knowledge you will realize the number of snags that have to be overcome. Those snags contribute to Myth #1 on the limited applicability of SuperMemo
Myth: SuperMemo is a great tool for cramming. Many first-time users hear it by word of mouth that SuperMemo is a great tool for cramming. They are ready to buy the program only for the purpose of an exam coming in a week. Fact: SuperMemo is nearly useless for cramming knowledge that is supposed to last less than a week. For fast cramming to an exam, use traditional review, recall, repeat approach known to crammers for ages. The power of SuperMemo increases in proportion to the expected lifetime of knowledge in your memory. SuperMemo is useful if you need to remember things for a year (e.g. legal code). It is more useful if you learn for a decade (e.g. a programming language). But it is unsurpassed in gathering lifetime knowledge (e.g. anatomy, geography, history, etc.)
Myth: SuperMemo is hard to use. Several thousand FAQs and the 5 MB help file make many think SuperMemo is complex. It may appear like a program dedicated to heavyweight professionals. This makes it seem like a program of little use to mere mortals. Fact: It is true that some users start from the "wrong end" or wrong pre-conceived assumptions. They may indeed get lost or frustrated. However, a well-tested and certified fact is that SuperMemo can be used effectively after a 3 minute introduction! A great part of its power (perhaps a half) can be harnessed by learning just two operations: Add new (adding new questions and answers) and Learn (making repetitions). Naturally, things get gradually more complex when you start adding multimedia, foreign language support, templates, categories, etc. At the other end, incremental reading, a powerful reading and learning technique, may require months of training before bringing quality results. You can easily start using SuperMemo today, and gradually build skills needed to expand its power
Myth: SuperMemo is useless. Some people truly believe that the natural mechanisms of building long-term memories are superior to spaced repetition. Fact: Our brain prefers "easy" over "important". We excel at remembering celebrity trivia. We are dismal at recalling mathematical formulas learned in high school. In addition, those who deny the value of spaced repetition usually fail to appreciate the value of associative memory, or fail to delineate the distinction between cramming facts and learning universal inference rules. There are many traps of ignorance that prevent people from ever trying SuperMemo. See: SuperMemo is Useless and No force in the world can convince me to SuperMemo
Myth: As you add more material to SuperMemo, your repetition loads mount beyond being manageable. No item added to SuperMemo is considered "memorized for good". For that reasons, all items are subject to review sooner or later. This makes many believe that there is an inevitable increase in the cost of repetitions. Fact: It is true that a large number of outstanding repetitions is the primary excuse for SuperMemo drop-outs. However, computer simulations as well as real-life measurements show that, with the constant daily learning time, the acquisition of new knowledge does not visibly slow down in time (except the very first couple of months). In other words, from a long-term perspective, the acquisition of new knowledge is nearly linear. Older items are repeated less and less quickly leaving room for new material. The exponential nature of this "fading" explains why we can continue with a heavy inflow of new material for decades
Myth: SuperMemo repetitions take too much time. Many users struggle with an increasing load of repetitions and may conclude that the effort is not worth the outcome. Fact: Just 3 well-selected items memorized per day may produce a better effect than a hundred crammed facts. This means that even a minute per day will make a world of difference, as long as you pay attention to what you learn. Not all knowledge is worth the effort of 99% retention. High retention should be reserved only for mission-critical facts and rules. Last but not least: knowledge formulating skills may cut the learning time in beginners by more than 90%
Myth: SuperMemo is expensive. At prices approaching $40 for the newest Windows version, SuperMemo may seem too expensive for users in poorer countries of Africa, Asia or even Eastern Europe. Fact: Older versions of SuperMemo for DOS and Windows are free. Its on-line version is still free. Even the newest version of SuperMemo is available free for contributors to SuperMemo Library
Myth: SuperMemo requires a computer. Fact: See: paper and pencil SuperMemo
Myth: We do not need SuperMemo, all we need is to build an index to knowledge sources. With multiple on-line sources of knowledge, some people are tempted to believe that memorizing things is no longer needed. All we supposedly need to learn is how to access and use these external sources of knowledge. Fact: Knowledge stored in human memory is associative in nature. In other words, we are able to suddenly combine two known ideas to produce a new quality: an invention. We cannot (yet) effectively associate ideas that live on the Internet or in an encyclopedia. All creative geniuses need knowledge to form new concepts. The extent of this knowledge will vary, but the creative output does depend on the volume of knowledge, its associative nature, and its abstractness (i.e. its relevance in building models). Lastly, even "index to knowledge" is subject to forgetting and needs to be maintained via repetition or review. See: SuperMemo is Useless
Myth: Many people are successful without using SuperMemo, hence its importance is secondary. Fact: Neither Darwin nor Newton had access to computers, yet computer illiteracy may make today's scientist entirely impotent. Similarly, with a growing importance of knowledge, neglecting the competitive advantage of a wider and stable knowledge will increasingly limit your chances of successful career in science, engineering, medicine, politics, etc. You can live without SuperMemo, but it can definitely raise your learning to a new level
Myth: Natural mechanism of selecting important memories is good enough. We do not need a crutch. The evolution produced an effective forgetting mechanism that frees our memory from space-consuming and perhaps irrelevant garbage. This mechanism proved efficient enough to build the amazing human civilization. Consequently, many believe that there cannot be much room for improvement. Fact: The forgetting mechanism was built in abstraction from our wishes and decisions. It only spares memories that are used frequently enough. Now though, we are smart enough to decide on our own which knowledge is vital and which is not. A single peek into a dictionary may often take more time than the lifetime cost of refreshing the same word in SuperMemo. And that is the least spectacular example. Human history is rich in monumental errors coming from ignorance. NASA's confusion of imperial and metric units cost a lost Mars probe. Confusion of comma with a dot in Fortran, cost a Venus probe. Errors in English communication caused many aerial and maritime catastrophes. A piece of knowledge in surgeon's mind may be worth the life of his patient. Forgetting is too precarious to leave mission-critical knowledge in its hands. SuperMemo puts you in command
Myth: Developing photographic memory is a better investment. Fact: A great deal of claims related to photographic memory are vastly exaggerated or plain false. Mnemonic tools are vital for efficient learning, but they are no substitute to SuperMemo. They are complementary. Techniques such as Photoreading use the same catchy photo-scanner concept. Unlike SuperMemo, they are easy to publicize and comprehend. However, SuperMemo's superiority in the arsenal of a student's tools is easily demonstrable with plain facts of science, as well as in the practice of learning. For more see: articles at supermemo.com
Myth: Memorizing multiplication table only deprives one of computing skills. Like kids using calculators, those who memorize the multiplication table with SuperMemo are supposed to be less numerate (i.e. less fluent in their calculation skills). Fact: Memorizing the basic 9x9 multiplication table is the cornerstone of all calculation on paper and in mind. Memorizing the 20x20 multiplication table is also a good way of training basic multiplication skills. It is hardly possible to actually memorize the 20x20 table. Intuitively, most students do it the right way by using the combination of their familiar 9x9 table and their adding skills. For example, 14*16 is remembered as 10*14+6*14=140+6*10+6*4=140+60+24=224. This means that the student uses (1) a simple decomposition, (2) zero-shifting rule, (3) the 9x9 table once (to figure out that 4*6=24) and then (4) addition (to add the resulting three numbers). In contrast to the myth, all students who learned the 20x20 multiplication table report a dramatic increase in their multiplication skills. Alas, there is relatively very little carry over to division skills. These require additional learning material and slightly more complex skills (see: Division Table)
Myth: SuperMemo is so simple that it is not needed (PalmGear user's comment). Fact: Simplicity of an idea usually enhances its usefulness. The underlying idea of SuperMemo (increasing intervals) is indeed very simple. However, doing all computations by hand makes little sense, and not employing spaced repetition is bound to negatively affect learning. Consequently, SuperMemo is necessary for knowledge where retention levels are to reach above 80%. Otherwise, any disorganized system of repetitions becomes very wasteful. Ironically, many users of SuperMemo for Windows complain that the program is too complex (see Myth: SuperMemo is Hard)
Myth: The main learning bottleneck is short-term memory, hence SuperMemo is not needed. Some educators live by the wrong conviction that it is the short-term memory that is the bottleneck of learning. This comes from common daily observations of devastating leak in sensory memory. We retain only a fraction of what we perceive. Fact: The opposite is true. Short-term memory is indeed very leaky. However, we can retain in short-term memory far more than we can retain over the long term. The myth is partly derived from the conviction that long-term memory is virtually limitless. The error comes from noticing the huge long-term storage, while neglecting the difficulty with which we retain knowledge in that storage. An advanced student will quickly learn all mnemonic tricks necessary to retain far more in his or her short-term memory than (s)he is able to convert into a lasting knowledge
Myth: Drilling fluency is more important that drilling for retention. Some students and educators believe that they need to train for quick retrieval which often determines the performance (e.g. as in IQ tests). They believe that clocking the repetition improves the retention. Fact: The myth originates from the research by B.F. Skinner's student Ogden Lindsley in the 1960s, which shows how fluency training can demonstrably enhance learning (e.g. in classroom conditions). Lindsley's fluency research does not translate directly to spaced repetition methodology though due to the problem of spacing effect (see also: Memory myth: Fluency reflects memory strength). The procedure that may enhance recall after a single session is not necessarily optimum for repeated active recall in spaced repetition. A clocked drill is more likely to evoke the spacing effect as retrieval difficulty enhances memory consolidation. Consequently, a timed drill will actually increase the frequency of repetitions and overall repetition workload per item. In SuperMemo terms, the effect is similar to an attempt to reduce the forgetting index below 3%. Assuming maximum attention, slow considerate repetition is likely to leave more durable memory traces than a clocked fluency drill. Fluency training makes sense for knowledge whose retrieval is time-critical. This may refer to procedural learning, training before tests based on fluency, foreign language training, reading fluency, etc. However, for fields where creativity is more important than speed, or where solving the problem is more important than solving it fast, "slow" (i.e. meticulous and considerate) learning is recommended. Independently, in SuperMemo, it is the user who determines the grading criteria in learning. Fluency may, but does not have to be included in self-assessment. In other words, although speedy drills are not recommended, SuperMemo does not prevent the user from employing them
See also: SuperMemo FAQ
Language learning myths
Antimoon has compiled another myth list related to language learning: Language learning: Myths and facts.
I personally disagree with classifying the tolerance for language errors as a strategic mistake (myth: "It's OK to make mistakes"). Antimoon's approach assumes that the student's goals is to reach a perfect command of the language, while most students are rather interested in maximum communication fluency in minimum time. When learning English myself, I was primarily interested in communication while accepting a large margin of tolerance for non-semantic errors. This left me with a legacy of wrong habits that are hard to root out. Yet my communication goals have been accomplished on target. Given a choice, I would chose the same strategy again. This is why I would cut Antimoon's myth list by one position
Skepticism
Remain skeptical. Read more about the myths listed above. Drop me an e-mail if you disagree. Or if you believe I missed a dangerous myth that should be included. You can rant about this article here.
Some websites devote all their energy to dispel myths that propagate throughout the population. Myths are friends of ignorance. They do damage to individuals and societies. They are also food for ruthless scams that currently breed rich on the net. Here are a couple of links to websites that I would like to praise for their commendable efforts in the struggle against ignorance, superstition, as well as plain deception:
Skeptic's Dictionary - Prof. T. Carroll's monumental effort listing the most dangerous, most deceptive, most bizarre, as well as the most amusing beliefs, myths and "theories" such as: astrology, clairvoyance, creationism, dianetics, divination, dowsing, homeopathy, NLP, psychokinesis, reincarnation, Silva method, telepathy, teleportation, UFO, etc.
James Randi Educational Foundation - best known for his Million Dollar Challenge, James Randi tirelessly fights against anything paranormal. Anyone able to demonstrate paranormal, supernatural, or occult phenomena via a scientifically controlled experiment can claim Randi's $1 million reward
Quackwatch - Dr Stephen Barrett's equally impressive struggle against harmful diets and medical procedures deceptively employed for profit. Dr Barrett discloses companies, individuals, websites, and products that ascribe miraculous properties to acupuncture, chiropractic healing, super-DHEA, Calorad, gingko, herbal weight-loss tea, iridology, macrobiotics, magnetotherapy, super-melatonin, orthomolecular therapy, psychic practices, etc.
Skeptic Friends Network
Skeptic Planet - skeptic sites search engine. A search for homeopathy yields 500 articles, astrology 900, while creationism 2000 (Aug 3, 2003)
Anti-quackery - collection of anti-quackery links
Stephen Lower on Pseudoscience
CSICOP - Committee for the Scientific Investigation of Claims of the Paranormal
BBC Horizon takes on homeopathy - BBC Horizon fails to win James Randi's Million Dollar Challenge with a scientific experiment that failed to prove that homeopathy actually works
Talk Origins - a collection of articles contesting intelligent design theories in response to a related Talk Origins Usenet newsgroup with unrestricted discussion forum
Truth or Fiction - anti-rumor website
Logical Fallacies - definitions and examples of logical fallacies that underlie most myths, rumors, and superstitions
More links from Randi's JREF
Skeptical Information Links - 532 links to skeptical websites (Aug 3, 2003)
What is not myth?
Sometimes I receive requests for the evaluation of legitimate learning methods. I will only shortly list here the keywords that are worth studying and that are legitimate! You will find plenty of information about these on the net.
Legitimate concepts and authors that might be misunderstood at best and dismissed at worst receive our stamp of approval: mind maps, Mega Memory (never mind Kevin Trudeau's reputation), mnemonic techniques, peg-list system, loci method, Mind Manager, ThinkFAST, Tony Buzan, Sebastian Leitner, expanded rehearsal, reactivation theory, SAFMEDS, bright-light therapy, chronotherapy, melatonin, neurogenesis in adulthood, brain growth through training, neural compensation (e.g. in brain damage), and physical exercise as a brain booster. See also:
Discuss it and add more at SuperMemopedia
Apology (March 2005)
I have received mail that the passage about American beliefs on the age of the Earth may be considered offensive. It is not my intent to offend anyone. I believe that stating facts of science resolutely is an obligation of anyone involved in myth-busting. Unlike far blunter James Randi, who I admire immensely, I try to use a gentler language. If submitted, I am ready to accept a less offensive rewording of the said passage as long as it plainly expresses my belief that most rudimentary scientific consensus leaves no place for infinite memory or young earth. In addition, the passage must include most telling data on how basic science remains little understood by a vast proportion of population in industrialized nations of which the US is probably the best example of contrast. You can leave your comments at SuperMemo Wiki
2 notes
·
View notes
Text
ok so this is mostly for my own records but i need to write this down somewhere at least. about my exam review this sem (don’t worry im still in school, my grades were just really disappointing) :
i reviewed my absolutely garbage exam with my torts prof today. she was equally shocked at how poorly i did, because i always kept up in class and asked good questions and overall showed a solid understanding of tort law. we kind of concluded that the exam was not “real” enough for me to really go through it the way she wanted, but she suggested that based on how i write and speak that i might excel in moot court, negotiation, trial advocacy, or litigation. which is really encouraging because that’s what i want to do! we talked about how i wrote my exam like i was still an english major in grad school, and b/c i was so focused on subtle things i missed the low-hanging fruit. which is fair; you don’t get an MA in english literature and not do this.
so it seems that the bar exam might be a challenge for me (no new news there), but my crap grade in a class i genuinely enjoyed isn’t the end of the world. i just don’t think like that prof’s exams, which isn’t a real-world skill by a long shot, she admitted.
she asked if i was ok, because i’ve lost almost 10 pounds over break and the last couple days my insomnia has been bad, so i’m sure i look tired. after thinking about it, i realized that the reason this made me want to cry was because she noticed and cared enough to ask me. and she also emphasized that once i get a job, my grades won’t be that important - what will be important is that i’m a good attorney and that i can pass the bar.
my takeaway was that: 1) i’m not “but an idiot law student”, i’m just way better at writing papers/ making arguments than taking exams that have “obvious” answers, and 2) i need to get better insurance so i can get medicated properly. i’ve got at least 3 major issues that i should be taking meds for but i just can’t afford the diagnostics.
#the life and times of dani#long post#personal#i'm only on antidepressants but i also have anxiety insomnia and adhd#i just hate going to doctors and asking for refills on those meds#because i also have chronic pain and i'm afraid ill be labeled a drug seeker#anyway#thats my post for today
2 notes
·
View notes
Text
I just got a perfect score on an annotated bibliography worth 20% of my grade, so I thought this would be the perfect subject for my first studyblr tips post!
Formatting
One can either say very little on this subject, or write hundreds of pages. I’m going for the former.
I personally suggest avoiding auto-citation sites such as EasyBib. Citation generators require so much revising that it’s usually faster to write them yourself.
“Cite this source” links are often also automated. They’re usually better than the sites mentioned above, so you might save some time by copy/pasting them and then checking for errors.
Oddly enough, I’ve had professors that either have preferences beyond style rules, or actually don’t know some of the more obscure rules. Send a draft to them if possible.
Purdue Owl is all right, but not that comprehensive. If your school has access to the full manual, use it.
Pre-RADAR
My first sentence is usually “This is a [primary/secondary/tertiary] [scholarly/popular/trade] [journal article/encyclopedia entry/book/etc.] Before I get into RADAR, I explain why a source is scholarly/popular/trade, and why it’s primary/secondary/tertiary.
There are some sources which are a bit ambiguous, in which case I would cite more than one justification for your classification. Otherwise, you can basically write the same two sentences for every citation of its kind, as long as it’s always true. Ex: “This is a scholarly source because it was published in a peer-reviewed academic journal.” “This source is secondary, because it analyzes other sources.”
If it is ambiguous, or your classification could reasonably be contested, consider the publisher, intended audience, overall goal, tone, and language (accessible vs. esoteric.)
RADAR
Your prof may have told you to use RADAR to argue for the credibility of your source. I’ll focus on where to look for your assertions.
I recommend going through every letter in order, even when it feels more natural to combine/reorder them. That way, your professor can clearly see that you’ve hit everything.
Rationale: Journal articles, theses: usually stated explicitly after a brief introduction. Books: Introduction. Table of contents. Online encyclopedias: About page. Anything else: probably self-explanatory, but feel free to send me an ask.
Authority: Publisher: Peer-reviewed. Reputation. Affiliations/memberships. Awards. Longevity. Author: Bio. Other books and articles they’ve published on the subject. Teaching credentials. Prominent mentors.
Date: If it’s very recent, emphasize that. If it’s a field with little emerging research, say that. Additionally: Citations in recent articles. (Google Scholar is perfect for this.) Inclusion in more recent editions/anthologies. If the source is old, but the author is considered to be foundational, say that-- possibly with a note that you’ll take particular care to corroborate information you use. If the author is somewhat low-profile, but recent sources corroborate their info, state that.
Accuracy: Logical arguments. Reliable and plentiful citations. Peer-reviewed. Corroboration in other sources.
Relevancy: Often repetitive. “This source is about X, and my research question is about X.”
Summary
Usually self-explanatory. If it’s part of a long work, and you aren’t reading all of it, use the table of contents. If I mention the table of contents in Rationale, I mention that, and explain that I’ll only highlight the sections most relevant to my project.
#studyblr#annotated bibliographies#annotated bibliography#study blr#study tips#study#studying#theology major#international relations major#ir major
52 notes
·
View notes
Text
Kat's Dream - Analysis of a Scene
(Mildish spoilers, mainly vague references to the final case. I use LMJ to refer to the new series as a whole, MC for the game itself, Katrielle and the Millionaires' Conspiracy.)
I want to talk about the opening of MC, because not only is it one of my favorite cutscenes in the game, it serves as a striking contrast to MC at large. I find it an intriguing addition that hints at the unfolding of a larger story, one that, if the title is any indication, could shape up to be the new series' driving force. Buckle down for a long post that often segues into review of the game as a whole.
For the most part, MC is comical, light-hearted, and laid-back, offering a much more relaxed tone then previous entries in the series. There's nothing wrong with this. The whole "darker always equals better" mindset is, frankly, ridiculous. I appreciate and enjoy when dark elements are well-handled and well-placed in any story, but a comedy should never be deemed inferior to a more serious work on the sole basis of genre alone. The two have different goals and, often, different approaches to characters and story-lines. That said, MC is very comfortable as a character-based comedy where the majority of the humor stems from the heightening of characters' flaws, clever wordplay and banter, and playing around with stereotypes (with a pinch of physical and scatological humor thrown in there, too. The game draws from a wide variety of comedic genres and it's really fun to see different cases reflect different kinds of humor. "Ghost Busted" delights in Scooby-dooesque hijinks, while "Ratman Returns" pokes fun at the current superhero craze and its obsession with franchising).
Of course, MC isn't entirely estranged from its roots. Each case ends with a decidedly emotional resolution, culminating in the final case that, while not entirely original, succeeds in fleshing out a certain character in a genuine and heartfelt manner that is very much in keeping with the spirit of the original Layton games.
I personally loved MC's blend of comedy and emotion, even as I found myself longing for a more coherent over-arching story. The millionaire's conspiracy mentioned in the title does ultimately tie the cases together in a loose fashion, but for the most part these cases can be played in any order and each have their own set-up and resolution, acting as standalone "episodes". I would argue that Mystery Room handled its episodic structure much better than MC, but this has less to do with the structure itself and more to do with the nature of the cases. MC is just very small-scale compared to previous Layton titles and that's a rough adjustment. Yet examining the game as whole reveals that it is, indeed, setting up an even larger, presumably game-spanning story, one that figures only faintly into MC, but will no doubt continue to grow and take precedence as the series unfolds. This larger story is, of course, related to the main title, Layton's Mystery Journey, and the opening cutscene is our most candid look at what this larger story entails.
We open on an overhead of London, shrouded in fog, before cutting to a young girl, Katrielle racing through the streets in her pajamas. She stops when she catches sight of a man in the distance, Professor Layton. She call out to him, but the Professor merely touches his hat with an implacable smile and turns. As he walks away, Kat begins to chase after him again, continuing to call after him, asking where he is going. She finally stops, out of breath, as the fog closes in around her. Older Kat suddenly awakens in bed with a gasp, tears in her eyes.
This scene effectively establishes several important things for the player in a manner that allows the player to see for themselves, and to feel, instead of simply being told the necessary information:
Kat is the Professor's daughter.
Professor left Kat when she was young.
Kat doesn't know the Professor's whereabouts or why he left.
Kat has been deeply affected by her father's disappearance.
While this information has not a lot of direct bearing on the main story, it is still essential. After all, the Professor's disappearance is a large part of the reason why Katrielle decides to pursue a career as a private detective in the first place. Her relationship to the Professor shapes her as a character and it also allows her to play a pivotal role in the game's final case. Yet there is no resolution to the questions brought up in this opening dream sequence. In fact, we have even more questions to ponder by the time the game ends. The implications are clear: the mystery of the missing Prof is only beginning.
I would love to see the series delve deeper into this mystery, broadening its scope, storytelling, and character development, while remaining rooted in the character-based comedy established in MC. Honestly, this is one of my favorite forms of story-telling. In fact, MC with its focus on humor and character dynamics while simultaneously offering fleeting, tantalizing hints at a darker, deeper, over-arching story reminds me of the beginning of one of my favorite comic series, Jeff Smith's BONE. Long story short: three cousins are run out of their hometown and find themselves lost in a medieval, fantasy world. While the series begins by focusing almost exclusively on all manner of comic shenanigans involving the three Bone cousins as they adjust to their new surroundings, a larger story begins to unfold in the background, until it finally takes center stage and plunges the characters into the middle of a war with incredibly high stakes. All the while the comic elements and focus on character relationships are kept intact, serving as an amazing foil and complement to the more serious elements. I could see LMJ doing something similar and the idea has me really excited.
Of course, this isn't to say MC can't still be enjoyed on its own. I hope no one thinks I'm implying the game only finds its worth when connected to a larger story. Not at all. The game is enjoyable in and of itself without figuring the "mystery journey" into the equation. Comparisons are inevitable, however and the fact that MC strives to so fully emulate the gameplay mechanics of its predecessors makes the comparisons even more likely. MC is a lot of fun and sometimes emotionally candid, but the sprawling, rich mysteries of previous titles that tie everything together are sorely missed. There's nothing wrong with MC's structure by itself, but when placed next to its legacy the game feels oddly lacking. Always a problem when trying to continue an old series in a new direction. Changes must be made to keep the series fresh, but these changes will always be under the critical eye of comparison.
MC's lack of a clear over-arching plot is part of the reason why the dream sequence and the greater mystery implied excite me so much. They add a whole new layer to the game, a depth that is rife with potential for future entries. I feel a bit self-conscious saying this, because of course there is marketing on the mind with these tantalizing hints that link MC to the original series without giving us anything substantial. I suppose it's my optimism and respect for the series that leads me to believe the "mystery journey" of the title isn't just a gimmick, but a story worth building up to and exploring. Time will tell.
So, anyway, we've talked about the "what" of the dream scene, but I also want to discuss the "how". How the dream sequence gets its information across. Because there's so many noticeable contrasts from the rest of the game that are worth noting.
The music. This is the only part in MC where we hear Professor Layton's theme. Even in the original series, the theme was used sparingly, usually saved for moments when the Professor was at his best---inventing a contraption to help him escape a dire situation or exposing the true mastermind. It makes sense that the theme would be utilized in a dream centered on the Prof. The version of the theme used in MC is slower, more contemplative and mysterious. The original series was largely in Luke's perspective (the opening letter framing device would support this), so I wonder if the Professor's theme is in part shaped from Luke's perspective of his mentor. If so, this version of the theme could be shaped from Katrielle's perspective. Same Professor, different perspective. Despite Kat's close relation to the Professor, he has become an enigma.
The atmosphere. The London of MC is a warm and inviting place. Even the seedy alleyways of Bowlyn Hill are home to low-lifes who actually harbor hearts of gold. In contrast to UF's focus on political corruption, London in MC is run by a competent and passionate mayor. Most of the cases end not in unremorseful criminals being arrested, but sincere mistakes or confessions that lead to personal growth. This honey-colored optimism has always been present in the PL series, but it seems especially heightened in MC, probably due to the tone decided on from the beginning: the game is a comedy and character's short-comings are treated with both laughter and sympathy. This gold-tinged glow spills over to the setting. The London in Kat's dream, however, is far different. The dream portrays an empty city, one blanketed in thick fog, so thick it swallows Katrielle at the end. The buildings are gray and serve as a claustrophobic framing device. Notice how the road appears to stretch as Katrielle chases after her father. The city itself seems to scheme against her, all the while hosting an indifferent facade. It is an impersonal, desolate city.
Katrielle. In the dream, Kat appears to be around 6-8 years of age in contrast to her current age in MC, which is twenty-one. The obvious reason for this is that her dream reflects the actual circumstances of her father leaving her. It's fairly safe to assume Kate was a young girl at the time, thus, the dream serves as a dramatic distillation of her memories, sort of a recap boiled down to its emotional essence. I can't help but think, however, that her young age in the dream is also indicative of her vulnerability regarding her father's disappearance and perhaps even her emotional immaturity. I've mentioned in a previous post that one of Kat's most prominent flaws is her childishness. While often played for laughs, this trait could point to something deeper. Kat hasn't completely matured and this connects in some way to her father leaving her behind.
Another interesting contrast is Kat's reaction to the dream and how she treats her father's disappearance when discussing it with others. Kat wakes up in tears after the dream and there's a moment right afterwards were she slowly sits up and gazes forlornly at her lap in the middle of her darkened room in silence. A small, but surprisingly powerful moment. His disappearance has deeply hurt her, yet when talking about his disappearance to Lucy, Sherl, and Ernest on different occasions she displays a decidedly nonchalant attitude, denying she is a "daddy's girl" and joking about the matter, calling the Professor a "silly old fool", even suggesting he is enjoying himself wherever he is and has simply lost track of the time. All of this points to Kat concealing her darker emotions regarding the Professor, in favor of making light of the situation and seeing it with an optimistic bent. I think this says loads about her character, but that's a post for another time.
Finally, the dream scene is bereft of any comedic elements. Even the final case in the game manages to slip in a bit of humor, but the opening is solemn and gray. Let me rephrase this: the beginning sequence of the most light-hearted and comical entry in the PL series is perhaps its most serious, troubling, and darkest moment. Yes, it's only a dream. But the implications...The Professor, the paragon of gentlemanly conduct and solid rock for his friends and family, is shown silent, faceless, turning his back on not just someone in need, but his own daughter. The one who proclaimed that every puzzle has an answer has now become a seemingly unsolvable puzzle himself. Of course, there is more to the story, but what a way to open a game that delights in dog puns, collecting outfits, and tidy resolutions. Such an intriguing contrast.
There's a lot more I want to say about MC, but for now I'll close by saying I'm cautiously excited for the series' future and how this contrast between comedy and drama will play out. My hopes is that LJM will ultimately carve out its own unique identity while making insightful and meaningful connections to the previous series instead of merely piggy-backing on its predecessor via indulgent cameos and throwaway references (I’d like to clarify there is nothing inherently wrong with cameos or references, they only become a problem when they are used in place of genuine story-telling and character development). MC is a flawed game and the fact that the scattered collection of hints related to a larger story is one of its most interesting elements underscores the game's weaknesses while also pointing to many future possibilities.
So. Do we really want LMJ or do we just want the original series but new and different, yet somehow still the same? Does MC ultimately succeed in being original? Questions for another post. Personally, my own feelings are mixed. I genuinely loved the game and its new cast of characters while also recognizing its many flaws and shortcomings. For now, share your thoughts if you'd like. What did you think of MC? Agree or disagree with anything I've said here? Optimistic or cynical about the series' future? Another perspective on the dream scene? Let's discuss.
127 notes
·
View notes
Text
The great glaciatic meltdown
A titanic piece of Greenland's ice cap estimatiated at 110 square meters had split and started to float away towards the far northeastern Arctic, flagging a grave risks' that is bound will follow, and the glaciatic obliteration has recently gazed. The part that severed is toward the finish of the Northeast Greenland Ice Stream. It's 42.3 square miles (110 square kilometers) or around multiple times as large as Central Park in NY. This ice desert split away from a fjord called Nioghalvfjerdsfjorden, which is roughly 50 miles (80 kilometers) in length and 12 miles (20 kilometers) wide, as distributed in the National Geological Survey of Denmark and Greenland diary. In any event, being the coldest spot in the outside of the world's air, this district has recorded an increment by enormous 3 degrees Celsius since 1980," as per Dr. Jenny Turton, a polar analyst working Friedrich-Alexander University in Germany. What's more, even with the European landmass recording the most noteworthy temperatures ever, in any event, throughout the mid year of 2019 and 2020.
The previous few months have seen heap features of chilly liquefying – especially in Greenland – and ice sheet breakdown. According to the report which was distributed in the diary Nature Communications Earth and Environment, Greenland's ice sheets have contracted such a lot of that regardless of whether an unnatural weather change were to stop at the present time, the ice sheet would keep contracting a similar distribution further cited satellite information, the Greenland ice sheet lost a record measure of ice in 2019, comparable to 1,000,000 tons each moment across the year. Another paper which was a paper distributed in The Cryosphere, educated that an amazing ice misfortune wasn't brought about by warm temperatures alone yet in addition credits to and non-occasional and remarkable environmental flow designs as the significant reason contributing gigantically to the way the ice sheets quickly of shed's their weight. As these environment models that project the future softening of the Greenland ice sheet don't consider for adjusting barometrical examples, there is an undeniable degree of plausibility that they might have been thought little of by a proportion of 1/2.
According to a report distributed in September 2020,, the last completely flawless ice rack in the Canadian Arctic – the Milne Ice Shelf, which is greater than Manhattan – fell, shedding an abundance of 40% of its space in only two days somewhat recently of July. Which frightened researchers to notice the example of a floated piece of a Mont Blanc icy mass – the same size of Milan basilica – was in danger of breakdown and occupants of Italy’s Aosta valley were organization to clear their homes? The most noticeably terrible was on the way. A British Antarctic Survey along with a group from the USA, planned depressions estimating a large portion of the size of the Grand Canyon that are permitting warm sea water to disintegrate the immense Thwaites icy mass in the Antarctic, speeding up the ascent of ocean levels across the world. As indicated by the International Thwaites Glacier Collaboration, the icy mass measures bigger than England, Wales, and Northern Ireland set up and if it somehow happened to implode completely, worldwide ocean levels would increment by 65 cm (25 in). This isn't the finish of the story. Nature has planned glacial masses to go about as a scaffold or as a cushion between the warming ocean and different ice sheets. A breakdown is sure to convey adjoining ice sheets in western Antarctica down alongside it. This welcome with open arm a cataclysmic situation where the ocean levels will undoubtedly will be an ascent of ocean level by around by a stunning 10 feet, forever sinking some low-lying waterfront regions that incorporate those pieces of Miami, New York City, and the Netherlands, which is a visa for implosion.
An Earth-wide temperature boost as the actual name conveys, walks ahead unabated. While the Paris revelation on environmental change promised to confine a dangerous atmospheric devation to 1.5℃ in any event, during this century, a report by the World Meteorological Organization cautions that breaking point can be penetrated as ahead of schedule as 2024. As per Prof Anders Levermann from the Potsdam Institute for Climate Impact Research in Germany, it will be a judicious to anticipate an expansion in the ocean level more than five meter's, regardless the objective set up in Paris have been accomplished 100%Hence is the obligation of each person to be responsible for their activities, to do all that could be within reach under the sun, not anticipating other's to act. Each nation has distributed standard rules to be clung to, if which followed will bring down the chance of early disaster y striking us early. One of the potential risky Thwaites ice sheets is bigger than England, Wales and Northern Ireland set up and if the inescapable occurs, there is high likelihood of a significant part of England and Wales being gulped by the Atlantic.
In August '20, the last completely flawless ice rack in the Canadian Arctic – the Milne Ice Shelf, which is greater than Manhattan – fell, losing in excess of 40% of its space in only two days toward the finish of July. Researcher's admonished that that an enormous piece of a Mont Blanc ice sheet – which is in the same size of Milan house of prayer – was in danger of breakdown and inhabitants of Italy's Aosta valley were told advised to clear their homes. Further adding to the anguish, a British-American Antarctic review group planned depressions estimating a large portion of the size of the Grand Canyon that are permitting warm sea water to dissolve the tremendous Thwaites icy mass in the Antarctic, speeding up the ascent of ocean levels across the world. A report in the International Thwaites Glacier Collaboration has cautioned that if the ice sheet measures bigger than England, Wales and Northern Ireland set up and if it somehow managed to implode completely, worldwide ocean levels would increment by 65 cm (25 in).
There is no sign that ocean levels won't increment further. Icy mass goes about as a guardian angel, go about's as a support between the warming ocean and different glacial masses. The impending breakdown has the ability to drag adjoining ice sheets in western Antarctica down with it. The most pessimistic scenario most dire outcome imaginable can be that see ocean levels ascend by almost 10 feet, for all time lowering some low-lying beach front regions including portions of Miami, New York City, and the Netherlands meets the substance of the Titanic, which was viewed as resilient and it is amusing that the landmass will be let go in a similar floor. An unnatural weather change is presently a really worldwide proceeding unabatedly. Paris statement expects to restrict a worldwide temperature alteration to 1.5℃ by end of this century, anyway worryingly, a report by the World Meteorological Organization cautions this breaking point might be surpassed by as ahead of schedule as 2024. As per Prof Anders Levermann from the Potsdam Institute for Climate Impact Research in Germany, there are high prospects of ocean level's expanding more than five meters, regardless of whether the objectives of the Paris Climate Agreement are met. Over the year’s each administration has understood the degree's of obliteration that environmental change would incur in their nation and are taking each conceivable measure to even the smallest risk exacting the country. The aggregate exer
0 notes