Understanding fallacies and cognitive biases in order to consciously seek out rationality.
Don't wanna be here? Send us removal request.
Text
Freestyle Post: The Fundamental Attribution Error
For this last freestyle post, I am going to discuss one of my favorite cognitive biases: The Fundamental Attribution Error (FAE). The FAE is when one attributes someone's behavior to their character or some internal attribute, and ignores the possible external circumstances that could have caused their behavior.
I found an article written on psychologytoday.com written by Christopher Dwyer, Ph.D. He discusses several different cognitive biases, including the Fundamental Attribution Error. One of the most alarming things about the FAE is that we look for contextual excuses for our own failures, but generally blame other people or their characteristics for their failures. Dr. Dwyer uses a great example to illustrate the FAE: Imagine you are driving behind another car. The other driver is swerving a bit and unpredictably starts speeding up and slowing down. You decide to overtake them (so as to no longer be stuck behind such a dangerous driver) and as you look over, you see a female behind the wheel. The Fundamental Attribution Error kicks in when you make the judgment that their driving is poor because they’re a woman (also tying on to an unfounded stereotype). But, what you probably don’t know is that the other driver has three children yelling and goofing around in the backseat, while she’s trying to get one to soccer, one to dance and the other to a piano lesson. She’s had a particularly tough day and now she’s running late with all of the kids because she couldn’t leave work at the normal time. If we were that driver, we’d judge ourselves as driving poorly because of these reasons, not because of who we are.
I think it is of high importance in our lives to judge others as accurate as possible, or to at least try to, no matter how elusive that accuracy is. However, more importantly, what I want readers to learn from this is not to jump to conclusions about others. I think it is important to realize that it is (obviously) easy to observe someone's behavior, but explaining it is not. It often has less to do with one's internal character than we realize. This concept has important relevance for each one of us. In our daily lives, we constantly must interact with others, meet new people, and judge whether or not to befriend them, associate with them, or perhaps avoid them. The FAE could cause us to judge someone as rude/unfriendly, when in fact they may be going through a stressful life that is weighing on them. The external circumstances that influence our behavior are often hidden to observers. That is all the more reason to be cautious to jump to conclusions. When observing others, we must take into account nearly limitless factors, such as, the context, setting, time of day, on and on. What is important to remember is that we never have all the information we need. Much of what influences behavior is hidden from us. I think it is fair to say we should simply be cautious about evaluating others because we never know the whole story. Also, I think it is important to recognize that external circumstances often have a bigger influence on our behavior than our own internal character does--by necessity. Think of your own daily life and behavior, and how often you adjust how you present yourself according to the setting, context, presence of others, etc. Or think about last time you had a bad day at work. Maybe you did not provide the most friendly customer service, or you yelled at someone on a business call. Maybe it was because you were going through a rough time, or your boss was putting pressure on you to get your job done quickly. Our external world has a significant effect on all of us, and we should be more aware of that, and be more understanding/tolerant of others.
We so often draw conclusions that are inaccurate/incomplete because are brains are hard-wired for efficiency. It is a fact of human nature. I have had a lot of fun writing this post and this blog altogether. I think we all need to be more aware of our own cognitive biases and the cognitive biases of others. For our safety and well-being, I think it is important for all of us to explore the real-world implications of cognitive biases and the details associated with them. If we could all generally better understand our own thought processes, we could better avoid conflicts/disasters and our own internal states of confusion and helplessness.
3 notes
·
View notes
Photo
0 notes
Photo
1 note
·
View note
Text
Freestyle post: The Dunning-Kruger Effect
Have you ever noticed the most ignorant/inept people are often the most confident? This concept is a psychological phenomenon known as the Dunning-Kruger Effect. I found an article on forbes.com addressing the Dunning-Kruger Effect, written by Mark Murphy.
The Dunning-Kruger Effect was coined in 1999 by then-Cornell psychologists David Dunning and Justin Kruger. The Dunning-Kruger Effect is a cognitive bias whereby people who are incompetent at something are unable to recognize their own incompetence. And not only do they fail to recognize their incompetence, they’re also likely to feel confident that they actually are competent. The 1999 paper that launched the Dunning-Kruger Effect was called “Unskilled and unaware of it: how difficulties in recognizing one's own incompetence lead to inflated self-assessments.” Across 4 studies, Professor Dunning and his team administered tests of humor, grammar, and logic. And they found that participants scoring in the bottom quartile grossly overestimated their test performance and ability.
Mark Murphy points out you can find examples of the Dunning-Kruger Effect everywhere. One study of high-tech firms discovered that 32-42% of software engineers rated their skills as being in the top 5% of their companies. A nationwide survey found that 21% of Americans believe that it’s ‘very likely’ or ‘fairly likely’ that they’ll become millionaires within the next 10 years. Drivers consistently rate themselves above average. Medical technicians overestimate their knowledge in real-world lab procedures. In a classic study of faculty at the University of Nebraska, 68% rated themselves in the top 25% for teaching ability, and more than 90% rated themselves above average. Do these numbers scare you at all? They sure scare me. This phenomenon is so prevalent, I am willing to say it is engraved into our human nature. So, why do the most ignorant/inept have the most confidence? How many dangerous outcomes could have caused by this cognitive bias? These are the questions that interest me. I think the implications that can be drawn from our knowledge of the prevalence of this cognitive bias should concern everyone. If the most inept/ignorant see themselves as the most competent, then they are not likely to recognize their own faults and improve. They are also likely to make dangerous mistakes. Think, for example, of tractor-trailer truck drivers. Driving a tractor-trailer truck can be dangerous, and if drivers not only do not have the skills/knowledge to be good at their job, then they also do probably do not have the intelligence to recognize how bad they are at driving. Tractor-trailer truck drivers must constantly be aware of changing traffic laws, and be hyper alert when driving such a large vehicle that can cause massive damage. If they are overconfident in their driving ability, they are unlikely to worry about changing traffic laws, and may even pay less attention to the road and other drivers when driving. This carelessness/ineptitude could be the cause of some serious traffic accidents that hurt a lot of people. This tractor-trailer truck example is just one of countless possible examples of how the Dunning-Kruger Effect can be dangerous. People in countless positions around the world are susceptible to this effect, and it is especially concerning for people who hold positions that impact the safety of others.
I could go on and on about the Dunning-Kruger Effect and how it impacts the world, but I want to emphasize one major point: The Dunning-Kruger Effect is outright dangerous and it ties into the overall danger of cognitive biases. We as humans are all prone to cognitive bias. People in all kinds of positions make decisions/judgments daily that are irrational and harmful. Not only are we generally irrational, the most irrational judge themselves as the most rational/competent. Simply, the less intelligent/skilled one is, the more susceptible they are to cognitive bias. The ability to understand and be aware of one's own thought processes is known as metacognition. It takes high-level cognition to be able to recognize one's own thought processes, and that is illustrated by the Dunning-Kruger Effect. So, now my question is, would a widespread movement to inform people of cognitive biases and change their thought processes work? What I have learned from writing this post is that cognitive biases are absolutely an issue for all humans, however, they are even more of an issue for the most ignorant/inept (not particularly surprising). I think it is important that the public makes a collective effort to better understand how cognitive biases harm humanity, and minimize their effect. However, unfortunately, I am beginning to doubt how effective such an effort would be--if the people most effected by cognitive biases are the least likely to make an effort to educate themselves about them. I do not like this pessimistic view, so, if any of you have any suggestions for how we could combat the issue of cognitive bias, I would be happy if you left a comment.
0 notes
Photo
3 notes
·
View notes
Text
Primary Source Post: Confirmation bias in a medical malpractice case.
Carolina Carcerano was a 74 year old woman with a back issue. She died due to a medical error that was caused by confirmation bias. She arrived at Tufts Medical Center for a brief procedure to relieve lingering symptoms resulting from a back injury. In the operating room, the neurosurgeon requested a special dye to test the location of tubing that had been threaded into her spine. But the pharmacy didn’t have it and provided a different one. The surgeon checked the label, hospital executives said, and then injected it, twice. After Carcerano awoke with severe pain and seizures, caregivers zeroed in on the substitute dye, whose label clearly warned against use inside the spine, according to a report from regulators who investigated the error. “A mistake was made,” the neurosurgeon, Dr. Steven Hwang, immediately told Carcerano’s two sons as they watched over their 74-year-old mother. “We gave her the wrong dye.’’
The doctor thought that he had the correct dye, despite reading it twice. He thought that he read it was the correct dye, even though it clearly was not. Confirmation bias is the reason he thought he had the correct dye. He saw what he expected his team to give him, even though it clearly was not.
This relates to answering research question (What is the source of cognitive bias?) because it is a specific instance of how influential cognitive bias is, and how harmful it can be.
This is a primary source because it is an article of a specific event when cognitive bias influenced someone.
What this article taught me is that cognitive bias can not only affect thought processes and reasoning, but also simple sensory perception--which is very unnerving to think about. The doctor looked at the dye label twice, and somehow still saw the dye he was expecting, even though it read something else entirely. I now would like to investigate further how much of an impact cognitive bias has on sensory perception, and other real-world problems associated with that. Imagine how many problems could result from simply seeing what you expect rather than what is actually there. Perhaps you could expect to see someone smile at a comment you made, but instead they gave a facial expression that indicated they were not enthused or even offended. In a case like this, you could imagine how problematic it would be if your cognitive/perceptual bias resulted in you seeing social/facial cues that were not actually present. There are probably more communication/transparency issues resulting from cognitive bias than we could even imagine. The real-world scenarios that could involve transparency issues caused by confirmation bias are nearly infinite, including, heated spouse arguments, international conflict, group projects, on and on. In my opinion, more research should be done to investigate how cognitive/perceptual biases affect many different real-world scenarios. The more we understand how different biases affect us, the better we can minimize their affect on us and work together more cohesively in a vast number of areas.
1 note
·
View note
Photo
0 notes
Photo
0 notes
Text
Academic Article Post: Cognitive bias modification reduces social anxiety symptoms in socially anxious adolescents with mild intellectual disabilities: A randomized controlled trial.
My academic article post is on a research experiment called "Cognitive bias modification reduces social anxiety symptoms in socially anxious adolescents with mild intellectual disabilities: A randomized controlled trial." It is an experiment conducted by Klein, Anke M et al. It was published in the "Journal of Autism and Developmental Disorders" on April 21, 2018. There study included a sample of 740 adolescents recruited from seven secondary schools for students with mild to borderline intellectual disabilities in the Netherlands. These schools include students with MID with IQ-scores varying between 60 and 85 in combination with more than 3 years of learning deficits (Van Rijswijk and Kool 2002). The goal of their study was to find a way to clinically modify negative interpretation bias. In this case, negative interpretation being a tendency to interpret ambiguous information in a way that reflects poorly on their social performance. Their audience is other clinical psychologists or any academic with an interest in clinical psychology. Their thesis is that their treatment did have a significant effect on their subjects' social anxiety symptoms.
Each subject went through a session of ambiguous social scenarios. Each session consisted of 40 ambiguous scenarios. Each scenario consisted of three short sentences, with a word in the last sentence missing. We created two different versions of the training task, a positive training and a neutral control-training. In the positive training, all ambiguous scenarios were related to social situations and the word fragment made the story end positively. In the control training, all ambiguous scenarios were non-emotional and the word fragment made the story end in a neutral non-emotional way that was irrelevant to anxiety. A set of 200 scenarios was used. Across training and within each training session, the scenarios were randomized for all adolescents. First, the participants were asked to read each scenario and imagine themselves as the central character. After reading a scenario, the participant pressed the space bar and the missing final word appeared on the screen with one letter missing. The participant’s task was to fill in this missing letter, after which they received feedback by reading the correct response. Third, the participant was asked to answer ‘yes’ or ‘no’ to a question that measured comprehension of the story. This comprehension question reinforced either the positive or neutral interpretation. Directly following the answer, participants received feedback regarding their response to this comprehension question. In case the participant gave the correct answer, the participant received the feedback ‘indeed’ followed by the correct interpretation. In case the participant gave the wrong answer, he/she simply received the correct interpretation.
In the positive training group, reduction in social anxiety symptoms was not reported directly following treatment, but did reduce significantly in the period from post- to 10-weeks post training. This significant reduction was absent in the control-training group--meaning their treatment did have a significant effect on removing their subjects' social anxiety symptoms.
One of the sources they used was Beck, A. T., Emery, G., & Greenberg, R. L. (1985). Anxiety disorders and phobias: A cognitive perspective. New York: Basic Books. They used it to briefly explain the concepts of cognitive therapy. The author of the book, Aaron T. Beck, is widely acclaimed the "father of cognitive therapy." His book provides elite credibility and perfectly explains the concept of cognitive therapy and puts it in a way that is easy for laypersons to understand. The book summarizes cognitive therapy practice as well as the disorders it is used to treat.
The book "Anxiety DIsorders and Phobias: A cognitive perspective" relates to the research article by giving a foundation of cognitive therapy to help readers better understand the specific cognitive therapy used in the research study--cognitive bias modification.
What the article tells me about my research question (What is the source of cognitive bias?) is that is has an attainable answer. The answer to that question may be extremely elusive, even near impossible to answer concisely, but it has an answer that is attainable, no matter how complex and sophisticated. Human beings are extremely complex. Psychological disorders are extremely hard to treat because different treatments work and do not work for individuals with the same disorders. However, the treatment effects of this research design are extremely encouraging. The researchers managed to find a single treatment that worked for an entire group of adolescents with mild intellectual abilities. If these researchers managed to modify the cognitive biases of developing adolescents with intellectual disabilities, then I think it is fair to infer that we can address some of the bigger issues involving how cognitive bias effects society at large, such as wrongful convictions and even handling international relations that require sensitivity to avoid conflict. Cognitive bias has a much bigger impact on the world than what is apparent. The findings from this article are important because they show that we can change thought inclinations and thus, change behavior. With our increasing understand of how modifying cognitive bias can have a positive impact on mental disorders, we can speculate that we can use such other types of cognitive treatments to benefit other issues, such as wrongful convictions and conflict resolution.
1 note
·
View note
Text
Book Post: Convicting the Innocent. Understanding how cognitive bias results in convicting the innocent.
My book is called “Convicting the Innocent: Where Criminal Prosecutions go Wrong” by Brandon Garrett. It addresses common errors within our criminal justice system, and how to address them. The author makes a point that we do not know and we cannot know how many innocent people are languished in our prisons. Most exonerees fight for years to obtain access to DNA testing and exoneration.The word exoneration refers to the official decision to reverse a conviction based on new evidence of innocence. Also, the author points out that the most haunting features of those exonerations is that many of them were discovered by chance. Most convicts who seek postconviction DNA testing cannot get it. In fact, in my opinion, unfairly, some jurisdictions even deny access to DNA testing that could prove innocence. One of the main points the author tried to explain early on in the book about convicting the innocent is that most police, prosecutors, forensic analysts, judges, and jurors do act in good faith--they don't mean to convict innocent people, but most of them suffer from an everyday phenomenon called cognitive bias, meaning they unconsciously ignore evidence contradicting prior knowledge, or perhaps they were motivated to pursue only the guilty. The author suggests that rather than blame individuals for their mistakes in these cases, systematic problems should be called for reform. We must safeguard the accuracy of crucial pieces of evidence, including, eyewitness identifications, confessions, forensics, and informant testimony.The purpose of the book is to better understand why criminal prosecutions go wrong, and how we can avoid convicting the innocent.
As previously mentioned, one of the most significant problems with criminal prosecution is that is often influenced by cognitive bias. One would think that forensic science is one of the most objective measures we could utilize in criminal prosecution, however, as the author points out, forensic science is still very susceptible to being influenced by cognitive bias. Forensics may have been susceptible to cognitive bias based on their role working for law enforcement and information they were given about the case. Scientists typically design experiments that are blind, so that not all involved know certain aspects of the experiment—so they are not influenced by bias. In contrast, forensic analysts do not do their work blind. They receive info about the crimes being investigated that may have nothing to do with the forensic analysis being investigated—such as the fact that a suspect confessed, or the prior criminal record. As the U.S Supreme court put it, a forensic scientist may not be "neutral" because an analyst responding to a request from a law official may feel pressure—or even have an incentive—to alter the evidence in a manner favorable to the prosecution."
To sum up the effect of cognitive bias on criminal prosecution in the last section of his book, the author mentions that modern psychology has shown in many ways, that our beliefs, hopes, and desires influence what we perceive and how we reason and behave--in other words, we are influenced by cognitive bias. Cognitive bias can occur where people tend, for example, to see themselves in a positive light. In criminal cases, for example, police may see themselves doing justice by only investigating the guilty. The result can be "tunnel vision" or confirmation bias.
In my opinion, it is a pathetic and outright irresponsible of our criminal justice system to make it difficult to obtain postconviction DNA testing, and more irresponsible of individual jurisdictions to forbid it altogether. There are innocent people suffering greatly in our prisons--a place they do not belong. Even the ones who are lucky enough to get exonerated never get the lost time back, or any form of compensation that make up for the time lost and suffering they went through. It is an absolute failure on not only our criminal justice system, but on our society to not do more to address the issue of convicting the innocent. I understand there is a lot of ignorance about this issue, however, ignorance is not an excuse for social injustices occurring on a grand scale. We as a society, have a responsible to keep ourselves informed about not only societal, but global issues that cause great human suffering. Now that you are more informed about the issue of wrongful convictions, I hope you, as a U.S citizen, pay more attention to the issue of convicting the innocent, and try to find candidates for legislative positions who want to address wrongful convictions/criminal justice reform and perhaps vote for them. "Having heard all of this you may choose to look the other way, but you can never again say that you did not know." William Wilberforce (1759-1833)
I know we dove deep into the issue of wrongful criminal prosecutions, but what I am trying to highlight here is how influential cognitive bias is on our society at large. Cognitive bias is engraved into human nature, it is a part of us. However, that is all the more reason we should educate ourselves about it, understand how it influences the world in a harmful way, and try to learn how to avoid it.
0 notes
Text
Op-Ed Assignment
As many of you know, bias can effect even those who report the news. Those who are solely supposed to inform us with objective information still let their biases slip in. Even if they are unaware of it. I found an article by David Leonhardt called “The Six Forms of Media Bias,” published by the New York Times.
David himself is obviously a fan of the media because he is a member of the media. However, he believes the media still deserves criticism. In this article, he describes what he believes to be forms of media bias.
One bias he notes is Affluent Bias--meaning, the media is biased toward the views of people who are wealthy. David believes this is because national journalists, those who set the agenda, tend to spend more time around wealthy people, and national journalists themselves tend to be more wealthy than most Americans. To give an example, David noted a 2008 Democratic primary debate, a then anchor of ABC News, Charlie Gibson, suggested that a middle-class family in New Hampshire might make $200,000 a year. The audience laughed.
Another bias he suggested was Bias for the new. Journalists often mistake new for importance. He suggests it has to do with the products name: “News.” Too often, media emphasizes trivial stories, like political candidates taking swipes at each other--over more important ones, like a candidate’s tax policy.
He also notes Centrist bias. Centrist bias being when the media covers the most extreme positions, while ignoring both sides of an argument.
Clearly, the media can suffer from biases themselves, even when they are supposed to be objective. It is dangerous and perhaps even irresponsible for members of the media to be biased--even if they are unaware of the bias. Of course, we are all susceptible to our own biases. It is wired into our brains. Although, national reporters should be held to a higher standard than the average person when it comes to being biased. When reporting to a massive amount of people, these reporters should be more careful about letting their own biases affect their coverage. Perhaps all reporters should constantly be educated further on how biases can affect their coverage. They will never be able to fully eliminate their own biases, but with effort they can minimize them.
The next article suggests the contrary to my opinion--that education on bias cannot end change how people think. The article “A Better Solution for Starbucks” is written by Phillip Goff, and published by the New York Times.
Starbucks has made an effort to eliminate racial discrimination in its stores by educating their employees on bias. Their employees went through a 4-hour training sessions where they were educated on bias. Phillip argues that educating ourselves on understanding bias is an insufficient solution for discrimination.
Phillips suggests non-police options for people worried about suspicious behaviors. For example, maybe you could post the phone numbers of social workers and others who are trained to deescalate conflicts in restaurants.
Phillip brings up a really interesting point. Bias training like being conducted by Starbucks rest on the idea that racism lives in a person’s heart and mind, and eliminating it is internal work. Simply educating people on implicit bias--a bias i have previously posted about--is not enough to prevent people from being prejudiced. Phillip points out that most research on prejudice by psychologists suggests that 90 percent of behavior is driven by how we react to situations, not by attitudes. Ultimately, Phillip believes we should not educate people on bias, but rather reshape situations so that fear or dislike of people does not produce an armed response.
Phillip brings up an excellent point--biases are powerful. However, I disagree with his overall stance. We are still learning about the neurological/evolutionary underpinnings of prejudice. Also, their certainly can be more research done on how to avoid cognitive biases. I agree with Phillip’s statement that training people to react differently to situations can have a big impact. However, I believe it is hard to dispute that training people to react differently to certain situations is not going to help unless we teach them to change their beliefs, especially beliefs they may be unaware of.
0 notes
Text
Documentary Post: Is your brain hard-wired to be prejudiced?
There are neurological reasons we are at least somewhat prejudiced, even if we firmly believe being prejudice is immoral. The documentary I watched is called “Is Your Brain Unprejudiced?” The point of emphasis in this documentary is that we can be prejudiced even if we do not believe there are difference between races that deserve different treatment. To put this in perspective, the punchline of the lecture, according to the lecturer herself is “You might not be racist, but your brain likely is.” I am sure this idea might make you uncomfortable. However, keep in mind it is not our fault that we have cognitive biases, but to ignore them is to accept a life-time of confusion. Recall one of the objectives of this blog is to become aware of our unconscious inferences so that we can consciously override their impact, and become more rational.
Before I get into the details of the documentary, I want to remind you of dangerous implications of our cognitive biases--especially related to prejudice. People often intersect multiple sociocultural groups, therefore, our assumptions about specific groups often lead to inaccurate generalizations about individuals. These assumptions/judgments can be dangerous; our judgment of people determines who we befriend, associate with, and avoid. It is of our paramount importance that our judgments of people are as accurate as possible, so that we do not befriend or avoid someone for wrong reasons.
Now, into the documentary. A heavy focus of the documentary are the neurological underpinnings of bias. Let’s start with implicit vs explicit biases. Explicit biases are your attitudes towards groups of people that you consciously acknowledge. For example, an elderly person holding the beliefs that millennials spend too much time on social media, therefore, have short attention spans is explicit bias. Implicit biases are unconscious cognitive processes that we are unaware of. For example, if you pass a man in dark alley way who reaches in his coat pocket and you flinch, whereas a woman doing the same thing elicits no reaction, then you have an implicit bias that men are more dangerous than women. Implicit biases reveal themselves when we are in fight-or-flight mode--our bodies automatic response to a stressful situation. Biases can hijack our rational minds. The amgydala--the part of the brain associated with threat-detection is activated when exposed to potentially threatening stimuli, or even sometimes unfamiliar stimuli, depending on personal experiences, disposition, etc. When this happens, our cognitive/analytical brain is essentially hijacked, overwhelmed by its emotional response. We cannot think rationally when under the control of our emotions. Emotional experiences have an intense impact on us, that is why they can create implicit biases after only one exposure to a certain stimulus/event, persist, and be difficult to eliminate. Our brains have evolved to take many shortcuts. We make automatic, unconscious inferences about people who are not like us. However, keep in mind our brains do not only look for these types of shortcuts in potentially dangerous situations. Our brain can never fully process all available stimuli, so it must efficiently search for shortcuts to interpret our senses. Unconscious inferences do impact us more in potentially threatening situations, however, they affect us all the time.
When evaluating this documentary, the documentary did not show many signs of bias. If anything, it may have tried to sway us to trust their opinion based on their authority (ethos). Their thesis, essentially, was that our brains are essentially racist, even if we do not morally believe in racism. To be honest, this makes me uncomfortable. I think it would have been great if they discussed complex individual differences in neurological wiring associated with racism. Then, perhaps, that would further open up the question--”Is everyone at least a little bit racist, from an implicit/neurological perspective?” I would argue there are significant differences across individuals’ degree of implicit bias/prejudice. They definitely tried to sway us in the direction that we are all neurologically prejudiced. That may have been revealed by their decision to neglect questioning individual differences, or perhaps they did not think of that. Regardless, that is definitely something they should address in the future. Perhaps if they were not biased to prove their point, they may have thought to address individual differences.
Ultimately, what I learned from this documentary is that our responses to stimuli, especially potentially threatening stimuli stems from a need to efficiently categorize people in the case we need to escape. This documentary opens up a lot of doors. It is clear that we have prejudice hard-wired into our brain’s neurons (individual differences may vary, in my opinion). Now the question is, how do we counteract something that has such deep physiological roots? I think the question will be difficult to answer, but the reward of finding the answer will be worth the struggle. As soon as we can reverse engineer our brain’s emotional responsiveness, we can get the most out of our cognition, and judge people more accurately.
1 note
·
View note
Text
Library Meeting
The consultation with the research librarian went well. Unfortunately, the Psychology research librarian could not meet with me, so I had to meet with a researcher that did not specialize in psychological research. However, she did give me a lot of valuable advice. My goal was to start off by finding databases that specialize in background information, so I could help give an overview for those of you who are not familiar with cognitive biases. Most of the databases were only giving information specific to certain cognitive biases, and were very advanced level readings. She introduced me to credo, a database that specializes in background information. Cognitive biases is a broad/difficult topic, so getting background information for those of you who are not familiar with it is important. Once exposed to a general view of cognitive biases, you can start to dive into more specific biases that have significant relevance to your daily life.
She also gave me some advice on finding primary sources on cognitive biases. One of the best databases to find any scientific information on is science direct. Sciencedirect has a lot of scientific/medical research, so it may have a lot of information on the neuroscience of why it is often difficult for us to break the neural connections in our brain--making it seldom we change our beliefs on something.
She also advised me to branch out my thinking on the subject as much as possible--to ask more questions. For example, instead of just thinking about what the source of cognitive biases is, maybe consider why cognitive biases are necessary for decision making? It may be a way of asking the same question in a different way, but ultimately it causes you to find different answers and make more connections. Connecting different ideas is important. Being stuck on the same question can lead to some dead-ends, causing stagnation. Sometimes a research problem is not that you simply cannot find the right answers, sometimes you are not asking the right question(s). Cognitive biases being a broad topic, there are many different research routes I can take. At the beginning of my research, I was narrowed on the neuroscience of cognitive biases. Now I am a lot more interested on external factors that hinder our ability to be rational. For example, I am now interested in how advancements in technology are making us less rational. One might think technology would only make us more rational/sophisticated, but I am confident I can find a lot of information to the contrary. The research consultation went well. I am now thinking about my research in an entirely new light, and I am equipped with the tools to find good information.
0 notes
Text
Cognitive Biases
Greetings, rationality seekers. My name is Joe. I’m a Psychology major, and I have a big interest in how our inherent cognitive processes limit our capacity to accurately assess the world. I love debate and thinking existentially, so it is important to me to become the most objective/rational person I can be. If you would like to know more about me, see my aboutme page. In this blog I am going to be discussing cognitive biases. Cognitive bias is a mistake in reasoning, evaluating, remembering, or processing information. Essentially, it is a subconscious/subjective form of judgment that affects decisions you make. This subjectivity results from our own experiences of the world. We all have our reasons we perceive the world the way we do. We want to hold onto our beliefs regardless of contradicting information.
This blog is to help you to understand cognitive biases, apply strategies to consciously minimize their affect, and, in turn, make better decisions. It is impossible to eliminate their affect, they are engraved deeply into the neural patterns of your brain. In order to minimize their effect, the emphasis of this blog is going to focus on the source of cognitive biases. Why do you hold on so strongly to your beliefs? What makes you miss pieces of information that are obvious? Why do you often sense/perceive what you expect? This blog is going to look at the physiological and psychological reasons for such mistakes. Once you understand the underlying reasons for your irrationality, you can catch yourself being bias in the moment, and approach problems with minimal distortion.I don’t think it is possible for one to completely eliminate their natural cognitive biases, but it is possible to minimize their effect.
According to www.psychologytoday.com our perceptions are "easily influenced," and that influence has potentially "dangerous" implications. Confirmation bias is when you interpret information in a way that confirms your preexisting beliefs, or you simply ignore contradicting information. For example, a teenager who becomes addicted to drugs might keep their grades up, so their parents ignores the behavioral and mood changes. Motivation has a dangerous impact on our perceptions as well. We do not want to believe our child or loved one could be suicidal.
Some of our mistakes in perceiving are not entirely our fault. Our cognitive biases can be exploited by media and advertisements. For example, according to www.psychologytoday.com, "as hundreds of thousands of people were dying in Syria, the media kept us focused on Gaza/Israel, sending nearly 700 journalists to that region, and hardly any to the neighboring states with masses amounts of people being murdered." The media narrowed our view, and we missed a lot of important information.
It is very important for your judgments of people to be as accurate as possible. It is how we decide who to befriend, associate with, and who to avoid. However, we have a strong bias that affects how we perceive people. This error is known as the fundamental attribution error. It is when we attribute someone's behavior to some internal characteristic, rather than external circumstances that caused them to do what they did. Not everything someone does is telling of their nature. For example, people may attribute someone's poor grades to "laziness," when really they have some type of cognitive or intellectual deficit. Awareness of this bias should cause you to not jump to conclusions so quickly, and consider possible alternatives before assuming negative things about someone.
Understanding cognitive biases can be difficult. You must process why you often make mistakes in processing information, while you are often unaware you are making a mistake. Do you see a problem? We are trying to dig into our subconscious, what we often consider "common sense," when really we are making a mistake in processing. It takes effort and interest to understand cognitive biases, however, I believe there is good reason for everyone to want to be educated on cognitive biases. They affect your personal life, relationships, goals, career, and many aspect of one's life. With enough effort, one can minimize their effect and make better decisions/judgments. I know most people consider themselves rational--and i don’t mean to sound condescending, but the reality is, everyone is naturally irrational. It takes effort to attain rationality, but the effort will lead to less feelings of confusion, impulsiveness, and ultimately empower you.
2 notes
·
View notes