A pre-med student's journey through space-time. Particularly interested in the women of STEM, disability politics, and community health. Come yell with me about research journal paywalls. Art by Jean-Baptiste Marc Bourgery.
Don't wanna be here? Send us removal request.
Quote
Open-mindedness can strike in even the most unexpected of places.
The Triumph of New Age Medicine
Turns out, maybe there’s something to be said for “quackademic” medicine after all.
0 notes
Text
Autism Speaks and Self-Advocacy in Mental Health
The mentally ill have a long history of being spoken over, derided, and ignored, going back all the way to the colonial period and further. Mistreatment, misrepresentation, and stigma are all part of their daily reality, so it is no wonder that in reaction, the mentally ill of today have formed their own self-advocacy networks and ideas. One such case is the controversy surrounding the organization Autism Speaks. Self-branded as an organization dedicated to “advances in understanding and treating the physical and mental health conditions that frequently accompany autism,” Autism Speaks was founded by the grandparents of a child with an autism spectrum disorder to promote understanding and research causes and interventions for autism. But many actual communities of autistic people and their families vocally oppose the organization, and for good reason. Too frequently, Autism Speaks doesn’t actually allow autistic people to speak. Their board of directors is almost entirely composed of doctors and family members of autistic people, with little to no representation for the demographic they supposedly speak for. They pour money into research to eliminate autism with cures and pre-natal testing, which implies that autism is a condition in need of a cure at all, rather than simply support and acceptance of difference. Autism Speaks frames autism as a “national tragedy,” as a burden on society, and in one infamous event, a board member at an event called Autism Every Day spoke publicly of a time when she was so distraught over her autistic daughter that she contemplated committing a murder-suicide by driving them both off a cliff.
Above: a representation of a puzzle piece as a symbol for autism, coined by Autism Speaks, and seeming to imply that autistic people have a piece missing
So autistic communities have taken the matter into their own hands, with organizations like the Autistic Self Advocacy Network, whose motto is “nothing about us without us.” They support respect and acceptance of difference, as well as equal education and employment opportunities for people with autism, and elimination of abuse and bullying related to autism. The autism rights movement is gaining traction in their endeavor to promote “therapies that focus on coping skills rather than cures that would imitate ‘normal’ or ‘neurotypical’ people.” And increasingly, other groups of mentally ill people are adopting similar self-advocacy practices, represented in organizations such as TotallyADD created by and for people with attentional disorders, and DBSA, an organization for people with depression and bipolar disorder with over half the board members having a mood disorder themselves.
It is clear that more severe forms of certain mental conditions can lead to debilitating distress or anxiety, and the fight to improve empathy and care for these conditions is absolutely worthy of activism. But other mental conditions, like ADHD or Asperger’s Syndrome, are treated by the medical community as illnesses not because they inherently decrease quality of life, but because they make it hard for people with these conditions to fit into society’s mold. It has become apparent that mental health patient advocacy is more nuanced than simply campaigning for better conditions. Patient advocates must fight both for higher standards of care, and for a decrease in the stigmatization of mental illness that leads to neuro-divergent voices being ignored in favor of neuro-typical ones. Doctors and families usually have the best intentions, but no one can understand a condition better than someone who lives with it themselves. And as the autism rights movement shows us, no one is more informed about a community’s needs regarding intervention than the community’s members.
The mental health self-advocacy movement has brought a number of critical questions to the forefront: at what point does neuro-divergence necessitate a treatment or a cure, and who gets to decide? Is it doctors, families, or patients who should determine care plans? Are chronic mental conditions something to be overcome, or should they be embraced as diversity? As the self-advocacy movement unfolds and expands, these questions and more will have to be addressed by society and the sub-communities it contains.
0 notes
Text
Superheroes and the Rise of PTSD
Amid the spectacle of our modern superhero craze, with Marvel movie budgets running in the hundreds of millions and pulling in billions in box office revenue, it’s easy to forget that the western superhero genre was born less than a century ago. The “golden age” of superhero comics began in 1938 with the creation of Superman. In the years following Superman’s big debut, comic book heroes were popping up left and right, and due to the pre-World War II era, the popularity of these new heroes quickly became a way to promote the war effort.
The first superhero with an explicit connection between his story and WWII was Captain America, a sickly young man named Steve Rogers who underwent a dangerous experiment to become a super-soldier so he could fight for his country. Captain America’s depiction as an analogy for the US military is made painstakingly clear. He wears an American flag as a costume, he carries an indestructible shield that he uses as a weapon, and he directly fights Hitler and the Nazis in his comics before the US even gets involved in the war. Captain America was everything a classic hero should be – noble and brave, strong physically and morally, and distinctly masculine. He was a dazzling piece of propaganda, helping to sanitize the war and turn it into a cause for people to get behind.
(source)
The Captain America of today is a very different individual. Like most modern superheroes, he has a grittier, more human edge to him now. He shows emotional vulnerability. He struggles with guilt. Captain America remains an allegory for the United States, but now he represents the people, rather than the military, and nowhere is that clearer than in his changed reactions to war.
In the modern Marvel movies, Steve Rogers is tired of war, and for excellent reason. At the end of WWII, he crashes a plane into the ocean, staying alive but in stasis due to his superhuman physiology, and is woken seventy years later to a world where everything and everyone he has ever known is gone, but war hasn’t changed a bit. Horrible things are still happening all over the planet, and it’s clear to Rogers that all the death and destruction of WWII didn’t make much of a difference in the grand scheme of things. In a more direct statement, Roger’s closest friend in the present day, Sam Wilson, is a counselor for veterans. Moreover, it’s strongly implied that Rogers himself has some form of PTSD from his experiences in the war, shown through Roger’s hyperactivity, inability to sleep, and vivid battle flashbacks.
youtube
This may seem like a dramatic departure from Captain America’s origins as a hyper-masculine pro-war propaganda machine, but Steve Roger’s narrative arc actually follows shifts in public perceptions about war quite closely. Before the advent of modern warfare with WWI and WWII, the beloved archetype of a valiant warrior charging fearlessly into battle was the predominant image associated with American soldiers. But in the aftermath of the World Wars, as soldiers came home with amputated limbs and deadened gazes, or in huge numbers didn’t come home at all, popular culture began to associate war with pain, shell shock, and death. As Dr. Tracey Loughran put it in her review Masculinity, Shell Shock, and Emotional Survival in the First World War, “The martial hero no longer took centre stage unchallenged. Instead, he was superseded by, or at least jostled for space with, the coward, the frightened boy, and the shell shock victim.”
Increasingly, modern superhero franchises are taking that shift in stride. Beyond Captain America’s handling of PTSD, other popular heroes have stories dealing with trauma as well. Iron Man is shown to have violent nightmares, Batman’s entire persona is crafted as a result of his trauma from his parent’s death, and torture was the catalyst that turned Bucky Barnes into the Winter Soldier. It would seem that the modern preoccupation with damaged superheroes is a reflection of society’s pessimistic outlook on conflict. Gone are the bright, virtuous heroes of the 40’s, replaced by grimmer visions of the consequences of war.
And for the most part, audiences are reacting really well to the change. Comics were long regarded as “junk entertainment,” as Jon Hogan puts it in his essay The Comic Book as Symbolic Environment: The Case of Iron Man. But, he continues, in recent years the added maturity of the tone and subject matter of comics have helped people “respect the medium as a means of telling stories of great depth.” And the record-breaking opening weekends and mushrooming fan engagement and transformative media are further evidence that the darker flavors of the modern superhero touch modern audiences imaginations like little else can.
I am curious to see if and how superhero narratives will reshape themselves as warfare becomes increasingly technological and remote. Perhaps it is overly optimistic to think that the US will never engage in a war the likes of WWI and WWII again, but it is doubtless that as the US military increasingly relies on drones and robotics, soldiers will be more detached from physical danger, and attitudes about soldiers and wars will change once again. Will the battle-fatigued specter of the modern war hero persist, or will we make a return to the idealistic, courageous warrior of last century as personal danger is removed from the equation?
0 notes
Quote
“Why would those who have made war on society or have been a burden to it be permitted to say what shall be done with their remains?” the Washington Post asked in an 1877 editorial. “Why should they not be compelled to be of some use after death, having failed to be of value to the world during life?”
(source)
This is in reference to the common practice of using the bodies of poor, POC, or imprisoned people as cadavers for medical school. Makes you think twice about the abhorrent costs of immoral science, and horrors bigotry can cause, and whether things have really changed as much as we think they have.
0 notes
Text
Magic: Science We Don’t Understand Yet
My class visited the Harvard Medical School museum yesterday. The curators seemed to know everything concerning medical history, rattling off details about anything from the mental status of President Garfield’s assassin to inaccuracies in fetal skeletal reconstruction. In passing, one of the curators mentioned that the museum owned a repository of information on witchcraft. I was intrigued, but witchcraft was never mentioned again. So I looked into it myself.
Upon first consideration, it might seem ridiculous to include magic in a collection of medical history. Broomsticks and cauldrons and other iconography strongly linked to our social perception of witchcraft seem to have nothing in common with the surgeries and vaccinations of modern medical practice. But historically speaking, magic and medicine have been so strongly linked as to be almost indistinguishable in records from the past, and the link goes as far back as Pliny the Elder’s Natural History, written in AD 77.
According to the Britannica article on the subject, the Natural History is notable for its attention to detail and proper sources, as well as for its novel assemblage of seemingly unrelated facts into a cohesive narrative. It was one of the few surviving texts detailing the lifestyles and practices of Ancient Romans, and as the sources Pliny used were lost to time, his work became the de facto textbook for a classical education. Thus, because Pliny spoke about magic in relation to scientific and medical innovation in his work, giving both the same weight, the two were conflated. Pliny’s work was largely unchallenged for centuries, and it was only in the late 1500’s that leading scientists began to truly reject Pliny’s teachings. By then, the link between magic and medicine was thoroughly embedded in western culture.
Thorndike’s History of Magic and Experimental Science, published in 1923, gives a rich account of Pliny’s attitudes towards magic and science. Pliny’s works are filled with references to magic, and he attributes magical tendencies to major historical figures ranging from Plato to Moses to Pythagoras. However, because “nearly half the books of the Natural History deal in whole or in part with remedies for diseases,” magic is linked most strongly to medicine. Pliny states that “no one doubts” that magic “originally sprang from medicine.” He believed that magic and medicine developed together, with magic as a subversive, false counterpart with only echoes of truth. Pliny expresses bitterness that false magic casts doubt into the minds of patients as to the validity of any medicine at all. Although Pliny’s distaste for magic was rooted in his disbelief in its validity, he certainly seemed to unselfconsciously believe in very similar things. For example, he cites herbal treatments which require that the plant be picked with the left hand or without looking backward, and he seems to believe in the abilities of herbal concoctions to grant grace, glory, favor, and luck. His belief only stops with the truly nonsensical, like herbs with the ability to dry up rivers or open locked doors. When considering Pliny’s confused attitudes about magic and superstition, society’s love-hate relationship with magic and suspicion of scientists and medical professionals as magic practitioners are unsurprising.
Jumping forward to the 15th century, double-edged reliance on and mistrust of medicine in Europe is still alive and well, and nowhere is this clearer than in society’s treatment of midwifery. In 1484, Pope Innocent VIII wrote an edict explicitly condemning witchcraft and approving the “just correction, imprisonment, and punishment” of witches, kicking off centuries of Inquisition-style witch trials. One of the most commonly targeted demographics in these trials were midwives. In these days, witches were commonly thought to use the corpses of babies in their magic, so naturally, midwives were suspect, because they provided contraceptive and abortive treatments as well as delivering still-born infants. Midwives were further targeted because they were often capable of providing women in labor with pain relief, a seemingly magical marvel. And it may not have been mere unsubstantiated suspicion: a paper written by Thomas Forbes in 1962 suggests that 15th century midwives often turned to occult activity out of desperation due to their low socioeconomic status (though this same paper seems to believe that witchcraft is a real phenomenon, so this must be taken with a grain of salt.)
Magic was deeply associated with the medical practices of non-European cultures as well. We can see this clearly in the trope of the “witch doctor”, a figure associated with African cultures and indigenous tribes of the Americas. Britannica defines witch doctor as “a healer or benevolent worker of magic in a nonliterate society” and classifies the term as a pejorative, and indeed, the term seems meant to imply the sort of suspicion and doubt that the western world has always applied to witchcraft. The dichotomy between the image of a revered, educated medical professional and a superstitious, illiterate magical healer highlights the way that connotations of magic are used by western academia to emphasize the otherness and unreliability of practices that don’t fit into the established medical power structure.
Records of unexplained events where the word “magic” is never used are telling as well—often, seemingly magical practices were reframed as religion. A 1953 article by Humphrey Humphreys explains it well: in the early days of medicine, disease was thought to be caused by the devil, so medical practitioners were primarily concerned with purging demons from the body. Priests and other religious figures were commonly called upon to heal the sick by excising evil spirits. Though we’re unaccustomed to considering the spiritual acts of religious leaders to be magical, the chanting, ritual, and symbolism of an exorcism certainly resembles our perception of witchcraft. Religion and religious healing are, in many ways, simply a benign form of magic. (And perhaps it is worth thinking twice about why unaccountable phenomena coming from white men with social capital are called religion, while the same phenomena coming from women and purportedly “uncivilized” cultures are called witchcraft.)
The traditional ties between magic and medicine continue to this day. In certain parts of the world, healing magic is still a common component of medical intervention, with as high a success rate as modern western doctors. Scientists seek to explain seemingly-impossible cures with catch-all terms like placebo. According to the World Health Organization, “alternative medicine”, including acupuncture, herbal cures, and hypnosis, has a global market of $60 billion US dollars, and 25% of modern medications come from plants originally used traditionally. Unresearched, faith-based, and ritualistic forms of medicine are still around and as influential as ever.
This all begs the question: what is the role of magic-like health care in modern medicine? The knee-jerk reaction of a modern scientist would be to reject magic altogether. Everyone knows that magic isn’t real, after all. But non-western and non-masculine medical professionals, modern pharmacology studies, and centuries of accumulated cultural knowledge say otherwise. While it may be true that there are no mystical, inexplicable forces that allow us to manipulate nature at a whim, it is hard to deny that there are facets of science and medicine that we don’t yet realize, and the line between unexplained and impossible is often blurred. Perhaps it is time to stop automatically discrediting the collective wisdom of centuries of medical practitioners, reframe so-called “alternative” medicine as science we don’t understand, and start taking it as seriously as our forebears did.
Leave a comment!
2 notes
·
View notes
Text
This is no humbug-- or is it?
I went with my class to visit the Massachusetts General Hospital Ether Dome this week. The first public surgery using ether as an anesthetic took place there in 1846, and between then and now, the mythos of the Ether Dome has developed a life of its own.
(image source)
In the early days of modern medicine, surgery was a messy, brutal affair. The best a patient could hope for in terms of pain relief was a shot of hard liquor and something to bite down on, or a punch to the jaw to knock them cold (which, by the way, has potential to cause serious brain damage). Doctors aimed to be as quick as possible with the procedure to minimize suffering, but even after the surgery itself, the danger was far from over. Patients were known to die from the sheer trauma and blood loss, and infection ran rampant. One paper authored in 2005, claims that infections followed “practically all operations,” killing “almost half of all surgical patients.” Surgery was an absolute last resort, and for good reason.
So it was no wonder that when ether was first proven to effectively erase all pain and much anxiety from the experience of surgery, it was hailed as “the greatest gift ever made to suffering humanity.” The first public demonstration of ether took place on October 16th, 1846, during a surgery to remove a tumor from Edward Gilbert Abbott. Onlooking students were shocked by the lack of response from Mr. Abbott, and when he woke from his stupor and said he’d felt nothing, surgeon John Warren turned to the crowd and exclaimed the now famous quote,”Gentlemen, this is no humbug!”
And so the legend of ether began. A memorial to ether was built in the Boston Common, titled the Good Samaritan and covered in religious imagery and classical references. Paintings were commissioned, donations were given, and everyone wanted to have their name associated with what was widely considered one of the greatest innovations in medical history.
But inevitably, as people exchanged their money for influence, the narrative surrounding ether shifted. One of the most striking things I noticed from our trip to the dome was the story behind the painting The First Operation with Ether, painted by Robert C. Hinckley in 1894, 38 years after the event itself. The painting depicts the surgery taking place, with the unconscious patient surrounded by doctors and observers. Hinckley spent fourteen years researching and completing the painting, extremely mindful of the details, but he made the decision to add in major medical figures of the time who weren’t actually present at the event. Even the title, “The First Operation with Ether,” is false, because the first operation with ether actually took place four years before the public exhibition. Because of the inaccuracy, the painting received a great deal of controversy, but to this day, it is one of the most famous works of art in the medical field.
(The First Operation with Ether, 1894)
Propagandic, misleading, and fabricated narratives surrounding the Ether Dome are numerous within the room itself as well. In one corner of the room was a classical Greek statue, seemingly made of marble, and in the other an Egyptian mummy donated to the hospital in 1823, relics that have little connection to the surgical practices of the dome but are clearly meant to imply ties to an ancient lineage of civilization. Chairs in the seating area have plaques with names on them, commemorating donors and physicians and seeming to imply that these people were involved in the creation of ether. But you would be hard-pressed to find a donor who was even alive at the time of the historic surgery.
I think the question I’m getting at is: what should we believe from the stories history tells us? It is doubtless that the discovery, testing, and popularization of ether as an anesthetic has done humanity a great service. But the near-religious fervor surrounding ether and the keenness of the medical community to associate themselves with it has led to the propagation of myth and ignorance of the medical efforts which led up to ether’s popularization. So how can we, laypeople and medical experts alike, show our appreciation for scientific innovation without turning history into propaganda?
Send in your replies!
0 notes
Text
Discuss: Individualized Medicine
Something interesting I read today, from Charles E. Rosenberg’s The Tyranny of Diagnosis:
Disease categories have always linked knowledge and practice, necessary mechanisms for moving between the idiosyncratic and the generalizable, between art and science, between the subjective and the formally objective.
What really grabbed me was the distinction between idiosyncratic-- meaning individual-- and generalizable-- meaning universal. This begs the question: why does medicine have to be generalized?
It wasn’t always this way, as Rosenberg explains later in the essay. Traditional medicine had its roots in the individual, focusing on symptoms and without a clear, statistically-based idea of what course the sickness would take. As Rosenberg states, a common cold could pass without consequence or could turn into fatal pneumonia, and either result seemed equally likely.
In fact, many doctors in the 19th century expressed skepticism about the value of formal medical classifications. Some didn’t even believe it could be done. It was only when doctors became capable of isolating the specific cause, or mechanism, of a person’s symptoms that formal diagnosis began to catch on. More and more, doctors began to view disease as something that existed independent from the body. (Fun fact: this shift in thinking came even before it was widely accepted that germs existed.)
There is certainly some validity to both schools of thought. On the one hand, the wide-scale preventative healthcare we have today, which has so drastically improved our collective quality of life, relies on the validity of medical statistics. If we didn’t believe that heavy metals are toxic, that vaccinations reduce disease rates, or that condoms prevent the spread of STI’s, then a lot of the health infrastructure we count on today wouldn’t exist.
But on the other hand, one of the great quandaries of modern medicine faced by practitioners and lay-people alike is the degree to which a formal diagnosis can adequately explain a patient’s personal experience. When our symptoms don’t fit neatly into a single diagnosis, or fit into too many, our treatment can be delayed or over-complicated. Worse, parts of our personalities that don’t have anything to do with sickness can be pathologized, or our suffering can be ignored altogether.
Luckily, with advances in technology and practice, we’re moving closer and closer to a kind of medicine that has all the scientific rigor of generalized diagnosis, while still taking into account the variability of symptoms between different circumstances and different patients, and the social norms and stigmas that define what should be called a “disease”. Stem cell research and personalized gene therapy will soon let doctors create treatments more closely aligned to a patient’s needs than ever before, and social justice advocates actively fight to de-pathologize people in LGBT, neurodivergent, and other “deviant” communities.
Personally, I think there is a lot of value in standardized medicine. It allows doctors to use generations of accumulated knowledge to pinpoint what might be the cause of their patient’s problem. Better yet, it lets the general population have a sense of what healthy looks like, so they can better care for themselves. But individualized medicine is still an ideal we should strive for, because when a doctor won’t give you a formal diagnosis, or brands you with a diagnosis you don’t need or want, your pain can slip through the cracks. And that is a sort of tyranny.
Thoughts?
#discuss#individualized medicine#history of medicine#history of science#social justice#musings#6/26/17
0 notes
Photo
A side-by-side comparison of medical perspectives of antiquity in Chinese vs. Western tradition. The Chinese image focuses on acupuncture tracks, while the Western image emphasizes muscle groups.
0 notes
Quote
Each of us calls this object My Body; but we give it no name in ourselves, that is to say, in it. We speak of it to others as of a thing that belongs to us; but for us it is not entirely a thing; and it belongs to us a little less than we belong to it....
The Expressiveness of the Body, Shigehisa Kuriyama
#quotes#books#history of medicine#eastern medicine#western medicine#this book is about the perceptual divide#between an eastern medical system that focuses on acupuncture#and a western system that fixes on muscles#and how that came to be
1 note
·
View note
Text
Obamacare v. AHCA
There seems to be a lack of media coverage on this, and the GOP wants to vote on Thursday June 29th, which is next week. Figured I’d compile a list of facts.
The Congressional Budget Office predicts that the percent of uninsured Americans will almost double with AHCA as opposed to Obamacare, with projections increasing from 10% to 18%.
Under AHCA, Medicaid spending will be cut by $834 billion, and the expanded medicaid coverage that Obamacare offers, adopted by 30 states to help low-income adults, will be gone by 2020.
States will receive a fixed amount of federal funding every year, as opposed to the current model, which allocates funds based on how much care that state’s patients need.
Insurance premiums will decrease somewhat, but insurance plans will cover less and out-of-pocket costs for medical care will increase, counteracting any benefit from lower insurance premiums.
Older people with low incomes who live in high-cost areas like Alaska and Arizona will be most dramatically affected, while young people with high salaries living in areas with low insurance premiums, like Massachusetts, may actually benefit.
Under Obamacare, insurance subsidies are higher for those with lower incomes and more expensive premiums. Under AHCA, subsidies aren’t tied to either of those factors, but rather to age.
Under Obamacare, people have to pay a tax penalty if they don’t buy insurance. Under ACHA, this tax is gone, but if you go more than two months without coverage, you have to pay a surcharge of 30% of the premium when you try to buy a new plan.
Obamacare has protections against denying coverage to people with pre-existing conditions, annual and life-time insurance caps, and hiking premiums for people who get sick. AHCA repeals these protections.
ACHA eliminates the Obamacare mandates that insurers provide a basic set of benefits, including maternity care and contraceptives.
Planned Parenthood will no longer receive any federal funding for any service, even though abortions already can’t be funded by federal money.
The revised AHCA is estimated to cut the deficit by $118.7 billion, significantly less than the original version which promised to cut the deficit by $337 billion.
Medical device makers, insurance companies, and wealthy Americans all receive a big tax cut as federal health spending is slashed.
Source
*Edit: a previous version of this post said that Medicaid costs will be cut by $834 million. Actually, it will be cut by $834 billion, a thousand-fold more. My mistake.
0 notes
Quote
We all move uneasily within our restraints.
An Unquiet Mind, by Kay Redfield Jamison
#quotes#books#mental illness#if you've ever known the heartbreak of weighing life-changing medications#against huge changes to your personality#then give this a read
0 notes
Text
Discuss: Whiggism
Whiggism [hwig-iz-uh m, wig-] : the belief that all of history is progress, leading inevitably from one state to a better state.
To me, this seems to be the predominant line of thinking in a lot of liberal academia. See: futurism, post-capitalist utopia, etc. There’s a general attitude that we are superior, both technologically and morally, to our predecessors. And in many ways, I agree. But that belief also implies that all the information we have today is a complete synthesis of everything we’ve learned before, with no loss of knowledge. We know that isn’t true (library of Alexandria, etc).
Even in moral territory, there’s no guarantee that the human race, or any particular society thereof, is the best it’s ever been. We’ve seen societies regress morally, at least by our standards. And going by the maxim of moral relativity, another common line of thinking in modern academia, there may not even be such a thing as absolute moral superiority. I dunno.
Thoughts?
Further reading
0 notes
Text
Happy Birthday to Susan La Flesche Picotte!
In honor of Ms. LaFlesche 152nd birthday, here’s a quick biography!
LaFlesche was the first Native American to earn a medical degree in the United States in 1889, overcoming significant systemic barriers for both Native Americans and women of the time. She graduated as valedictorian of her class at the Women’s Medical College of Pennsylvania. At the time of her graduation, women were not allowed to vote, and Native Americans weren’t considered citizens of the United States.
WMCP, class of 1889. LaFlesche is second row from back, fourth from right.
Upon completing her degree, LaFlesche returned to her home reservation as a physician. So many of her constituents insisted on being treated by her that her white counterpart quit, leaving LaFlesche as the only doctor in 1,350 square miles of reservation land.
In addition to her training as a doctor, LaFlesche spoke four languages, played piano, and was very well-read. She campaigned strongly for public health measures like screen doors to keep out insects and the elimination of communal drinking cups, and resolutely fought back against the rapid spread of alcoholism in the community.
LaFlesche played many roles in the Omaha community, acting as their advocate until her death in 1915. Before she died, she managed to raise enough funds to build the reservation hospital she’d always dreamed of, the first in the county.
LaFlesche’s hospital, funded entirely by privately raised funds.
Today, LaFlesche’s heroic legacy is upheld in the continual fight for equality and good health care for Native Americans. To contribute to the cause, check out the First Nations Development Institute!
Learn more from these sources: 1 2
8 notes
·
View notes