#but the important part is that women and men are not nearly as segregated in roles as they are irl
Explore tagged Tumblr posts
grouchythefish · 1 month ago
Text
when people draw murderbot as androgynous like it is described in the books instead of just defaulting to entirely masculine features: do you know I would die for you?
34 notes · View notes
hindahoney · 2 years ago
Note
Hey, I go to a combined Liberal and Reform shul and wanted to debunk some of those misconceptions on your post.
Being Reform/Liberal is not at all, even slightly, about level of observance. We have many, many men in kippot and tzitzit with payot who come to our synagogue to pray every single week. We wear kippot out and about in town, we wear Magen David proudly. We sing our prayers with all of the life and vigour of any Jews. Many of the people who pray with us also attend classes with the rabbi twice per week, in their own free time. I personally study Talmud and biblical Hebrew with my Reform rabbi every week. We have people who keep kosher extremely strictly, more than people who don't. Jewish history is hugely important to us and we honour our ancestors every single day.
Reform Judaism is just about having slightly different values to Orthodox. In shul, we are taught that the difference between us and Orthodox Jews is that Reform Judaism adapts expectations of Jewish people to be reasonable for living in the modern world, whereas Orthodox values tradition and keeping things the exact same way they have been for thousands of years. The rules about electricity use on Shabbat are loosened to allow people with hearing aids to be spoken to, to allow powered wheelchair users to leave their homes, to make sure every Jew has the opportunity to get in touch with their emergency contacts. There is no "better" or "worse" denomination, only ones that fit each individual Jew best, if any.
We still abide by kosher and the teachings of Torah, but we do not place pressure on other Jews to do the same. We do not shun or scold others for not abiding by these laws, and are open-minded to the possibility that they have very good reasons for not doing so.
We adapt some traditional ceremonies, such as holding a B'nei Mitzvah for non-binary children, and adapting conversion ceremonies for trans and non-binary adults. Jewish law is much more de-gendered in a Reform setting, with the same expectations and freedoms afforded to both men and women. Many of us choose to keep to traditional gendered roles and expressions, but queer Jews are celebrated even though they are different.
We are absolutely not Jewish "in name alone". A Jew is a Jew is a Jew. Some of us are very very religious and frum, others are not, but every Jew is always welcome at our shul, because this is a community space that does not ask any Jew to 'prove' they are Jewish enough to join in with our customs, and pray with us during service.
I am disabled and queer, and due to my circumstances I must choose how to live my life Jewishly in a way that suits me. I would not be able to do nearly as many mitzvot if I tried to meet Orthodox standards -- because my needs for care and assistance would break the laws of shabbat, and I could not live up to gendered Orthodox standards very easily as a non-binary person. This is why I choose to pray at a Reform/Liberal synagogue instead of an Orthodox one -- I am more able to do mitzvot in a Reform/Liberal context. While I know there are many Orthodox synagogues that would accept me anyway, it's always a case of trying to work out which congregations I can feasibly become part of, whereas with Reform Judaism I know that I will almost never find any difficulty or judgement.
Being Reform is just another way of practicing Judaism. It isn't lesser, and it isn't less serious, or less religious, or less frum. Really, we are just like you. I think the world would be better with less segregation between denominations. Anti-Orthodox sentiment makes me sad, but I very rarely encounter Orthodox Jews who respect Reform Judaism for what it is. A lot of us don't feel safe in Orthodox synagogues because we are shunned there.
I understand feeling more comfortable in a reform shul because of their gender or sexual identity. Though it has changed pretty drastically in the past few decades and there are many more groups to help gender non conforming and queer people feel more comfortable in orthodox spaces, there are still many who hold strongly to gendered traditions.
However, I need to point out that orthodox Jews do still wear hearing aids and use motorized wheelchairs and pacemakers. If it is a medical necessity it is permitted. In any case, I do not forsee anyone judging someone else for using a medical device on Shabbat.
Thank you for sharing. I do feel like this cleared things up for me!
22 notes · View notes
nanshe-of-nina · 3 years ago
Photo
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Favorite History Books || Women in the Middle Ages: The Lives of Real Women in a Vibrant Age of Transition by Joseph and Frances Gies ★★★★☆
What are the elements that affect a woman’s life? Recent works in women’s history have tended to focus on the status of women relative to men. But the first and most important consideration in evaluating the quality of life in the Middle Ages applies equally to men and women: the technological and economic level of a low-energy but expanding society, influencing work, housing, food, clothing, health, security, comfort, and self-fulfillment
A second basic element, affecting only women, is the state of obstetrical practice. Throughout the ages, until antisepsis and improvements in obstetrical techniques arrived in the nineteenth century, childbirth was a mortal hazard. Rich or poor, women suffered and were injured in labor; often they died. A medieval gynecological treatise, The Diseases of Women, from the medical school at Salerno, reflects the problems and horrors of childbirth in the whole pre-industrial era, during which doctors and midwives had few aids other than potions and poultices. Nevertheless, amid prescriptions for rubbing the woman’s flanks with oil of roses, feeding her vinegar and sugar, powdered ivory, or eagle’s dung, placing a magnet in her hand or suspending coral around her neck, the Salernitan text also gives sound advice, for example on breech delivery: “If the child does not come forth in the order in which it should, that is, if the legs or arms should come out first, let the midwife with her small and gentle hand moistened with a decoction of flaxseed and chick peas, put the child back in its place in the proper position.” Although abortion, with its own dangers, was practiced from very ancient times, contraception, by various methods—mechanical, medicinal, and magical—found limited use and even less effectiveness. Women had babies, successfully or otherwise.
Several other special criteria apply to the quality of a woman’s life in any historical setting. First, simple survival: in many times and on different continents, women have been victims of infanticide as a technique of selective population control. The reason, although usually rationalized in terms of the female’s alleged weakness of physique, character, and intellect, is transparently economic: the contribution in work of a daughter was often out-weighed by the cost of raising her and giving her a marriage portion: investment in a daughter went mainly to the profit of a future husband. Second, conditions of marriage: the question of consent; the relative age of consent for men and women; monogamy versus polygamy, which emphasizes woman’s biological role at the expense not only of her personal, but of her social and economic roles; the seclusion of women in harems or gynaeceums, or their “privatization” at home, where they were segregated from the male spheres of business, politics, and religion; attitudes toward adultery and divorce, where a double standard nearly always prevailed.
One of the enigmas of history is its pervasive misogyny, in prehistoric and ancient times, in the Middle Ages, into the modern era. Anthropologists and historians have turned to Freud and Marx for explanations: men feared women’s sexual functions, or hated women because their mothers had failed to gratify their Oedipal longings; or they derogated them, in Engels’s words, as the “slave of [man’s] lust and mere instrument for the production of children.” From ancient times, societies have attributed sinister magical powers to women, particularly to their physiology. Pliny the Elder (first century A.D.) reported that some products of women’s bodies had marvelous properties. The odor of a woman’s burned hair drove away serpents; its ash cured warts, sore eyes, and diaper rash, and, mixed with honey, assuaged ulcers, wounds, and gout. Woman’s milk cured fevers, nausea, and many other ailments. The saliva of a fasting woman was “powerful medicine for bloodshot eyes and fluxes.” Furthermore, “I find that a woman’s breast-band tied round the head relieves headaches.”
... Aside from the mystery and magic of woman’s physiology, a historically persistent male attitude toward sex bred misogyny. Wherever sex was regarded as a weakness on man’s part and rigid codes of sexual morality were adopted, women were feared and mistrusted for their very attraction. A final element in misogyny lies in the nature of patriarchy: where males dominated, females were “other,” secondary, inferior.
Christianity was in theory egalitarian in respect to sex as to race and class: “For ye are all the children of God by faith in Christ Jesus,” wrote Saint Paul. “There is neither Jew nor Greek, there is neither bond nor free, there is neither male nor female; for ye are all one in Christ Jesus.” Unfortunately Paul muted this ringing declaration by ambivalence in his other writings, and although it stirred echoes in later sermons and texts, equality, whether between man and man, or between man and woman, was never a medieval doctrine. Theories of equality between men belong to the eighteenth century, between man and woman to the nineteenth.
In the Middle Ages there was in fact little feminine awareness, little consciousness of women as women. In spite of their disabilities, there was no protest—no “sobs and cries” of their “aeons of everyday sufferings,” to quote a modern critic.⁸ One of the few women to speak as a woman in the Middle Ages was Christine de Pisan, poet at the court of Charles VI of France, at the close of the fourteenth century. There were other women poets in the Middle Ages, but Christine is unique in speaking up for women, and in her awareness of the special role and condition of women. She was, in fact, one of the few true feminists be- fore the modern era.
72 notes · View notes
probablyasocialecologist · 4 years ago
Text
The legacy of the United States’ founding racial territorial conquest and domination can be read off the Department of Labor’s occupational data. In 2020, the whitest and most racially segregated job in this settler state was the appraising of property (96.5% of appraisers are white), and the second whitest was managing a farm (96.3%). It is hardly a coincidence that the largest farmland owner in the United States is one of the country’s richest men: Bill Gates.
In the United States, the legacy contradictions within the food system are particularly acute. Seven out of the 10 worst-paying jobs in America are in the food system, and women are overrepresented in them. Nearly a third of families headed by single mothers are likely to be food insecure, and food insecurity is systematically higher in communities of people of colour.
There has been a boom in the service industry over the past decade. Yet when that work is occasional or characterized by shift work as so much food service work is, you’re more likely to be hungry. In part, that’s because many precarious jobs are low-wage and depend on tipping. Tipping was a European feudal relic imported to the United States by the well-travelled Victorian-era American upper class. Initially, it was widely reviled. Even as late as 1905, it was possible to find restaurants in St Louis with signs in the window announcing “No tipping! Tipping is not American.”
77 notes · View notes
96thdayofrage · 4 years ago
Text
Tumblr media
The exhausting depravity of Floyd’s death—the indelible image of Chauvin’s knee pinched into Floyd’s neck as fellow officers looked on with indifference—served as a vivid illustration of a fact Black activists have long known: that police brutality is not only endemic in the United States, but in Minneapolis specifically. The Minneapolis City Council seemed to recognize this last summer, when city officials announced a commitment to substantive police reform—up to and including the possibility of replacing the Minneapolis Police Department (MPD). Such bold action was unthinkable prior to 2020.
In the weeks prior to Wright’s death, city legislators approved, or at least considered, a series of specific reforms of the MPD. In March, legislators put forward a ballot measure (to be voted on in November) to amend the Minneapolis city charter to deputize the city council with the authority to make significant changes to the city’s policing: once this has been approved, actions being considering include putting the MPD under the supervision of both the mayor and the city council (the MPD currently answers only to the mayor), in some way curtailing the growth of the Minneapolis police force, and even outright replacing the police department with an “office of public safety.” Piecemeal yet long overdue changes such as banning chokeholds and mandating that officers document when they unholster their firearm already preceded this effort to reform the police via charter amendment.
While these reforms appear ambitious, they are far from the coordinated effort to “defund” the MPD that they were initially billed as being. Indeed, in many senses they coopt the language of defunding, made popular by young activists, to attempt to sneak through the exact opposite: though they would reduce the growth of the police force, they would not in any obvious way address the militarization that has made the MPD into one of the most violent police forces in the country.
The violence of the MPD is, of course, part of a national story. As scholars such as Elizabeth Hinton, Stuart Schrader, and Naomi Murakawa have shown, modern, militarized U.S. policing arose collectively out of postwar liberalism: though its precise manifestation has varied regionally, all U.S. policing relies on a rationale of “security” as a pretext for regulating the behavior of poor people and communities of color that were the intended recipients of social reforms since the 1960s. Out of this has arisen everything from punitive “tough-on-crime” policies—historically popular across the political spectrum—to “preemptive” policing initiatives such as Broken Windows and Stop and Frisk.
Unfortunately, our collective notion of what would constitute ideal police reform has its roots in this same context of postwar liberalism, in which private responsibility and collective securitization remain the ultimate goods that are sought. Since World War II, liberals have emphasized regulating individual behavior to correct the inequalities that police often reinforce—the sanctity of Black communities contingent on the ability of the police to “restore peace.”
In the specific case of Minneapolis, for example, the failure to curtail police brutality—despite numerous waves of well-intentioned liberal reform efforts beginning as early as the 1920s—derives precisely from the limitations of those who sought transformative racial justice, not because of the efforts of reactionaries to undermine those reforms. At many points in the postwar history of Minneapolis, police reform efforts were led by the very progressives who had helped militarize the MPD in the first place.
This was in no small part a result of progressive ideological commitments about the origins of racist policing. Believing that racist policing was mainly caused by what we’d now call implicit bias, Minneapolis progressives sought to remake the psychology of white police officers, compelling cops to interrogate their biases—all while encouraging greater presences of police officers in Black communities and while downplaying systemic and overt racism. Minneapolis progressives thus approached police reform with the premise that policing could be made more effective, more precise—and that better, not less, policing was essential to racial justice and improved race relations.
This legacy still overshadows police reform in Minneapolis, and the recounting of this history that follows—of how so many good-faith efforts failed catastrophically—should leave us deeply skeptical about the enterprise of police reform in its entirety. If Minneapolis and the nation are to escape the shadow of failed decades of police reform, they must reckon with this history. And they likely need to jettison the rubric of police reform and seek out more promising ways of conceptualizing the path toward racial justice and a society free from violence.
At many points in the postwar history of Minneapolis, police reform efforts were led by the very progressives who had also helped militarize the MPD.
Historically, Minneapolis was a very white city: during the Civil War, its Black residents totaled fewer than 300. By 1940 the city’s population was still less than 1 percent Black, and most were segregated in Northeast Minneapolis, confined by racial covenants placed on real estate. The percentage of the city that was Black remained below 5 percent into the 1970s, at which time it began increasing dramatically during the same period that saw whites abandoning urban centers nationwide. The city is now nearly 20 percent Black.
Despite its relatively small number of Black residents, in the 1920s Minnesota became a Midwestern hub of the Ku Klux Klan. The 1920 lynching of three Black men in Duluth spurred the passage of a state anti-lynching law in 1921 (led by Black activist and women’s rights advocate Nellie Griswold Francis), but it also stoked recruitment efforts for the Klan. The state became home to fifty-one chapters of the KKK, with ten in Minneapolis alone. Klan parades enveloped streets in Minnesota on weekends and Labor Days. Minneapolis Klan chapters counted a number of police officers as members—a fact which certainly continued to be true even after police affiliation with the Klan was officially forbidden by 1923.
Despite this, the same era saw some of the first efforts to professionalize the MPD and address issues of racial bias within the force. In 1929 the MPD hosted its first training session for police, which sought to make police ��the foremost experts on crime prevention in the community.” However, professionalization as the route to police reform obscured the racial contours of policing in the city. As Black residents sought to integrate the city’s neighborhoods, incidents of indiscriminate police violence made headlines in Black newspapers, including the assault of 2 Black men and one Black woman by two drunk off-duty police detectives in July 1937. The beatings launched an investigation by the Minneapolis NAACP and the demand for the officers “to be immediately dismissed from the force,” but at least one of the accused detectives, Arthur Uglem, remained an MPD detective years beyond the event.
In Minneapolis, as in many parts of the North and Midwest, World War II would bring greater employment to Blacks, with African American workers integrating industries—for example, the Minneapolis garment industry. The war also ignited progressives’ attention to racial inequality in U.S. cities. Near the war’s conclusion, Swedish sociologist Gunnar Myrdal published his era-defining text, An American Dilemma (1944), which argued that racism was an aberration within democracy and a betrayed of U.S. ideals. For U.S. democracy to be actualized in global terms, racial inequality would have to be vanquished at home.
Midcentury’s attempts to reform the MPD emerged from this context. Lyndon B. Johnson’s future vice-president, Hubert Humphrey—then a young, relentlessly energetic left-liberal—would lead the charge to overhaul the department. Elected mayor in 1945, Humphrey was acutely aware of Minneapolis’s history of racism and actively pursued policies to rectify past injustices.
Humphrey also saw civil rights—and particularly police reform—could be a coalition-building issue for moderates against the state’s more radical left, which was calling for more dramatic forms of racial redress. The creation of a Minneapolis Fair Employment Practices Commission (FEPC) to report issues of discrimination in employment; a “Mayor’s Council on Human Relations” to document public acts of prejudice against Black people; and other anti-racist measures sought to outflank the left by drawing voters to Humphrey’s coalition.
Humphrey’s ambition was tempered, however, by the limits of his office. Mayoral powers in Minneapolis were—and remain—restricted; power lies mainly with the city council. The mayor has no budgetary authority and has little say in governmental appointees. Still he did have important control over one appointee: the police chief. For Humphrey, “the quality of law enforcement is as good or bad as [the mayor] decides,” he would write in his memoir. Shortly after taking office, he appointed Ed Ryan, known for his hostility to MPD corruption, as chief of police.
As mayor of Minneapolis, Hubert Humphrey assumed that the solution to policing reform lay in more efficient bureaucracy, not less policing. Indeed, Humphrey enlarged the size of the MPD, in part to guarantee more complete coverage of Black neighborhoods.
Under Humphrey’s orders, Ryan sought to rid the department of racism. Ryan supervised a series of training seminars for MPD officers so that they could “be prepared for any disturbances resulting from racial prejudices.” Some of these seminars, held in 1946, were led by Joseph Kluchesky, a former Milwaukee police chief and pioneer in anti-bias training. Kluchesky encouraged officers to give anti-racism talks at elementary schools and in general recommended greater police liaising with Black community ambassadors—clergy and other members of the Black middle class—who together could work toward a vision of “impartial law enforcement.” The seminars thus put the partial onus for improved policing on African Americans themselves, and suggested that the solution must include Blacks being willing to welcome greater police surveillance in their communities. As historian Will Tchakirides has argued, Kluchesky’s seminars “emphasized fixing Black behavior ahead of addressing the economic underpinnings of racial inequality.”
Humphrey’s police reform efforts assumed that the solution lay in more efficient bureaucracy, not less policing. Indeed, Humphrey enlarged the size of the MPD force, in part to guarantee more complete coverage of Black neighborhoods. Convinced of his faith that racism could be educated out of the body politic, Humphrey’s attitudes reflected Myrdal’s view of racialized policing and racial “discrimination to be an anomaly, something practiced by a few bad people.” Humphrey’s approach to police reform thus encapsulated a project of racial uplift through a preponderance of police: mandating greater communication between police officers and the public, and ingratiating police officers into the community to rectify inner biases that manifested in police assaults.
Police reform also took on the broader aims of Cold War liberalism—an effort to ensure personal beliefs did not hamper the country’s teleological march toward its democratic providence. In a pamphlet distributed to all members of the MPD, Kluchesky argued that it was the “Nazi technique to pit race against race,” and it was the responsibility of all police officers to reject such methods and “to preserve for posterity the splendid heritage of democracy.” Police reform in Cold War Minneapolis thus entailed a mission of aligning policing with an idealized image of U.S. democracy.
The overt antiracism of the Humphrey administration—the mayor’s principled dedication to extirpate racial discrimination from the minds of Minnesotans, the passage of the Minneapolis FEPC, attempts to eliminate racial covenants in housing—left many with the impression of Minneapolis as an enlightened city.
Yet despite this hype, police brutality regularly resurfaced within the MPD. After the police beating of Black Minneapolis resident Raymond Wells by two MPD officers on May 19, 1963, Mayor Arthur Naftalin—a liberal progressive like his mentor, Humphrey—called for a larger role for the Mayor’s Council on Human Relations, where he hoped civic leaders could “broaden . . . our discussions” of race. But Naftalin rejected the idea that racism ran rampant in the MPD. While the beating of Wells was “unfortunate and deeply distressing,” it was “essentially an isolated incident,” he reasoned. The mayor had no intention of drastically shaking up the MPD’s structure.
In the aftermath of the 1967 uprising, Mayor Arthur Naftalin, a liberal progressive, said the idea that police brutality had anything to do with the riot was “preposterous” and said roving gangs of Black youths had exacerbated the riot.
Nonetheless, calls for change were overwhelming among Minneapolis activists, which led to the creation of the first Minneapolis Civilian Review Board soon after Wells’s beating. But the review board disbanded after only a few months over legal concerns, leaving Black residents once again without recourse if they wanted to file a complaint against police.
The 1967 Minneapolis riot tested Naftalin’s conclusion that racism did not pervade the MPD. Accusations of police brutality following a parade on Plymouth Avenue led to violence between Blacks and police on July 19, which brought more police into Northeast Minneapolis armed with shotguns. Violence then escalated to a three-day riot that led to looting, arson, and Naftalin calling up the National Guard to quell tensions. The head of the Minneapolis police union, Charles Stenvig, also wanted to deploy massive numbers of officers to the Northside, looking to take “a harder line against black militants” who he felt were responsible for the riots. But Stenvig was restrained by Naftalin, who discouraged the use of overwhelming police force. As police looked to suppress the riots, chants of “We want Black Power” echoed through the streets. The riot eventually faded out by July 23, leaving property damage but no deaths in its wake.
In the aftermath of the 1967 uprising, Naftalin said the idea that police brutality had anything to do with the riot was “preposterous” and said roving gangs of Black youths had exacerbated the riot. Naftalin would go on to create a Hennepin County grand jury (consisting of all whites) that found “no police brutality” during the riot. In fact, it recommended increasing the police force, putting more patrols into the streets to “re-establish the rapport between the people and the authorities.” The jury also blamed the Northside Black community center The Way for encouraging “hoodlums.” Even when proffering ideas on what city government could do to prevent future riots, white elites leaned into the stereotype that the economic disenfranchisement of Blacks had its roots in a culture of poverty and fractured, fatherless families.
Black residents took matters into their own hands following the grand jury’s report. In 1968, Black leaders from The Way formed Soul Force, which, along with the American Indian Movement (AIM), enlisted Black and Native men to patrol Northside streets to intervene between “potential law-breakers and law-enforcers.” The AIM/Soul Force (or “Soul Patrol”) collaboration had shattering success in preventing arrests—which coincided with expanded welfare and criminal justices services offered by The Way—before being disbanded in 1975, partly due to harassment from MPD officers.
Naftalin declined to run for mayor in 1969. He was replaced by Stenvig, leader of the police union during the 1967 riot, who ran on a George Wallace–style platform of “law and order.” Stenvig would be mayor 1969–1973 and then again 1976–1977. During his time in office, Stenvig demonized welfare recipients, castigated taxes and government spending, touted increased prison sentences for criminals, and solidified renewed and enduring ties between the MPD and the mayor’s office.
Stenvig also helped lay the groundwork for the militarization of the MPD we now see. By 1980 the MPD had overstaffed its police force beyond its own expectations and aggressively supported vice squads against LGBTQ residents, raiding bathhouses and arresting gay men en masse. Throughout MPD maintained its reputation for corruption.
Despite so many reform efforts over the decades, nothing ever really changed.
Don Fraser (another Humphrey mentee, known for reforming the Democratic Party and tackling human rights issues as a Minnesota congressman in the 1970s) hoped to, once again, change this reputation. Fraser assumed the mayoralty in 1980 and held the office for fourteen years. Like Humphrey and Naftalin, he aimed to reform the MPD by imposing external oversight of its conduct. He created a police review board in the early 1980s that was entrusted with the mission of fielding complaints of police brutality. But the review board had no enforcement power, and the police chief was under no obligation to heed its demands. Despite his rhetoric of reform, Fraser supervised the MPD’s “aggressive operation” of drug raids as part of the escalating War on Drugs. The MPD became increasingly dependent on SWAT teams and saw increased arrest rates. The force earned a reputation of being “damn brutal,” according to its own police chief. MPD leadership also made “fighting the crack-cocaine trade a priority,” leading to the deaths of many innocent victims, including an African American elderly couple mistaken for drug traders.
By the mid-1980s, with incidents of police brutality unabated and unaddressed, the Minneapolis Civil Rights Commission (MCRC) encouraged Congress to investigate the MPD. Though a congressional investigation never materialized, a reimagined Civilian Review Board would emerge in January 1990. But despite so many reform efforts over the decades, nothing ever really changed.
The serial failures of police reform efforts in Minneapolis are indicative of larger failures within contemporary liberalism, and of progressives to articulate a comprehensive vision of how cities could still be safe—indeed, safer—without militarized policing. In the case of Minneapolis, it is clear that decades of militarized policing have failed to generate prosperity for all. Minneapolis ranks last among metro areas in the country in Black homeownership and has one of the nation’s worst racial education gap between Blacks and whites. Minnesota as a whole is next to last among states for income inequality between whites and Blacks.
Minneapolis reformers can begin by rejecting the premise that policing as we know it can align with the principles of U.S. democracy—that policing can be perfected.
A liberal concept of police reform, however ambitious it might be conceived, cannot rectify these metrics of injustice. The city council’s current plans are disconnected from the racialized order that will be maintained even if an Office of Public Safety patrols Minneapolis’s streets.
Minneapolis reformers can begin by rejecting the premise that policing as we know it can align with the principles of U.S. democracy—that policing can be perfected. For decades it was thought that more interaction between Blacks and whites was the cure for police violence and that the greater “visibility” of police in Black communities would ease crime rates. This has proven false. Historian Keeanga-Yamahtta Taylor uses the term “predatory inclusion” to describe postwar efforts to encourage Black homeownership. The concept also applies to the history of police reform in Minneapolis: the various movement to encourage Black cooperation with police have persistently allowed the MPD to deflect attention from its own criminality, and put the onus on Blacks to rectify systemic injustices that the police are inevitably tasked with enforcing.
Seventies-era experimental programs such as AIM/Soul Force offer a possible route to reimagining police in Minneapolis: community-based, they sought harm reduction and tension de-escalation, and prioritized disarming suspects with firearms. But community policing must take place alongside municipal and federal investments that echo Hubert Humphrey’s vision of a “Marshall Plan for the cities.” Only massive expenditures on comprehensive programs of employment, health care, housing, and infrastructure can succeed in depriving racialized policing of its rationale.
Today, we must emphatically reject the central conceit of police reform that acts of police brutality are aberrations, and that they can be addressed with more and better policing. Those interested in genuine change must refuse to accept reform as the way forward, and work to build a city that brings greater justice to the families of George Floyd and Daunte Wright, and to all of Minneapolis.
6 notes · View notes
woman-loving · 4 years ago
Text
Women and Family in Korean History
Selections from A Concise History of Korea: From Antiquity to the Present, 3rd ed., by Michael J. Seth, 2020
The status of women in Silla [traditionally 57 BCE--935 CE] was higher than in subsequent periods and perhaps higher than it was in Paekche and Koguryŏ. Much of our knowledge of Silla’s family structure and the role of women, however, remains a matter of speculation. It is believed that the status of women was high compared to most contemporary Asian societies, that men and women mingled freely and participated together in social functions, and that families traced their ancestry along both their father’s and mother’s line. Women were able to succeed as the family head, and failure to produce a son was not grounds for divorce. Three women ascended to the throne—the last was Chinsŏng (r. 887–897)—although only when there was no male heir. Among royalty, about whom much more information is available, girls married between sixteen and twenty, and there was often a considerable difference in ages between partners. No strict rule seems to have existed concerning the use of paternal surnames. Succession was not limited to sons, but also included daughters, sons-in-law, and grandsons by sons and daughters. Equal importance was given to the rank of the father and the mother in determining the status of the child.[6] Kings selected their queens from powerful families. A careful reading of the historical records that were edited in later times suggests that Silla queens may have exercised considerable authority.[7] In all these ways, Korean society at this time differed from later periods, in which the position of women weakened considerably. If the above represents an accurate picture of Silla society, then the pattern of the next 1,000 years of Korean history is one of a steady decline in the status of women, of greater segregation of the sexes, and of a shift to a more patrilineal society.
[...]
As historian Martina Deuchler and others have pointed out, compared to later periods, the social position for women in Koryŏ [918--1392] times was high. Women could inherit property, and an inheritance was divided equally among siblings regardless of gender. A woman’s property was hers and could be passed on to her children. Some women inherited homes and estates. Ownership of property often gave upper-class women considerable independence. Korean women remained to a considerable extent members of their natal families, not those of their husbands. For example, if a woman died without children, her property passed on to her siblings, not to her husband. Wives were not merely servants of their  husbands. Their importance was reflected in the practice of conducting marriages in the house of the bride. There was no bride wealth or dowry, and men often resided in their wife’s home after marriage. The two sexes mingled freely. The twelfth-century Chinese traveler Xu Jing was surprised by the ease with which men and women socialized, including even bathing together.[23] Recent studies have clarified this picture of the role of women. In private affairs such as marriage, care of parents, and inheritance of land, women had equal rights with men, but they were excluded from public affairs.[24]
We do not yet have a clear picture of marriage in Koryŏ.[25] Evidence suggests that marriage rules were loose. Divorce was possible, but seems to have been uncommon; separation may have been more common. Koreans may have also practiced short-term or temporary marriages; however, the evidence of this is unclear. Remarriage of widows was an accepted practice. Marriage between close kin and within the village was also probably common. Later Korean society was characterized by extreme endogamy in which marriage between people of even the remotest relationship was prohibited, but this was not yet the case in Koryŏ times. Plural marriages may have been frequent among the aristocracy. Xu Jing said that it was common for a man to have three or four wives. Concubinage existed, but it is not known how customary it was. Evidence suggests that upper-class men married at about twenty and women at about seventeen. Men lived with their wife’s family until about the age of thirty. Widows as well as widowers appear to have kept their children. All this is a sharp contrast with later Korean practices (see chapter 7).
The Koryŏ elite was not strictly patrilineal. Instead, members of the elite traced their families along their matrilineal lines as well. This gave importance to the wife’s family, since her status helped to determine that of her children. Although high status and rights of women in Koryŏ were in contrast to later Korean practice, in many ways it was similar to Japan in the Heian period (794–1192). Much less is known about either Sillan or early Koryŏ society than about Heian Japan, but it is likely that the two societies shared a number of common practices relating to family, gender, and marriage. It is possible that these practices may, in fact, be related to the common origins of the two peoples. This is still a matter of speculation; further study is needed before the relationship between Korea and Japan is clearly understood.
Some changes took place over the nearly five centuries of the Koryŏ period. The adoption of the civil examination system in the tenth century led to careful records of family relations. At the same time, the strengthening of Chinese influences resulted in the gradual adoption of the Chinese practice of forbidding marriage among members of patrilineal kin. As Koreans began to place more importance on direct male descent and the Confucian ideas of the subordination of women to men became more accepted, the position of women declined. The state, for example, enacted laws prohibiting a wife from leaving her husband without his consent. Most major changes in family and gender relations, however, took place only after the Koryŏ period.
[...]
Korea during the Chosŏn (Yi dynasty) period became in many ways a model Confucian society. Confucian ideas shaped family and society in profound ways. Although China was the home of Neo-Confucian ideals and they were embraced by many in Japan and Vietnam, nowhere else was there such a conscientious and consistent attempt to remold society in conformity to them. The zeal and persistence by which Koreans strove to reshape their society in accordance with Neo-Confucian ideals helped to set them apart from their East Asian neighbors. These efforts initially began at the upper levels of society, but by the eighteenth and nineteenth centuries Neo-Confucian norms had prevailed, if to a somewhat lesser degree, among commoners. As Neo-Confucian values penetrated throughout all levels of society, they helped bind the Korean people together as members of a single culture even while sharp class divisions remained. While it moved Korea closer to many Chinese cultural norms, Confucianization was also a creative process of adapting the ideals that originated in China to indigenous social practices. Indeed, it was during this period that many distinctive features of Korean culture, such as its unique writing system, emerged.
The basic ideals of Confucianism centered on proper social relationships. Three cardinal principles (samgang) guided these social relationships: loyalty (ch’ung) of subjects to their ruler, filial piety (hyo) toward one’s parents, and maintaining distinction (yŏl) between men and women. Distinction meant that women had to display chastity, obedience, and faithfulness. Another Confucian formulation that defined the relationships that held society together was the five ethical norms (oryun): ŭi (righteousness and justice), which governed the conduct between ruler and ministers (subjects); ch’in (cordiality or closeness) between parents and children; pyŏl (distinction) between husbands and wives; sŏ (order) between elders and juniors; and sin (trust) between friends.[1] The ethical norms of Confucianism emphasized the importance of family relations, the hierarchical nature of society, the necessity for order and harmony, respect for elders and for authority, the importance of a clear distinction between men and women, and the subordinate status of women. Neo-Confucianists taught that each individual was to strive to cultivate his or her virtue. This was regarded as a lifelong task that involved sincere and persistent effort. Neo-Confucianists placed great importance on rituals and ceremonies, on honoring one’s ancestors, on formality and correctness in relationships, on constant study of the classics as a guide to a virtuous life, and on the importance of public service. They valued frugality, thrift, hard work, and courteousness, along with refraining from indulgence in immoderate behavior. Neo-Confucianists could be prudes, disdainful of spontaneous or sensuous behavior.
It was a philosophy that emphasized rank and status; it was important for everyone to know his or her place and role in society. Concern about rank and hierarchy had always been a feature of Korean society, but Neo-Confucian thought gave that concern ethical purpose. The Korean language reinforced rank consciousness. Lower-class people addressed upper-class persons with honorific forms that Koreans call chondaeŏ or chondaemal. As they addressed their superiors, their sentences concluded with verbal endings that indicated levels of deference. The language contained many synonyms reserved for respectful usage. Superiors in age or social status spoke in turn to their inferiors in a speech style devoid of the elaborate honorific endings and special honorific terms, which came to be called panmal. This use of elaborate speech styles indicating levels of deference, intimacy, and formality is still part of the Korean language.
The Family
Family and lineage were fundamental to the Korean Confucian order. Lineage refers to those people who directly trace their origins to a common ancestor. In Korea these lineages were called munjung. Only those who were in the direct line traced through the eldest son or nearest male relative belonged to a lineage. To keep track of lineage, Koreans began to keep chokpo, books where births, marriages, and deaths were recorded over the generations. This started to become a common practice in the fifteenth century and eventually became a universal custom, so that most Koreans, even today, can usually trace their ancestry back many generations. Families  eventually began to keep pulch’ŏnjiwi (never removed tablets) with the names of their immediate ancestors at the home of the lineage heir, normally the eldest son. Many of the ceremonies and practices associated with lineage found their way into the law code, the Kyŏngguk taejŏn, compiled in 1469. Laws required all Koreans to perform the rites to their ancestors known as chesa. In the chesa ancestral rites family members pay homage to chosang (ancestors). This emphasized that the ties of kinship extended to include the dead as well as the living. Ancestral rites became extremely important in establishing family ties. There were three basic types of chesa: kije, or death anniversary commemorations, which were performed at midnight on the eve of the ancestor’s death day; ch’arye, or holiday commemorations, which were performed on certain holidays; and myoje, or graveside commemorations performed on visits to a family member or to an ancestor’s grave (myo).
At the kije and ch’arye rites, the family members offered food and drink to the ancestral spirits. The rites came to symbolize the importance of maintaining order and properly adhering to rituals. Every aspect of the rituals followed a formal procedure. Food had to be arranged on an altar in a special order: fruit in the front row; then vegetables, soups, and meats; rice and thick soups; and spoons and chopsticks in the back. Red fruit was placed on the east and white on the west. Incense was placed in front of the food table, and a tray for wine was placed in front of the incense. Rites were performed by the eldest direct male relative, who began the ceremonies by kneeling and burning incense and then pouring three cups of wine. Others, generally according to rank, followed, prostrating themselves with their heads touching the floor three times. The eldest male then took the cup of wine after rotating it three times in the incense smoke. When the wine offering was completed the family members left and allowed the ancestor to eat. The men returned and bowed and the food was then served to the family. These rituals came to be performed  exclusively by men. Chesa rituals emphasized the importance of family, lineage, and maintaining a sense of order and propriety.
Marriage in Chosŏn Korea was characterized by extreme exogamy and a strong sense of status. Koreans generally married outside their communities and were prohibited from marrying anyone within the same lineage, even if that lineage contained up to hundreds of thousands of members, as the largest ones did. Yet the concern for status meant that marriages remained confined within a social class. In early Koryŏ times, marriages between close kin and within a village were probably common, but they became less so in subsequent centuries. The adoption of the civil examination system in the tenth century led to careful records of family relations and to the strengthening of Chinese influences forbidding marriage of patrilineal kin. However, in Koryŏ such marriages still took place. During the Chosŏn period the strict rules prohibiting kin marriages were enforced. Men and women married at younger ages than Western Europeans but not as early as in many Asian societies. In 1471, minimum ages of fifteen and fourteen were legislated for men and women, respectively. Men generally married between sixteen and thirty years of age and women fourteen to twenty; the age gap between husband and wife was often considerable. Commoners often married at younger ages than yangban [aristocracy].
Weddings underwent changes in Korea during the Chosŏn period as a result of the impact of Neo-Confucianism. Zhu Xi’s Family Rituals (Chinese: Jiali, Korean: Karye) became the basis for rules governing marriage ceremonies and practices. Koreans did not always blindly adhere to them, and  wedding practices were modified somewhat to conform to Korean customs. For example, Koreans had traditionally married at the bride’s home. But Zhu Xi and other authors stated that it should be done at the groom’s home. Scholars and officials debated whether to follow Chinese custom or kuksok (national practice). A compromise was worked out in which part of the ceremony was performed at the bride’s home, after which the couple proceeded to the groom’s family home to complete the wedding. To this day when marrying, Korean women say they are going to their groom’s father’s house (sijip kanda) and men say they are going to their wife’s father’s house (changga kanda). In Koryŏ times, many if not most newlyweds resided at the wife’s family residence. This custom, contrary to notions of patrilineal family structure, gradually died out, and brides moved into their husband’s home. In many cases, however, young couples lived with whomever’s family was nearest or with whichever parents needed care or had land available to farm.
A variation of marriage custom was the minmyŏnuri, a girl bride who entered the house as a child, often at the age of six or seven. Koreans never felt entirely comfortable with this custom, boasting that, unlike the Chinese at least, they did not take girls at infancy.[2] The girl bride would be ritually sent back to her home to reenter upon marriage, although this was not always practiced. The Yi government disapproved of the custom and  set minimum marriage ages, but these were not enforced. Child marriages were practiced mainly by the poorer members of society who needed the child labor and who could not afford costly weddings. Often the family of the bride could not afford a dowry. One advantage of child marriages was that the girl would be trained to be an obedient daughter-in-law, but in general it was a source of shame and a sign of poverty. Many grooms who were too poor to obtain a bride found this to be their only option. It was not unusual for the groom to be a fully grown adult so that an age gap of as much as thirty years between husband and wife was possible. Korean tales talk of the abuse these child brides received from their mothers-in-law. No doubt for some life was miserable. For much the same economic reasons, some families had teril­sawi or boy child grooms, although this was less common.
Great emphasis was placed on direct male descent, usually through the changja (first son). While this was always important in Korea, it was reinforced by Neo-Confucian thought, especially the influence of Zhu Xi. So important was direct male descent that even the posthumous adoption of a male heir (usually a close male relative) was necessary if a man died before leaving a male offspring. Men also took secondary wives, not just to satisfy their lust but to ensure they had male offspring. Inheritance patterns in Korea differed from those of its neighbors. While in China land was divided equally among sons, and in Japan all rights went to a sole heir, in Korea the trend during the Yi was to exclude daughters from inheritance and to give the largest portion to the first son, although all sons had the right to some property. This meant that most of a family’s property was kept intact and not divided, or at least kept in the lineage.
In short, Korean families during the Yi dynasty became increasingly patriarchal in that the authority of the males was enhanced. They became patrilineal in organization in that property was inherited through males, and that descent and the status that came with it was traced primarily through direct father-to-son or nearest male relative lines. The habit of residence in the groom’s family home after marriage reinforced male dominance. Families and lineages were exclusive; nonmembers could not be adopted into families. Nor could they join lineages, although disgraced members such as traitors and criminals could be expelled from them. Family and lineage truly mattered in Korea, as evidenced by the huge number of printed genealogies produced, perhaps unmatched in volume per capita anywhere else in the world.
Women During the Yi Dynasty
The status of women declined during the Chosŏn period. This can be attributed at least in part, if not primarily, to the fact that Neo-Confucianists stressed direct male descent and the subordination of women to men. Women were urged to obey their fathers in youth, their husbands in marriage, and their sons in old age. Books written for women emphasized virtue, chastity, submission to one’s husband, devotion to in-laws, frugality, and diligence. Moral literature, a great deal of which was published under the Yi, emphasized that women should be chaste, faithful, obedient to husbands, obedient to in-laws, frugal, and filial. Some of this literature was published by the state. To promote these values, the state in 1434 awarded honors to women for virtue. Literacy was very low among women, since the village and county schools admitted only men. The small proportion who could write, perhaps amounting in the eighteenth and nineteenth centuries to only 3 or 4 percent at the most, generally did so in han’gŭl, rather than in the more prestigious Chinese characters.
The decline in status was gradual. Households headed by women disappeared early in the Yi dynasty, but women still inherited property until the seventeenth century. Widows were no longer allowed to remarry, since they were supposed to be loyal to their husbands even after their partner’s death. As the marriage customs shifted from earlier practices, brides usually left their families after marriage. A daughter was a todungnyŏ (“robber woman”), since she carried away the family wealth when she married. A married daughter became a ch’ulga oein (“one who left the household and became a stranger”).[3] This contributed to the practice of reducing or eliminating a daughter’s share of her inheritance. Since the daughter was thought to leave her family, there was less reason to bequeath a portion of the family estate to her. Women could not divorce men, but men could divorce women under the principle called ch’ilgŏjiak (seven grounds for divorce), which legitimized the grounds for divorce. The seven grounds for divorce were disobedience to parents-in-law, failure to bear a son, adultery, jealousy, hereditary disease, talkativeness, and larceny. So associated were women with their families that they were generally referred to by their relationship to their male family members rather than by their name. It has been suggested that by late Chosŏn, women became “nameless entities,” being referred to as “the wife of” or as the “mother of (son’s name).” They had, in other words, not only lost their rights to divorce, to property, to participating in public life, they had also lost any identity of their own.
Women could no longer freely mix with men socially, and their lives were restricted in many ways. Nae-oe pǒp, inner-outer laws, sought to keep the sexes strictly separate. The official legal code, the Kyŏngguk Taejŏn, forbade upper-class women from playing games and from partying outdoors, with penalties of up to 100 lashes. Horse riding, a common activity among upper-class Koryŏ women, was forbidden by law in 1402. Women had to seek the permission of husbands or family heads before participating in social activities. Upper-class women were not allowed to attend services at Buddhist temples or any public festivals.[4] In later Chosŏn, women in Seoul were allowed in the street only during men’s curfew hours from 9 p.m. to 2 a.m. At other times it became customary for women to wear veils when entering the street. The segregation and restriction of women became reflected in the architecture of the Korean home, which was divided between the sarang ch’ae, the outer section for men, and the anch’ae, the inner section of the house for women, also called the anbang (inner room). Even poor families often had three rooms: one for men, one for women, and the kitchen. Husbands and wives often lived virtually apart in their own home. Separation of religious functions occurred as well, with the women in charge of kosa, offerings to household gods, and the men chesa, Confucian rites to the ancestors. Unlike in China, women were excluded from the rites to the ancestors. In Korea there was a clear gender division in ritual responsibilities.
Particularly tragic was the position of widows. Since a woman was not allowed to remarry, or head a household, once her husband died she became an inconvenience for her family. There were stories of widows being pressured to commit suicide, but this was probably rare. A widow was sometimes called a mimangin (a person who has not died yet). Among commoners and outcastes widows were sometimes married off to a poor man, sometimes to a widower who needed a wife but could not afford a marriage. The man would enter the house and carry out the woman, supposedly in a big sack, resulting in what was sometimes referred to as a “sack marriage.” This might be arranged by the widow’s family against her will.[5] Also difficult was the life of women in a household where a man took a concubine. This practice was an opportunity for a poor slave or commoner woman to enter an upper-class household. But her life could be made difficult by the jealous first wife and her children, and by the stain of illegitimacy given to her children (see below). First wives could also be made miserable by the entry of a new younger wife with whom they had to compete for their husband’s attention.
The same Confucian demand on loyalty and chastity that made remarriage unacceptable resulted in the custom of presenting a woman with a p’aedo, a suicide knife. This custom, which began among the elite, became common to all social classes in the southern regions. There were reported cases of women using the knife to protect themselves from attackers. In one such case, a government slave girl, Tŏkchi, used her p’aedo to kill a number of Japanese who attempted to rape her during the sixteenth-century invasions. But the purpose of the knife was for a woman to protect her virtue by committing suicide. A particularly sharp knife was called a chamal p’aedo after another slave girl, who after being embraced by her drunken master always kept her knife sharpened.[6] A woman had to not only protect her honor but, most importantly, protect her family from even the slightest hint of scandal. Sometimes even rumors of an indiscretion were enough for a woman to be pressured to commit suicide for the sake of her family’s reputation.
The sign of a married woman was the tchok, long braided hair coiled at the nape and held together with a pinyŏ, a long pin. Single women wore their long hair unpinned. Ideally women were kept from public view, secluded in their women’s quarters and venturing out only in screened sedan chairs or at night. In reality, only the upper class could afford this. Rural women worked in the fields, participating in all the tasks except plowing and threshing, which were men’s work. Women could not engage in business, but women’s loan associations, called kye, were an important source of income for rural women. Commoner and low-caste women mixed with men at festivals. The separate existence of men and women was an ideal most honored at the upper reaches of society.
There were also some exceptions to the restricted roles of women. Mudang (women shamans) were an important part of life since at least Silla. During Chosŏn times the great majority of shamans were women, although their social status declined as a result of the official Confucian disdain of traditional religions. Some women became entertainers. These were generally from outcaste and slave families from whom attractive young girls were often purchased to be trained as entertainers known in Chosŏn times as kisaeng (see photo 7.1).
Women prevailed in some performing arts, such as singers in the nineteenth-century dramatic form p’ansori.
Perhaps the most interesting exception to the restricted lives of women were the kisaeng. The kisaeng were carefully trained female entertainers similar to the Chinese singsong girls and the Japanese geisha. Kisaeng often came from the slaves. Attractive ones would be taught to read and write, appreciate poetry, and perform on musical instruments so that they could entertain men, especially yangban. Since the lives of men and women were increasingly segregated, the kisaeng offered men a chance to enjoy the company of women who were not only attractive but able to engage in learned conversation and witty banter. There were also common prostitutes; however, the kisaeng were considered more virtuous as well as highly educated, fitting companions for upper-class men. Kisaeng were able to engage in conversation with men and be intellectual as well as romantic companions to men in a way that good, virtuous Confucian wives could not. Some kisaeng were official kisaeng, recruited and employed by the state. These were carefully trained in government-regulated houses. During the early dynasty about 100 kisaeng were recruited every three years for the court while others were trained and sent to provincial capitals.
Most kisaeng were privately employed by the hundreds of kisaeng houses throughout the country. There were, however, also medical kisaeng who besides their duty entertaining men also served to treat upper-class women, since women of good families were unable to see male doctors who were not related to them. Others were also trained to sew royal garments. Kisaeng, although never entirely respectable, were often admired and loved by men. Some were celebrated for their wit and intellect as well as their beauty and charm. A few talented kisaeng won fame for their artistic and literary accomplishments, such as the sixteenth-century poet Hwang Chin-i (see below). Another, Non’gae, according to legend, became a heroine when she jumped in the Nam River with a Japanese general during the Hideyoshi invasions. But these were exceptions; most kisaeng led humble lives in which the best they could hope for was to be some wealthy man’s concubine.
[...]
An interesting legacy of Chosŏn was women’s literature. In recent years scholars have rediscovered much of this large body of feminine writing. The percentage of women who were literate was small, since even yangban girls were discouraged from learning. Nonetheless, a small number of women became quite accomplished in letters. Lady Yun, mother of Kim Man-jung (1637–1692), is said to have tutored her two sons to pass the civil exams. Lady Sin Saimdang (1504–1551), mother of Yi I (Yulgok), wrote many works in Chinese. Hŏ Nansŏrhŏn (1563–1589), a beautiful and highly intelligent daughter of a high-ranking official, was so talented as a youth that she attracted the attention of well-known poets who tutored her. Tragically she died at the age of twenty-six and destroyed many of her poems before her death. Her famous brother, Hŏ Kyun, collected what remained. These proved to be enough to earn her a reputation as an accomplished poet not only in Korea but also in China. Another distinguished woman writer was Song Tŏk-pong (1521?–1579), the daughter of a high official who became famous as a poet and was the author of the prose work the Diary of Miam (Miam ilgi).[17] Kisaeng such as Hwang Chin-i were often accomplished poets as well.
As in Japan, Korean women wrote primarily in indigenous script while men stuck to the more prestigious Chinese characters to express themselves. Women, if they learned to write, generally wrote in han’gŭl, which was regarded as fitting for them. Han’gŭl, in fact, was sometimes referred to as amgŭl (female letters). Women, following cultural expectations, generally wrote about family matters. But within these restrictions Korean women produced kyuban or naebang kasa (inner-room kasa). These originated in the eighteenth century and were largely anonymous. They included admonitions addressed to daughters and granddaughters by mothers and grandmothers on the occasion of a young woman’s marriage and departure from home. Young brides would arrive with these kasa copied on rolls of paper. They would pass them to their daughters with their own kasa added. Other inner-room kasa dealt with the success of their sons in taking exams, complaints about their lives, and seasonal gatherings of women relatives.[18]
Another genre of women’s literature was palace literature written by court ladies about the people and intrigues of court. A large body of this literature, much of it still not well studied, survives. Among the best known is the anonymously authored Kyech’uk ilgi (Diary of the Year of the Black Ox, 1613), the story of Sŏnjo’s second queen, Inmok. Queen Inmok is portrayed as a virtuous lady who falls victim to palace politics and jealousies. She struggles to protect her son and is imprisoned by Kwanghaegun. The tale ends when the doors of the palace where she is imprisoned are suddenly opened following Kwanghaegun’s over-throw.[19] Another work, Inhyŏn Wanghu chŏn (Life of Queen Inhyŏn), tells the virtuous life of Queen Inhyŏn, who married King Sukchong in 1681. She too is victimized at the hands of an evil rival, Lady Chang. Today the most read of these palace works is the Hanjungnok (Records Written in Silence) by Lady Hyegyŏng (1735–1815). This is the autobiography of the wife of the ill-fated crown prince Changhŏn. Written in the form of four memoirs, it is a realistic and in most respects accurate story of her mistreatment at court, the tragedy of her husband’s mental illness and death, and the sufferings of her natal family at the hands of political enemies. Her memoirs are a literary masterpiece, and because of their honesty and her astute insights, they are a valuable window into court life in the eighteenth century. Biographical writings by women in East Asia are very rare, and one by a woman of such high intelligence so close to the center of political life is especially important.[20]
[...]
Few social changes [in the early modern period] marked a greater break with tradition than those that concerned women. Many Korena women embraced new ideas and opportunities presented by a modernizing society. Korean progressives in the late nineteenth century saw the humble status of Korean women as symptomatic of the country's low level of civilization. The Kabo Reforms [1894-1896] had abolished some of the legal restrictions on women. They also abolished child marriage and ended the prohibition on widows to remarry. The issues of establishing greater equality for women, begun by the tiny number of Koreans exposed to the outside world in the 1890s, was embraced by much of the intellectual community in colonial times. Many Koreans blamed the Confucian concept of namjon yŏbi (revere men, despise women) as emblematic of both the country's backwardness and its past uncritical adoption of Chinese customs. Of particular concern was the exclusion of women from formal education. They noted that girls attended schools in Western countries and that Japan had drawn up plans in the 1870s to make basic education universal and compulsory for girls as well as boys. An early proponent of women's' education was Sŏ Chae-p'il, whose editorial in the Tongnip sinmun on April 21, 1896, called for equal education for men and women to promote social equality and strengthen the nation. In another editorial in September that year, he argued that gender relations were a mark of a nation's civilization. Conservatives in the late Chosŏn government were less sympathetic to the need for women's education. A petition to the king by a group of women from yangban families to establish a girls' school was ignored.[25] Women's education was established by American missionaries, not Koreans. After an initial slow start, many families began sending their daughters to these new schools, and the enthusiasm for education among Korean women was commented on by foreign missionaries. Women graduates of these schools became active in patriotic organizations, and thousands of women participated in the March First Movement [in 1919]. It was only during the 1920s, however, that the women's movement became a major force in Korea. One of its important figures was Kim Maria. Educated in Tokyo, she formed in April 1919 the Taehan Aeguk Puinhoe (Korean Patriotic Women's Society), an organization to promote national self-determination. The organization worked with the Korean Provisional Government in Shanghai and in 1920 claimed some 2,000 members. The activities of this and other, mostly Christian, women's groups helped win respect for women among Korean intellectuals.
In the 1920s men and women participated in discussions about the role of women and gender relations. Feminists included Kim Wŏn-ju, who published Sin yŏja (New Woman); artist Na Hye-sŏk (1896–1948), who wrote for Yŏja kye (Women’s World); and the poet Kim Myŏng-sun. Some members of this small class of women led lives daringly defiant of tradition. They wore Western-style clothes with short skirts and bobbed hair, socialized in public, advocated free love and the right to divorce, and rejected the confinement of women to the roles of housewife and mother. These ideas, however, were too radical for Koreans, including male intellectuals. Moderate nationalists called for an educated, healthy woman who role in society was very much like the "wise mother, good wife" (hyŏnmo yangch’ŏ) ideal promoted by the Japanese government; meanwhile, leftist male nationalists argued for the need to subordinate gender issues to those of class.
Two individuals exemplify this new small class of “modern” women. One is Kim Hwal-lan, known to Westerners as Helen Kim. Born in 1899 to Christian parents in Inch’ŏn, she attended mission schools, became active in the YWCA, went on to Boston University, and received a PhD from Teachers College of Columbia University in 1930. After returning, she became president of Ewha College, the most prestigious school of higher education for women in Korea, a position she held from 1939 to 1961, except for a brief period (1944 to 1945) when the school was shut down by the Japanese. Pak Kyŏng-wŏn (1901–1933), daughter of a rich farmer, attended an industrial arts school in Japan and took a job as a technician in the silk reeling industry, an industry dominated by women workers. She then returned to Japan to learn to become a driver, a rarity for a woman, and then became one of the few women to attend an aviation school. Korea’s first woman aviator, she won a number of flying competitions in Japan before perishing in a flight back to her home in Korea.[26]
The women's movement was quite political, since most writers linked the liberation of women with national liberation. While this may have made the belief in women's rights and equality more acceptable to educated Koreans, it meant that feminists subordinated their own social agenda to the nationalist political agenda. It also meant that the women's movement followed the general split between moderate, gradualist reformers and radical leftists that characterized most political and intellectual activity from the early 1920s. Moderate women reformers were associated with the YWCA and various church and moderate patriotic associations, while some thirty women with socialist and Communist leanings established a more radical group, Chosŏn Yŏsŏng Tonguhoe (Korean Women's Friendship Society) in 1924. As part of the united front, in 1927, moderate and radical women worked together to organize the Kŭnuhoe (Friends of the Rose Sharon). By 1929, the Kŭnuhoe had 2,970 members, including 260 in Tokyo.[27]
The colonial legacy for Korean women was mixed. In many ways the "wise mother, good wife" concept promoted by the Japanese and embraced by much of society reinforced the traditional ideas of a sharply defined domestic "inner" sphere for women and "outer" sphere of public life form men. Yet as Sonja Kim writes, this domestic space for women was "infused with new conceptualizations of equality, rights, and humanity."[28] For the vast majority of Korean women, their traditional subordinate social status remained unchanged, but the emergence of a small number of politically active and assertive women among the educated was an important precursor of more radical changes that would take place after 1945.
[...]
An extreme form of coercion [during the wartime colonial period] was the comfort women, or comfort girls. These were young Korean girls who were either recruited or forcibly enrolled as sex slaves to serve the Japanese troops. The so-called comfort girls included Filipinas and Chinese, but most were Koreans. Many of these girls were recruited under false pretenses. They or their parents were told that they were to be given well-paying jobs. In practice, they were treated miserably. After the war, these girls returned home disgraced and were forced to hide their past or live lives as unmarried and unwanted women. Between 100,000 and 200,000 Koreans became comfort women. One examples was Mun Ok-ju, an eighteen-year-old woman from a poor family of casual laborers in Taegu, in southwestern Korea, who was offered "a good job in a restaurant" by two civilian recruiters. Lured by the promise of a good salary to support her family, she went along with a group of seventeen other young women between the ages of fifteen and twenty-one who were shipped off to Burma, where she "serviced" thirty men a day under conditions of virtual imprisonment. Five of the girls in her group died or committed suicide.[38]
The abuse of the comfort women has become one of the most contentious issues in colonial history. In many ways it symbolizes the brutality and exploitation of Japanese colonialism at its worst. It was only one way Koreans were victimized. Koreans also suffered from Allied bombing while working in Japan, for example. Among the more than 2 million Koreans working in wartime Japan, at least 10,000 died form the atomic bombing of Hiroshima and Nagasaki.[39]
[...]
One of the revolutionary changes the North Korean Communists introduced was the concept of gender equality. Even the sense of womanhood as an identity was an important innovation in the conservative Confucian Korean society. Women enjoyed equality in education and, at least legally, in pay. Women could share equal inheritance, divorce was made easier, taking of concubines was outlawed, and all occupations were in theory open to women.
Women entered the workforce in large numbers. While the ideology of gender equality encouraged this, the prime motivation was a labor shortage. The labor shortage was especially acute after 1953 and remained a problem since so many young men were in the military and because economic growth relied on labor inputs rather than on improving productivity. In 1958, the Cabinet issued a resolution calling for women to join the workforce. Women who did not work were penalized by receiving smaller rations. The effort to free women for labor was accelerated with the 1976 Law on the Nursing and Upbringing of Children. This called for the creation of 60,000 kindergartens and pre-kindergartens that could accommodate three and a half million children virtually all in this age-group.[23] The day care centers also served the function of indoctrinating the young at an early age. They became a great source of pride; a visit to a model day care center was part of the standard tour for foreign visitors.
Another purpose of day care centers was to free women for the workforce. By the 1970s women made up nearly half the labor force, including 70 percent in light industry and 15 percent in heavy industry.[24] But North Koreans were still conservative enough that women were expected to take care of the housework, to cook for their families, and raise children. Married women were often let out of work early to collect children and prepare dinner. According to the 1976 law, women with children under thirteen were to be let out two hours early but paid for eight hours.[25] Most of the jobs filled by women were low-paid, menial ones. Few women enjoyed high-status jobs. It was rare for them to hold jobs as managers. Many were schoolteachers, but by one estimate only 15 percent were university professors. One-fifth of the delegates to the Supreme Peoples' Assembly, the powerless legislature, were women, but there were few women in top positions. The former Soviet intelligence officer Pak Chŏng-ae stood out, until she was purge in the 1960s. Hŏ Chŏng-suk, daughter of prominent leftist intellectual Hŏ Hŏn, served as Minister of Justice for a while, but she too was purged in the early 1960s.[26] Later, Kim Jong Il's sister Kim Kyŏng-hŭi wielded some power, but mainly through her husband, Chang Sŏng-t’aek.
Marriages were commonly arranged using the Korean custom of a chungmae or matchmaker, much as was done in the South. By the 1980s love matches were becoming more common, again reflecting a pattern of change similar to North Korea’s modernizing neighbors.[27] Visitors to North Korea noticed the change, with more young couples appearing together in public. But in many respects it was a puritanical society with premarital sexual relations strongly discouraged. The Law of Equality between the Sexes in 1946 made divorce by mutual consent extremely easy, and for a decade divorce was fairly common. This was a major break from the past. But in the mid-1950s, people were required to go to a People’s Court, pay a high fee, and then adhere to a period of reconciliation. As a result divorce once again became uncommon.[28] Family bonds, between husband and wife and especially between parent and child, came under official praise to an extent not found in other Communist states. The 1972 constitution stated, “It is strongly affirmed that families are the cells of society and shall be well taken care of by the State.”[29] The nuclear family was idealized and supported.
The birth rate was quite high in the 1940s, 1950s, and 1960s, then fell. Partly this was due to government efforts and partly it was part of the normal demographic transition as the country became more industrialized, urbanized, and better schooled. Early marriages were banned. In 1971 the marriage age was recommended at twenty-eight for women and thirty for men. The long years of military service, small apartments, and the entry of women in the workforce all contributed to a decline in the birth rate. By 1990, the birth rate had fallen to the point that the ban on early marriages was lifted.[30]
[...]
The democratization of South Korea was part of a broad social and cultural change that included the rise of the middle class, of an industrial working class, and of Christianity, and the spread of egalitarian ideals. Another important component of the social and cultural change was the movement for greater legal and social equality for women. At first, attitudes about the role of women in society and the nature of the family changed slowly. After liberation, many South Korean officials and intellectuals were more concerned about preserving or restoring what they sometimes called "laudable customs and conduct" (mip'ung yangsok).[27] In part this was a reaction to the attempts by the colonial authorities to modify the Korean family structures to make it conform to Japanese practice.[28] In a nationalist-traditional response, when the South Korean government created the civil law code in the 1950s, the parts that governed family relations, known as the Family Law, were very conservative, adhering to a traditional patrilineal and patriarchal family structure. It included the prohibition of marriage between people with the same surnames and the same pon'gwan; it not only obligated the eldest son to head the extended family but gave him a greater share of inheritance. Women were excluded from heading households and received less inheritance, and at marriage they were required to become legally part of their husbands' family. In divorce, which was uncommon, men generally received custody of children. Maintaining these practices was important, it was argued, to preserve the essential nature of Korea's cultural traditions.
In the 1950s and 1960s women organized to challenge these traditions and the laws that protected them in the name of women's equality. A Federation of Korean Women's Groups (Taehan yŏsŏng tanch’e hyŏphŭihoe) led by Lee Tai-young (Yi T’ae-yŏng) (1914–1995) fought during the 1950s and 1960s for legal reforms establishing the equality of men and women in marriage, divorce, child custody, and inheritance. Lee, the daughter of a miner, worked as a seamstress before becoming South Korea’s first woman lawyer in 1952. She founded the Korea Legal Aid Center for Family Relations, a nonprofit that provided assistance to poor, uneducated women and was a champion of equal justice and rights for women. Early women’s rights advocates were up against entrenched patriarchal attitudes. With the expansion of women’s education, however, and the gradual acceptance of the ideas of equality, attitudes toward these matters began to change. Even under the very conservative Yushin period in the 1970s, a Pan-Women’s Group for Revision of the Family Law succeeded in revising the law in 1977 to give greater rights to women in these four areas: marriage, divorce, inheritance, and child custody.[29]
More significant changes took place when women’s rights became part of the great upsurge in political and social activism of 1987. In that year, female activists created the Korean Women’s Association (Han’guk Yŏsŏng Tanch’e Yŏnhap).[30] In 1989, the Family Law, in part due to the pressure from this and other groups, was again revised, with most of the old patriarchal provisions eliminated or modified. Up to that time the eldest son was still expected to succeed as the head of the house, receive extra property in inheritance, and take care of his parents in old age. Under new legal revisions, complicated by court rulings, this was no longer automatically the case. Other changes were slowly taking place. The emphasis on universal education meant literacy rates among women were as high as for men, and there was no significant difference in the percentage of women completing secondary education. But in higher education women tended to be confined to nonprofessional programs, studying home economics, English, and fine arts. South Korea lagged far behind most industrial nations in the early 1990s in the percentage of women represented in law, medicine, and the other professions. Few served in government, and they were still expected to resign from work when they married.
[...]
South Korea in the first decade of the twenty-first century was a society still undergoing rapid change. As was the case with its economic development, a social transition that took decades in most other countries occurred over a relatively few years. One of the most dramatic changes was demographic. The once high birth rate fell sharply in the 1960s with a  government-sponsored birth-control program. By 1983, it was only slightly above the replacement level, with women having an average 2.1 children. Other changes also contributed to the creation of a two-child norm by the end of the 1980s. The urbanization of the population, now crammed into small apartments and townhouses; the enormous expense of education; and the high literacy rate of women were important factors. Cultural norms changed as well in what was still a rather conformist society. Increasingly, a small family with a son and a daughter had become the ideal. However, by the late 1990s the birth rate continued to fall, dropping below the natural replacement rate. The sharpest drop was in the five-year period 1997–2002, blamed on the economic crisis of the late 1990s. But the return of good economic conditions did not reverse this trend. In 2004, the birth rate had fallen to 1.08, one of the lowest in the world, even lower than Japan’s 1.3 rate, which was the cause of so much concern there. It rose a bit by 2014 to 1.2, then in 2018 it fell back to .98, the lowest in the world. As a result, South Korea’s population, which stood at 51 million in 2019, was expected to decline to 40 million by 2056 and 20 million in 2100. By the late 2010s, officials were referring to it as an “extreme demographic crisis.”[18] Various state policies to raise the birth rate were ineffective. In late 2018, a government panel recommended improving living conditions as a more effective way to increase fertility. However, the experiences of other countries with low birth rates suggested there was no easy solution to falling fertility rates.
This led to a related problem. South Koreans were living longer; life expectancy had reached about seventy-five for men and eighty-two for women in 2008 and was still rising, but the birth rate was dropping. As a result, South Korea was rapidly becoming an “aging society,” with 14 percent of its population over sixty-five in 2015, and the number was expected to rise to 25 percent by 2030 and 38 percent by 2050, which would be one of the highest, if not the highest, in the world.[19] At that time there would be only 1.4 adults of working age for every senior citizen. The workforce began to shrink in the late 2010s. Becoming the world’s fastest-aging society was becoming the nation’s greatest challenge. The government responded by passing a law in 2013 making sixty the minimum retirement age, and it began providing financial subsidies for parents with multiple children. Local governments came up with many incentives, offering bonuses for a second and third child, and offering free babysitting services. The government discouraged abortions. It also called for lowering housing costs, work/childcare balance, and reducing the costs of weddings. At the same time there was concern over the rising poverty levels among those over sixty-five. In 2018 the percentage of elderly living below the poverty line (defined in relative terms) was the highest of any modern developed country, resulting in a rising suicide rate among those over 65, even though overall the suicide rate was declining. It also contributed to a rise in crime committed by the elderly.[20]
[...]
Family remained a central unit of society, as reflected in the welfare policies that still expected family members to take care of each other, in the corporate culture in which the chaebŏls that dominated the country were still family enterprises, and in the nearly obsessive focus on educating family members. In fact, so strong was the family-centeredness of South Korea’s society that some observers have referred to the country as undergoing familial modernization.[29] Yet, perhaps no social changes in South Korea were more dramatic than those concerning gender and family. The legal codes were amended to allow women to head households, inherit property, and initiate divorce, and gender discrimination was legally prohibited by the early 1990s. Enrollment of women in colleges and universities soared past that of men in the 2000s. In 2010, women made up 49 percent of those in master’s degree programs and 31 percent in doctorate programs.[30] This was still behind men, but the number of women graduate students had nearly doubled in a decade. Higher education was no longer a finishing school where girls majored in home economics, English, and art. Women, however, still faced discrimination in the workplace and elsewhere. Increasingly educated, organized, and empowered, Korean women were no longer accepting the patriarchal traditions of their society. Women also campaigned against the sex industry that had always been a major employer of women. In South Korea, the double standard prevailed in which it was accepted that married men frequented the “room salons” and other places that employed sex workers but that women who worked there were not virtuous. Under pressure from women’s groups, the Kim Dae Jung administration created a Ministry of Gender Equality in 2001, renamed the Ministry of Gender Equality and Family in 2005, to deal with this problem. Women in the 1990s dealt more openly with previously taboo issues such as spousal abuse and sexual harassment. During the military regimes, some women had protested against Japanese sex tourism. After 1990, women’s groups refocused on these issues, brought attention to the South Korean government’s complicity in making prostitution available in base camps used by American troops in Korea, and in other ways confronted the issue of sexual exploitation of women.[31]
From the perspective of South Korea’s historical legacy of male domination, the changing role of women was almost revolutionary, yet by most measures Korean women still lagged behind their counterparts in other developed nations. In 2006, women made up only 3 percent of executives in companies of over 1,000 employees; Samsung had only 12 women out of 1,300 officers and managers; Hyundai Motors and POSCO had no women in top positions. In 2000 the OECD found that Korea had the widest gender gap of any of its members. And this was still true in another study in 2017. In that year Korean women made only 51 percent as much as men. The gender gap in wages was not only greater than in all Western countries but in most other Asian countries as well.[32] In another international study in 2017, Korean women ranked 90 out of 142 countries in political empowerment, and 121 out of 142 in economic opportunity and participation.[33] Women did make some progress in politics, although here too they lagged behind their sisters in most industrial states. In 1992, only 1 percent of the members of the National Assembly were women; by 2015, 16 percent were, but eighty-three countries had a higher percentage of parliamentarians.[34] In the spring of 2006, Han Myung-suk (Han Myŏng-suk) (1944–) became the first female prime minister and in 2013 Park Geun Hye became the first woman president. Unfortunately she was forced to resign four years later as a result of scandal.
Korean women typically married at the age of twenty-nine and had their first baby at thirty. Many were not marrying at all; it was estimated in 2013 that 15 percent of all women would remain single. Child-rearing and working at home were major burdens; a study showed that wives devoted five times as much of their time to these tasks as husbands.[35]
Another break with ancient tradition was ending the prohibition in the Civil Code against people marrying who shared the same surname and ancestral home. Most Koreans share one of a small number of family names, nearly half named Kim, Lee (or Yi or Rhee), or Park (or Pak). The surnames were broken down into clans who shared the same ancestral home and reputed ancestral descent. Members of some clans, such as the Gimhae Kim, the Miryang Park, and the Chŏnju Lee, had hundreds of thousands of members—there were 1.5 million members of Gimhae Kim—creating hardship for young people sharing a remote and theoretical ancestry who happened to fall in love. Marriages between them were not recognized, and their children were regarded as illegitimate. A court ruling declared this law unconstitutional in 1997, and in 2002 after lobbying by reformers, the National Assembly formally repealed it.
A truly unprecedented change in Korea’s social history was the rise in the divorce rate. Up through the 1980s, divorce brought great shame and was uncommon, but by 1990 this had begun to change. Between 1995 and 2005 the divorce rate tripled. By 2005, the rate was 2.6 divorces per 1,000 people, a little higher than Japan’s 2.3 or the European Union average of 1.8, although less than the U.S. rate of 4.0 per 1,000. Many women were opting out of marriage. In one survey of college women, a third said they did not want to get married.[36] For men the problem was not enough women. This was the product of Korean preferences for sons. In the 1980s and 1990s the use of sonograms resulted in increased abortion of female infants. The result was that there were more boys than girls. The imbalance reached a peak in the in the early 1990s when there were 117 baby boys per 100 girls, one of the highest ratios in the world. That surplus of boys became a serious problem in the early twenty-first century. However, in the 2000s the gap between male and female births narrowed to just slightly above the natural ratio of 105 males to 100 females. The use of sonograms resulted in similar sexual imbalances in China, India, Vietnam, and other nations. South Korea was the first Asian nation to show this sharp reversal, partly due to government measures in 1991 restricting the practice of sonograms by medical personnel. But public awareness of the problem and changing attitudes also contributed. Many Koreans began to value daughters, often seeing them as caregivers in their old age. Surveys in the early twenty-first century suggested that the age-old preference for sons over daughters no longer prevailed.
Families themselves were changing. In 2017, the average household contained only 2.5 members, less than half the size of a generation earlier.[37] A quarter of Korean households were headed by women. While still uncommon, this was no longer a rarity. According to one poll in 2007, one in six single women said they would be happy to have children without having husbands.[38] Even adoptions were becoming more common. Because Korean culture placed such emphasis on bloodlines, it was rare to adopt children. As a result, agencies sprang up after the Korean War to arrange for adoption to the United States and Europe. Embarrassingly for many, Korea continued to be a source of adoptees for Westerners at the start of the twenty-first century. This was beginning to change, although slowly. Koreans were also showing more acceptance of gay and lesbian relations, and sexual diversity in general. Still, in some ways South Koreans were socially conservative. In international surveys, they were less likely to approve of cohabitation without marriage or believe that people can be happy without marrying than people in most Western nations or in other developed Asian nations such as Japan and Taiwan.
Ethnic Homogeneity and Multiculturalism
South Korea’s low birth rate contributed to one of the most radical changes in Korean society in centuries—the end of ethnic homogeneity. A factor contributing to this was the shortage of women due to the preference for males. This hit rural men hard. Few young women wanted to live on a farm, and with the supply of marriage-age men greater than that of women they were able to avoid doing so. Consequently, many rural men sought  wives from abroad. By 2006, more than a third of male farmers married foreign women, mostly Chinese and Vietnamese, but also from other Asian countries such as the Philippines and Uzbekistan. According to the National Statistical Office in Seoul, marriage to foreigners accounted for 13 percent of all marriages in 2005; more than 70 percent were between Korean men and women from other Asian countries. According to one study, by 2020 Kosians (bi-ethnic children) would make up one-third of children born in South Korea.[39] In a homogeneous society such as Korea this was a startling statistic. Public awareness of the idea of inter-ethnic and interracial marriage and the implications for what it meant to be Korean was highlighted by the visit of Hines Ward, a Korean-speaking American football hero whose parents were a Korean mother and a black American father, and by a popular TV drama, The Bride from Hanoi.
5 notes · View notes
uncontainedkc · 4 years ago
Text
Necessary Disruption: Housing Reimagined
“There’s no place like home!”, is more than a popular line from the classic movie - The Wizard of Oz. Home is a safe place, a place to grow and create a lifetime of memories with your loved ones. Home is an ideal.  It is the American Dream. Sadly, home has been an unfathomable circumstance for millions of humans that lived and died through various tragedies on American soil throughout our troubling history with racism, slavery and discrimination. Home continues to be a mere illusion of a reality that is completely unknowable and out of reach for some. Specifically, more than 500,000 Americans are unsheltered today. Millions more are housing insecure, including 2.5 million children. Despite the fact that housing is a basic physiological need for human survival- “home” evades millions of people in the wealthiest nation on earth, America.  
The long-standing traditions of limiting generational wealth and status by prohibiting land ownership coupled with rampant housing discrimination are ever-present even today. Housing in this country is treated as a luxury and not as a human right. That is a problem.
A disruption is necessary.
LIMITING WEALTH BY RESTRICTING ACCESS TO OWNERSHIP OF LAND AND REAL PROPERTY IN THE UNITED STATES.
Understanding the shift we must make requires we understand the roots of our current land ownership and housing system. Historically, housing in the United States has long been an area of explicit, strategic discrimination and oppressive practices. These practices were implemented and maintained as a way to control mobility, status, and life opportunities of populations that were deemed inferior or less desirable. It was also the most effective way to concentrate power and wealth in a select group of people- white men and by extension white women.
From the time Europeans landed in the Americas, there has been a race for land acquisition.  Once the Native Americans and the Mexican states were forcibly removed from their lands and homes via murder, enslavement, or cultural genocide, that made way for what has become The United States of America. The stolen parcels, stained with fresh blood of the rightful inhabitants that gave their lives defending their homes, were divided up for the new owners. When it came time to distribute the stolen land parcels the privilege of ownership was available almost exclusively to a select class- white male immigrants.  
In this country, at least fifteen generations of land ownership was the currency by which one built and maintained their family wealth and passed down such wealth to future generations. The institution of slavery ensured that ownership was a privilege specifically denied to most Black, Native and Mexican people, and their children for fifteen plus generations. For centuries, they built wealth for landowners while themselves owning nothing and having nothing to pass down to future generations.
There are some significant legislative landmarks that had lasting impacts on current day US housing:
40 Acres and a Mule
When blacks legally gained citizenship via the Civil Rights Act of 1866 which was ratified by the 14th Amendment in 1868 and after the Civil War, Congress passed the Southern Homestead Act. The stated purpose of the act was to allow for land in southern states to be acquired by formerly enslaved people. Hence, the expectation of 40 acres and a mule as recompense for generations of depravity and abject poverty imposed.  This was also seen as a way to stabilize black families and allow for a basic opportunity to build a life after the horrors they endured.  However, specifically excluded from being beneficiaries of the act were people holding two specific occupations: domestic servants and agricultural workers. As coincidence would have it (insert sarcasm and a major eye roll), formerly enslaved people, Native Americans, and Mexicans just so happened to occupy those roles in society. So white males were again, legally allowed to say “Sorry, no land for ‘you people’- still ”. The inability to own anything in addition to meager wages did not allow for wealth transfer in the form of land or money to be passed down to the children of Black, Native and Mexican families for another 5-8 generations.
Creating the Ghettos- Redlining
The National Housing Act of 1934 was passed by Congress which introduced the concept of redlining. Security maps for residential neighborhoods were created across the country. The security maps designated areas of high risk- which were majority black and minority communities.  These maps were created by the Home Owners’ Loan Cooperation as a way to outline the neighborhoods in red (hence the term redlining) so that banks would know exactly the areas to deny mortgages or improvement loans. The lack of loans prevented home ownership, community improvement or updating which lead to crumbling infrastructure and devaluing of those neighborhoods. The domino effect of crumbling infrastructure, no maintenance or upkeep by landlords and  more crowded environments led to devaluing of the property.  Since the properties were in disrepair the property taxes collected based on their value were insufficient to fund schools at a reasonable level. Resulting in a collapse of the school system. By design, the infrastructure of these redlined areas imploded- making it easy to shove minorities in but nearly impossible to get out.
Public Housing- Redlining 2.0 the new Ghettos
Low-income housing and further segregation was the end effect of The Housing Act of 1937. The intent was to provide relief from the Great Depression for standard low and middle-income families. Over time the housing units were only provided to low income, mostly minority families. The units were built intentionally in segregated parts of town. This further resulted in segregated housing for Blacks, Hispanic and Asian populations.
Black WWII soldiers denied GI Bill benefits
The GI Bill was signed by FDR in 1944 to provide soldiers returning from WWII with education, training, loans for farms, businesses, employment assistance and houses.  The low-cost mortgages lead to the rise of the suburbs. The problem, blacks couldn’t live in the suburbs although blacks were technically included in the benefits of the bill. The discrimination was upheld because whites did not want minorities moving to their neighborhoods.  They believed that minorities drive down property values. It was also considered unethical to sell a home to a black person in a predominantly white neighborhood.  There were covenants and clauses to ensure homes in most suburbs could only be sold to white families.
Civil Rights Act of 1964
Enduring 250 years of chattel slavery then 99 years of slavery in a different form brings us to 1964 when the Civil Rights Act was passed. (Of course, we are not detailing many of the tragic and important details during this time frame. It is worth noting that these years were hell for non-white people in nearly every way shape and form!) The Civil Rights Act was passed in 1964 and it prohibited discrimination on the basis of race, color, religion, sex, or national origin. So, finally, after dozens of generations of racist and discriminatory practices, we will get some housing justice and equity, right? Nope.
Even since civil rights were passed, discriminatory practices have continually affected who owns property as well as land.
Racial home ownership gaps were at the highest levels in 50 years in 2017. Statistics of home ownership:
79.1% of white Americans
41.8% of black Americans
This gap is even larger today than it was when deliberately racist and discriminatory redlining practices were rampant. Redlining was an effective systemic method to maintain social hierarchy and we still feel the effects today. This has kept blacks in certain neighborhoods and prevented them from owning land or real property. This practice resulted in another three to five generations of limiting opportunities, quality of life, and generational wealth for non-white Americans. This isn’t ancient history. A person that is 56-57 years old has lived this reality.
First Generation of Legally Free and Fully Equal Human Beings
In 2020, we are now living with the first generation of African Americans deemed to be legally, fully free, equal human beings in this country. I am one such African American born to parents that lived through segregation with no basis of wealth and systemically limited opportunities.  The lack of generational ownership or wealth is critical to understanding wealth disparity in the black middle class today.  The lack of generational wealth also contributes to the lack of mobility of lower-class black Americans. This reality makes it harder- if not impossible- to accrue and pass along wealth to any future generations.
Land ownership has been held as the mechanism by which wealth and status are transferred.  The deliberately exclusionary nature of land and real property ownership over the past 400 years has led us to our modern-day housing crisis.  Our current housing circumstance in the US is precarious but we are here by design.
A disruption is necessary.
https://www.uncontainedlivingkc.com/post/necessary-disruption-housing-reimagined
1 note · View note
46ten · 5 years ago
Text
The Forgotten Fifth
I started this post years ago, but unfortunately since have lost many of my notes. Still, at this time (and the day after Juneteenth) I think it’s critical that we understand that Black Americans have been here since the beginning, have advocated for themselves, and have fought for themselves. Our inability to “see” Blacks in American history means we don’t understand why Native American slaughter and Westward expansion happened, we discuss the goals of the antebellum “South” as though 4 million Blacks did not live there (and comprised nearly 50% of the population in some states), we rarely bring forward the consequences of the self-emancipation of enslaved Blacks on the Confederate economy, and so on. 
It’s also critical to reject false historical narratives that place white Americans as white saviors rescuing Blacks. Within the Hamilton fandom, there is a strong white supremacist narrative embedded in the praise for John Laurens*, an individual who could not be bothered to ensure the enslaved men with him were properly clothed - which says more about his attitude towards Blacks than any high language way he could write about them as an abstraction. And if we want to praise a white person for playing a big role in encouraging the emancipation of Blacks during the American Revolution, the praise should go to the Loyalist Lord Dunmore in that roundabout way.
The Forgotten Fifth is the title of Harvard historian Gary B. Nash’s book, and refers to the 400,000 people of African descent in the North American colonies at the time of the Declaration of Independence, one-fifth of the total population. Unlike commonly depicted, Blacks in the colonies were not waiting around for freedom to be given to them, or to assume a place as equals in the new Republic. Enslaved Blacks seized opportunities for freedom, they questioned and wrote tracts asking what the Declaration of Independence meant for them, they organized themselves. And they chose what side to fight on depending on the best offers for their freedom. At Yorktown in 1781, Blacks may have comprised a quarter of the American army.  
Most of what’s below is taken from wikipedia, other parts are taken from sources I have misplaced - the work is not my own.                     
In May 1775, the Massachusetts Committee of Safety enrolled slaves in the armies of the colony. The action was adopted by the Continental Congress when they took over the Patriot Army. But Horatio Gates in July 1775 issued an order to recruiters, ordering them not to enroll "any deserter from the Ministerial army, nor any stroller, negro or vagabond. . ." in the Continental Army.[11] Most blacks were integrated into existing military units, but some segregated units were formed.
In November 1775, Virginia’s royal governor, John Murray, 4th early of Dunmore, declared VA in a state of rebellion, placed it under martial law, and offered freedom to enslaved persons and bonded servants of patriot sympathizers if they were willing to fight for the British. Lord Dunmore’s Ethiopian Regiment consisted of about 300 enslaved men.               
in December 1775, Washington wrote a letter to Colonel Henry Lee III, stating that success in the war would come to whatever side could arm the blacks the fastest.[15] Washington issued orders to the recruiters to reenlist the free blacks who had already served in the army; he worried that some of these soldiers might cross over to the British side.
Congress in 1776 agreed with Washington and authorized re-enlistment of free blacks who had already served. Patriots in South Carolina and Georgia resisted enlisting slaves as armed soldiers. African Americans from northern units were generally assigned to fight in southern battles. In some Southern states, southern black slaves substituted for their masters in Patriot service
In 1778, Rhode Island was having trouble recruiting enough white men to meet the troop quotas set by the Continental Congress. The Rhode Island Assembly decided to adopt a suggestion by General Varnum and enlist slaves in 1st Rhode Island Regiment.[16] Varnum had raised the idea in a letter to George Washington, who forwarded the letter to the governor of Rhode Island. On February 14, 1778, the Rhode Island Assembly voted to allow the enlistment of "every able-bodied negro, mulatto, or Indian man slave" who chose to do so, and that "every slave so enlisting shall, upon his passing muster before Colonel Christopher Greene, be immediately discharged from the service of his master or mistress, and be absolutely free...."[17] The owners of slaves who enlisted were to be compensated by the Assembly in an amount equal to the market value of the slave.
A total of 88 slaves enlisted in the regiment over the next four months, joined by some free blacks. The regiment eventually totaled about 225 men; probably fewer than 140 were blacks.[18] The 1st Rhode Island Regiment became the only regiment of the Continental Army to have segregated companies of black soldiers.
Under Colonel Greene, the regiment fought in the Battle of Rhode Island in August 1778. The regiment played a fairly minor but still-praised role in the battle. Its casualties were three killed, nine wounded, and eleven missing.[19]
Like most of the Continental Army, the regiment saw little action over the next few years, as the focus of the war had shifted to the south. In 1781, Greene and several of his black soldiers were killed in a skirmish with Loyalists. Greene's body was mutilated by the Loyalists, apparently as punishment for having led black soldiers against them.
The British promised freedom to slaves who left rebels to side with the British. In New York City, which the British occupied, thousands of refugee slaves had migrated there to gain freedom. The British created a registry of escaped slaves, called the Book of Negroes. The registry included details of their enslavement, escape, and service to the British. If accepted, the former slave received a certificate entitling transport out of New York. By the time the Book of Negroes was closed, it had the names of 1336 men, 914 women, and 750 children, who were resettled in Nova Scotia. They were known in Canada as Black Loyalists. Sixty-five percent of those evacuated were from the South. About 200 former slaves were taken to London with British forces as free people. Some of these former slaves were eventually sent to form Freetown in Sierra Leone.
The African-American Patriots who served the Continental Army found that the postwar military held no rewards for them. It was much reduced in size, and state legislatures such as Connecticut and Massachusetts in 1784 and 1785, respectively, banned all blacks, free or slave, from military service. Southern states also banned all slaves from their militias. North Carolina was among the states that allowed free people of color to serve in their militias and bear arms until the 1830s. In 1792, the United States Congress formally excluded African Americans from military service, allowing only "free able-bodied white male citizens" to serve.[22]
At the time of the ratification of the Constitution in 1789, free black men could vote in five of the thirteen states, including North Carolina. That demonstrated that they were considered citizens not only of their states but of the United States.
Here’s another general resource: https://www.pbs.org/wgbh/aia/part2/2narr4.html
*The hyper-focus on John Laurens is one of the ways white people in the Hamilton fandom tell on themselves - they center a narrative about freedom for Blacks around a white man (no story is important unless white people can stick themselves at the center of it, no matter how historically inaccurate!). Lord Dunmore’s 1775 proclamation, if known, is seen just as cynically politically smart, while Laurens’ vision is seen as somehow noble.
**Whether Lord Dunmore’s Proclamation - encouraging enslaved Blacks to rise up and kill their owners and join the Loyalist cause - played a major role in the progress of the American Revolution was hotly debated as part of the 1619 project. 
4 notes · View notes
magicalgirlfumiko · 5 years ago
Text
The Aeon Sisterhood
Please Enter Password: ***********
Login Successful.
Please Select File:
File: aeon_sister.pdf
Document opened.
Review :The Roman Catholic Church was once a powerhouse in the world. While the Knights Templar are the best remembered of the various old Crusader orders, most of which have evolved into mere conspiracy theories, several remain around to this, enforcing the will of the Church upon the world of the magic.  
Note One: The Agency has no official ties to the Church. Mages were always in positions of power in Europe during the Early Middle Ages. The Affairs of the daily world were left to the Mage Council, while events of the divine were assigned to Church oversight. The first usage of magical girls by the Church came during the Children’s Crusade, traditionally the two popular crusades were led by young men, but it may have been a magical girl named █ █ █ █ █ █ █ that led them. Joan of Arc is considered a member of this movement. █ █ █ █ █ █ █ and Joan of Arc are seen as the foundational myths for the movement later known as the Sisterhood. 
Note Two: The Church’s officially has a secret group of crusaders to this day. They have long abandoned using Templar as titles, due to their loss of power in France March 22, 1312.The abrupt reduction in power of a significant group in European society gave rise to speculation, legend, and legacy through the ages. The new secret order would be called the Aeons. This movement followed many of the lost Gnostic gospels. This source of all being is an Aeon, in which an inner being;s soul dwells. The study is much more complex and the Agency has only had limited access to understanding how the heirarcy of this movement works. It is through mastery of their Aeons, that they are able to use magic. Though, they deny it being a type of sorcery. 
Note Three: The Aeon has a segregated set-up  meaning that male and female members belong to different branches. We will be looking at the Sisterhood. 
Much like the Flowers Program, the Sisterhood has found beings that can grant miracles. During the Battle for Rome in 1944, the Agency soldiers in the Allied Armies, noted that their were young women with angelic-like beings as their guides. It seemed that these young women by all accounts were magical girls that had been granted their powers by a miracle. It is unknown if if these are actual angels or beings of a different origin. 
Note Four: Letterhead found during the Battle of Rome, 1944. Translated from Italian into English. 
Greetings my dear child,
It is July of 1944 AD. The chances that you are here in the holy nation of Vatican City means that you were summoned by one of the Lord’s Messannger.
This means that you have been chosen to join the Sisterhood, an agency that works for the Church. Despite our usual views on sorcery, slayings, and the like, the Sisterhood have once again called for a special group of young women that have a high magical and psychical potential.
The reasons why you have been chosen vary from a deep devotion to your faith to totally personal raison d’être; like removing a curse.
Since you have come this far, all you have to is take care of the sin that exists in this world.
You see, ever since the massive slaughter of many innocents due to the Great War, nearly twenty years in the past now, a great increase of undead and other plagues have revealed themselves to the world. Now with the increased rate of death, 7 vampires called the Sins have awoken and are seeking their final member. 
Supernatural agencies from both warring parties believe that the Aeons are a danger and will stop at nothing to stop you in your quest…No matter the dangers involved, we can grant you the one desire that you want the most in exchange for your hard work.
Chose to continue and you will now be known as an Aeon.
You are a part a strong agency that has existed since the time of the Crusades.
This is the price of your salvation to freedom.
Your Obliging Friend,
Archbishop Francois St. Clair
Note 5: The Sisterhood is still centered in the halls deep under the marbled churches in Vatican City. They also have chapters in mainly Catholic nations like Spain, Portugal, France, Poland, and Mexico. It is believed that their main mission remains to fight their Crusade against the vampire sisters known as the Sins. They tend to be very devout to their cause, with a Crusader level of loyalty. The Agency has banned them from entering the Holy Land for obvious reasons. 
Note 6: Like all magical girl movements, a uniform is worn. These accounts came from research done in the early 2000s. They may be out of date by the time of reading:  All members were a white version of a nun’s outfit. They have several different ranks. Novices start with a plain white outfit, followed by a outfit with a single blue trim, followed by two blue trims, next by three. The next highest levels will either have two silver crosses on the cuffs or two golden crosses on each arm. The Agency has no records on what these ranks mean. 
Note 7: Since the Sisterhood tends to deal with vampires and werewolves, their arsenal is very similar to that of a Slayer. Common weapons tend to be crossbows blessed by Divine magic that fire arrows that uses divine fire to purify. Holy Water hand grenades were reported during an incident in Spain in 1977. Other possible weapons include lances and swords. The Sisterhood seldom uses the same armor or types of swords seen in use by magical girls in France that are of the Knight class. 
Note 8: It is recommended by Agency personnel to avoid too much interaction with the Sisterhood. Any interactions between them and the Flowers, generally ends up in duels. Treat them with respect, if they ask for your assistance. It is unknown how big their movement is, so we also recommend avoiding important religious sites of the Church, since they are likely around to defend it.  
█   █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █ █
it appears that much of the rest of the file has been corrupted. We sincerely apologize.
3 notes · View notes
sataniccapitalist · 6 years ago
Photo
Tumblr media
FEBRUARY 18, 2019
31 Actual National Emergencies
 by PAUL STREET
A Wannabe Strongman’s Brown Menace Straw Man
Everyone with five functioning gray cells knows that the aspiring fascist strongman Donald Trump’s Declaration of a National Emergency on the U.S.-Mexico border is absurd.
There is no “national security crisis” of illegal immigration on the southern United States border.
Illegal crossings are not at “emergency” levels; they are at a fifty-year low.
Undocumented immigrants are not a crime and violence threat.  They are less likely to commit crimes, violent ones included, than naturalized U.S. citizens.
Drugs come into the U.S. not through gaps in border fencing but primarily through legal ports of entry.
There is no big call for a completed U.S.-Mexico wall on the part of U.S. citizens on the southern border.
The United States military has not been “breaking up” and blocking “monstrous caravans” of illegal immigrants trying to harm the U.S.
The only crisis at the border is the humanitarian one created by Trump’s war on asylum-seekers and legal as well as technically illegal immigrants. The wannabe strongman has set up a ridiculous brown menace strawman in an effort to take an unprecedented step. He wants to use the National Emergencies Act to fulfill a ridiculous campaign promises to his white-nationalist base.  He wants to make an end run around Congress to spend federal taxpayer on a project that lawmakers chose not to fund – a political vanity scheme that is opposed by 60 percent of the U.S. populace.
Actual National Emergencies
An irony here is that the United States today is in fact haunted by many actual and interrelated national emergencies.  Here below are the top thirty-one that came to the present writer’s mind this last weekend:
1. Class Inequality. America is mired in a New Gilded Age where economic disparity is so extreme now that the top thousandth (the 0.1 percent, not just the 1 Percent) possesses more wealth than the bottom U.S. 90 percent and three absurdly rich U.S.-Americans – Jeff Bezos, Bill Gates, and Warren Buffett – possess more wealth between them than the bottom half of the country.
2. Poverty. The nation’s 540 billionaires (Trump is one of them) enjoy lives of unimaginable opulence (Trump flew off to one of his resorts to play golf after declaring his “national emergency” – an “emergency” he foolishly said he didn’t actually have to declare) while 15 million children – 21% of all U.S. children – live in families with incomes below the federal poverty threshold, a measurement that has been shown to be drastically below the minimally adequate family budgets families require to meet basic expenses.
3. Plutocracy. “We must make our choice,” onetime Supreme Court Justice Louis Brandies wrote in 1941. “We may have democracy, or we may have wealth concentrated in the hands of a few, but we can’t have both.” Consistent with Brandeis’s warning, the leading mainstream political scientists Benjamin Page and Martin Gilens find through exhaustive research that “the best evidence indicates that the wishes of ordinary Americans actually have had little or no impact on the making of federal government policy.  Wealthy individuals and organized interest groups – especially business corporations – have had much more political clout.  When they are taken into account, it becomes apparent that the general public has been virtually powerless…Government policy,” Page and Gilens determined, “reflects the wishes of those with money, not the wishes of the millions of ordinary citizens who turn out every two years to choose among the preapproved, money-vetted candidates for federal office.” Economic power is so concentrated in the US today you can count on one hand and one finger the multi-trillion-dollar financial institutions that control the nation’s economic and political life: Citigroup, Goldman Sachs, JP Morgan Chase, Wells Fargo, Bank of America, and Morgan Stanley. “You have no choice,” George Carlin used to tell his audiences earlier this century, “You have owners. They own you. They own everything. They own all the important land. They own and control the corporations. They’ve long since bought and paid for the Senate, the Congress, the state houses, the city halls. They got the judges in their back pockets and they own all the big media companies, so they control just about all of the news and information you get to hear.”
4. Bad Jobs. Trump boasts of American job creation and low official unemployment rate (real joblessness is a different story) while deleting the fact that tens of millions of the nation’s workers struggle with jobs whose pay lags far behind employment growth thanks to declining unionization (down to 6.5% of the private-sector workforce due to decades of relentless employer hostility), inadequate minimum wages, globalization, automation, and outsourcing. A third of the nation’s workers make less than $12 an hour ($24,960 a year assuming full-time work) and 42% get less than $15 ($31,200 a year). Good luck meeting a family’s food, rent, childcare, medical, and car payment (car ownership is often required in a nation that lacks adequate public transportation) costs on those kinds of returns on labor power. The Federal Reserve Bank of New York recently reported that a record 7 million U.S.-Americans are three months or more behind on their par payments. As the Washington Post reports: “Economists warn this is a red flag. Despite the strong economy and low unemployment rate, many Americans are struggling to pay their bills. ‘The substantial and growing number of distressed borrowers suggests that not all Americans have benefited from the strong labor market,’ economists at the New York Fed wrote in a blog post. A car loan is typically the first payment people make because a vehicle is critical to getting to work, and someone can live in a car if all else fails. When car loan delinquencies rise, it is a sign of significant duress among low-income and working-class Americans.”
5. Corporate Media Consolidation is so extreme in the U.S. now that just six corporations – Comcast, FOX, Disney, Viacom, CBS, and AT&T – together own more than half of traditional U.S. media content print, film and electronic. The Internet giants Google, Facebook, and Amazon rule online communication and shopping. (It is isn’t just about “news and information” [Carlin], by the way. The corporate-owned mass media probably spreads capitalist, racist, sexist, authoritarian, and military-imperialist propaganda more effectively through its entertainment wing than it does through its new and public/political affairs wing. A movie like “American Sniper” beats CNN reporting bias when it comes to advancing the U.S. imperial project [see #s 28 and 29 below]. A film like Clint Eastwood’s “Gran Torino” beats the evening news when it comes to advancing racist mass incarceration and racial segregation [see #s 6 and 9 below]).
6. Racial Disparity and Apartheid. The U.S. Black-white wealth gap is stark: 8 Black median household cents on the white median household dollar. Equally glaring is the nation’s level of racial segregation.  In the Chicago, New York, Detroit, and Milwaukee metropolitan areas, for example more than three in every four Black people would have to (be allowed to) move from their nearly all-black Census tracts into whiter ones in order to live in a place whose racial composition matched that of the broader region in which they reside. These two statistical measures are intimately interrelated since housing markets distribute so much more than just housing.  They also distribute access to jobs, good schools, green spaces, full-service groceries, safety, medical services and more that matters for “equal opportunity” and advancement.
7. Gender Inequality. Among full-time U.S. workers, women make 81 cents for every dollar a man is paid. The gap is worse in part-time employment since women more commonly work reduced schedules to handle domestic labor. Women ‘s median retirement savings are roughly one third of those of men. Households headed by single women with children have a poverty rate of 35.6 percent, more than double the 17.3 percent rate for households headed by single men with children. Women comprise just 27 percent of the nation’s top 10 income percent, 17 percent of the upper 1 percent, and 11 percent of the top 0.1 percent. By contrast, women make up nearly two-thirds (63 percent) of U.S. workers paid the federal minimum wage.
8. Native American Poverty. Thanks to the savage white-“settler” ethnic-cleansing of most of North America from the 16th century through 1900, Indigenous people make up just 1 percent of the U.S. population. The Native American poverty rate (28%) is double that of the nation as a whole and is particularly high in most of the commonly isolated and high-unemployment reservations where just more than a fifth of the nation’s Indigenous population lives. Native American life expectancy is 6 years short of the national average. In some states, Native American life expectancy is 20 years less than the national average. In Montana, Native American men live on average just 56 years.
9. Racist Mass Arrest, Incarceration, and Criminal Marking. The U.S. has the highest incarceration rate in the world, fueled by the racially disparate waging of the so-called War on Drugs. The racial disparities are so extreme that 1 in very 10 U.S. Black men is in prison or jail on any given day. One in 3 Black adult males are saddled with the permanent crippling mark of a felony record – what law professor Michelle Alexander has famously called “the New Jim Crow.” Blacks make up 12% of the U.S. population but 38% of the nation’s state prison population.
10. Trumpism/Fascism. The U.S. mass media focuses so heavily on the seemingly interminable awfulness of the creeping fascist Donald Trump (whose hideous nature is a ratings bonanza at CNN and MSNBC) that it is easy to lose sight of the fascistic horror of his authoritarian and white-nationalist supporters – roughly a third of the nation. The best social and political science research on Trump’s base reveals a fascist-like movementseeking a “strong” authoritarian “leader” who will rollback civil liberties and the gains won by women and racial and ethnic minorities since the 1960s. Trumpism wants to Make America more fully white-supremacist, patriarchal, and authoritarian (“great”) Again. Herr Donald’s disproportionately armed throng of die-hard devotees backs their Dear Leader no matter how terribly he behaves. It is a grave, creeping fascist threat to democracy.
11. The War on Truth. The aspiring fascist leader Trump made on average 15 false statements per day in 2018. He had stated more than 7,600 untruths as president by the end of last year. Trump lies constantly about matters big and small. He is a practitioner of what Chris Hedges calls “the permanent lie.” It is no small matter. In his description of this as “the most ominous threat” posed by Trump, Hedges quotes the philosopher Hannah Arendt. “The result of a consistent and total substitution of lies for factual truth,” Arendt wrote in her classic volume The Origins of Totalitarianism, “is not that the lie will now be accepted as truth and truth be defamed as a lie, but that the sense by which we take our bearings in the real world—and the category of truth versus falsehood is among the mental means to this end—is being destroyed.” Trump is only the most extreme and egregious wave of fabrication in a vast sea of national deception. U.S.-Americans, once accurately described by Alex Carey as “the most propagandized people in the world,” are surrounded by duplicitous and misleading information and imagery. This constant barrage of falsehood – examples include the thoroughly untrue notion that the U.S. possessed  a “great democracy” for the Trump campaign and Russia to (supposedly) “undermine” in 2016 – threatens to exhaust our capacity to distinguish fact from fiction.
12. Gun Violence. Fully 40,000 people died from shootings in the American “armed madhouse” in 2017 (we are still waiting for the grisly statistic for 2018). The U.S. was home to 322 mass shootings that killed 387 people and injured 1,227 in 2018. Twenty-eight mass shootings, killing 36 and wounding 92, took place in January of this year. A mass shooting killed five workers in Aurora, Illinois, on the very day (last Friday) that Trump declared his fake national emergency.
13. Sexual Violence. One in 5 women and 1 in 71 men will be raped at some point in their lives in the U.S.
14. Illiteracy and Innumeracy. More than 30 million adults in the United States cannot read, write, or do basic math above a third-grade level.
15. Manufactured Mass Ignorance and Amnesia. Thanks to corporate control of the nation’s media and schools, U.S.-Americans are shockingly ignorant of basic facts relating to their own history and society. White U.S.-Americans are mired in extraordinary denial about the level of Black-white inequality and the depth and degree of discrimination faced by Black Americans today. U.S.-Americans in general know next to nothing about the criminal and mass-murderous havoc U.S. foreign policy wreaks around the world.  This renders them incapable of understanding world politics and woefully vulnerable to nationalistic propaganda and militarism. Eleven years historian Rick Shenkman wrote a book titled “Just How Stupid Are We? Facing the Truth About the American Voter.” Shenkman found that a majority of Americans: didn’t know which party was in control of Congress; couldn’t name the chief justice of the Supreme Court; didn’t know the U.S. had three branches of government; believed George W. Bush’s argument the United States should invade Iraq because Saddam Hussein had attacked America on 9/11. Ask an average U.S.-American when the American War of Independence or the Civil War or WWII were fought and why, what the Bill of Rights was, what fascism is past and present, or what the Civil Rights Movement was about, and you will get blank stares and preposterously wrong answers. A people that doesn’t know its history wanders without a clue through the present and stumbles aimlessly into the future. Real historical knowledge is a great democratic people’s weapon and it is in perilously short supply in the U.S. today.
16. The Israel and Saudi Lobbies. Israel’s power in U.S. politics and political culture is so absurdly exaggerated that a freshman Muslim U.S. Congressional Representative (Ilhan Omar) was recently subjected to a massive and bipartisan political assault absurdly charging her with “anti-Semitism” for daring to Tweet seven words suggesting the elementarily true fact that the American Israel Public Affairs Committee (AIPAC) – a deep-powerful, deep-pockets public relations and lobbying organization committed to the advance of Israeli state interests – exercises money-lubricated influence on U.S. politics and policy. To visibly raise the question of Palestinian rights and Israel’s horrendous treatment of Arab peoples is to invite an onslaught from the Israel Lobby’s vicious and powerful attack-dogs. They’ve even been known to strip professors of tenure. Meanwhile, the despotic Saudi regime, possibly the most reactionary government on Earth, continues through money and other means to exercise huge influence on U.S. politics even as it senselessly crucifies the people of Yemen (with direct U.S. military assistance), cultivates terrorism across the Muslim world, and vivisects dissident journalists in its foreign embassies.
17. Neo-McCarthyism. The original Orwellian-American and Russia-mad McCarthyism of the late 1940s and 1950s has been resurrected in the post-Soviet era with a curious partisan twist. Anti-Russian hysteria has been picked up by the Democratic Party, which has been eager to blame its pathetic failure to defeat Trump on Russia’s supposedly powerful “interference in our [unmentionably non-existent] democracy” in 2016 – and to deny its politicos’ role in provoking any such relevant Russian interference as may have occurred. On the Republican side, Trump (who was mentored by Senator Joe McCarthy’s onetime chief counsel Roy Cohn!) and other GOP leaders now routinely follow in the footsteps of Joe McCarthy by calling even cringingly centrist corporate-neoliberal Democrats and everything they propose “socialist.” One of the most horrific moments in Herr Donald’s sickening State of the Union Address came when the Orange Mother of all Malignant Assholes (OMoAMA) told the assembled federal officials to “renew” the nation’s “pledge” that “America will never be a socialist country.”  Numerous Democrats, including House Speaker Nancy “We’re Capitalist and That’s Just the Way it is” Pelosi (net worth $71 million) and “progressive” U.S. Senator and presidential candidate Elizabeth Warren ($11 million) joined the GOPers in attendance in applauding that “pledge.”  McCarthyism was always and remains a richly bipartisan disease.
18. Health Care and Health. The United States’ corporate-owned/-managed for-profit health care system is the most expensive in the world but ranks just 12th in life expectancy among the 12 wealthiest industrialized countries. The U.S. spends almost three times more on healthcare as do other countries with comparable incomes. Reflecting poor, commercialized and corporate-imposed food systems and lethally sedentary life styles, 58 percent of the U.S. population is overweight, a major health risk factor.
19. Bad Schools. The nation’s expensive but very unequally funded schools deliver terrible outcomes. Among the world’s 34 ranking OECD nations, U.S. schools are the fifth most expensive, but the U.S. ranks scores far below average in math.  It ranks 17th among in reading and 21st in science.
20. Child Abuse. Childhelp reports that “Every year more than 3.6 million referrals are made to child protection agencies involving more than 6.6 million children. The United States has one of the worst records among industrialized nations – losing on average between four and seven children every day to child abuse and neglect…A report of child abuse is made very ten seconds.”
21. Depression and Substance Abuse. The United States, once described by onetime U.S. Senator Kay Bailey Hutchinson as “the beacon to the world of the way life should be” (in a speech supporting the Congressional authorization of George W. Bush to invade Iraq) has the third highest rates of depression and anxiety and the second highest rate of drug use in the world. “One in five adults in the U.S. experiences some form of mental illness each year,” according to the National Alliance on Mental Illness. That estimate is certainly absurdly low.
22. Immigrant Workers Without Rights. Undocumented immigrants make up 55% of hired labor on farms, 15% of laborers in construction, and 9% in both industry and the service sector. “These workers,” CBS reported earlier this year, “play vital roles in the U.S. economy, erecting American buildings, picking American apples and grapes, and taking care of American babies. Oh, and paying American taxes.”  Their technically illegal status makes them easily exploited by employers and undermines their ability to organize and fight for decent conditions both for themselves for other workers.
23. The Dreamer Nightmare. Eight hundred thousand people living in the U.S. were brought to the country as children by parents without U.S. citizenship.  These “Dreamers’” legal status is stuck in limbo.  They are not allowed to vote. They live in the shadow of possible future deportation, with their legal status treated as a partisan political football.
24. Vote Suppression. State-level racist voter suppression and de facto disenfranchisement is rife across the United States. Among other things, this has contributed significantly to the Republicans winning the presidency in 2000, 2004, and 2016. A “gentleman’s agreement” between the two reigning political parties pushes this critical problem to the margins of public discussion. (The Democrats have widely ignored the matter while they have obsessed for two years plus about Russia’s real or alleged role in the last election.  Moscow’s influence was likely small compared to American-as-Apple Pie racist voter suppression in electing Trump.) “The United States,” political scientist David Schutlz noted on Counterpunch last year, “is the only country in the world that still does not have in its Constitution an explicit clause  affirmatively granting a right to vote for all or some of its citizens.”
25. The Absurdly Archaic U.S. Constitution. Popular sovereignty, also known as democracy was the late 18thcentury U.S. Founders’ ultimate nightmare.  They crafted an aristo-republican national charter brilliantly crafted to keep it at bay – in the darkly ironic name of “We the People.”  Two and a third centuries later, their handiwork continues to do its explicitly un- and anti-democratic work through such openly authoritarian mechanisms as the Electoral College, the apportionment of two Senators to every U.S. state regardless of population, the distant time-staggering of elections, the lifetime presidential appointment and Senate approval of Supreme Court justices.  The preposterously venerated U.S. Constitution is an ongoing 232-year old authoritarian calamity in dire need of a radical and democratic overhaul. It is long past time for the populace to declare a national emergency and call for a Constituent Assembly to draft a new national governing structure dedicated to meaning popular self-rule.
26. Trump and the Imperial Presidency. The OMoAMA (Trump) is by all indications a demented and malignant narcissist, a pure sociopath, and a creeping fascist. But the fact that someone as twisted, venal, sexist, and racist as Trump can pose dire threats to humanity in the first place is in no small part a function of the extreme powers that have accrued to the United States constitutionally super-empowered executive branch over the many decades in which the U.S. has reigned as the world’s most powerful state.  The absurdly vast and authoritarian powers of the imperial presidency are an on ongoing national and global emergency.
27. Election Madness/Electoralism. In the early spring of 2008, the late radical American historian Howard Zinn wrote powerfully against the “Election Madness” he saw “engulfing the entire society including the left” in the year of Obama’s ascendancy. “An election frenzy seizes the country every four years,” Zinn worried, “because we have all been brought up to believe that voting is crucial in determining our destiny, that the most important act a citizen can engage in is to go to the polls. …” Zinn said he would support one major-party candidate over another but only “for two minutes—the amount of time it takes to pull the lever down in the voting booth.” Then he offered sage counsel, reminding us that time-staggered candidate-centered major party electoralism is a very weak surrogate for real popular sovereignty, which requires regular grassroots organization and militancy beneath and beyond what his good friend Noam Chomsky has called“the quadrennial electoral extravaganza”: “Before and after those two minutes, our time, our energy, should be spent in educating, agitating, organizing our fellow citizens in the workplace, in the neighborhood, in the schools. Our objective should be to build, painstakingly, patiently but energetically, a movement that, when it reaches a certain critical mass, would shake whoever is in the White House, in Congress, into changing national policy on matters of war and social justice. … We should not expect that a victory at the ballot box in November will even begin to budge the nation from its twin fundamental illnesses: capitalist greed and militarism. … Before [elections] … and after … we should be taking direct action against the obstacles to life, liberty, and the pursuit of happiness. … Historically, government, whether in the hands of Republicans or Democrats, conservatives or liberals, has failed its responsibilities, until forced to by direct action: sit-ins and Freedom Rides for the rights of black people, strikes and boycotts for the rights of workers, mutinies and desertions of soldiers in order to stop a war. Voting is easy and marginally useful, but it is a poor substitute for democracy, which requires direct action by concerned citizens.” The reigning “mainstream” US media and politics culture is fiercely dedicated to advancing the hegemony of the major party candidate-centered election cycle, advancing the deadly totalitarian notion that those two minutes in a ballot box  once every four years – generally choosing among politics vetted in advance for us by the nation’s unelected and interrelated dictatorships of money and empire – is the sum total of “politics” – the only politics that really matters.  Since the hidden corporate control of the US electoral politics on behalf of the center-right ruling class rules out victory for candidates who accurately reflect majority left-progressive public opinion, these ritual exercises in fake democracy deeply reinforce the fatalistic and false belief that most Americans are centrist and right-wing. The 2020 Democratic Party presidential candidate Iowa-New Hampshire circus is already sucking up vast swaths of cable news coverage and commentary while numerous pressing matters (like most of what is listed in the present essay) is largely ignored. It’s pathetic.
28. Guns Over Butter. Dr. Martin Luther King, Jr. rightly preached that the U.S. could not end poverty or escape “spiritual death” as long as it diverted vast swaths of its tax revenue to a giant war machine that “draw [s] men and skills and money like some demonic destructive suction tube.” Just over half a century after King said this, the United States gives 54 percent of its federal discretionary to the Pentagon System, a giant subsidy to high-tech “defense” (war and empire) corporations like Raytheon and Boeing. Six million U.S, children live in “deep poverty,” at less than half (!) the federal government’s obscenely inadequate poverty level, while the U.S, government maintains 800 military bases in more than 70 countries and territoriesaround the world (Britain, France, and Russia together have a combined 30 foreign bases) and accounts for nearly 40 percent of all global military spending. It is deeply offensive that the progressive-populist (fake-“democratic socialist”) U.S. Senator and presidential candidate Bernie Sanders has repeatedly cited Scandinavian nations as his social-democratic policy role models without having the elementary Dr. Kingian decency to note that those countries dedicate relatively tiny portions of their national budgets to the military. It is disturbing but predictable that most Congressional Democrats voted for Trump’s record-setting $700 billion Pentagon budget last year. U.S. Americans must choose: we can have democracy, social justice, guaranteed free health care, well-funded public schools, and livable ecology or we can have a giant global war machine.  We can’t have both.
29. Doctrinal Denial of U.S. Imperialism. Across the U.S. “mainstream” political and media spectrum, it is beyond the pale of acceptable discussion to acknowledge that the United States is a deeply criminal and imperialist power. The examples are endless. It is normative for U.S. cable talking heads, pundits, and politicians to discuss Eastern Europe or East Asia as if the Washington has as much right to influence developments there as Moscow and Beijing, respectively. Terrible developments in the Middle East and North Africa are routinely discussed by “mainstream “U.S. politicos, talking heads, and pundits as if the United States had not wreaked nearly indescribable havoc on Iraq and Libya and the broader Muslim world. Migrants seeking asylum from Central America are regularly reported and discussed with zero reference to the fact that the United States has inflicted massive and bloody devastation on that region for decades – and without mentioning the Obama administration’s support of a vicious right-wing coup in Honduras in the spring of 2009.  Reporting on the current political crisis in Venezuela comes with complete Orwellian deletion of the United States’ role in crippling the nation’s democratically elected socialist government on the model of the Nixon administration’s campaign to undermine Chile’s democratically elected socialist government in the late 1960s and early 1970s.  No serious discussion is permitted of the historical context of Washington’s longstanding intervention and regime-change operations across Latin America. The reigning Empire-denial is absurd.
30. Amazon. Google (lol) up its mind-boggling and many-sided monopolistic reach and then thank the New York City Left for stopping this public-subsidy-sucking, zero tax-paying corporate monstrosity from setting up its headquarters in the nation’s largest city.
31. Last but not at all least, Ecocide. The climate catastrophe poses grave existential threats to livable ecology and all prospects for a decent human future. It is a national and global emergency of epic proportions. It is the single biggest issue of our or any time. If this environmental calamity is not averted soon, nothing else that progressives and decent citizens everywhere care about is going to matter all that much. The United Nations Panel on Climate Change has recently warned that we have a dozen years to keep global warming to a maximum of 1.5C, beyond which true cataclysm will fall upon hundreds of millions of people. Under the command of capital, we are currently on a pace to melt Antarctica by 2100. The unfolding climate disaster’s leading political and economic headquarters is the United State, home to a super-powerful fossil fuel industry with a vast, deeply funded lobbying and public relations apparatus dedicated to turning the planet into a giant Greenhouse Gas Chamber.
Towards a Green New Deal
If a vicious and moronic creeping fascist like Donald Trump can declare a fake national emergency over a non-existent crisis in order to build a political vanity wall rejected by Congress and 60 percent of the population, perhaps a future decent and democratic government sincerely committed to the common good could declare a national emergency to address the all-too real climate crisis by moving the nation off fossil fuels and on to renewable energy sources while advancing environmentally sustainable practices and standards across economy and society.  A properly crafted Green New Deal would also and necessarily address other and related national emergencies including the crises of financial oligarchy, bad jobs, inequality, poverty, plutocracy, racial inequality, mass incarceration, untruth, inadequate health care, fascism, poor schooling, mental illness, substance abuse, gun violence, militarism-imperialism, gender disparity, spiritual death, and much more.  I plan in a future essay to elaborate on what it is meant by a “properly crafted Green New Deal.”
Join the debate on Facebook
More articles by:
PAUL STREET
Paul Street’s latest book is They Rule: The 1% v. Democracy (Paradigm, 2014)
7 notes · View notes
anamedblog · 6 years ago
Text
Archaeology, Sports, and ANAMED
by Vera Egbers, ANAMED PhD Fellow (2018–2019)
Tumblr media
Max von Oppenheim (sitting on the chair) during breakfast, 1913 (by Hausarchiv Sal. Oppenheim jr. & Cie., Köln)
Without a doubt, archaeology is a colonial science (e.g., Effros and Lai 2018). In particular, the birth of “Near Eastern” archaeology—as it is still called today—revolves around the imperial awakening of European nations like France, Italy, Great Britain, and Germany during the late 19th/early 20th century.
Though specific individuals of that time might have had different motivations, that is, to prove the authenticity of the bible, or to accumulate extraordinary artifacts for newly established national museums and thereby gain some fame and influence, the discipline was always connected to notions of political interests in the region, to the feeding of orientalist thought, and to power struggles among European nation states.
In the course of the 20th century, the political situation of both Europe and the Middle East changed drastically, leading also to shifts and adjustments in archaeology. But the remnants of the formation of the field are still active today and difficult to overcome. We publish mostly in English, German, or French, only gradually moving towards abstracts and full articles in Arabic, Kurdish, and a little more so, Turkish. Military technologies like drones or satellite-imagery are often used without the consent of local communities (see Pollock 2016 for further detail). And even the way we present many of our data is obviously rooted in the thinking of borders and territories—there are hardly any archaeological publications without maps showing the extent of the empires or chiefdoms under study. As O. Hugo Benavides recently stated during a talk at the SAA Annual Meeting, archaeology is in fact not about the past, but about the present (similar see Benavides 2003). More specifically, it is about politics and the question of how we connect past and present, hence creating a past for our self-understanding. It is valuable to wonder why we still undertake this colonial endeavor, why we archaeologists want to deal with the past and also—after Foucault—what this doing does?
For me personally, a major part of being an archaeologist is exchange, travel, and sociology. In a healthy way, the discipline has lost its former allegedly greater political meaning (something some scholars still seem to refuse to acknowledge) and turned—at least partially—towards social theories that seek to understand both the socio-cultural structures of the past and their meaning and influence on the present. In this realm, archaeological projects started in recent years to study and consider the impact excavations have on the local communities they are based at—and vice versa (see Rosenzweig and Dissard 2013).
Tumblr media
“View of the village of Tepe from the mound of Ziyaret Tepe” (Photograph by A. Wodzińska)
While only rarely the hierarchical dichotomy between workmen and -women on the one side and (foreign) researchers on the other can be overcome, I believe that already the awareness of the “excavation-doing” in a community that lives at or next to the archaeological remains is an important step towards acknowledging the two-sided endeavor each research project is, as well as the effect it has on all members involved.
With these broader thoughts in mind, I pondered about my life as a doctoral fellow at ANAMED in the heart of Istanbul. Many topics came to my mind, like the street musicians that are so much a part of our daily soundscape, yet being such strangers to us even after all this time; or the disappearance of Syrian forced migrants that were so present at Istiklal Caddesi just a few years ago, but have become almost invisible now.
Here, I want to dedicate this second part of my blog-entry to something seemingly more trivial. Having now lived for almost seven months in the Merkez Han building of ANAMED, I noticed an almost incidental “collaboration” established with the fellowship that had been going on between fellows and locals nearly unnoticed for years—a connection that goes quite literally under the skin: the gym. Working all day with the mind, a physical compensation is vital to maintain a healthy balance of the two. Just a ten minute, hilly walk away from ANAMED, at a lively corner of Cihangir, a group of fellows registered at a sports club and is using its facilities to do Pilates, Yoga, or “Total Body” workouts.
Tumblr media
A lively corner in Cihangir, around the Firuzağa Cami” (photo by “Behlül'ün Gezi Rehberi)
As random or insignificant as this might sound, it literally shapes our bodies and hence our connection to the space we chose to temporarily live at. The buzzing backdrop of trashy electro-pop music, the smells of sweat and moving bodies, the Turkish instructions (“Nefes tutma!”), as well as the smiles and chats with familiar faces contribute to a sense of belonging and community.
Tumblr media
Weekly Schedule of the Gym
Of course, other than being an archaeologist that settles down in a village for a couple of months each year, we invade the gym as customers and as such possess a greater influence on the way we are treated and heard. As workwomen and -men on excavation, the position in regard to underlying power relations is not the same. Yet in both cases there is more to it than a simple “employer-employees” or “seller-customer” relationship. A space with room for exchange and interaction between otherwise unconnected people is created, laying the base for the possibility of benevolent mutual curiosity, personal growth—or just some fun.
Tumblr media
Questionable Quote at the Gym’s entrance
For instance, each Thursday a handful of people meets in front of the gym and runs for an hour in the streets of Istanbul—to Maçka Park, the Golden Horn, up until the Marmara Sea. Everything you see, you hear, you experience, is shared with the ones you are jogging with. There is for example the young Turkish personal trainer, who is trying to become an actor, or the Turko-British writer that could talk for hours about languages, not to forget about one of the owners of the sports club, who revealed to me that ANAMED-fellows had been members of his gym for years now and therefore he knows exactly how long most of us are living in Istanbul and why we are here. While this is not part of the official agenda of the ANAMED-fellowship, it is worthwhile reflecting upon such small-scale encounters and the ways they become a part of the creation of somehow globalized subjects. When perceived and acted out self-consciously, it has the potential to set a subtle, miniature counterpart to the globally growing threat of nationalism and segregation. Apart from all this idealism—doing sports after a long day in ANAMED’s study room is also simply a lot of fun!
Tumblr media
Street dog and cats on the way home to ANAMED—a motivator to relax too!
Bibliography:
Benavides, Hugo. 2003. “Seeing Xica and the Melodramatic Unveiling of Colonial Desire.” In ‘Social Text’ 76, 21(3), 109–34.
Effros, Bonnie and Guolong Lai, ed. 2018. ‘Unmasking Ideology in Imperial and Colonial Archaeology: Vocabulary, Symbols, and Legacy’. Los Angeles: Cotsen Institute of Archaeology Press, University of California, 2018.
Pollock, Susan. 2016. “Archaeology and Contemporary Warfare.” In ‘Annual Review of Anthropology’ 45(1), 215–31.
Rosenzweig, Melissa and Laurent Dissard. 2013. “Common Ground: Archaeological Practice and Local Communities in Southeastern Turkey.” In ‘Near Eastern Archaeology’ 76(3), 152–58.
2 notes · View notes
douxreviews · 6 years ago
Text
Quantum Leap - Season One Review
Tumblr media
"Oh, boy."
Quantum Leap began as a mid-season replacement in early 1989, ran for five seasons (1989-1993), and made a television star out of Scott Bakula. While it was running, it was one of my two favorite shows (the other was Star Trek: The Next Generation). There wasn't much good science fiction on television back then. Actually, there wasn't much sci-fi on television at all, unlike today's sci-fi-rich television environment.
What happens
A brilliant scientist named Sam Beckett (Scott Bakula) invents time travel. Pressured to produce results or lose funding, he tries it on himself — and wakes up in 1956 in someone else's body. With the help of his Quantum Leap Project partner Al (Dean Stockwell) who visits Sam in the form of a neurological hologram, Sam discovers that he must correct whatever it was that "went wrong" in the original timeline before he can leap out again. It is theorized by Ziggy, the artificial intelligence back at the Project, that if Sam can't make the appropriate correction in each leap, he'll be stuck in that person's body forever.
What works
There is so much to love about Quantum Leap. Fortunately, the two best things about the show are the main characters, Sam and Al, and the actors who played them. I've always thought that Sam Beckett is a dream role for an actor, and Scott Bakula was more than up to the challenge of playing a new character in a new situation every week. Okay, not exactly a new character, but he still had to play Sam's interpretation of that character, which added some acting layers while still preserving the integrity of Sam himself as a character.
Yes, Sam Beckett is just too perfect. A genius with six doctorates, his massive intellect made him capable of stepping into nearly anyone's life. What helped make Sam less perfect was that the Quantum Leap process made "swiss cheese" out of his memory. His partial amnesia also helped disconnect him from his old life, making it easier to immerse himself in the lives of the people he leaped into, an excellent plot device.
And then there is Al, who is also brilliant and multi-talented, and whatever Sam can't do while living someone else's life, like fly a plane or speak Italian, Al can step in and help. Al is also the king of double entendres and references to scoring with women, and under other circumstances, I would have found such a character repulsive. But Dean Stockwell is just so lovable in this part. He made it easy to see the humanity and goodness inside Al, right from the start. And Bakula and Stockwell played so well off each other. Even though Sam and Al were totally different people, they were believable as close friends.
The basic premise of the series is great, too; it's a fascinating framework for a time travel series. The only real limitation is that Sam couldn't travel to the future or to a time earlier than 1953. Setting episodes in the fifties, sixties or seventies made Quantum Leap all about the nostalgia, though. Gender roles, period music, historical events woven into the story like the east coast blackout and the streaking fad in the early seventies, you name it.
And then there were the clothes. I have little interest in fashion, but I love the costumes on this show. Scott Bakula looked so comfortable and natural, so right in those period outfits. Sometimes they were yummy; occasionally they were hilarious. What I enjoyed just as much was Al showing up in bizarre futuristic outfits in outrageous colors, which fortunately never became fashionable in real life. Like Bakula with the period clothes, Dean Stockwell simply made that wardrobe work. Al is a colorful character, and his wardrobe matches his personality.
What doesn't work
There isn't much I don't like about Quantum Leap. Maybe it would have been interesting if they hadn't been limited to Sam's lifespan, and the United States (and yes, brief spoiler, they do get around that occasionally in future episodes). And yes, it tends toward the procedural, since most of the episodes are Leaps of the Week, but hey, it was the nineties.
One thing did leap :) out at me during this rewatch — the show's tendency to lecture. In this abbreviated first season, we got "The Color of Truth," the first time that Sam leaped into the body of someone who wasn't a white guy like himself. Instead of just being a person of color with an important life experience that Sam had to figure out and change, "The Color of Truth" is a sixty-minute lecture on the evils of racial segregation in 1955 Alabama. Not that there's anything wrong with the topic: it was a huge and important part of the recent past, and the episode was both well-intentioned and well done. But preachiness can be a turnoff, and this wasn't the only time it happened.
Another thing I didn't like was that every episode ended in a cliffhanger as Sam leaped into his next challenge, in what always appeared to be dire circumstances. Yes, I get it, cliffhangers help bring the audience back. But I would have been a lot happier if they had simply ended each episode with Sam leaping out, who knows where.
The music replacement controversy
When Quantum Leap was initially released on DVD way back when, Universal decided not to buy the rights to a number of the songs featured on the series simply because it was prohibitively expensive. Changing the music changed the series, though, and many fans were livid about it. The worst offenders were the season two episodes "M.I.A." and "Good Morning, Peoria." (I'll talk more about why fans were upset in my review of season two.)
After some research, I can report that Amazon and Netflix fixed this serious problem; the original music is intact. (I'm writing this review in December 2016, and I live in the U.S.) Unfortunately, Netflix decided to stop carrying Quantum Leap as of January 1, 2017, when I hadn't quite finished my rewatch, so I had to move to Hulu. And unfortunately, Hulu does not feature the original music. I have no idea what is going on with the music in the DVD sets. If you plan to buy Quantum Leap on DVD, you might want to find out about the music replacement situation before purchasing, if it matters to you.
Important episodes
1.1/1.2 "Genesis (September 13, 1956)": This is a decent two-part pilot. The brave test pilots and their long suffering wives waiting at home kept reminding me of the 1983 movie The Right Stuff, which might have been their intention. (In fact, many Quantum Leap episodes remind me of specific movies.) Maybe it shouldn't have been a two-parter, though, because honestly, while Sam's "wife" was doing the laundry, I got a little bored.
This pilot does mention the possibility that Sam's leaping is being directed by God. You'd think God would have the power to fix things Herself without having to use Sam, but okay. Maybe God employs other people like Sam, too.
1.6 "Double Identity (November 8, 1965)": Best episode of the season, and an obvious tribute to The Godfather. The wedding scene where Sam had to sing and Al gave Sam the Italian lyrics to "Volare" was funny, and kept getting funnier as Sam channeled his inner lounge lizard and really got into it. In fact, it went on so long that you'd think it would stop being funny, but it didn't.
Tumblr media
(This might be a good time to mention that Scott Bakula has a beautiful, professional singing voice that they often featured in the series.)
Later, during a life and death situation and wearing hair clips and shaving cream, Sam had to converse in Al-prompted Italian. Bakula spoke the lines Sam didn't understand as if he were reciting poetry. And the ending with the thousand watt hair dryer in Buffalo causing the east coast blackout of 1965 was practically perfect.
1.9 "Play It Again, Seymour (April 14, 1953)": A very Sam Spade sort of episode with bits of Casablanca, with Sam in the body of a private eye who looked like Bogart investigating the murder of his partner. Of course, there was a dame — his partner's slinky wife, Alison (Claudia Christian, one of my favorites from Babylon 5). There was also a poorly written novel called Dead Men Don't Die, a dropper named Klapper, and every hardboiled detective cliche you can imagine.
Tumblr media
Much of "Play It Again, Seymour" was filmed in the Bradbury Building, a Los Angeles landmark that was also used as a major location in my favorite science fiction movie, Blade Runner. When I was living in L.A., I went to see the building in person. It's gorgeous.
Sam was born in August 1953, and this final leap of the season was set in April 1953. I can only assume the leap range was defined by Sam's conception, not his birth?
Bits and pieces:
-- In season one, Sam leaps into and must become: a test pilot, a professor of literature, a boxer, a veterinarian, a chauffeur, a drag-racing teenager, and a private eye.
-- There are many references to three characters we don't get to meet in this first season: Ziggy, the artificial intelligence that gives Al projections on what Sam is supposed to change; Gooshie, a little guy with bad breath who also works on the Project; and Al's current girlfriend Tina. (Okay, oops, I'm wrong. According to IMDb, Tina is the woman with the flashing earrings that Al picked up in his car.)
-- The person that Sam replaces turns up in the imaging chamber, and Sam only knows how others see him by looking in a mirror. The synchronized mirror scenes are okay, although the motions were never choreographed well enough for me to suspend belief. Maybe those scenes should have been done more simply.
-- In the pilot, Sam wanted desperately to contact his late father but couldn't remember his own last name. Later in the season, in a lovely scene, Sam did speak with his father on the phone but of course, didn't tell him who he was.
-- It is established in season one that animals can see Al, that Al had been raised in an orphanage, had participated in protests during the civil rights movement, and has been married five times.
-- Famous people: Sam gives teen Buddy Holly the lyrics to "Peggy Sue," and shows a tiny Michael Jackson how to moon walk.
-- Notable actors: Teri Hatcher as Sam's first love in "Star-Crossed," Mark Margolis from Breaking Bad in "Double Identity," and Claudia Christian in "Play it Again, Seymour."
-- The saga sell is fun and so are the opening credits and theme music. But come on. A little "caca"? That's childish. I'm glad they didn't retain that.
-- Scott Bakula has a streak of white in his hair. It's not artificial; he has said during interviews that he's had it since childhood.
-- We're told that you cannot fix your own life. Why?
Season one is all "leap of the week" episodes, but it's a short first season and there's nothing wrong with that. By the end, we still don't know much about Sam, Al, or the Quantum Leap Project, so there's a lot of story left to tell.
On to season two!
Billie Doux loves good television and spends way too much time writing about it.
9 notes · View notes
drunklander · 6 years ago
Link
Slavery: perhaps the last, great unmentionable in public discourse. It is certainly a topic that even today makes people very uncomfortable, regardless of their race.
American society has often expressed its internal problems through its art. Perhaps the most powerful medium for important discussions since the turn of the last century has been the motion picture.
For decades Hollywood has attempted to address the issue of slavery. For the most part, films have represented the period of enslavement in a manner that reflected society's comfort level with the issue at the time. Director D. W. Griffith's 1915 silent drama, Birth of a Nation, for instance, depicted African Americans (white actors in black face) better off as slaves. Griffith's movie showed the institution of slavery "civilizing" blacks. Birth even made it seem like slaves enjoyed their lives and were happy in servitude.
That wasn't the case, of course, but it was what white society wanted to believe at the time.
More than two decades after Birth of a Nation, the portrayal of African Americans in films had changed only a little. 1939 saw the release of one of Hollywood's most acclaimed movies, Gone with the Wind. Producer David O. Selznick believed he was serving the black community with respect — he made sure the novel's positive portrayal of the Ku Klux Klan was eliminated from the film, for example. But Gone with the Windnevertheless treated the enslaved as relatively happy, loyal servants, a depiction that continued to reflect America's segregated society. History was made, however, when Hattie McDaniel became the first African American to win an Academy Award for her role as "Mammy." Still, her part, and the parts of the other black actors drew harsh criticism from major African American newspapers and civil rights groups.
Nearly forty years later, one of Hollywood's most meaningful attempts to portray the period of enslavement came in 1977 with the television blockbuster mini-series, Roots. Based on Alex Haley's 1976 best-selling book, Roots: The Saga of an American Family, the mini-series was groundbreaking on many levels. It was a dramatic series with a predominantly African American ensemble that captured a record 37 Emmy nominations — television's highest artistic award.
And Roots marked the first time America witnessed slavery portrayed in detail. Along with the scenes of transporting, selling, and trading men and women, were scenes showing the brutality African Americans often suffered at the hands of slave owners. The depictions of abuse and cruelty were limited, of course, by the medium and by what American society would accept at the time. In keeping with the series' marketing campaign, the show focused heavily on the family's ultimate triumphs. For all of Roots' firsts, and there were many, it was ultimately a story of resiliency.
Fast forward three-plus decades — American society is undeniably changed. African Americans are regularly featured in movies and television shows. The nation elected, then re-elected, an African American president, Barack Obama.
Drawing critical acclaim today is the movie 12 Years a Slave. 12 Years is a watershed moment in filmmaking. Not only does it feature remarkable performances, excellent cinematography, and powerful direction; it also offers the first realistic depiction of enslavement.
Unlike prior motion pictures and television shows, 12 Years does not retreat from the brutality many blacks endured. The movie is not for the faint hearted, as the violence and cruelty it portrays is not the highly stylized violence found in films like Django Unchained. 12 Years is true to the reality that for years many Americans treated fellow human beings with ruthless brutality — and that reality is harder to face.
The film, however, is not only drawing praise from critics — it recently received nine Oscar nominations, including Best Picture — but enjoying audience appreciation, as well. With that appreciation comes an opportunity to bring the discussion of slavery to the mainstream.
This, then, is an exciting time for the Smithsonian's National Museum of African American History and Culture. Among its many virtues, the Smithsonian is a great legitimizer with a long tradition of providing venues for Americans to examine their shared history. One of the over-arching goals of the National Museum of African American History and Culture is to create a place where issues like enslavement can be viewed through an unvarnished lens.
America today needs this discussion and I believe it is ready for it, a sentiment undergirded by a belief in the public's ability to deal with and care about the issue. The great strength of history, and African American history, is its ability to draw inspiration from even the worst of times. No doubt people throughout the nation and around the world will find that inspiration when they visit the Museum and view our major exhibition on "Slavery and Freedom" when our doors open in late 2015.
Before I close, I want to recommend four insightful narratives written by African Americans during this period of American history. The first is Solomon Northup's book, 12 Years a Slave. Next is Incidents in the Life of a Slave Girl, by Harriet Jacobs. One of the first books to describe the sexual abuse and torment that female slaves endured, Incidents became one of the most influential works of its time. Our Nig: Sketches from the Life of a Free Black, by Harriet Wilson, is believed to be the first novel published by an African American in North America. Though fictionalized, Wilson's book is based on her life growing up in indentured servitude in New Hampshire. Finally, Narrative of the Life of Frederick Douglass, An American Slave, remains today one of the most important autobiographical works ever written by an American.
11 notes · View notes
xtruss · 3 years ago
Text
The Black Mortality Gap, and a Document Written in 1910
Some clues on why health care fails Black Americans can be found in the Flexner Report
— By Anna Flagg | August 30, 2021
If Black Americans died at the same rates as white Americans, about 294,000 Black Americans would have died in 2019. Each dot represents 10 people
Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media Tumblr media
Black Americans die at higher rates than white Americans at nearly every age.
In 2019, the most recent year with available mortality data, there were about 62,000 such earlier deaths — or one out of every five African American deaths.
The age group most affected by the inequality was infants. Black babies were more than twice as likely as white babies to die before their first birthday.
The overall mortality disparity has existed for centuries. Racism drives some of the key social determinants of health, like lower levels of income and generational wealth; less access to healthy food, water and public spaces; environmental damage; overpolicing and disproportionate incarceration; and the stresses of prolonged discrimination.
But the health care system also plays a part in this disparity.
Research shows Black Americans receive less and lower-quality care for conditions like cancer, heart problems, pneumonia, pain management, prenatal and maternal health, and overall preventive health. During the pandemic, this racial longevity gap seemed to grow again after narrowing in recent years.
Some clues to why health care is failing African Americans can be found in a document written over 100 years ago: the Flexner Report.
In the early 1900s, the U.S. medical field was in disarray. Churning students through short academic terms with inadequate clinical facilities, medical schools were flooding the field with unqualified doctors — and pocketing the tuition fees. Dangerous quacks and con artists flourished.
Physicians led by the American Medical Association (A.M.A.) were pushing for reform. Abraham Flexner, an educator, was chosen to perform a nationwide survey of the state of medical schools.
He did not like what he saw.
Published in 1910, the Flexner Report blasted the unregulated state of medical education, urging professional standards to produce a force of “fewer and better doctors.”
Flexner recommended raising students’ pre-medical entry requirements and academic terms. Medical schools should partner with hospitals, invest more in faculty and facilities, and adopt Northern city training models. States should bolster regulation. Specialties should expand. Medicine should be based on science.
The effects were remarkable. As state boards enforced the standards, more than half the medical schools in the U.S. and Canada closed, and the numbers of practices and physicians plummeted.
The new rules brought advances to doctors across the country, giving the field a new level of scientific rigor and protections for patients.
But there was also a lesser-known side of the Flexner Report.
Black Americans already had an inferior experience with the health system. Black patients received segregated care; Black medical students were excluded from training programs; Black physicians lacked resources for their practices. Handing down exacting new standards without the means to put them into effect, the Flexner report was devastating for Black medicine.
Of the seven Black medical schools that existed at the time, only two — Howard and Meharry — remained for Black applicants, who were barred from historically white institutions.
The new requirements for students, in particular the higher tuition fees prompted by the upgraded medical school standards, also meant those with wealth and resources were overwhelmingly more likely to get in than those without.
The report recommended that Black doctors see only Black patients, and that they should focus on areas like hygiene, calling it “dangerous” for them to specialize in other parts of the profession. Flexner said the white medical field should offer Black patients care as a moral imperative, but also because it was necessary to prevent them from transmitting diseases to white people. Integration, seen as medically dangerous, was out of the question.
The effect was to narrow the medical field both in total numbers of doctors, and the racial and class diversity within their ranks.
When the report was published, physicians led by the A.M.A. had already been organizing to make the field more exclusive. The report’s new professional requirements, developed with guidance from the A.M.A.’s education council, strengthened those efforts under the banner of improvement.
Elite white physicians now faced less competition from doctors offering lower prices or free care. They could exclude those they felt lowered the profession’s social status, including working-class or poor people, women, rural Southerners, immigrants and Black people.
And so emerged a vision of an ideal doctor: a wealthy white man from a Northern city. Control of the medical field was in the hands of these doctors, with professional and cultural mechanisms to limit others.
To a large degree, the Flexner standards continue to influence American medicine today.
The medical establishment didn’t follow all of the report’s recommendations, however.
The Flexner Report noted that preventing health problems in the broader community better served the public than the more profitable business of treating an individual patient.
“The overwhelming importance of preventive medicine, sanitation, and public health indicates that in modern life the medical profession” is not a business “to be exploited by individuals,” it said.
But in the century since, the A.M.A. and allied groups have mostly defended their member physicians’ interests, often opposing publicly funded programs that could harm their earnings.
Across the health system, the typically lower priority given to public health disproportionately affects Black Americans.
Lower reimbursement rates discourage doctors from accepting Medicaid patients. Twelve states, largely in the South, have not expanded Medicaid as part of the Affordable Care Act.
Specialists like plastic surgeons or orthopedists far out-earn pediatricians and family, public health and preventive doctors — those who deal with heart disease, diabetes, hypertension and other conditions that disproportionately kill Black people.
Tumblr media
With Americans able to access varying levels of care based on what resources they have, Black doctors say many patients are still, in effect, segregated.
The trans-Atlantic slave trade began a tormented relationship with Western medicine and a health disadvantage for Black Americans that has never been corrected, first termed the “slave health deficit” by the doctor and medical historian Dr. W. Michael Byrd.
Dr. Byrd, born in 1943 in Galveston, Texas, grew up hearing about the pain of slavery from his great-grandmother, who was emancipated as a young girl. Slavery’s disastrous effects on Black health were clear. But by the time he became a medical student, those days were long past — why was he still seeing so many African Americans dying?
Dr. Linda A. Clayton had the same question.
Her grandfather had also been emancipated from slavery as a child. And growing up, she often saw Black people struggle with the health system — even those in her own family, who were well able to pay for care. Her aunt died in childbirth. Two siblings with polio couldn’t get equitable treatment. Her mother died young of cancer after being misdiagnosed.
By 1988, when Dr. Byrd and Dr. Clayton met as faculty members of Meharry Medical College in Nashville, he had been collecting data, publishing and teaching physicians about Black health disparities for 20 years, calling attention to them in the news media and before Congress.
In their decades-long partnership and marriage that followed, the two built on that work, constructing a story of race and medicine in the U.S. that had never been comprehensively told, publishing their findings in a two-volume work, “An American Health Dilemma” (2000 and 2001, Routledge).
Much has changed since the publication of the Flexner Report.
Racial discrimination is prohibited by law. Medical schools, practices and hospitals are desegregated.
In 2008, a past A.M.A. president, Dr. Ronald M. Davis, formally apologized to Black doctors and patients. The association has established a minority affairs forum and a national Center for Health Equity; collaborated with the National Medical Association, historically Black medical schools and others in Black health; and created outreach and scholarships.
But Dr. Clayton and Dr. Byrd have questioned whether the field is working hard enough to change the persistent inequalities. And they aren’t the only experts wondering.
To Adam Biggs, an instructor in African American studies and history at the University of South Carolina at Lancaster, Flexner’s figure of the elite physician still reigns. That person is most likely to have resources to shoulder the tuition and debt; to get time and coaching for testing and pre-medical preparation; and to ride out years of lower-paid training an M.D. requires.
Evan Hart, an assistant professor of history at Missouri Western State University, has taught courses on race and health. She said medical school tuition is prohibitively expensive for many Black students.
Earlier this year, an A.M.A. article estimated there are 30,000-35,000 fewer Black doctors because of the Flexner Report.
Today, Black people make up 13 percent of Americans, but 5 percent of physicians — up just two percentage points from half a century ago. In the higher-paying specialties, the gap grows. Doctors from less wealthy backgrounds and other disadvantaged groups are underrepresented, too.
This disparity appears to have real-world effects on patients. A study showed Black infant mortality reduced by half when a Black doctor provided treatment. Another showed that Black men, when seen by Black doctors, more often agreed to certain preventive measures. Data showed over 60 percent of Black medical school enrollees planned to practice in underserved communities, compared with less than 30 percent of whites.
The limits of progress are perhaps clearest in the continuing numbers of Black Americans suffering poor health and early death. Millions remain chronically uninsured or underinsured.
According to Dr. Clayton, a key problem is that the health system continues to separate those with private insurance and those with public insurance, those with resources versus those without, the care of individuals versus the whole.
During the Civil Rights movement, Medicare and Medicaid — which were opposed by the A.M.A. — passed in part because of the advocacy of Black doctors, extending care to millions of lower-income and older Americans. But the A.M.A.’s long battle against public programs has contributed to the United States’ position as the only advanced nation without universal coverage. When a social safety net is left frayed, research shows, it may hurt Black Americans more, and it also leaves less privileged members of all races exposed.
“It is basically a segregated system within a legally desegregated system,” Dr. Clayton said.
In February, Dr. Byrd died from heart failure in a hospital in Nashville at 77. Dr. Clayton was holding his hand.
Before his death, the two doctors had given hours of interviews to The New York Times/The Marshall Project over the course of six months.
Dr. Byrd said he wanted to spread awareness to more American doctors — and Americans generally — about the Black health crisis that slavery began, and that continues in a health system that hasn’t fully desegregated.
The doctors’ work showed that never in the country’s history has Black health come close to equality with that of whites.
“We’re still waiting,” Dr. Byrd said.
— This article was published in partnership with The Marshall Project, a nonprofit news organization covering the U.S. criminal justice system. Sign up for its newsletter, or follow The Marshall Project on Facebook or Twitter. Anna Flagg is a senior data reporter for The Marshall Project.
0 notes
indomitablekushite · 3 years ago
Text
Declaration of the Rights of the Negro Peoples of the World
“Declaration of the Rights of the Negro Peoples of the World”: The Principles of the Universal Negro Improvement Association After fighting World War I, ostensibly to defend democracy and the right of self-determination, thousands of African-American soldiers returned home to face intensified discrimination, segregation, and racial violence. Drawing on this frustration, Marcus Garvey attracted thousands of disillusioned black working-class and lower middle-class followers to his Universal Negro Improvement Association (UNIA). The UNIA, committed to notions of racial purity and separatism, insisted that salvation for African Americans meant building an autonomous, black-led nation in Africa. The Black Star Line, an all-black shipping company chartered by the UNIA, was the movement’s boldest and most important project, and many African Americans bought shares of stock in the company. A 1920 Black Star Line business meeting in Harlem’s Liberty Hall brought together 25,000 UNIA delegates from around the world, and produced an important statement of principles, the “Declaration of Rights of the Negro Peoples of the World.” Preamble Be It Resolved, That the Negro people of the world, through their chosen representatives in convention assembled in Liberty Hall, in the City of New York and United States of America, from August 1 to August 31, in the year of Our Lord one thousand nine hundred and twenty, protest against the wrongs and injustices they are suffering at the hands of their white brethren, and state what they deem their fair and just rights, as well as the treatment they propose to demand of all men in the future. We complain: 1. That nowhere in the world, with few exceptions, are black men accorded equal treatment with white men, although in the same situation and circumstances, but, on the contrary, are discriminated against and denied the common rights due to human beings for no other reason than their race and color. We are not willingly accepted as guests in the public hotels and inns of the world for no other reason than our race and color. 2. In certain parts of the United States of America our race is denied the right of public trial accorded to other races when accused of crime, but are lynched and burned by mobs, and such brutal and inhuman treatment is even practiced upon our women. 3. That European nations have parcelled out among them and taken possession of nearly all of the continent of Africa, and the natives are compelled to surrender their lands to aliens and are treated in most instances like slaves. 4. In the southern portion of the United States of America, although citizens under the Federal Constitution, and in some States almost equal to the whites in population and are qualified land owners and taxpayers, we are, nevertheless, denied all voice in the making and administration of the laws and are taxed without representation by the State governments, and at the same time compelled to do military service in defense of the country. 5. On the public conveyances and common carriers in the southern portion of the United States we are jim-crowed and compelled to accept separate and inferior accommodations and made to pay the same fare charged for first-class accommodations, and our families are often humiliated and insulted by drunken white men who habitually pass through the jim-crow cars going to the smoking car. 6. The physicians of our race are denied the right to attend their patients while in the public hospitals of the cities and States where they reside in certain parts of the United States. Our children are forced to attend inferior separate schools for shorter terms than white children, and the public school funds are unequally divided between the white and colored schools. 7. We are discriminated against and denied an equal chance to earn wages for the support of our families, and in many instances are refused admission into labor unions and nearly everywhere are paid smaller wages than white men. 8. In the Civil Service and departmental offices we are everywhere discriminated against and made to feel that to be a black man in Europe, America and the West Indies is equivalent to being an outcast and a leper among the races of men, no matter what the character attainments of the black men may be. 9. In the British and other West Indian islands and colonies Negroes are secretly and cunningly discriminated against and denied those fuller rights of government to which white citizens are appointed, nominated and elected. 10. That our people in those parts are forced to work for lower wages than the average standard of white men and are kept in conditions repugnant to good civilized tastes and customs. 11. That the many acts of injustices against members of our race before the courts of law in the respective islands and colonies are of such nature as to create disgust and disrespect for the white man’s sense of justice. 12. Against all such inhuman, unchristian and uncivilized treatment we here and now emphatically protest, and invoke the condemnation of all mankind. In order to encourage our race all over the world and to stimulate it to overcome the handicaps and difficulties surrounding it, and to push forward to a higher and grander destiny, we demand and insist on the following Declaration of Rights: 1. Be it known to all men that whereas all men are created equal and entitled to the rights of life, liberty and the pursuit of happiness, and because of this we, the duly elected representatives of the Negro peoples of the world, invoking the aid of the just and Almighty God, do declare all men, women and children of our blood throughout the world free denizens, and do claim them as free citizens of Africa, the Motherland of all Negroes. 2. That we believe in the supreme authority of our race in all things racial; that all things are created and given to man as a common possession; that there should be an equitable distribution and apportionment of all such things, and in consideration of the fact that as a race we are now deprived of those things that are morally and legally ours, we believed it right that all such things should be acquired and held by whatsoever means possible. 3. That we believe the Negro, like any other race, should be governed by the ethics of civilization, and therefore should not be deprived of any of those rights or privileges common to other human beings. 4. We declare that Negroes, wheresoever they form a community among themselves should be given the right to elect their own representatives to represent them in Legislatures, courts of law, or such institutions as may exercise control over that particular community. 5. We assert that the Negro is entitled to even-handed justice before all courts of law and equity in whatever country he may be found, and when this is denied him on account of his race or color such denial is an insult to the race as a whole and should be resented by the entire body of Negroes. 6. We declare it unfair and prejudicial to the rights of Negroes in communities where they exist in considerable numbers to be tried by a judge and jury composed entirely of an alien race, but in all such cases members of our race are entitled to representation on the jury. 7. We believe that any law or practice that tends to deprive any African of his land or the privileges of free citizenship within his country is unjust and immoral, and no native should respect any such law or practice. 8. We declare taxation without representation unjust and tyran[n]ous, and there should be no obligation on the part of the Negro to obey the levy of a tax by any law-making body from which he is excluded and denied representation on account of his race and color. 9. We believe that any law especially directed against the Negro to his detriment and singling him out because of his race or color is unfair and immoral, and should not be respected. 10. We believe all men entitled to common human respect and that our race should in no way tolerate any insults that may be interpreted to mean disrespect to our race or color. 11. We deprecate the use of the term “nigger” as applied to Negroes, and demand that the word “Negro” be written with a capital “N.” 12. We believe that the Negro should adopt every means to protect himself against barbarous practices inflicted upon him because of color. 13. We believe in the freedom of Africa for the Negro people of the world, and by the principle of Europe for the Europeans and Asia for the Asiatics, we also demand Africa for the Africans at home and abroad. 14. We believe in the inherent right of the Negro to possess himself of Africa and that his possession of same shall not be regarded as an infringement of any claim or purchase made by any race or nation. 15. We strongly condemn the cupidity of those nations of the world who, by open aggression or secret schemes, have seized the territories and inexhaustible natural wealth of Africa, and we place on record our most solemn determination to reclaim the treasures and possession of the vast continent of our forefathers. 16. We believe all men should live in peace one with the other, but when races and nations provoke the ire of other races and nations by attempting to infringe upon their rights[,] war becomes inevitable, and the attempt in any way to free one’s self or protect one’s rights or heritage becomes justifiable. 17. Whereas the lynching, by burning, hanging or any other means, of human beings is a barbarous practice and a shame and disgrace to civilization, we therefore declare any country guilty of such atrocities outside the pale of civilization. 18. We protest against the atrocious crime of whipping, flogging and overworking of the native tribes of Africa and Negroes everywhere. These are methods that should be abolished and all means should be taken to prevent a continuance of such brutal practices. 19. We protest against the atrocious practice of shaving the heads of Africans, especially of African women or individuals of Negro blood, when placed in prison as a punishment for crime by an alien race. 10. We protest against segregated districts, separate public conveyances, industrial discrimination, lynching's and limitations of political privileges of any Negro citizen in any part of the world on account of race, color or creed, and will exert our full influence and power against all such. 21. We protest against any punishment inflicted upon a Negro with severity, as against lighter punishment inflicted upon another of an alien race for like offense, as an act of prejudice and injustice, and should be resented by the entire race. 22. We protest against the system of education in any country where Negroes are denied the same privileges and advantages as other races. 23. We declare it inhuman and unfair to boycott Negroes from industries and labor in any part of the world. 24. We believe in the doctrine of the freedom of the press, and we therefore emphatically protest against the suppression of Negro newspapers and periodicals in various parts of the world, and call upon Negroes everywhere to employ all available means to prevent such suppression. 25. We further demand free speech universally for all men. 26. We hereby protest against the publication of scandalous and inflammatory articles by an alien press tending to create racial strife and the exhibition of picture films showing the Negro as a cannibal. 27. We believe in the self-determination of all peoples. 28. We declare for the freedom of religious worship. 29. With the help of Almighty God we declare ourselves the sworn protectors of the honor and virtue of our women and children, and pledge our lives for their protection and defense everywhere and under all circumstances from wrongs and outrages. 30. We demand the right of an unlimited and unprejudiced education for ourselves and our posterity forever[.] 31. We declare that the teaching in any school by alien teachers to our boys and girls, that the alien race is superior to the Negro race, is an insult to the Negro people of the world. 32. Where Negroes form a part of the citizenry of any country, and pass the civil service examination of such country, we declare them entitled to the same consideration as other citizens as to appointments in such civil service. 33. We vigorously protest against the increasingly unfair and unjust treatment accorded Negro travelers on land and sea by the agents and employee of railroad and steamship companies, and insist that for equal fare we receive equal privileges with travelers of other races. 34. We declare it unjust for any country, State or nation to enact laws tending to hinder and obstruct the free immigration of Negroes on account of their race and color. 35. That the right of the Negro to travel unmolested throughout the world be not abridged by any person or persons, and all Negroes are called upon to give aid to a fellow Negro when thus molested. 36. We declare that all Negroes are entitled to the same right to travel over the world as other men. 37. We hereby demand that the governments of the world recognize our leader and his representatives chosen by the race to look after the welfare of our people under such governments. 38. We demand complete control of our social institutions without interference by any alien race or races. 39. That the colors, Red, Black and Green, be the colors of the Negro race. 40. Resolved, That the anthem “Ethiopia, Thou Land of Our Fathers etc.,” shall be the anthem of the Negro race. . . . 41. We believe that any limited liberty which deprives one of the complete rights and prerogatives of full citizenship is but a modified form of slavery. 42. We declare it an injustice to our people and a serious Impediment to the health of the race to deny to competent licensed Negro physicians the right to practice in the public hospitals of the communities in which they reside, for no other reason than their race and color. 43. We call upon the various government[s] of the world to accept and acknowledge Negro representatives who shall be sent to the said governments to represent the general welfare of the Negro peoples of the world. 44. We deplore and protest against the practice of confining juvenile prisoners in prisons with adults, and we recommend that such youthful prisoners be taught gainful trades under human[e] supervision. 45. Be it further resolved, That we as a race of people declare the League of Nations null and void as far as the Negro is concerned, in that it seeks to deprive Negroes of their liberty. 46. We demand of all men to do unto us as we would do unto them, in the name of justice; and we cheerfully accord to all men all the rights we claim herein for ourselves. 47. We declare that no Negro shall engage himself in battle for an alien race without first obtaining the consent of the leader of the Negro people of the world, except in a matter of national self-defense. 48. We protest against the practice of drafting Negroes and sending them to war with alien forces without proper training, and demand in all cases that Negro soldiers be given the same training as the aliens. 49. We demand that instructions given Negro children in schools include the subject of “Negro History,” to their benefit. 50. We demand a free and unfettered commercial intercourse with all the Negro people of the world. 51. We declare for the absolute freedom of the seas for all peoples. 52. We demand that our duly accredited representatives be given proper recognition in all leagues, conferences, conventions or courts of international arbitration wherever human rights are discussed. 53. We proclaim the 31st day of August of each year to be an international holiday to be observed by all Negroes. 54. We want all men to know that we shall maintain and contend for the freedom and equality of every man, woman and child of our race, with our lives, our fortunes and our sacred honor. These rights we believe to be justly ours and proper for the protection of the Negro race at large, and because of this belief we, on behalf of the four hundred million Negroes of the world, do pledge herein the sacred blood of the race in defense, and we hereby subscribe our names as a guarantee of the truthfulness and faithfulness hereof, in the presence of Almighty God, on this 13th day of August, in the year of our Lord one thousand nine hundred and twenty. Source: UNIA Declaration of Rights of the Negro Peoples of the World, New York, August 13, 1920
0 notes
dipulb3 · 4 years ago
Text
The number of Black women mayors leading major cities to reach historic high. Here is why they are winning
New Post has been published on https://appradab.com/the-number-of-black-women-mayors-leading-major-cities-to-reach-historic-high-here-is-why-they-are-winning/
The number of Black women mayors leading major cities to reach historic high. Here is why they are winning
Her victory came just two weeks after Kim Janey was appointed Boston’s first Black female mayor following the resignation of Marty Walsh, who is now the US Labor Secretary. Janey recently announced she would run for a full term in this year’s mayoral election.
With the ascension of Jones and Janey, there will be a historic high of nine Black women serving as mayors of the nation’s 100 largest cities. Other major cities led by Black women include Atlanta, San Francisco; Chicago; Baton Rouge, Louisiana; New Orleans; Washington, DC; and Charlotte, North Carolina.
Political observers say the growing number of Black female mayors signals they are gaining electoral strength and appealing to voters in races that have been historically won by White men. They say Black women have proven they are relatable with an ability to lead, organize and engage new voters. Black women are also speaking out against the racial disparities in their communities at a time when the nation is having to reckon with systemic racism and police violence against Black people.
Kimberly Peeler-Allen, a visiting practitioner at the Center for American Women in Politics at Rutgers University, said as more Black women rise to political power, the electorate is seeing the importance of having diverse voices making decisions.
“Black and brown women are running with a message that is a totality of their life experiences, which transcends race or gender,” Peeler-Allen said. “And there are people who are saying ‘she may not look like me but I know we share the same experience, because she is wrestling with credit card debt, or she has a family member with addiction or she’s a small business owner, she’s a veteran.'”
Peeler-Allen said she believes the advancement of Black women in all levels of government could also be inspiring more to run for office.
In the last few years, Kamala Harris became the first Black female vice president, Ayanna Pressley became Massachusetts’ first Black woman elected to Congress, and Tish James was elected New York’s first Black female attorney general.
Stacey Abrams narrowly lost her bid to become the nation’s first Black woman governor in 2018, but is now a powerful advocate for voting rights for people of color. Some political analysts view Abrams as a viable candidate for Georgia’s gubernatorial election in 2022.
Creating equity in St. Louis
Both Jones and Janey have vowed to make racial equity a priority while reflecting on their own lived experiences as Black women.
Jones said during her victory speech that she would not stay silent or ignore the racism that has held St. Louis back.
She told Appradab she wants to address the exodus of Black residents in recent years and why they don’t feel welcome in St. Louis. The city’s Black population dropped from 51% to 45% in the last 10 years.
Jones said she wants to revitalize the northern part of the city where she grew up because the neighborhoods have been neglected.
“I am ready for St. Louis to thrive instead of just survive,” Jones said on Appradab “New Day” earlier this month. “We need to provide opportunities for everyone to succeed, no matter their zip code, the color of their skin, who they love or how they worship.”
Kayla Reed, executive director of the grassroots racial justice group St. Louis Action, said she believes Jones can relate to the plight of Black people in St. Louis because of her lived experience as a single mother from a marginalized neighborhood.
The city, Reed said, struggles with segregation, disparities in education, employment and housing, overpolicing and violence in the Black community.
Reed said Jones has embraced the demands of a racial justice movement that started in 2014 when unrest broke out in nearby Ferguson following the police killing of Michael Brown. Ferguson elected its first Black woman mayor Ella Jones last year.
Jones is listening to the concerns of organizers and giving them a seat at the table, Reed said.
“She understands the unique inequality that our communities face,” said Reed, who campaigned for Jones and sits on her transition team. “And it gives her an advantage to think through creative, innovative solutions to shift outcomes and conditions.”
Breaking the ‘steel wall’ in Boston
In Boston, Janey has promised to answer the call for equity in a city with a reputation of being racist.
Boston struggles with an enormous wealth gap, unequal economic opportunity, neighborhoods that are segregated along racial lines and disparities in access to education.
The median net worth for White families is nearly $250,000 compared to just $8 for Black families, according to a 2015 study by the Federal Reserve Bank of Boston.
There is also a racial disparity in city contract awards, with a recent study showing that only 1.2% go to Black and Latino-owned businesses. Black and Latino workers also face higher unemployment rates than White workers in Boston.
Janey wrote in a Boston Globe op-ed that she will tackle these inequities with new policies and creative solutions.
She also reflected her experience with racism as a child on the frontlines of school desegregation in the 1970s. Janey said rocks and sticks were thrown at her bus while people yelled racial slurs.
Janey told Appradab that she believes there is an added burden to being the first woman and the first Black person to serve as mayor of Boston.
“I know there is a perception and a reputation that Boston has, but I think what is important is that the reality and the opportunities that we create for residents here is one that is focused on equity, on justice, on love and ensuring that there is shared prosperity in our city and shared opportunities,” Janey told Appradab’s Abby Phillip. “It’s not to say that we’ve solved everything when it comes to racism, but I think we have come a long way.”
Tanisha Sullivan, president of the Boston NAACP, said civil rights leaders have spent decades advocating for diversity in city leadership.
Black people have been able to win seats on city council and Rachael Rollins was elected the first Black female district attorney of Suffolk County in 2018. However, Black Bostonians have hit a “steel wall” with the mayor’s office before now, Sullivan said.
“There has been more of a concerted effort and focus on breaking through with the belief that having more diversity in that office leading the way would result in public policy that was intentional about racial equity and so many other quality of life measures that would be good for our city as a whole,” Sullivan said.
Sullivan said racial justice advocates are now hoping Janey will create momentum around electing a woman of color as mayor in November.
There are two Black women — Janey and Andrea Campbell — and one Asian woman, Michelle Wu, running for mayor.
Sullivan said it is past time for a Black woman to win the mayor’s office in Boston.
“We have for generations now been the engine behind the ascension of so many others to political office,” Sullivan said. “It has been our strategy, it has been our sweat equity, it has been the soles of our shoes that been worn out for others. It is not only our time, we have earned our spot.”
Black women mayors are a force
Jones and Janey are joining the tide of Black women mayors who have emerged onto the national stage in recent years.
San Francisco Mayor London Breed gained national attention when she was one of the first to lock down her city when the Covid-19 pandemic hit US soil last year.
Atlanta Mayor Keisha Lance Bottoms was one of the top contenders to be President Joe Biden’s running mate. She was also lauded for her assertive response to protesters looting in city streets during uprisings last summer and speaking out against Gov. Brian Kemp’s decision to lift Covid-19 restrictions last spring.
Chicago Mayor Lori Lightfoot has made headlines for defending her city and standing up to sharp criticism from former President Donald Trump who threatened to send in federal law enforcement officers to fight violent crime there.
Black women in New York are also hoping to join the short list of Black female mayors making history.
Both Maya Wiley and Dianne Morales, who identifies as an Afro-Latina, are vying to become the first Black woman to lead the nation’s largest city.
Wiley has garnered the support of Black female celebrities including Gabrielle Union and Tichina Arnold. Rep. Yvette Clarke announced earlier this month that she was endorsing Wiley.
Some activists say the success of Black women in mayoral offices is creating a pipeline for them to run for state and national office in the future.
“We still have not had a Black woman governor, we still have not had a Black woman who has been speaker of the house, there is not a Black woman now in the US Senate,” Reed said. “So, there are gaps, but I’m confident that with the election on the local level, not only are we changing things but we are building a pipeline to answer those questions.”
1 note · View note