Tumgik
vampyrichamster · 5 years
Text
Addendum 5/1/20: It has since been brought to my attention that although my essay references two biological genders in terms of just genitals, this isn't entirely accurate. Having pursued more understanding of why this is, it's clear even from a biological sense, there exists a great variance of genitals, gonads, chromosomes and hormones (and that's just in humans) which in combination make for surprising reinterpretations of sex. I didn't mean to ignore that variance, but now I know to consider it too.)
I've been reading with interest how research into the mechanics of genitalia are divided by volume between studies of male parts, female parts or both. Some of the things scientists have found across the animal kingdom are fascinating in their own right and worth a read, but this post is about how much research is concentrated on one gender or both.
About 49% of research into sexual organs is focused on male organs, 44% study both male and female sexual organs and the remainder on female organs alone.
First, why does this matter and why would anyone even need to research this stuff anyway? This needs us to understand the act of procreation as an arms race--a biological arms race waged between both sexes in every species so differentiated. It's about the mechanics of sexual organs, i.e. how locks (female parts) match their keys (male parts) and the lengths each gender goes to prevent them from fitting.
The bias in research towards male organs was attributed to a longstanding idea that female organs serve as passive recepticles, basically just waiting for the right key. This makes the keys more important because of their wide variety. Another reason was that male organs tend to be external and more easily observed. To see inside a lock, you usually need to cut it open.
Far from passive tubes waiting to be filled, when we have looked inside, it's become clear that female organs were very much created to thwart their male counterparts. The race to spread your genes as successfully as possible tends to require mutual agreement. Ducks evolved vaginas that spiraled clockwise, whereas drakes evolved penises that spiral counterclockwise. Unless the duck relaxed her muscles and allowed a drake's penis to inseminate her, even forced sex (and it turns out most duck sex is forced sex) wouldn't guarantee reproduction. Kangaroos have three vaginas and two uteruses, enabling them to birth and decide when to grow three joeys at the same time: one embryo held for safekeeping in a uterus, a newborn joey in her pouch and a joey old enough to leave the pouch. Just as with humans, in any species, being able to decide when to have children helps guarantee their chance of survival. This means being able to mate at the right time, allowing pregnancy at the right time and giving birth at the right time.
Some years ago, an elected politician somewhere made the claim that in a "legitimate" rape, a woman's body has a mechanism to simply reject fertilisation at will, so pregnancies rarely are a result. If we could manage it, the revolution in sexual autonomy probably wouldn't work the way he intended. But that doesn't mean female human bodies simply allow fertilisation to happen between an egg and several million sperm during every act of sex. You know how they say each child is a miracle because the chances of even making a child are so tiny? That's because for humans, it is. A fertilised embryo has about 30% chance of becoming a blastocyst (this is the starting cell package that eventually might implant in the uterus and grow into a baby). Here's the really cool part. Recent research has found that in order to proceed with turning into a blastocyst, the cells inside the embryo need to start expressing genes specific to embryonic development. But each of these cells does it autonomously of each other, on their own schedule. Most of the time, they don't express the gene to develop further at all, or enough cells fail to synch up and do so that a blastocyst never forms.
Again, this is after millions of sperm have managed to fight their way up the uterus and assuming at least one joined with the single egg most women release each month. And before the blastocyst successfully implants in the wall of the uterus, which fails two-thirds of the time.
Here's where I need to add that in spite of the above, I do feel a tinge of jealousy for some of the insane gadgetry our fellow animals have evolved. Dolphin vaginas have a large number of layers and folds, the better to block sperm with. Dolphins are also both capable of enjoying sex and raping other dolphins for the pleasure of it. It's worth noting here that most species don't have parts to prevent forced sex. The locks and keys metaphor is about the race to successfully reproduce, not about successful copulation. But it doesn't diminish the usefulness of every tool we have in the 0kit.
Now, how is studying any of this useful for us? Understanding how sexual parts have evolved and how they can evolve might teach us better ways to deal with ours. Developing ever-more intricate mazes for sperm to lose their way in isn't the only thing vaginas do. Some women have malformed uteruses divided by an extra wall of tissue that prevents embryos from implanting or developing properly. But some animals (such as rodents) normally have uteruses divided into multiple chambers. As an example, figuring out what prompts this trait might teach us why this happens and how to help it.
It's also worth remembering that our sexual tools include the hormones secreted by our physical parts. Because these affect fertility and sexual behaviours, they are tools that can be used to promote and supress reproductive autonomy. Traditional patriarchal thinking makes much of hormones being a source of female impairment. Mood swings during menstruation are signs that women lack the self-control and rationality to take on roles in leadership and adjudication, even over their own bodies. Pregnancy and childbirth lowers their worth as career employees, weakens their commitment to succeed. Never mind that hormones also regulate male behaviour and any imbalance can be said to inject the same impairments to clear-thinking and judgement to men as they would women.
In addition, human beings uniquely possess an array of artificial tools that control fertility to augment our natural locks and keys. So these too become part of the arms race. Access to hormone-based contraception becomes a sin. Imagine if all women had the tools to circumvent menstruation and determine when or if they wanted to be pregnant. That they do but can't use these tools is an artificial barrier created by people intent on wielding the most harm to human health.
I want to be clear here that I believe the reason both genders evolved different sexual parts, the reason we have this biological arms race at all, is ultimately to create the best people we can to survive and navigate the world. Both genders are physically different. The hormones do affect behaviour and also confers specific benefits to our bodies that science is still discovering. But the behaviour itself doesn't determine how smart, educated or capable you are in your modern human job. That's because the knowledge and skills you need to become a doctor, judge or scientist are so far removed from basic survival and procreation they are difficult for everyone, regardless of gender.
The reason we need to study how sex works in both genders of a given species together isn't just because it's twice as interesting. We now know, for example, that most clinical trials have historically focused on male patients and this has meant that the full effects of some medical treatments on women have as a result been poorly evaluated. This has obvious real life implications that could result in injury or death. Understanding the actual nature of sexual organs in both genders and how they interact makes good sense because we have the whole picture on how they, and we, can work together.
I'll leave you with this unfortunate real life example where concentrating on keys neglected the locks. The Japanese parliament greenlighted the use of Viagra in record speed before they allowed modern forms of the hormonal contraceptive pill for women. And although Japanese women can now access oral contraceptives, this is not covered by health insurance there. For reference, the pill was widely approved for use in other countries as far back as the 60s. This happened as part of a much wider patriarchal culture, one that assumes women are naturally predestined to care-giving and motherhood. In the meantime, the Japanese birth rate has plunged below replacement level and stayed there for the past few years. Career women have decided their autonomy and the intense effort needed to uphold it is more important. When women and men do not equally share the conversation on reproduction; when to have, how and who will raise offspring, society fails.
0 notes
vampyrichamster · 6 years
Text
Small thoughts coalescing
Today, I read an essay about uprooting yourself and reestablishing an identity in a foreign country that struck a chord.
Like the author, I come from a place where my passport would almost certainly be destroyed if I were ever to gain an American one. Malaysia's toxic history with granting citizenship, who it deems worthy and why, is a piece of our poisonous dialogue about race. That is a question I've pondered often, but isn't why I am writing this.
My first introduction to moving "to the West" was really more of a hard swerve east, to Australia. I considered it a parental decision, one I was emotionally removed from. Part of it was a choice I made to retain my own sanity. Having no choice in whether or not we moved meant that I was removed from most of the practicalities of say, visa applications, but faced all of the emotional challenges of being told one day I was no longer going to live in my country.
That time in Australia was painful for a great number of personal reasons, but it was through no fault of the country or its people. In a different frame of mind than I had back then, I might have been better able to appreciate the newfound space to think and live the way I wanted to.
I consider my real migration experience the choice I made to come to America. That was a choice I undertook wholly for myself. I am often ashamed of talking about my migrating anywhere because it reeks of my privilege. First, by being born into a family that could afford it, later to having gotten here because of the wonderful man that I married.
That marriage in fact forms the very basis for how migrating abroad has allowed me to move from homesickness and being doggedly patriotic to thinking critically about my old home and my new one.
In Malaysia, my marriage wouldn't have been possible. My choice of religion is impossible, that I knew before I ever left Kuala Lumpur. But my birth religion dictates who I can marry; someone from the same religion on paper, by necessity.
The first time I realised I had a choice to live as I wanted to was when I filled out the form to book our wedding at City Hall here in San Francisco. This was right after Proposition 8 came into effect, which took away the brief right of same sex couples to marry within the state. So fresh was the change that the online form I was using still had two tickboxes to ask if your union was same sex or not, preceded by a hastily added note saying same sex applications were no longer possible.
You get a real sense of how precious your ability to marry is when you don't have it. My marriage would never be legal in the absolute sense in Malaysia unless my husband and I pretended to be something we were not.
Being here in the US gave me the right to marry, the right to object vehemently against having a theistic faith and live by that credo with no penalties on my daily life. (Shortly after I married, I first began hearing about official Muslim reeducation camps being built in Malaysia where I spent my childhood.)
I resolved not to take lightly the privilege of being accepted at face value. So I began by reading books, any book I wanted on any subject. No post office would inspect my books and seize them for subversive content.
Afterwards, I came around to writing on some of the subjects that mattered to me but we couldn't say aloud. I am painfully aware as I write this how I weave too many words together to subtly touch these subjects. But where I come from, self-censorship is what you do.
The current state of the US is grave for its free press. Long before fake news was a catch phrase, my country hewed to a relentless pro-government message. Our country was united and strong. Dissidents, pretty much anyone who pointed out the cracks, were spirited away in holding camps. But you still have a press that can ask questions here. Your belittled press is ultimately still free, and its readership used to at least the concept of a free press that any move to restrict it will be resisted. That resilience, however fragile, should not be overlooked.
It's a resilience I'm trying to at least develop rather than merely appreciate. The author of the essay I linked to, Xiaolu Guo, talks about hitting 30 and being confronted with the fact that 30 is old where we come from. Women in particular are expected to have met biological benchmarks of success, marriage with infants and the stridently competitive motherhood that raises "successful children". If you're not with a high-paying career by that point, and you're not doing the substitute track of living for other people, well what are you good for?
Ms Guo's essay asks a diametrically opposite question: After the long process of getting to freedom, who do you want to be?
0 notes
vampyrichamster · 8 years
Text
Bee-pocalypse, no?
I read a fascinating article the other day that got me thinking about bees, or specifically, their oft-heralded demise. This whole year felt like you couldn’t approach an almond or a fruit without dire portents about droughts and bee-pocalypses. The two are intrinsically linked. In both, big agriculture takes a few hits on the nose for creating vast fields of monoculture in the deserts of California, and planting produce treated with neonicotinoids (neonics) that have been blamed for causing Colony Collapse Disorder (CCD). In both, the sound and the fury of activism felt like it was missing some nuances. The topics are important, but the focus wasn’t linking all the pieces together.
When we think of beekeeping, the immediate image we get is of the private beekeeper, tending a few hives in a garden. We don’t often think of beekeepng as a large, industrial operation, and in many ways, the advertising thrives on maintaining that private, intimate image of raising bees. But the most likely candidates for CCD are battery bees (think battery chickens), hives in the hundreds and thousands plonked in monoculture fields of canola and orchards of almonds. These bees are trucked by commercial operators throughout the country, wherever a farm needs pollinating. They’re frequently stressed from the moving, supplemented with poor diets of sugar syrup. When they stop to work, their diet is dependent on whatever single type of plant the farmer grows and they need to help pollinate. If they’re lucky, strands of wild land or strips deliberately left fallow offer some variegated bee forage, but it’s never enough.
In that regard, the many travails that affect the modern honeybee start to make sense. Malnourished bees, like malnourished chickens and malnourished people, suffer from fertility issues, are more prone to illness, and will be more likely to die from even the smallest shift in their environment. Weakened immune systems are a walking target for predatory vermin as well as germs -- the plague of varroa mites that we have heard are killing honeybee larvae are indicators of this. Think about it-- if you’re already under the weather because of a flu, a sudden cold snap, strong and persistent air pollution or a week of stress in the office could easily be the straw that breaks the camel’s back.
It is this ecology of health that bothers me when CCD is squarely blamed on neonics. First, a specific clarification is in order. Insecticides, herbicides and fungicides can pose real threats to pollinators, and are particularly worrying when they harm wild pollinators that preserve indigenous flora. While most of the research has understandably been about testing sublethal doses of neonics on honeybees, more research into their effects on wild pollinators is clearly needed because of the unknowns. For example, blueberries are indigenous to the Americas, where wild pollinators like bumble bees and solitary bees work in tandem with commercial honeybees. Although pollinator death can be the direct result of insecticide applications, such as through pesticides drifting from surrounding areas affecting bees working in an unrelated field, it is the long term effect of insecticides, herbicides and fungicides that most impact pollinator survival. 
This includes many other potential reactions than just neonics and has to take in that wider ecology of impaired battery bee health I talked about earlier. As an example, the most prevalent chemical traces on commercial bees may actually be from the fungicide used to prevent nosema outbreaks and miticides that prevent varroa mites than any insecticide that was ingested in the field. Research into the active ingredients of the antibiotics and pesticides used to ensure bee health show them as safe when used as prescribed. (Note: Cows and chickens are not the only food creatures we stick full of antibiotics. One consequence of over-stressed commercial bees is that they also need antibiotics to stay healthy.)
However, active ingredients make up only a small percentile of the formulations on the market. The inactive or inert ingredients, which form the majority of the formulations, usually escape attention. Recently, there has been interest in studying the long term effects of these inactive ingredients as well, as the toxicity over time to bees might actually be worse than the active ingredients.
Bees also ingest more than just neonics in the field. Herbicides and fungicides aimed not at insects but meant to prevent other problems for crops also pose a potential threat. Again, this is like the flu analogy I used earlier. Imagine that the bee is like your urban office drone, stressed from work, coming down with a cold, breathing in smog and possibly neglecting their diet to cope with work. Taken individually, each of these factors could possibly be fought off over time. Taken together, the office drone is almost bound for a serious health crisis.
One thing that honeybees (though much less for wild pollinators) do have over office drones is that they can replace themselves at a far greater rate. Nature makes up for the short life cycle by building redundancies into the system. Die-offs happen naturally during the winter, for example, when the whole focus of the hive is to stay warm and keep the queen alive. The shortfall is then replaced by new generations of bees bred during warm months. In 2006, when CCD was first diagnosed, honey bees were disappearing at greater than replacement rates. But evidence suggests that periodic large die-offs of this sort go back much farther, and have happened on every continent (link goes to a PDF). In each case, there was no conclusive evidence of what caused the die-off, but a lot of those suggested causes sound awfully familiar, and happened in the sort of clusters we see today. More importantly, in each case, it’s clear the honeybees eventually rebounded.
Some evidence even seems to show that honeybee numbers are actually rising right now -- which would go against the whole pollinator die-off hysteria in general. This awesome news comes with a caveat, since this is where the whole ecology of the system gets missed in the mad rush to ban neonics, and those almonds show up.
Among the biggest jobs of commercial honeybees in the United States is pollinating California’s almond fields. We’re looking at numbers in the 1.6 million colonies to keep the almond milk flowing. Almonds and by extension bees, are a rising commodity that drives pensions and hedge funds. Major almond orchard plantings these days are more likely the result of investment firms, not family farms. The almond industry has gotten the collective dirty look in California’s ongoing drought, but most of their water these days comes from underground. As the frenzy of almonds dig deeper, not only are they depleting millions of years of stored natural reservoirs, they’re also reaching parts of underground water caches that are naturally contaminated with heavy metals. Physical geography is also affected by mass pumping, and laws were passed two years ago to ensure people in the Central Valley had water to drink, and hold back sinking land (i.e. subsidence) that actually makes it harder for aquifers to recover over time. 
A lot of this is our own bloody fault. California is the produce basket of the US and the largest producer of various export crops in the world --almond is the latest popular face of that industry, having recently overtaken grapes. That greed for growing more commodities has meant we’re reaching beyond places that were ideal for food and reclaiming patches of desert. I think it would be quite logical to say almond trees aren’t meant to grow in deserts. They need more water to grow and need to be watered more frequently than other sensible or seasonal food crops, like grapes, strawberries and lettuce.
Monoculture-growing as a whole is also bad for feeding bees, even though we need bees to ensure industrial farming survives. Farmers are getting smarter about land use, such as preserving wild patches next to their fields for both honeybees and natural pollinators. Clearly though, the scale of monoculture farming means that honeybees are going to be deprived of a balanced diet. The malnutrition is the first step in depleting bees’ immune systems, and in the larger ecology, one factor alongside everything else that could recreate the perfect storm of another big die-off.
Diversity is the key. This means pat answers like, “It’s all neonics!” is probably worse in the long run, and should divert resources to looking at other players in the big picture. Studying the effects of agricultural chemicals on honeybees and other pollinators is important, but that’s clearly a much larger list of ingredients than just neocotinoid formulations. Industrial farming isn’t going to go away. Feeding the world cannot be met by small family farms, and climate change necessitates us to look at all the tools we can to survive, from rethinking where different crops are traditionally grown, to building strains of plants resilient to new environmental challenges. It means finding better ways to create diversity in large-scale farming -- those fallow and wild patches are a start. And it will more than likely mean paying more for foods we take for granted, whether because growing them has to be pared back for the greater good, or whether because the resources to grow them are scarce and should be treated as precious. The bees will live. Or pollinators will change up to benefit from their evolving environment. We need to give more credit to insects -- there’s a reason there’s billions more of them than us!
0 notes
vampyrichamster · 10 years
Link
Restricting women is often one of the first steps Islamists impose when they come to power. This is no accident. Whether done in the name of safety, honour or tradition, the end goal is creating a captive population of loyalists beginning in the home. By deliberately keeping women's views and exposure shuttered, there is less need to expand truly egalitarian gender relations. If all women believe that no amount of personal achievement matters next to serving her husband and family, maintaining the idea that genders are complementary and not equal is simplified. Complementary roles have no need for equal human rights. We no longer live in agricultural societies reliant on manual labour. Why should we superimpose laws that made sense for gender roles 1,000 years ago on modern societies with modern needs?
0 notes
vampyrichamster · 11 years
Text
Retroactive logic
Many years ago, I had the privilege of interviewing a researcher who helped develop halal bone china. Normal bone china might be made from the bone ash of animals that were considered non-halal, including pigs. Made only from the bones of animals slaughtered according to Islamic tenets, halal bone china created a high end product with clear value for Muslims who wanted to ensure all their eating utensils were in line with their religious goals.
In an amusing aside towards the end of the interview, I learnt that the main ingredient in fine bone china, bone ash, was obtained by processing the raw material with very high heat. The resultant ash was chemically indistinct whether its base material came from pigs or cattle. The value of halal bone china was therefore one of faith, emotional resonance and product prestige — at least from a scientific standpoint.
I would like to stress here that saying this in no way diminishes the hard work and genuine research of the team involved. They created an innovative product which had the clear intention of meeting a luxury market niche. That is, they were doing exactly what researchers and innovators do best.
The question I’m posing in this essay is instead about the value of religious labelling as a marketing tool. Connecting objects and behaviours with the gods carries a lot of marketing cachet. It validates material and social products based on faith, emotional resonance and prestige (from being part of an exclusive circle). The faith part is the most important aspect of that sales mechanism. In this case, the other two points may work as byproducts of the faith-based marketing. Unlike a non - religious product, any science that is involved in its creation or marketing usually comes as an afterthought, used in service of validating the religious claim but never to disprove it. Disproving faith is ungodly, after all, and cuts into a product’s potential market.
For example, take miswak (a traditional Arab tooth cleaning stick). Dental hygiene and clean breath were personal values promoted by no less a religious authority than Muhammad, according to hadiths, in particular through the regular chewing of miswak. Miswak (Malaysian: kayu sugi) seems to have been twigs derived from numerous different local Arab trees, but especially Salvadora persica, which is native to the dry climates of the Arabian Peninsula and West Asia. It also seems that the use of miswak predates Islam. Babylonians record its use 6,000 years ago.
At one point in Michael Pollan’s The Omnivore’s Dilemma, the author talks about how organic food manufacturers depend heavily on pastoral imagery and emphasising a personal relationship between their customer and their food. Buying an organic chicken might involve an extensive label or signage extolling the bird’s provenance — its name, the free range farm it was raised in, its vegetarian diet (chickens are omnivores; unless worms are vegetables by proximity) and possibly even a brief biography of its farmer. Most battery chickens get by with a line or two about how tasty it is. Is the label on the organic chicken true? Offhand and in-store, we are more than likely to take the manufacturer’s word at face value.
Modern miswak sellers employ marketing that takes similar emotive/historical cues as with most naturally derived products. However, the benefit of the Muslim diaspora as an audience lends it particularly potent marketing value. Not only does miswak have an established history in Arab custom and traditional medicine, since Muhammad used it and connected its virtues with religious duties of hygiene in hadiths, the aura of sacred obligation (however voluntary) comes into play.
Obviously, not all Arabs are Muslim and neither was the consumer base for miswak ever historically just Muslims (remember, miswak use predates Islam). Therefore, modern miswak manufacturers couch their products’ marketing in a mix of general historical value and medical properties to reach the widest possible audience. However, in its marketing for Muslim consumers, the text explicitly employs halal labelling and subtly harnesses Arabic-Islamist mores. When the market is assumed to be primarily Muslim, as is the case of Southeast Asia, out come the hadith quotations. Mu’min, a Malaysian halal toothpaste manufacturer, goes so far as to use a hadith quotation on the package of its B*Sugi gel toothpaste stating that the rewards for prayer are increased 70-fold if miswak was used before it, while also gaining endorsement from a local Muslim scholar that notes using miswak toothpaste is as good as chewing the original stick.
Incidentally, organic chickens are not any less nutritious than battery birds. What differs is that one has less medication in its diet than the other, which are both resultant of its more inhumane upbringing and creates unintended long term negative effects on consumers and the environment.
Likewise, though scientific testing does show miswak has antiseptic and nutritional qualities akin to modern toothpaste, these qualities naturally occur at a lower quantity than said toothpaste. This naturally occurring dilemma of miswak sticks is often conveniently unmentioned in its marketing. Modern miswak toothpaste uses miswak extract (i.e. miswak concentrate), which could vastly amplify the positive properties of miswak over its natural form, and dependent on the product, any number of chemical additives common to non-miswak toothpaste.
Even the original miswak sticks are not removed from modern modifications. Al-Khair, a Pakistani miswak manufacturer, offers vacuum packed miswak sticks in refreshing mint and lime flavours.
We can’t fault innovators for discovering niche products or improving old ideas for marketability. And advertisers are paid to hone marketing messages as far as they can go. But imagine how much more stable your market base is when it is a captive audience. Religious faith, the idea that a god said a thing was good, is more compelling than the endorsement of rock stars and any scientific research to the faithful. Not buying into products connected to religious obligations carries the dread taint of unfaithfulness, which is a potent social stigma.
From products as mundane as toothpaste to specialised prayer shawls, rosaries and circumcision clamps, faith-based marketing gives the veneer of choice with the spectre of the sacred. But wait, you might be thinking, material products are still the choice of the consumer to purchase. There is ultimately no compulsion in religion to shop.
Imagine what Saudi Arabia would be without hosting Mecca and Medina. Would its example of Islam carry such weight to the rest of the Muslim world if it was not the foci of religious pilgrimage? Picture Jerusalem if it were a true world heritage site, and not a contentious heirloom bickered over by feuding Abrahamic faiths. What if, instead of being a historical battleground of mosques built on top of churches on top of temples, it was removed from rapturous pilgrimage? Let’s not pretend that Mecca and Jerusalem are too sacred to be money spinners. More importantly, the value of their custodianship in political terms and religious example are just as much vital commodities for Saudi Arabia and Israel as their pockets.
Social behaviours and alliances are just as subject to faith-based marketing as any tangible product. At the heart of faith  is a core assumption that the word of a god is good enough. Doing without really knowing on the idea that all will be revealed later. Every religion, small, fringe, established, from Hinduism to Scientology, has at least some element of suspending disbelief. This keeps them resilient in the face of evolving social mores, because mysticism is not meant to be questioned, but taken to heart. I may be stating the obvious, but it seems to me that the unknown, for an assumed mystery, has far too much power over the living world, whereas reality is not allowed to engage the sacred so thoroughly.
Science, the process of constant, rigorous, open testing, learning and debate, is hijacked by literalists to say that circumcision promotes hygiene or somehow prevent AIDS. But, ask why bodies built by perfect higher powers need adjustment at all, or how, if circumcision predates the Abrahamic faiths, then perhaps it persisted as a custom in much the same way slavery existed alongside the Abrahamic faiths for most of its history, and the questions stop. By reflecting the norms of early human civilisation, and perpetuating it ad infinitum, religion wears the marketing mantle of blind faith, and propagates literalists of every stripe. Science bound to faith tells us that cutting is clean, even though personal hygiene is about practice and does not have to involve pain, trauma and mutilation. If anything, good hygienic practice ought to be about preventing the latter. In any other situation, amputation is a sign all else has failed. Are penises and the clitoris born gangrenous at the tip?
Medicine in service of faith can tell us that masturbation is a sign of mental illness. It once told us cutting prevented touching — why have needs when you can excise the nerves? Good girls need cutting so they don’t turn “wild”. Maybe if they associated intimacy with pain, conjugal relations could only be about breeding. Maybe the babies they make will all be his.
Or how about this: that good boys get cut because Abraham did it and that’s how we know all good boys are disciples of Abraham’s family? Truth in marketing.
Context and history becomes submerged within the strictures of faith. Yet, context and history are what tell us we’re alive. When we look back and see how much we’ve learned, we are moving forward. Cease asking questions and do the same things as those before you, and there is no future.
0 notes
vampyrichamster · 11 years
Text
The lawyers of God
Many years ago, I picked up a copy of Feminism & Islam (edited by Mai Yamani) in a bookshop, now defunct, and remember being stunned at the variety of essayists putting up questions I didn’t know other women in the Muslim diaspora asked. At the time, I was a teenager, who had been raised her whole life in a state-controlled version of Sunni Islam at school, the more benevolent faith of my parents and the strident brand of atheism that only a young person could have. In my mind, the Shia were these mysterious extremists whose women draped themselves in black cloaks. They stoned adulterers to death. I’d only just learned that Saudi Arabian authorities were also in the habit of murdering their daughters in the name of honour — but the Saudis and all Arabs were just that extreme. That kind of thinking could never touch us in Malaysia.
So imagine my surprise when, here in this anthology, Shia feminists were putting forth more studied and progressive ideas about women’s independent capabilities within an Islamic context than some of their Sunni peers. I wasn’t comfortable then, and still am not now, with the idea that women had to take two steps backward to take one step forward, as the female Iranian lawyers and legislators I was reading about had, or learn to couch their arguments in the most subtle language of Islamic gender complementarity rather than equality. But, this book was talking about Muslim female lawyers striving to be judges in a time when my country’s administration wouldn’t even countenance female Sharia judges. (Note: Malaysia appointed its first two female Sharia judges in 2010.)
The entire experience opened the doors to new ideas, and new confusions. For a start, contrary to what Muslims within Malaysia are generally raised to believe, the Sharia isn’t a single, coherent body of hard and fast laws, and never was throughout its history. Even just taking into account the four accepted Sunni schools of law (Hanafi, Hanbali, Maliki and Shafi’i), Sharia as a body of laws continues to be an evolving entity, subject to countless interpretations. Naturally, each interpreter is certain of his righteousness. And this, unsurprisingly, causes natural human friction. Which of course leads to another radical idea: that is, that the Sharia is a growing body of human legal thought.
To the vast majority of Muslims in Malaysia, whatever their personal disagreements with individual edicts, the Sharia is the infallible word of God. The humans involved with working the Sharia are just there to somehow nail God’s orders into neat boxes of local context. The connection, that interpreting the Sharia for local contexts is in fact the effort of human beings making foreign laws relevant to their individual communities, disappears. Breaking Sharia law becomes blasphemy. Following the customs and mores of people who died 1,500 years ago becomes more important than individual history and culture itself.
Heaven on Earth by Sadakat Kadri goes through the chronology of the Sharia, from the compilation of the Quran to banning yoga. One of the major themes that enters his narrative is the idea that the more people strive to act the way the first Muslim community acted, the more their interpretations of Sharia resemble ancestor worship. The point of the matter is that no one knows for certain what these first Muslims were thinking. The founders of every Sharia school of law depended on collected oral histories passed down in the centuries after Muhammad’s death. While this process created a science out of verifying sources back to the early Muslims, looking at the compilation of hadiths more than amply suggests every piece of the historical narrative was ultimately dependent on the interpreter’s personal taste. A great example of this from Kadri’s book pertains to Muhammad al-Bukhari, who revolutionised hadith collection by committing them to paper. Until then, although written records were made for material events, Muslim leaders and scholars quite literally took the Arab idea that any wisdom worth knowing deserved to be learnt by heart. Compiling permanent records of Muhammad’s quotations and deeds was “presumptuous and dangerous” (pg. 82), because it set divine messages into a fallible medium — a complexity that was earlier encountered by the compilation of the Quran in the 7th century. al-Bukhari’s contemporaries were rightly worried that permanent recordings of holy wisdom meant the separation of the quest for knowledge (which in those days involved physical travel to far-flung oral historians) and merely acquiring knowledge (from a page).
Consider the modern example of reading about a subject on the Internet vs. studying that subject in a classroom. For example, if a student wanted to learn about daffodils, they could get detailed information about daffodils from a wiki, or they could experience the myriad forms and interpretations of the daffodil among other people hands-on. Ideally, the best approach would combine both methods. However, unless we have no other way of obtaining information except by approaching a subject directly, the vast majority of us would choose to read about that subject and take what we’ve learned at face value. (Admittedly, including much of my efforts here.)
Which brings us back to al-Bukhari. While his hadiths were predominantly an almanac of high religious duties and everyday questions, they certainly seemed to also incorporate stories that reflected his personal taste. Kadri cites al-Bukhari 5.58.188, recounting the recollection of a Hijazi tribesman who, in the dark days before Islam, saw a monkey being stoned to death by its fellows for adultery. He proceeded to join them in their grisly judgment. And this is before we take into account the mystical retelling of Muhammad’s journey into Heaven (al-Bukhari 4.54.429). A fixture of many a childhood religious class, the rich details of what Muhammad saw, his winged steed and his negotiations with God over the number of daily prayers Muslims were to fulfill are the products of hadiths. They significantly expanded the journey’s brief mentions in the Quran (17:1 & 60), logging not just encounters with God, but also with Moses, Abraham, Jesus, John the Baptist, Joseph, Enoch and Aaron.
al-Bukhari’s collection became one of the major sources of Sharia legal ideas in use today. The trend he began of collecting hadiths on paper, sanctified one version of Islamic history at the expense of many others. Only apocryphal notes and myths remain of those stories that never survived as canon. Because the written word became the most widespread, alternative stories were doomed to be forgotten or deemed irrelevant. Perhaps the most pervasive victims of this silent shroud were hadiths narrated by women. A’isha, one of the most important sources of hadiths, is credited with 2,200 hadiths, but only had 128 hadiths included in al-Bukhari’s collection. Fatima Mernissi’s seminal The Veil and the Male Elite takes apart the history and context of hadiths that restrict Muslim women’s rights, in particular, the al-Bukhari hadith that bars women from assuming leadership of their communities. Together with other books, including Leila Ahmed’s highly recommended Women and Gender in Islam, a picture emerges not of the whitewashed and streamlined view of Islamic history engraved in our childhood textbooks, but a morass of political ambition and cultural diversity in early Islam that shaped Sharia law. The companions of Muhammad, including the men who became the first four “Righteously Guided” Caliphs, all had different ideas on chastity and women’s public visibility, even during Muhammad’s lifetime. These ideas, aligned with tribal loyalties, power and varying local traditions, were as much influences on Sharia discourse as the distant voice of God.
So when traditionists today espouse living according to the Sharia, analogous to living according to the practices of the first Muslim community, their remarkably streamlined version of history should be questioned. The askers must come from as wide and true a representation of the people whom Sharia laws affect today — namely, all Muslim and non-Muslims citizens of every Islamic state — because the founders of Sharia laws were equally diverse. This not only helps facilitate democratic process, but also builds an inclusive framework that prevents future dispute. Put this way: if your sister is a Muslim and you are not, saying the laws that apply only to Muslims does not affect you would only work if every Muslim convert was disowned or disowns their flesh and blood. And even that is no certain barrier between family.
The idea that every Sharia edict is God’s word must be separated from its reality, that human interpreters are the source of human Sharia law. Most importantly, that separation must acknowledge that the hadiths considered canon are distillations of a much wider patchwork meant to be read as sunna (customary). Just because one narrative of Islamic history dominates Sunni Muslims, and is sanctioned, reprinted, donated and paid for by traditionists (hailing from the custodians of Sunni Islam in Saudi Arabia), does not mean that narrative is authentic by exclusion. Nor does that authenticity imply unassailable holiness. To reiterate a point, hadiths are recollections that themselves were subject to individual interpretation when many of the key figures in the first Muslim community were still alive. And while the figure of God looms large in the background, at no point were these stories ever said to be the verbatim word of a god.
And so, we must come to the conclusion that unless current context and history are made the basis behind formulating Sharia laws — and not simply the nebulous unreason of, “God said so (and thus anything with Sharia in it must be good)”, enacting the Sharia is an injustice to both Muslim and non-Muslim citizens of Islamic states. In fact, its enactment becomes farcical. Laws that govern a diverse people, based on the views of so personal a compass as faith and that cannot be questioned is tyranny, even if that tyranny is to do good and eschew evil.
1 note · View note
vampyrichamster · 11 years
Text
Girls before flowers
When I was nine, I got into some jokey fight with a male classmate that led to a little rough-housing. My school bus driver told the boy he should be more careful, since, “Girls are like flowers. They’re very delicate.”
Years later, in high school, I asked my Islamic Studies teacher why women wore headscarves and she replied, “When you have something precious, you take care to wrap it up away from others.”
I remember these and similar words said to me in passing. Their speakers spanned the gamut of religions, ethnicities and gender. Looked at from the microcosm of my childhood, they’re individual pieces of patriarchy.
Then, a few weeks ago, I read an anthology called Beyond Belief: The Secret Lives of Women in Extreme Religions. Most of the testimonials were from extreme Christian denominations, but among the few on Judaism was a story about a young woman’s encounter with Hassidic Jews. In it, a rabbi tells her that good Jewish women did not sing in public because, “A woman’s voice is like a jewel,” and “That it shouldn’t be flashed around.” Instead,  “A jewel should be kept in a safe and treasured place.”
Delicate, unseen and unheard. At what point is a girl a human being to others? At what point is she an object that is admired and given purpose by someone else? Call this segregated lifestyle what you like: treasured, safe or treated like a queen, it seems to handily replace a girl’s ability to decide for herself with a nebulous greater good.
But if a girl is something to be protected away from others, what happens when she leaves the safety of her minders? In recent times, we have seen the assault,  rape, even murder of girls who went outside their designated spheres to pursue educations, jobs and the basic need to secure a better future for themselves and their families.
Even the apparent safety of a purely obedient and submissive life is not safe in the hand of the wrong minders. Marriage out of duty and honour, given away by well-meaning parents, is no guarantee against domestic violence and abuse by a girl’s new family.
Therein lies the problem. We can put down the grievances of girls’ lives to mere misfortune, or we can start looking at every cause in a seemingly terrible chain of events as the poor choices of real people. People who may not have thought they had a choice. People who believed the traditions that came before them have generally always worked and must somehow continue to be the right way. People, even girls themselves, who subsumed their own decisions under the weight of delicate, precious and fragile flowers — that which is called a weaker sex.
Shame, that most impermeable of veils, is an easier cure against people who stick out from others than rooting out rot from within, particularly to any class of people deemed submissive, weaker and more breakable. You see, weak girls who are told all their lives they are victims yet queens grow up to be women who tell children that girls get raped for wearing the wrong dress, or deserve harassment from strangers for walking down the street. These are the adults who tell girls they can’t go out at night because they raised the children who believe all girls are delicate and precious — except for girls who aren’t like them.
Girls are not a subclass of humanity, but neither are they flowers. Until we assume girls are as much people with as much personal agency as anyone else, rather than a protected species, we’re never going to make society safe for girls. There is no such thing as a girl too delicate to be seen and heard. Because being a girl is not having a debilitating disease, and flowers aren’t even human.
0 notes
vampyrichamster · 11 years
Text
The quest for the perfect word
As a linguist, one of the most frequent issues I encounter as part of my editing process is the argument for and against the obsoleting of words. One side sees the Malaysian Language as a constantly evolving entity, and the newest words are always the most correct to use. The other side is hell-bent on preserving what are considered ‘pure’ Malay words, for which the natural introduction of borrowed terms and phrases from other languages, particularly English, is an aberration that should be stopped.  
It’s a no-win situation. Appeasing one side just invokes the ire of the other, and frankly, I think attempting to do so is hopelessly myopic. Each side represents a single section of the rich tapestry that is a language. A language is filled with history and context. We can see the vagaries of fashionable words as an absolute, that is, here today and gone tomorrow, or we can see it as part of a continuing chain of evolving language.
Similarly, the senior words of a language, the ones that to the best of anyone’s recollection, have always been there, do not simply vanish when new terms that convey the same meaning appear. For example, when I was in school (in the 80s and 90s), the word we used for a budget was belanjawan. Around the late 90s, I think during the Asian financial crisis, the use of bajet (literally borrowing from the English “budget”) started coming into wider circulation. The national budget for 2014, as stated on the Malaysian Treasury’s website, is referred to as Bajet 2014 (The 2014 Budget). However, the rise of this new term’s popularity doesn’t mean belanjawan was suddenly erased from everyone’s memory. We did not automatically cease to use belanjawan in our daily speech. A new word simply offered an alternative that could be used in its place. That choice and variance is important for a language’s survival. We human beings can only continue to discover new concepts to describe, or in this case, new ways to look at a present idea, and as we do so, we must also discover new words to give these ideas life.
When critics decry the ‘dilution’ of the Malaysian Language with borrowed words from other languages, they ignore a fundamental aspect of Malaysian culture itself. The history of our peoples is an epic tale of migrants and intermingled settlers. Some of that tale is told in our mixed features, another part appears in our cuisine’s incredible palette of flavours. The ways that history shows itself is countless, and certainly, one place it leaves its artefacts is in our language. Just two examples from an exhaustive list include cukup (enough) from Mandarin and barat (west) from Hindi. Many of the words we use to get through our daily lives come from the ways our ancestors – these brave new settlers from many distant lands – described their world.
It is only natural that, as their descendants, we find our ways to describe our world as well. And this is where language, apart from its value as a growing historical record, needs context.
Human beings are exceptionally well gifted to discern context. All our religions and philosophies are at their root, a discourse on the context of its time. At a very basic level, context is the meat that gives meaning to communication. Context, unlike words themselves, can vanish into the historical ether. Often, it has. We don’t use the laborious old Malay of the Sejarah Melayu (Malay Annals) in modern communication partly because most of the context behind its vocabulary is no longer relevant to our era. Yet, because context can evolve, words from the past can come back into use.
Take for example the word for “king”, raja. This word has Sanskrit roots, and arrived on our shores through the spread of Hinduism and Hindu concepts of government nearly 2,000 years ago. For at least until around the 1400s, nation states in Southeast Asia were predominantly Hindu, including those on what would become Peninsular and East Malaysia. The adoption of Indian culture was seen as aspiring to the most sophisticated standard at the time, kind of like adopting smartphones is to look sophisticated now. You’re thinking, that last example was too basic, if not entirely base. Well, I could have said the Sultanates of the early British colonial period in Malaysian history were made to believe British upper class culture were more sophisticated than theirs, a concept the British outright cultivated, helping reinforce the superiority of the English vs. the natives. Just imagine how many tough old boots I’d have stepped on if I did.
But I digress. Raja originally came with divine associations. To be raja was to be a nation’s direct link with the (Hindu) gods. Lèse-majesté in 700 BCE was essentially offending the (Hindu) gods. Our word and concept for government, kerajaan, is an acknowledgement of being ruled by a raja. Circa 2013 CE, no one in their right minds would say describing members of royalty as raja meant we lived under an ancient form of Indian governance. The context behind raja has changed. Our constitutional monarchy is still ruled by rajas, but their methods of government are entirely different from what they were. For a start, our raja is constitutionally Muslim. The word didn’t vanish into the obscure ends of the dictionary because it was constantly used, and constantly evolved in its scope.  
Naturally, many other borrowed words have also enjoyed this happy fate. The colourful tapestry of language is always filled with clues about our historical evolution. Here’s a quick question: Who came first to Peninsular Malaysia, the Arab traders or Islam? Early Malay kingdoms grew to prominence as major trading ports because they were centrally located between western (India and Arabia) and eastern (China and East Asia) sailing routes. Sailors from these lands settled and co-mingled with the locals. We know this because many people today carry their names. Just as not all the citizens of Arabia were Muslim during any given Caliphate, not all the citizens of Arabia today are Muslim. Thus, we come to the notion that not all the Arab traders who visited the Malaccan Sultanate were necessarily Muslim. But each Arab trader brought the Arab language, and words from their language seeped into common use, lasting to the present day. Their word for God was Allah. From pre-Islamic pagans to modern Arab Christians and Jews, Allah was God, differentiated for each religion by the context of conversation.
My point is, if speakers of Arabic have understood each other perfectly well in the centuries up till now over whose Allah is being invoked in conversation, do you suppose anyone could mistake Allah as the God of Islam if that name was exulted by a priest during mass? If all the Abrahamic faiths up to Islam, by its own official history, are part of a long chain of related messengers, then is Allah the sole property of the last message standing?
Two things here. Allah, the One God, cannot conceptually be three individual Gods each for the Jews, Christians and Muslims, and most certainly not if we depend on the Islamic narrative. Do we say therefore, that by claiming Allah is just for Muslims, the One God, according to the Islamic narrative, is really a trinity – again, one for the Jews, one for the Christians and one for the Muslims? Moreover, the tapestry of language suggests that this isn’t remotely how language works. Just because Allah was used for the Christian God yesterday, does not make it immediately obsolete tomorrow. To say so would be an insult to human memory, and an even greater insult to the continuous flow of human culture.
0 notes
vampyrichamster · 11 years
Text
Generation Hope
To me, faith has two faces. There is the private god, a belief in some higher power that gives life meaning, and in that way serves as a private moral compass when the unexpected happens.
By extension, there is the evangelical god, who may have begun as a private belief, but whose vigour and passion was deemed so important, that its believer sought to spread the good word hither and thither, compelling others to share this vision.
The private god, by his whispers and quiet hand, is rarely seen and rarely acknowledged in public. The evangelical god, because it is a fire, and fire must keep burning to stay alive, sets the world alight. That evangelical god would harness anything to stay lit. If it could sell votes to children with ice cream, it would hijack an ice cream truck.
The evangelical god, therefore, is the ambition of man fueled by his innate fears. It is the tribal religion, the one that builds borders and enforces them. In time, it becomes the religion of do, not think.
Most of us forget that the Abrahamic faiths began as apocalyptic cults. Many of its prophets and believers thought the apocalypse would happen in the near future, possibly even within their lifetimes. And so, the message, "To do good and vanquish all evil," is less about creating the ideal society where the good shall be rewarded and the wrong judged (and presumably punished), and more about creating that society as means to enact the end. When the end does not come, and a new age passes, the core message adapts. Maybe the rightful ruler of all things has yet to manifest. Maybe somewhere out there, there was that one last tribe of sinners left unturned.
We must remember too that back in the early days, when angels and prophets walked the earth, the earth was rather much smaller for the societies involved. The sea was vast, the human vessel minute. For many, the mountains and deserts surrounding their city state were the only borders they would ever know. A single, mighty army could smite a city to smithereens in the space of a day, and that scenario has played repeatedly throughout our history, as the world's borders grew ever wider. What then, do we do now that philosophies are global, and every time an event of apocalyptic proportions has occurred to any single nation, we have simply...lived on?
I think that the human mind is more adaptable than we give it credit for. We are good at making the best of things. If faith is a coping mechanism, then it is only one part of a vaster, richer mind, one capable of working towards consensus with other like-minds. The ever-widening borders -- the scope of what we can think up next -- scares one generation, and the next generation quickly grows out of it. The fact that we evolve, and have evolved the reasoning to realise this, means we, as a species, are really headed towards a season of hope. We are bigger than the shortsighted vision of a homogenous, just society based on punitive punishment of everyone who is different from us. We are better than anyone who tells us the world is not painted in technicolour; that it is not as diverse, as thoughtful, as chaotic and as random as we think it is.
The world is beautiful because it is random, and in that randomness, we are always in the process of seeking structure.
0 notes