Feeds:
Posts
Comments

Archive for June, 2010

So, today an Undecided reader tipped me off to an interesting read over at TheFrisky, entitled “Why Being Called ‘An Opinionated Woman’ Hurts.” Writer Chloe Angyal sets the scene:

Last weekend, I was hanging out with a male friend who I’ll call Stan. Over the course of our convo, he brought up a mutual friend who writes a rather detailed blog about her sex life. Stan was obviously disturbed by the amount of sex she appeared to be having, and the circumstances under which she’s having it. He was so perturbed that, well, the term ‘slut’ may have been thrown around once or twice.

I, of course, objected and a fight ensued. ‘Look, Chloe,’ Stan said. ‘You’re a very opinionated woman…’

I couldn’t help but notice that the tone he used for the words ‘slut’ and ‘opinionated’ sounded exactly the same.

I’m guessing you know what Angyal’s talking about. I certainly do. (Surely I can’t be the only one who’s been kicked under the table by a significant other attempting some sort of panicked Don’t-Say-It-Dear-God-Please-Don’t-Say-It!-driven damage control… ummm, can I?) I’ve never been known for my lack of opinions–or my lack of willingness to express them. And so, I guess it’s logical that I am no stranger to the way being identified as an opinionated woman can sometimes feel. But, you know, being a strongly opinionated woman and all, shouldn’t I be immune to the sting of such a sling? ‘Fraid not, folks. As Angyal points out, there are reasons why to be called out as such hurts just a little bit more for the female among us.

But ‘opinionated,’ when applied to a woman, is often code for ‘uppity,’ and ‘unladylike,’ and most of all, for ‘threatening because you don’t seem to be agreeing with me like you’re supposed to.’…

Note that Stan didn’t just call me opinionated. He called me an opinionated woman, as if to suggest that I was somehow unique or unusual for having both XX chromosomes and ideas…

That’s all ‘opinionated’ really is when applied to a woman: A way to shut her down. It’s like calling her ‘ugly’ and hoping that she’ll be so upset by your slight that she’ll shut up and stop disagreeing with you. Sometimes, it works, because no one likes to be told they’re bad at being a woman.

Like calling ourselves feminists, owning our opinions is risky. Putting ourselves out there is akin to voluntarily putting ourselves in front of the proverbial jury–one that will, in all likelihood, not only hand down judgments about the opinions we put out there, but, more insidiously–and more to the point–make judgments about us based on the fact that we’re putting those opinions out there at all. And it’s hard not to be held just a little bit captive to the fear that we’ll be perceived a certain way if we step too far out of the bounds of what’s socially acceptable for a woman. As though to do so somehow makes us less of a woman. And no one wants that.

To shut up or to speak up, that is the question, and either option involves a risk. In speaking up, there’s a risk that someone will think us a little less ladlylike, a little improper, a little…. opinionated. But in shutting up, I’d venture to say that the risks are steeper. Namely, that every time we opt to keep our opinions to ourselves, we do a little more to convince ourselves that using our voice is dangerous. Which makes it a little bit harder to use it the next time. And the next. And in my not so humble opinion, that’s a whole lot riskier.


Share/Bookmark

Read Full Post »

Or, as Wavy Gravy put it, we’re all just bozos on the bus, so we might as well sit back and enjoy the ride.

That’s all well and good in theory (and coming from the man who gave brown acid a bad name–and ice cream a good one), but who wants to admit to being a bozo? We have images to uphold here!  Whether that of some iconic self–the indie artist with no interest in the mainstream or the serious writer with no time for fashion or the free-spiritied adventurer with not a care in the world, or that of the superwoman who has it all–and has it all together. Whatever your role, the performance is remarkably similar. Someone asks how you’re doing; you say fine. You ask her; she says fine. Fine, then! We worry what other people think (though we’d never admit it), and, of course, we want to be happy, confident, competent, and successful. So we pretend we are. And, compounding the issue is the fact that the happy, confident, competent, successful self is the self everyone else shows to us, too, which compels us to keep our dirty little secret under even deeper wraps. If she (and she and she) has it together, what the hell is the matter with me??

It’s the open secret Rumi wrote about (and to which Elizabeth Lesser makes beautiful reference here), yet, centuries later, we still feel compelled to keep. And that’s understandable. Who wants to admit to being afraid, uncertain, overwhelmed, clumsy, neurotic, or prone to saying the wrong thing? The thing is, though, all of those things are part of the human condition–and those things and the good things aren’t mutually exclusive. And so why should claiming them be a negative? On the contrary: I think there’s a promise of something pretty awesome that comes when we’re able to own it all. The sky doesn’t fall, but, like the curtain hiding the Wizard of Oz, the blinders do.

And then what might we see? Well, for one thing, maybe a willingness to own our complex, dualistic, not always delightful but utterly human nature can make our choices a little bit clearer. With no one to impress, no images to uphold, we’ve got a lot less to factor in. There’s a freedom there. (We can be an indie artist AND a Hills junkie. A serious writer AND a fashion slut. A superwoman who has it all AND is totally overwhelmed.) And power, too: because when we are willing to come out of the I’m Fine! closet, maybe our friends will join us.

In fact, while writing this post (multitasking, of course), I was g-chatting with a friend. “I’m currently writing a post entitled, ‘I’m a Mess; You’re A Mess,'” I typed.

Her reply? “I hope you’re not naming names.”

Well, whaddaya know. And here I thought it was just me on this bus.

Share/Bookmark

Read Full Post »

Here’s a little good news for anyone who’s getting older (yeah, Peter Pan, that includes you): a recent Gallup poll has found that as people get older, they get happier.

You read that right. And I know, such a finding runs counter to the results of countless other studies and anecdotal asides. Not to mention the stereotypes of lonely spinsters, desperate housewives, cosmetically-enhanced cougars, and grumpy old men who have nothing better to do than lament the passing of the “good old days” and talk about the state of their prostate.

Researchers are equally baffled, according to a piece in the New York Times:

‘It could be that there are environmental changes,’ said Arthur A. Stone, the lead author of a new study based on the survey, ‘or it could be psychological changes about the way we view the world, or it could even be biological–for example brain chemistry or endocrine changes.’

The survey involved more than 340,000 people nationwide, ages 18-85, and basically found that people are pretty happy around age 18, but then get less and less so, until about the age of 50, at which point,

there is a sharp reversal.

And then, as everything else begins to sag, happiness starts to climb. Really. Equally surprising is the finding that happiness levels had very little correlation to any of the life biggies you’d think might affect our emotional state: the results held regardless of sex, relationship status, employment status, and whether or not the respondent had children.

‘Those are four reasonable candidates,’ Dr. Stone said, ‘but they don’t make much difference.’

So what is going on, then? Well, our pal Barry Schwartz, he of The Paradox of Choice, has a theory. Talking to him recently for the book, Schwartz posited that what’s happening here has a lot to do with expectations, choices, and the freedom that comes when we’re able to let go of the notion that, because there are so many options out there, there must be one that’s Perfect-with-a-Capital-P… and that it’s our job to find it. Here’s some of what Schwartz had to say:

I think the fact is that you need to learn from experience that good enough is almost always good enough. It seems like settling, as you put it. Why would anyone settle?… And this is something that’s been coming up in the last few weeks, starting at age 50, people get happier. And I think a significant reason why is what you learn from experience is exactly that good enough is good enough, and once you learn that, you stop torturing yourself looking for the best and life gets a lot simpler. And I think it’s very difficult to convince a twenty year-old that that’s the way to go through life.

Difficult if not flat-out impossible. But I guess there’s a silver lining to this self-induced suffering, this lesson that only experience can teach us: We may never find perfection, but we will surely get older.

Not the best news, I realize, but certainly, it’s good enough.


Share/Bookmark

Read Full Post »

Weren’t we all?

I came across that line Wednesday in a piece by Maureen Dowd, who quoted Michelle Obama as saying that her husband had spent so much time alone growing up that it was as if he had been raised by wolves.

Love that phrase, don’t you?

Think about it and you realize that, in a twisted kind of way, we’ve all been raised by wolves. As women in this new millennium, most of us are going it alone right now, figuring out how to navigate new and unfamiliar turf, without really knowing the rules once we leave the woods.

Growing pains? You bet. And you see them everywhere you look, in a variety of flavors. Here’s just a taste. In a piece in The Nation on the upcoming confirmation of Elena Kagan, Patricia J. Williams predicts that Kagan’s success as a lawyer will be characterized as “unwomanly” because, of course, success in such fields is equated with testosterone. She reminds us both how far we’ve come — and how far we’ve yet to go, noting that gender stereotyping is sometimes embedded in the language:

Forty years after the birth of modern feminism, we are still not able to think about women who attain certain kinds of professional success as normatively gendered. Officially, the English language does not have gendered nouns. Yet it seems that we do invest certain words with gendered exclusivity—nurse, fireman, CEO, lawyer—if only as a matter of general parlance. There’s a story that used to be ubiquitous about thirty years ago: a father rushes his son to the hospital after a bicycle accident. The boy is whisked into Emergency and ends up on the operating table. The surgeon looks down at the boy and gasps, “Oh, my God! This is my son!” The story would end with the question, “How is that possible?” Much puzzlement would ensue until the “Aha!” moment: the surgeon was the boy’s mother. In that era, the likelihood of a surgeon being female was so negligible that divining the answer became a kind of “test” of radical feminist sensibility.

Then there’s this, Vivia Chen’s piece from Legalweek.com that reminds us how much of our lives are caught up in trying to navigate that odious term called work-life balance. She reports on an interview with Harvard Law School grad Angie Kim whose sprint up the corporate ladder took a five year detour when her second child became sick with an undiagnosed illness. A few months back, Kim did some research and found that the majority of the women in her law school class had left the fast track. But the interesting thing (another sign of shifting terrain?) is what she told Chen:

“The ‘mommy track’ was renounced at birth for sanctioning boring flextime jobs with low plaster ceilings. But some of my not-fast-track classmates are using their clout and influence to create prestigious roles. A senior partner who brought many clients to her law firm, for example, now works 15 to 40 hours per week, mainly out of her home and on her own schedule… The author of a best-selling book on negotiations launched her own conflict resolution firm with about 15 lawyer and consultants. She works from home during school hours and after bedtime and takes July and August off.”

Kim argues that “the line between the fast track and the mommy track is blurring,” and that flexibility “is infiltrating more and more jobs and replacing traditional work values – long hours, face time – as the new workplace ideal.”

Positive signs? Could be, especially when you consider that as our workplace numbers rise — and with it our economic clout — we girls are in a better position to push for changes that work for us. Let’s look at Hanna Rosin’s piece in The Atlantic entitled “The End of Men.

What would a society in which women are on top look like? We already have an inkling. This is the first time that the cohort of Americans ages 30 to 44 has more college-educated women than college-educated men, and the effects are upsetting the traditional Cleaver-family dynamics. In 1970, women contributed 2 to 6 percent of the family income. Now the typical working wife brings home 42.2 percent, and four in 10 mothers—many of them single mothers—are the primary breadwinners in their families. The whole question of whether mothers should work is moot, argues Heather Boushey of the Center for American Progress, “because they just do. This idealized family—he works, she stays home—hardly exists anymore.”

The terms of marriage have changed radically since 1970. Typically, women’s income has been the main factor in determining whether a family moves up the class ladder or stays stagnant. And increasing numbers of women—unable to find men with a similar income and education—are forgoing marriage altogether. In 1970, 84 percent of women ages 30 to 44 were married; now 60 percent are. In 2007, among American women without a high-school diploma, 43 percent were married. And yet, for all the hand-wringing over the lonely spinster, the real loser in society—the only one to have made just slight financial gains since the 1970s—is the single man, whether poor or rich, college-educated or not. Hens rejoice; it’s the bachelor party that’s over.

Rosin doesn’t mention things like the wage gap or pervasive gender stereotyping (see above) that effectively quashes our numbers right now. But she does make an important point: if higher education is the “gateway to economic success” as well as a prereq for life in the middle class, clearly women in the not-too-distant future are going to be calling their own shots.

What those shots might be, however, is what’s so hard to figure out. In “Doing Grown-up Wrong” on siren.com, Allison Hantschel asks “what we do when we don’t have what the Jonese have and worse, don’t even want it?” What she knows she doesn’t want: a big house in the country, a bunch of kids, a climb up the corporate ladder. What she does want? That, she doesn’t quite get.

Which brings us back to the wolves. We’ve been raised in one world and suddenly we find ourselves in another, roadmap not included. What now? Insert howl here.

Read Full Post »

… and according to a recent New York Times piece (that, as fate would have it, ran on Friday, a big birthday for yours truly; big enough to officially bump me from one age range box to the next, in fact) neither do you. Surely by now you’ve heard the phrase “extended adolescence”. And whether you take pride or offense in the suggestion that you and Peter Pan have much in common, the fact is, according to Frank F. Furstenberg, who leads the MacArthur Foundation Research Network on Transitions to Adulthood:

people between 20 and 34 are taking longer to finish their educations, establish themselves in careers, marry, have children, and become financially independent.

I’m guessing you knew that already. And this too:

“A new period of life is emerging in which young people are no longer adolescents but not yet adults,” Mr. Furstenberg said.

National surveys reveal that an overwhelming majority of Americans, including younger adults, agree that between 20 and 22, people should be finished with school, working and living on their own. But in practice many people in their 20s and early 30s have not yet reached these traditional milestones.

Marriage and parenthood — once seen as prerequisites for adulthood — are now viewed more as lifestyle choices.

The stretched-out walk to independence is rooted in social and economic shifts that started in the 1970s, including a change from a manufacturing to a service-based economy that sent many more people to college, and the women’s movement, which opened up educational and professional opportunities.

Women account for more than half of college students and nearly half of the work force, which in turn has delayed motherhood and marriage.

You get the drift. And I’m guessing you don’t need the New York Times to spell it out for you. Because, more than likely, in one way or another, it is you. We live in a world of wildly expanded opportunities, an all you can eat buffet where everything looks too damn tasty to miss out on any of it. And women in particular have absorbed the message that to be at this buffet at all is a lucky opportunity — so of course we want to get our money’s worth. To try (it ALL) before we buy. It’s a great big world out there, and there’s no MapQuest to tap for directions — we have to figure out our path as we go. And we kinda want to do a little scouting around before we commit to one path, and forgo all the others.

To me, the most interesting question the article brings up is this: while financial independence is one thing, as for the rest of it — marriage, parenthood, and one single Career — is making such commitments all there is to being an adult? Is signing on to something — one thing — forever and ever the only thing that can ferry you over the threshold, out of NeverNeverLand and into GrownUpDom?

Maybe I’m just a product of my times, but I don’t think so. I tend to think of a grown up as someone who makes her own decisions and takes responsibility for where they lead her. And doesn’t expect every one to be right — and doesn’t expect that there’s a right answer to every one. Even — no, especially — if they lead her to dead ends, forcing her to back up and start all over again in the search for a truer fit, head held high over the nagging chorus of Why Doesn’t She Just Grow Ups that surrounds her. Even if her decisions never lead her to a mortgage, or a job that she’ll stay in until she retires, or a promise to stick with one partner til death does she part. Even, in fact, if they lead her out the window in her jammies, following a guy in green tights.


Share/Bookmark

Read Full Post »

You bet your mortarboard. Stick with us, you’ll find out why.

But first, backstory: Last month, New York Times writer David Leonhardt slapped the debate about the value of our American college-for-all ethos smack-dab on our collective kitchen table. Ever since, knickers have been in a bundle all across the interwebs as readers, reporters, students, parents and, yep, even faculty members have all weighed in with, if not your basic my-way-or-the-highway answers, at least impassioned questions:

Does higher education matter? Is college worth the bucks — or the lifetime of debt? Why are graduation rates falling? And the big Kahuna: should college teach us how to think or teach us how to do?

The Times’ Jacques Steinberg weighed in a few weeks later with a piece of his own whose title said it all: Plan B: Skip College. That piece cranked up the blogosphere for days, generating a lot of, well, polite debate, much of which centered on a pretty central question that I might frame thus:

Who the hell cares about a classic liberal arts education when what we really need are techno-geeks?

The answer, I’m pleased to say, is lots of us. One comment to Steinberg’s piece was especially elegant:

Does “intensive, short-term vocational and career training” teach you to write well? Or to communicate effectively, think critically, view the world in general from a perspective other than your own? Aren’t these a few of the mandates of higher education? The argument of the scientists in this piece too narrowly focuses the situation into “student+college=career.”

A college-education is supposed to prepare a student not just for a career but for an educated and informed life. Now, perhaps these goals are far too often attained. Yes, that’s an issue that needs to be addressed. But as is typical, there is room here for a compromise: well-rounded, career-based curricula infused with a healthy dose of liberal arts, science, and math courses.

Do mail carriers need a BS degree to place envelopes in a slot? Of course not. But you’d hope that there is more to their existence than this simple task.

Even more important, though, and what goes to the heart of what we’ve been talking about in this space, is this: What if said postal worker decides that, yeah, walking around delivering the mail is cool and all that, but what I really want to do is (fill in the blank)?

Back in April, we ourselves weighed in on how that very issue — locking yourself into Plan A before you’ve even tasted Plan B — impacts young, bright women carving out uncharted territory in the new millenium:

Is it all about the treadmill? Possibly so. Which makes me wonder if this is another way in which great expectations do women in. When you’ve felt the pressure of unlimited options ever since Career Girl Barbie or whatever-her-name-was first peeked out from under the Christmas tree, do you feel the roar of indecision, the fight between the red one or the blue one, early on — and shut it down by choosing a path too soon? And then sticking with it.

That’s certainly one way to kill the angst. Whew.

But maybe that’s why choices are so loaded, too, because they become so narrowly focussed — and by definition, do not include a back-up plan. Failure, that great teacher, is not an option. (Nor, for that matter, is the broad-based education. Classics, anyone?) And then, out into the real world, when that first job is more about fetching lattes than writing business plans, there’s that thing called regret.

Grass-is-greener syndrome suddenly comes calling and kicks the best and the brightest right there in the ass. That race ? Yeah, not in first place anymore.

And then what? Well, for the last word on the subject — following in the wake of pieces in the New York Times by David Brooks and Stanley Fisk — let’s read what Michael Roth, the president of Wesleyan University, had to say via a post on Wednesday’s HuffPo. Watch for the boldface:

It is certainly understandable that in these uncertain economic times families are more concerned than ever with the kind of education their students will receive. That’s why it’s so important to understand the deep, contemporary practicality of a liberal education. Patient and persistent critical inquiry has never been more crucial, and the development of this capacity is one of the defining features of a liberal education. One learns that successful inquiry is rigorous and innovative, and that one must be able to re-evaluate one’s own practices and prejudices. Real inquiry is pragmatic, and it is also reflexive — it includes rigorous self-examination. Given the pace of technological and social change, it no longer makes sense to devote four years of higher education entirely to specific skills. By learning how to learn, one makes one’s education last a lifetime. What could be more practical? Post secondary education, I am fond of telling the undergrads at Wesleyan, should help students to discover what they love to do, and to get better at it. They should develop the ability to continue learning so that they become agents of change — not victims of it.

Learning how to learn? An education that lasts a lifetime? Considerable bang for the buck, don’t ya think? Those thick books? Twenty page papers? Calculus, for the love of God?

Think trip, not destination. Discovery. Prepping yourself for the road not taken. In other words: Happy Graduation!

Read Full Post »

So. Remember that old anti-drug television commercial that shouted out: Here’s your brain on drugs — then showed an egg sunny-side up, sizzling in a frying pan?

Well, these days, the sizzler is the internet, as in uber-connection. And the result is less like fried eggs than a scramble, according to a piece in the Sunday New York Times, which said:

Scientists say juggling e-mail, phone calls and other incoming information can change how people think and behave. They say our ability to focus is being undermined by bursts of information.

These play to a primitive impulse to respond to immediate opportunities and threats. The stimulation provokes excitement — a dopamine squirt — that researchers say can be addictive. In its absence, people feel bored.

The resulting distractions can have deadly consequences, as when cellphone-wielding drivers and train engineers cause wrecks. And for millions of people like Mr. Campbell, these urges can inflict nicks and cuts on creativity and deep thought, interrupting work and family life.

While many people say multitasking makes them more productive, research shows otherwise. Heavy multitaskers actually have more trouble focusing and shutting out irrelevant information, scientists say, and they experience more stress.

No shit. All of this dependence on short-term bursts of information screws with our focus and our ability to process in the long term. Hello: decision making? Actual conversation? (Is that a cell phone in your pocket or are you just bored to see me?) Some folks call it acquired ADHD. All of which brings to mind one of our posts from last November (hence the Thanksgiving reference), which brings the same point to bear. So, this being the summer rerun season, we thought we’d replay it here:

There is a point here, I promise. But first, here’s the scene. My desk, at work. A wobbly stack of books, papers and files, some dating back to last spring. A to-do list, also written last spring. On the other side of my mousepad, a pile of resumes for the letters of rec I need to write. On my computer, some 200 emails that at least have to be opened.

Plus the steady buzz of folks, either in the hall, or in my office. Kinda like a roving cocktail party, but without the booze. This is not necessarily a good thing. The latter, I mean.

My home office, not much better. At least 100 unread emails. My desk is cleaner — today — but you still never know what you’ll find. A friend once described my work-at-home digs as a junk drawer. At times, the description is apt.

On Tuesday I got up early, graded papers, scanned two newspapers, got ready for school, found and paid my Macy’s bill while my Cheerios got soggy, blew out the door and off to work, taught some classes, and met with a bunch of students who have the end-of-quarter heebie-jeebies. (They’re contagious).

Last week, we hosted a party to celebrate a friend’s engagement. Next week is Thanksgiving (Yikes! I forgot to order the turkey). It’s my husband’s and son-in-law’s birthdays. Shannon and I are knee-deep in writing this book. And this blog. My hair is stringy and I’m low on clean clothes. So here I am.

Don’t get me wrong. I fully realize that those balls I’ve got in the air mark me as a lucky woman. Nonetheless, I’m somewhat breathless just itemizing all this. I’m frazzled. Distracted. And probably like you, just a little bit crazed: Too much going on, going on all at once.

Maybe it was ever so. But now, add this. The San Francisco Chronicle has reported on some new studies on the way that techno-stimulation — texts, tweets, IMs, Facebook, news alerts, the list goes on — has led to a new form of attention deficit disorder. We’re always on. Uber-connected. Addicted to short bursts of constant information. And despite our best intentions, we get sucked in. All of which, experts say, impacts our ability to analyze. From the story:

“The more we become used to just sound bites and tweets,” [Dr. Elias Aboujaoude, director of Stanford University’s Impulse Control Disorders Clinic at Stanford University] said, “the less patient we will be with more complex, more meaningful information. And I do think we might lose the ability to analyze things with any depth and nuance. Like any skill, if you don’t use it, you lose it.”

Dr. John Ratey, an associate clinical professor of psychiatry at Harvard Medical School, uses the term “acquired attention deficit disorder” to describe the way technology is rewiring the modern brain.

I don’t know about you, but I really don’t need to know what Suzy from Ohio is doing every five minutes. And yet. There’s the seduction of the buzz, the flash. She has me at beep-beep.

Which brings me belatedly to my point: Is all this stuff, this stimulation, this juggling, cluttering up our already cluttered brains to the point where we are not only overwhelmed — but chronically undecided?

The science suggests the answer is yes. Shannon wrote earlier on our blog about the Paradox of Choice, about how the more choices that confront us, the less likely we are to make one — or to be happy with it when we do. There’s the iconic jam study, where shoppers confronted with 24 jars of jam — versus just six — walked away empty handed. And the pivotal Magical Number Seven study, which dates back to the 1950s, that found that the human brain has trouble processing more than seven items at a time. The study was the basis for similar research in 1999 by Stanford Marketing Professor Baba Shiv, then an assistant professor at University of Iowa. He sent two groups off to memorize a series of numbers. One group had to memorize three. The other, seven. At the end of the task, the groups were given their choice of a treat: gooey chocolate cake or fruit salad. The three digit group overwhelmingly chose fruit. The seven digit group — cake. The point? Overwhelmed with the memory task, the rational brain of the seven-digit folks begged off and let the emotional side take over.

Shannon wrote recently about Zen and art of multi-tasking where, really, what we need to do when we drink tea –is to just drink tea. I wrote about the need to just play cards. Put all of this together and I think you find that maybe, for our own mental health, not to mention our ability to make decisions, we need to turn down the chatter.

Sixties guru Timothy Leary (he of LSD fame) once exhorted the youth of the day to “turn on, tune in, drop out.” I’m thinking it’s time to flip the switch: Turn off, tune out, drop in.

But wait. Did that make the slightest bit of sense? Not sure. I’m off to find some chocolate cake.

Read Full Post »

If it sounds like the above could be the title of a horror flick, well, you’re not far off. I came across the following bit of clever repartee between Mick LaSalle, our often irreverent film critic, and a loyal reader in our local paper this Sunday and was suddenly loaded for bear.

I feel obliged to point out that the column was brought to my attention by my husband, who enjoys a good rant as much as any of us. Stay tuned, but first check this:

Dear Mick LaSalle: I just saw “Aberdeen” (2000), featuring an actress new to me: Lena Headey. I looked up what else she has done, only to find that since “Aberdeen” she has made, for the most part, a series of second-rate horror flicks. What happens in a case like this? Poor management? A really bad agent? Blacklisting? Frank Flynn, Eureka

Dear Frank Flynn: No, it’s worse: two X chromosomes. Welcome to my world, Frank. Every year, I see actresses do great work in films and then disappear. In another generation, a studio would have nurtured them, and in other countries, filmmakers would build films around their talents. Not in the English-speaking world. Even established stars, such as Naomi Watts, Halle Berry and Ashley Judd, can go five years without getting a role worthy of their talents. In another country, they’d have two or three strong roles a year. What’s Catherine McCormack doing these days? Or Claire Forlani, Chad Morgan, Natasha McElhone, N’Bushe Wright, Bai Ling, Natasha Gregson Wagner or Alison Elliott? All of them have shown exceptional ability or charm or both onscreen, working in major films. All are still working, but much of that work is under the radar. Headey is doing better than most, in that she starred in a major action movie (“300”). Basically, women in Hollywood need to look convincing swinging a mace – and attractive with bloody fangs. Then they’ll never starve.

Okay. I have never seen Aberdeen, and I confess I don’t know Lena Headey. (Wait. Did I just make La Salle’s point?) But Naomi Watts, Halle Berry and Ashley Judd? Ready to put out to pasture as either deranged ex-wives or district attorneys? (Note what’s happened to former sexpot Sharon Stone on CSI: Special Victims Unit.)

For years we have decried the fact that the fat guy always gets the cute girl in the movies. We have for years ranted: about the schlubby guys on TV who have the slim trim wives; about the loser guys who end up with, you know, Katherine Heigl; about the sweet young things who are wooed by the guys old enough to be their grandpas. (That movie with Sean Connery and Catherine Zeta-Jones as the love interests? Stop me before I poke myself in the eye with a sharp stick.)

What we want to know first is why do we pay money to watch this junk. Unless I’m living in an alternate universe, it’s not believable. Or very entertaining, either. Last I checked, most sane women are not pining after some pudgy dude with a receding hairline and a bad choice of pants. Right?

But the real question is why this stuff gets made, and why women — at least as far as American media are concerned — are considered washed up by the time they get the first intimations of crow’s feet. Yeah, yeah, we know: Meryl Streep and Helen Mirren are still star quality and hooray for them. In every possible way. But are they the exceptions that prove the rule?

I think the answer may have something to do with gender parity, and here’s what we journalist types would call the nutgraf — somewhat buried, in this case — or the big picture stuff. In 2009, the Hollywood Writers Report by the Writers Guild of America, West (WGAW), found that women and minorities had not made any significant hiring gains since 2005, with women writers making up roughly one quarter of the field. Repeat: one quarter. The report states:

“Women, who account for slightly more than 50 percent of the U.S. population, remain underrepresented in television employment by 2 to 1 and in film employment by nearly 3 to 1. Their salaries, too, show a discrepancy: white men, $98,875 versus women, $57,151 for a whopping wage gap of $41,724.

Are you kidding me? Read it again. Is it any wonder that we’re made to believe that the old guy gets the girl? Of course, that’s just the movies. Hollywood fantasies. But look at the damage those ridiculous media images have done to women’s self-image. Our conception of ourselves. Ugh, right?

But now, let’s use movies as metaphor: What happens when women are relegated to one quarter of other segments of our society — like government, boardrooms, the offices down the hall where policy is made? Think about it.

As in movies, so in life. And ain’t that the curse of the double-X.

Read Full Post »

Guess who’s calling herself a feminist? I’ll give you a hint: she doesn’t read much but cooks a mean moose chili, and while she isn’t a big fan of hopey changey stuff, she has been known to engage in such enlightened chants as “Drill, baby, drill.” (Though she’s been conspicuously quiet on that subject as of late.) Oh, and she’s super-mavericky, too.

Yeah, her. Sarah Palin’s taken to calling herself a feminist (never mind the fact that, during the 2008 Presidential campaign she told Katie Couric that, no, she was not a feminist)–and many other self-described feminists are none too thrilled about it.

And, frankly, I’m terrifically torn. Yes, a part of me believes that a part of the reason young women are so reluctant to call themselves feminists is because, at times, the movement has seemed been exclusionary. Elitist. Historically, such charges aren’t entirely unfounded. I have even argued that, to my mind, Feminism can be boiled down to the simple idea that women are people.

So why aren’t I jumping up and down, welcoming a woman from the other side of the aisle to the party? Well, just being a woman isn’t enough. And being a woman who’s championed causes antithetical to the interests of women as a whole certainly seems like an adequate deal-breaker, doesn’t it?

The ultimate irony may be in the event at which Palin dropped the F-bomb (upwards of ten times) itself: It was a speech given to the Susan B. Anthony list, an anti-choice group. During the speech, she said the suffragettes were the real feminists (disregarding all that’s come since–you know, like the women who fought to pry open the doors through which Palin walked to get where she is today)–and that they were pro-life. She went on to disparage pro-choice feminists, suggesting that, by championing a woman’s right to choose, they’re really saying they just don’t believe women can handle motherhood and work, and

send this message, that ‘Nope, you’re not capable of doing both. You can’t give your child life and still pursue career and education. You’re not strong enough; you’re not capable.’ So it’s very hypocritical.

Implicit in such a statement is, of course, the idea that that woman who’s capable of doing both will have the benefit of enough support from the social structures around her to make it possible to do both–an argument that’s tough to make, given, you know, the ERA that was never passed, the fact that we’re still underpaid and underrepresented, not to mention the issues of inadequate, unaffordable child care and–until recently–health care (reform of which Palin feverishly campaigned against).

But back to Palin and her F-bombs. Jessica Valenti, who made a compelling argument that Palin’s feminism is not feminism at all, but rather disingenuous pandering for women’s votes come midterm time, lays it out thus:

A related strategy for Palin and fellow conservatives is to paint actual feminists as condescending hypocrites who simply don’t believe in young women… Palin’s “pro-woman sisterhood,” however, “is telling these young women that they’re strong enough and smart enough, they are capable to be able to handle an unintended pregnancy and still be able to… handle that [and] give that child life.” (Unless of course, these young women were unlucky enough to live in Alaska when then-Gov. Palin cut funding for an Anchorage shelter for teenage moms.)

Ahem, who you callin a hypocrite, Sarah?

But then, just when you’re ready to banish her from the kingdom forever, there’s this, from Meghan Daum at the L.A. Times.

The word in question, of course, is “feminist.” It may be the most polarizing label on the sociopolitical stage (it makes “environmentalist” or even “gay-rights advocate” seem downright banal), but Palin seems to have stopped dancing around it and finally claimed it as her partner. Granted, this is a conditional relationship; there’s a qualifier here as big as Alaska…

Now, there are a lot of ways in which [Palin’s] logic is contorted, not least of all the suggestion that supporting the right to choose represents a no-confidence vote for the idea of mothers leading fulfilling professional and personal lives. But putting that aside, I feel a duty (a feminist duty, in fact) to say this about Palin’s declaration: If she has the guts to call herself a feminist, then she’s entitled to be accepted as one.

Now, while a part of me agrees with Daum’s perspective, another part of me agrees with the in-your-face take offered by Kate Harding, who wrote on Jezebel that:

The problem is, words mean things. I could start calling myself a red meat conservative, or campaign for those of us who are against the death penalty to “reclaim” the term “pro-life,” but at some point, the relationship between your beliefs and your choice of words either passes the sniff test or it doesn’t. And someone who actively seeks to restrict women’s freedom calling herself a feminist is, not to put too fine a point on it, a liar. There’s a difference between a big tent and no boundaries whatsoever; if Palin’s “entitled to be accepted” as a feminist just because she says she’s one, then the word is completely meaningless–as opposed to merely vague and controversial. And I might just start calling myself a “right-winger” because I’m right-handed, or a “fundamentalist” because I believe everyone deserves a solid primary education, or a “birther” because I once hosted a baby shower.

Is she or isn’t she?? More troubling that all of this, to me, though, is this: it seems that what’s really happening here is that feminism is again being reduced to an issue of reproductive rights. Don’t get me wrong: I happen to believe a woman’s right to choose is critically, critically important–and while I find it nothing short of ludicrous to call an anti-choice argument “feminist,” something (else) about this entire debate rubs me wrong (and not just the high-school-clique-ish nature of the whole does she belong or doesn’t she question). I just think that every time we frame feminism as about abortion rights and nothing more, we take the focus off of what it’s really all about. And that is, of course, that women are people–people who deserve equal access, representation, freedom, pay, and support from all aspects of the social structures that circumscribe their lives. And while feminism may indeed be nothing more than the radical notion that women are people, a feminist is someone who puts her money (or her votes) where her rhetoric is. So, Sarah, call yourself whatever you want. But talk is cheap–and your record speaks for itself.


Share/Bookmark

Read Full Post »