(This article was one I originally wrote for The American Word)
Feminism seems to be in the news a lot lately; the news focuses on a different celebrity’s opinion about feminism and women’s rights every other day. Within the last week, Carly Fiorina, the only woman in the Republican presidential field, blasted a “progressive view” of feminism, saying that it isn’t working. Teen star Zendaya also was in the news this week for sharing her personal definition of feminism.
But what is feminism? Feminism is defined by Merriam-Webster’s Dictionary as “the theory of the political, economic, and social equality of the sexes.” Essentially, it means that you believe that men and women should have equal rights and opportunities.
The reason this subject continues to get so much press is because, while many of celebrities clearly believe in what feminism means, many of them still shy away from the term.
There is a slew of celebrities who suffer from a fear of the “f-word.” Celebrities like Carrie Underwood have actually acknowledged that part of the reason they do not call themselves feminists is because of the stigma often associated with this word. “I wouldn’t go so far as to say I am a feminist, that can come off as a negative connotation. But I am a strong female,” said Underwood. There are also some stars who are more than a little confused as to what the term really means, like Taylor Swift. “I don’t really think about things as guys versus girls. I never have. I was raised by parents who brought me up to think if you work as hard as guys, you can go far in life.”
At the same time, many celebrities have embraced the term “feminism” and have called themselves feminists. Ellen Page does not understand “why people are so reluctant to say they’re feminists. Maybe some women just don’t care. But how could it be any more obvious that we still live in a patriarchal world when feminism is a bad word?” Others like Lena Dunham have acknowledged that “women saying ‘I’m not a feminist’ is [her] greatest pet peeve.”
This leads to the question: is it important for celebrities to publicly say that they are feminists?
There are two sides to this issue: the first claims that celebrities’ power and tendency to be in the spotlight can be used to further important conversations and the second side says that the same power makes the celebrities themselves the focus of the conversations instead.
It is important, though, to ask these celebrities why they are feminists; many members of society are actually interested in creating a productive dialogue about this subject and the power that celebrities have provides the platform for this conversation to take place. While celebrities won’t be criticized by these members for saying that “I’m not a feminist,” they might be criticized for what comes after the word “because,” simply because, as Amanda Duberman of the Huffington Post stated, “the ‘what’ is the headline, but the “why” is the teachable moment.”
At the same time, we also have to be aware of the celebrities themselves becoming the central focus of these comments as opposed to what they’re actually saying. Celebrities are increasingly becoming a part of an effort to re-brand feminism, as if there is the perfect combination of words and images that will make the issue of gender equality more appealing to the masses and erase the stigma behind the “f-word.”
The logic is that we are more likely to embrace feminism and feminist messages when they are delivered in the right “package.” This package is one that generally includes youth and a particular kind of beauty and fame, things that most celebrities encompass.
In my opinion, this side of feminism is the side that has the most problems. A lot of people are willfully ignorant about anything regarding feminism, including what the word means and what the movement aims to achieve. However, a pretty young woman mentions feminism, and all of a sudden that broad ignorance disappears or is set aside because, at last, we have a more tolerable voice proclaiming the very messages feminism has been trying to impart for so long.
As long as we continue to focus on who the next celebrity feminist might be and how they might package this message, then we avoid discussing the real problems and inequalities that women face, not just in the United States, but all across the world.
When we as a society focus on celebrity feminists as the subject of our news and not what they are saying, we are choosing to avoid having the difficult conversations about the pay gap, the all-too-often sexist music we listen to, the limited reproductive freedom women are allowed to exercise, and the continuous sexual harassment and violence too many women face, especially on college campuses. In reality, the only way to bring about the changes that this society will require to one day resolve this issue is by having these important conversations.
I truly think that feminism will ultimately bring us a better world, one that will be better for all people. But in order for this to happen, the topic needs to be able to be spoken about freely, without a stigma, and certainly without celebrities names being the focus of the conversation.