Author Archives | by Vivian Wilson

Opinion: Astrology’s hidden wisdom

While Astrology is often met with a degree of snark for its lack of scientific backing, young people are flocking toward it now more than ever. Astrology has a certain staying power. 

It may have lost its footing as a science, but astrology hasn’t fallen to the wayside. We know now that there isn’t scientific backing behind it, yet many of us still read our horoscopes and dread Mercury retrogrades. 

Astrology holds importance and was a crucial part of the origin behind understanding our universe. Part of the development of geometry is credited to astrological calculations made by ancient Babylonians. Galileo and Kepler were expert astrologers as well as significant contributors to the field of astronomy. 

It’s proven to be somewhat influential as a tool for prediction and pattern recognition in recent history. President Ronald Reagan would use the writings of astrologer Joan Quigley to plan his schedules. Blending astrology and economics using astrological transits as a predictive tool for investing in the stock market is gaining traction.

Rachel Dominguez, tarot reader and owner of Third Eye Psychic Salon in Minneapolis, said despite criticism, astrology and other readings draw in a variety of clientele. 

“People will still seek it out,” Dominguez said. “I’ll still have scientists, atheists, come and get a reading. I will still have religious fundamentalists and conservative Christians and Muslims come get readings because we offer at least the possibility of certainty. People want to know. People want to be certain about things.”

Maybe this goes back to the intuitive wisdom that astrology provides. Perhaps not through scientific reason or empirical evidence, but as a tradition used to understand our universe for much of human history. Astrology is by no means scientific. However, scientific and useful are not interchangeable. 

To believe in something unprovable by scientific measures is not anything new or unheard of, either. 

Astrology is a source of faith and healing for many individuals. It plays a major part in Hindu culture, where astrologers are often consulted to aid with major life events, such as marriages or financial decisions. 

Astrology operates somewhere between science and spiritualism. Its core principles lie behind the idea that our natural surroundings and universe influence our behavior. 

It’s not based on pure fact but relies upon the transits and principles of the stars in our own universe, which we know to exist. It’s not based upon pure belief in something unprovable, but belief in certain patterns outside our physical world that may influence our behavior and predispositions.

Surena Singh, a first-year student at the University of Minnesota, said she believes astrology affects her mood at least to a certain degree, and that planetary transits can affect peoples’ disposition. 

“One day you’re feeling fine,” Singh said. “Then halfway through the day, it’s like your whole mood is changing. You just want to tune everything out. What is the reason for all of this?”

Astrology combines nature and nurture by providing us with an unconventionally holistic view of ourselves.

Birth charts go deeper than just the date of birth, with birth time and place being taken into account to create a larger picture of how we interact with our environment. A birth chart can reveal aspects of our personality not defined, but affected by universal events that took place when we were born. 

It provides us with a snapshot of the circumstances we entered this life that we otherwise wouldn’t think to consider. Contrary to popular belief, it doesn’t have to rely on reductive stereotypes such as “Tauruses are lazy” or “Geminis are two-faced.” It can be incredibly insightful into one’s personality when evaluated on a larger scale. 

Dominguez said good astrologers use birth charts to explore a person on a deeper level and begin an honest exploration of someone’s tendencies in life and love. 

“A good astrologer is going to look at your chart and say, ‘All right, here’s the parts of your chart that might get in the way of a long-term relationship.’” Dominguez said. “Or maybe I look at your chart and (your partner’s) chart and I see some incompatibilities or some potential conflicts that, if certain situations might arise, it might be really challenging and stressful to your relationship.”

There’s a certain beauty in having a complete stranger read your entire life.

As I mentioned earlier, astrologers can act as sources of catharsis and healing for many. Dominguez said astrology can be a helpful source outside of conventional religious or psychological methods.

“There is a quasi-confessional aspect, I think, too,” Dominguez said. “I think people come in and they want validation for the things that their free will or their brain or their heart has told them that they have to do that might make them feel guilty in some way. Whether some religion says it’s a sin, or some person got their feelings hurt or whatever.”

Astrology gives us greater insight into our connections with the universe, whether it be to answer questions about human nature, predict the stock market, or receive some sort of answer for why we are the way we are.

It’s no wonder we tend to look toward universal phenomena and patterns to explain and predict our uncertain world. 

Of course, astrology is in no way a substitute for mental health treatment. I’m advocating for its potential as a springboard toward the type of thought that gives a person the agency to get to the root of their problems. 

Dominguez said astrology can be a helpful tool toward self-actualization. 

“There’s a reason that you’re here,” Dominguez said. “There’s a reason those stars are looking down on you. And so we just try to figure that out, help people. Not even say, here’s your life’s purpose. It’s like, let’s try to read you in a way that clicks something in your head that becomes a catalyst for you to really figure that out for yourself and discover that.”

Maybe astrology’s continued legacy of cultural relevance provides us with some reassurance. That, for the problems we may not be able to understand, there may be an answer somewhere in our physical reality, as opposed to beyond the curtain, or outside of this realm. 

Maybe the answers to questions we can’t quite pose lie just outside the boundaries of this world or were within us all along.

Posted in UncategorizedComments Off on Opinion: Astrology’s hidden wisdom

Opinion: Why do we fall for partisan politics?

Our current system of partisan politics is a sick joke at the expense of the American people, and it’s gone on for far too long.

Politics are not meant to be amusing, and our conflation of entertainment and policy has resulted in a jumbled, black-and-white political landscape that is equal parts false advertising and fluff.

We have more in common than we’re led to believe, and our media landscape exacerbates the division that gets in the way of true bipartisan negotiation and progress. We’ve been warned about this from the very beginning.

There’s been a lot of speculation and fanfare claiming that Republicans and Democrats have become irreparably disparate. Reports from Brown University, NBC and Pew Research indicate a growing hostility between Americans along party lines.

Not everything is as it seems here, though.

The popular narrative around how polarized the Democratic and Republican parties have become is only half the story. Our harsh divisions along party lines only account for our hatred of the parties themselves, as in many ways the American consciousness has remained steadily centrist.

In general, voters across party lines tend to hold similar opinions on many issues. Even the party figureheads tend to concede to similar positions. We saw it with jobs, tax cuts and immigration this past election.

A Gallup poll of voters ranked the economy as their most important issue, and Howard Lavine, professor of political science at the University of Minnesota, said voters have stayed consistent on economic issues for the most part.

“I will say that the mass public has not become more extreme in its views on at least economic issues,” Lavine said. “Over the past 40 years, most people are still centrist, so we haven’t seen a move out to the extremes. The Democrats are a little bit more to the left, the Republicans are a little bit more to the right because mostly they’re just following the party elites. But they, at least on issues, are not that polarized.”

According to the Political Compass, which aims to measure individuals based on their political ideologies regarding social and economic issues, all four of the main presidential and vice presidential candidates in this election, both Republican and Democrat, were placed in the upper-right quadrant.

Vice President Kamala Harris and President-elect Donald Trump are more disparate in ideology than in the previous election, with the difference between Trump and President Joe Biden being almost insignificant as modeled by the compass. However, the gap between Trump and Harris doesn’t seem to be much more extreme than it was between former Presidential candidate Hillary Clinton and Trump in 2016. Still, all of these examples remain within the upper-right quadrant.

These charts don’t indicate a consistent trend of polarization, especially given that they are all still within the same quadrant.

Maybe we can all share a small sigh of relief, given that we have more in common than we originally thought, and our differences are in fact not irreparable.

It seems as if partisan politics in the U.S. tends to encourage and foster division. As we’ve made minor shifts here and there, our perceptions of those on the other side of the ticket have grown more and more distorted.

We’ve been warned against this from the very beginning.

George Washington’s famous farewell address made the assertion that factions, or as we now interpret them as political parties, turn politics into a dividing force instead of a unifying one, which is counterintuitive to the mission of democracy.

Washington specifically said political parties only serve as vehicles to fragment the nation.

“Unprincipled men will be enabled to subvert the power of the people and to usurp for themselves the reins of government,” Washington said. “Destroying afterwards the very engines which have lifted them to unjust dominion.”

This sentiment rings true today.

Politics have become more gamified and rooted in entertainment than ever before. We have a 24-hour news cycle and social media platforms that popularize short-form content without nuance. Everything needs to grab our attention and be digestible within a short time frame. There’s no time for clarity of argument or acknowledgment of the other side in a 30-second clip.

Alma Quiroz, a first-year University student, said she believes the media can make division worse.

“The media has definitely made it more polarizing,” Quiroz said. “It’s becoming more about (a politician’s) political party rather than the things that they’re going to bring into office.”

Lavine said a large component behind the increased polarity between Democrats and Republicans is social media’s interest-based algorithms.

“Social media stories are curated by algorithms that let you see more of what you seem to want to see, which is more and more one-sided coverage of politics,” Lavine said. “I think young people in particular, are particularly polarized given the kinds of media that they pay attention to.”

We are incentivized to see only the validity of our concerns and don’t even often think about the common or middle ground between us and our political opponents.

We need to embrace politics for all of its nuance and pragmatism, not shy away from it. Black-and-white thinking is hurting all of us, and we don’t need to grow so detached from an entire other half of the population based upon a few issues, given how much we actually have in common.

By continuing to reduce the other side, we too are complicit in harming our democratic process. It may be hard, but we should all try to separate ourselves from reductionist narratives and stereotypes based on our political leanings, and see what we can do to amend the polarization in our country.

It starts with us.

Posted in UncategorizedComments Off on Opinion: Why do we fall for partisan politics?

Opinion: All of our brains are rotted

I’m a frequent consumer of the internet, and it’s gotten to the point where I encounter something referential to Skibidi Toilet, Sigmas or, most recently, influencer-turned-boxer KSI’s new song “Thick of It” every time I open my phone.

This so-called “brain rot” content has taken the internet by storm. The Skibidi Toilet series collected 65 billion views on YouTube in 2023, and KSI’s infamous song reached number 64 on the Billboard Hot 100.

The title “brain rot” can be a misnomer, as not all brain rot is created equal. The type of content varies. Brain rot is not only mindless content aimed at young audiences like Skibidi Toilet or Subway Surfers gameplay but also self-aware satire of this type of humor and the people who consume it. Popular creators include Natalie Tran and Evan Cronin, among others.

This type of content is more of a spectacle than anything. What constitutes brain rot is not merely lowbrow, nonsensical humor but the discourse surrounding it as well.

We don’t only watch this content unironically. We consume it nonetheless, but how we consume it can make the greatest difference.

These two modes of consumption are not mutually exclusive as the line between them is extremely fine. While it’s not exactly brain fuel — very little short-form content is — the specific term of brain rot feels unfair.

If it’s being consumed by people who know exactly what it is, then it’s not brain rot, is it?

This, however, does not extend to small children. It has been proven that young children should have extremely limited access to screens, let alone short-form, fast-paced content.

Little entertainment media is intellectually stimulating nowadays, so what difference is there really between watching a random cartoon and consuming a Minecraft gameplay video?

Why do we, as self-aware consumers of this content, feel the need to take such a harsh moral stance on this purported genre that is in nearly no way worth the mental effort? Why do we feel the need to rename the age-old phenomenon of mindless entertainment once again with brain rot? Does it really serve us any benefit to be so alarmist about a subject so trivial?

Maggie Hennefeld, professor of cultural studies and comparative literature at the University of Minnesota, said there are many comparisons to draw between film in its infancy and the content we consume online today.

“Some of the earliest films were less than 30 seconds long, and they just existed to capture everyday reality or to represent some kind of silly accident or surprise,” Hennefeld said. “There are films from the 1890s about boxing cats. We all know that a lot of the internet today is dedicated to people’s cats, right? So there are a lot of similarities in terms of the short-form absurdities that sort of go viral on the internet today, and what cinema was used for in its very first decade.”

In this way, brain rot, and a lot of what has come to define the online space, is not exclusive to the internet. At this point, it is inextricably linked to visual media itself.

As people, we sometimes want to immerse ourselves in material that is not necessarily substantive. We don’t always need to read Dostoyevsky at the beach or watch a Ted Talk on our lunch break. Sometimes a break can just be a break.

According to Mira Jasmin, a third-year student, brain rot can be a mental reset.

“I do kind of feel like I spend time on social media to sort of turn off my brain,” Jasmin said.  “Sometimes I want to see content that doesn’t really mean anything, just because I’m trying not to critically think at that moment. I think there’s a lot of stuff on social media that’s kind of stupid, but sometimes I’m okay with absorbing some of the stupid stuff.”

Em White, a second-year student, considers brain rot content a guilty pleasure.

“I’ll watch (YouTube Shorts) when I get ready,” White said. “I will walk around my own home, scrolling on and on, and I’m like, ‘Wow, this sucks, but I love it.’ It might be damaging my brain, but I’m having a good time.”

Our culture has a rocky relationship with media that we consider “lowbrow” or not up to whatever arbitrary intellectual standard we consider acceptable for consumption.

In the public sphere, it seems that a lot of people tend to criticize these forms of entertainment while privately indulging in them. In this way, brain rot content is not any different from traditional guilty pleasures like reality television, sappy romance novels or silly cartoons. 

No matter how much discourse takes place, the numbers don’t lie. People like this form of entertainment, and will continue to tune in as long as it’s produced. It will continue to be produced as long as it’s profitable, and this loop will continue until a new form of entertainment in this same ilk comes along.

There is no need to continue the shame cycle that has dominated a large portion of our media landscape for decades at this point. Brain rot is just the most current iteration of the same trend we’ve been seeing for years.

As people who can healthfully consume content on screens, we don’t need to be so harsh or critical of what we consume as long as we are aware of what we’re consuming and it’s part of a balanced media diet.

Realistically, even those of us who aren’t brain-rot connoisseurs, such as myself, need not moralize so much of our media consumption, so long as we’re media literate and aware of what we’re consuming and why.

The kids are alright, and they have been for well over a hundred years.

Posted in UncategorizedComments Off on Opinion: All of our brains are rotted

Opinion: A new lens on procrastination

Our world has made procrastination normal. 

It’s led many of us into a seemingly endless spiral. We wait until the last minute to do everything all at once right before it’s due and then need so long to recover from the exhaustion that the cycle repeats itself seemingly forever. 

Procrastination is a self-fulfilling prophecy that many find hard to escape. It’s a miserable process that eats up time and is no way to live. 

It’s so much easier to continue this cycle than it is to fix it, though. Idle time is the devil’s workshop, and we live in a world where boredom is more avoidable than almost any other feeling. With constant access to entertainment and stimulation, it is fully possible to scroll and procrastinate until our eyeballs fall out. 

Procrastination feels like an inevitable reality of life, especially in college.

Liza Meredith, licensed psychologist and teaching focus professor at the University of Minnesota, said she’s seen how common it is. 

“I’ve taught at the University of Minnesota for eight years, and in one of my first psychology lectures, I almost always ask students, ‘Have you procrastinated?’” Meredith said. “Basically, the whole time I’ve worked here, people have said yes. So it’s really common and normal.”

We begin by not doing our work until the day of the deadline, then we escalate into avoiding our problems until they become unavoidable. Procrastination is far-reaching and may lead many into an existence of distraction and stasis, where life becomes a constant dopamine chase. 

Procrastination is the theft of our lives by nobody other than ourselves. 

However, certain enterprises stand to benefit greatly from this self-sabotage and even encourage it through dopamine-hacking technology and addictive interfaces. I’m talking about social media.

The time we lose doesn’t just disappear. On these apps, time is money.

According to Meredith, procrastination is often exacerbated by digital distractions because of the way our phones and the internet hack our brain chemistry.

“There’s always something enticing to look at on our phones that’s probably impacting our reward system in our brain more so than the tasks that we’re supposed to be doing,” Meredith said. “I think it’s really easy to get caught up in something sort of mindless and easy versus doing the thing that takes more cognitive effort.”

The more time we spend on an app or website, the more time the algorithms can obtain data based on our interactions to sell to advertisers who then sell products back to us, and we’ll spend more time seeing these advertisements.

While we have nobody to blame for our procrastination, we are playing right into the hands of people who stand to profit from these habits. This makes procrastination not only normalized but incentivized for both our brain’s reward system and social media companies.

A lot of it can boil down to brain chemistry and social media, but it also has a lot to do with everything outside of the dreaded work itself. If it becomes enough of a problem, life itself becomes merely an existence of two modes: work and avoiding work.

There is no time or capacity for actual relaxation or downtime because we’re always being watched by the faster-approaching deadlines behind them and, as a result, living in constant fear and dread. By delaying work in this way, procrastinators put off not only the task itself but the true enjoyment of life outside of one’s obligations and responsibilities.

It’s a slippery slope.

On a larger scale, it makes sense. Procrastination makes it normal to write off problems until they’re entirely too big to deal with and need immediate attention. It’s in part caused by the importance of day-to-day needs and realities, but it’s no coincidence that problems seem to snowball.

The way a lot of our societal structures are set up reflects this.

Instead of working to prevent crime, we punish criminals retroactively. Our country has become so indebted that the shutdown of our government is normalized and many workers endure periods without pay. We go through midlife crises, where only toward the latter half of our lives do we work to amend our dissatisfactions so we don’t leave with unfinished business before our final deadline.

We don’t have the time or ability to fix the real, big-picture problems because we are so busy rushing to finish before the next deadline or next crisis. In all of this confusion, we fail to recognize the root issue behind all of it.

We are kept in a state of perpetual distraction from the real issues that dictate how we live our lives through procrastination. We can never go back in time and amend the decisions of our pasts, yet we are forced to continue our mistakes to get anything done. The cycle continues.

Mitigating procrastination is not impossible, though. 

John Kammeyer-Mueller, an industrial relations professor at the University, said the best way to curb procrastination is to remove as many distractions as possible to fix your brain chemistry.

“Let yourself be bored,” Kammeyer-Mueller said. “Just like flat-out getting used to being bored.… Like if you’re driving someplace, don’t have any music on, don’t have any podcasts on, just get used to having no stimuli for a little bit — which is, again, really hard because there are so many things in our digital culture that make it hard to do that, and we’re not used to doing that. It does seem to be a way to acclimate yourself to not having that constant reward system.”

Procrastination, for all of its difficulty to solve, is not inevitable. It’s not just a behavior but a reflection of one’s struggles to see the bigger picture and move forward. 

While we cannot change the past, we can do our best to work toward a future where we don’t have so much regret and stress. The thing about procrastination is that it’s not only the work we’re avoiding. It’s the rest of our lives. 

Instead of continuing the self-loathing and self-perpetuating cycle, we should all work to spite the entities that stand to benefit from us doing so by stealing time we could be used to actually enjoy ourselves, rest, heal or do quite literally anything else.

As Annie Dillard famously said, “How we spend our days is, of course, how we spend our lives.”

Posted in UncategorizedComments Off on Opinion: A new lens on procrastination

Opinion: Why fall is so nostalgic

Leaves crunch beneath you with each step as you trudge home. It’s a good thing the chill in the air keeps you on your path because if you weren’t just slightly too cold you’d stay out forever. 

The sun is beaming on your face, peeking through trees and casting a golden glow on the path ahead. You aren’t in a rush, though. You know when you get back, there will be a blanket with your name on it and some flannel pajama pants waiting at your closet door. 

You’re biding time, like we all do during this time of year. As you wait out the clock and watch the sunset, you wonder how it’s always felt like this. Year after year, you drag your feet as winter draws upon you like an awful shadow. This feeling doesn’t ever change.

The energy behind fall as a season is fascinating on all fronts — from aesthetic to cultural. It is an extremely reflective and nostalgic season. Fall represents a transition between the two most opposite seasons in not only landscape but also in sentiment.

Maybe that’s why it feels so ancient.

Fall represents the juxtaposition between the warm, lively summer and the cold, barren winter. It’s a season in limbo, defined by both reflection of the months previous, and anticipation of what’s to come. 

Autumnal aesthetics are of particular interest and importance. We live in a highly visual culture, and seasons are often defined by evocative imagery and defining colors, often seen through fashion as it adapts to our environment. 

It feels as though fall traditions can be more archaic than other seasons. Fall seems to have an eerie energy surrounding it, with fall being a backdrop for many age-old urban legends and tall tales. This may be due to the confluence of weather changes, time-honored traditions and the existential threat of the passing of time as represented by both of these factors. This is greatly reflected in the visuals that have come to define autumn in the cultural imagination, such as gothic architecture, campfires, and imagery relating to the changing of fall colors.

Marilyn DeLong, professor emeritus in the College of Design at the University of Minnesota, said fall in the aesthetic realm is defined by protection and camouflage with our physical environment. It is also a continuation of basic winter preparation taught to us in our youths by those who knew better, as this wisdom is passed down through generations.

“The fall aesthetic is defined by autumn hues of gold, orange, deep red, and brown,” DeLong said. “Layering becomes a way to add warmth and a feeling of coziness. A T-shirt and denim blue jeans can be layered with a cardigan or sweatshirt.  Our feet transform from wearing sandals to being covered with stockings for warmth. Even our brightly colored toenails can take a rest as they hide in the loafer or bootie. Covering the head and feet is something we heard from our mothers about the necessity of keeping warm.”

Ada Gabert Nicholson, a third-year College of Design student and member of Golden Magazine, said fall is defined, in part, by a homespun feeling. 

“Fall specifically feels like a close-to-home connotation,” Nicholson said. “At least for me, I always think of my family, my home and kind of where I came from, but also just changes, like resetting and preparing for winter.”

The emphasis placed on fall’s antiquity cannot be overstated. It has an inherent nostalgic quality, reflected in sweaters, classic earth tones and comfortable clothing that never really looks out of place, nor does it go out of style. 

Modern fashion hasn’t changed the fall uniform, despite trends. 

People have worn sweaters and boots for generations now. Fall is a season defined aesthetically by staples that, for more or less, remain constant. As our physical surroundings become more volatile and unpredictable, we stick to what we know.

To Emily Pham, a second-year student, sweaters and cardigans are defining staples of fall fashion, as are minimal accessories as the feeling of cold metal on the skin is especially unpleasant as the temperature lowers. 

Khalid Mohamed, a third-year student, said his personal philosophy for fall fashion is that it’s meant to be classic and functional, with a defining motif being earth tones that mimic scenery and outfits that allow for an easy transition between warmer days and colder nights.

The purpose of fall fashion is not to be overly aestheticized or impractical. There is an emphasis on classic and basic outfits, with minimalism being a defining element for many people because of its functionality. 

People tend to go back to what they know rather than experimenting with something completely new because it’s more manageable on a sensory level, and less risky with unpredictable weather. This leads to a timeless fall look that builds upon tried and true outfits mimicked from older generations. 

What is “classic” can look and feel vintage. This is a large contributor to the sense of nostalgia that fall brings. Similar staples can be defined as anything from country chic to bohemian, to dark academia.

A lot of what defines fall aesthetically, at least in fashion, is timeless. It’s in direct contrast with the consumer-manufactured need for new and more. What we already have will work. It has for as long as we can remember. 

Another reason why fall feels so old is its preoccupation with death and decay. 

The seasonal changes that come with fall have existed long before us and will continue to repeat long after we’re gone.

Fall reminds us of the passing of years and that, like the leaves, plants, and insects, we too are mortal. 

Perhaps we tell each other ghost stories, folk tales and urban legends to commiserate over this. Some celebrate Halloween every year and invite young ones into this tradition. They come for the candy and stay for the community, as the rot continues through Thanksgiving.

It seems to me that it’s not a coincidence how Halloween, the holiday with a tradition around fear and legend, happens earlier in the season and further from winter than Thanksgiving. It’s a holiday built around the anxieties that characterize fear of loss, and an uneasiness toward what lurks in the shadows that grow with each day.

Thanksgiving, however, happens well into the decaying process. It is centered around not only acceptance of the changing landscape and life cycle, but also gratitude and appreciation for loved ones.

Fall is a season that reflects our anxieties and hopes back onto us, and its aesthetic constancy across generations serves as an indication of time’s passage and the life cycles it coincides with. 

Its timeless nature only furthers its mysticism and continues to make it a subject of intrigue for not only myself, but much of the cultural imagination at large.

The feelings that fall has evoked for centuries continue to be celebrated, honored and passed down. It not only represents fear, grief and acceptance, but embodies it in a way that makes it reflective of, and as old as, life itself.

Posted in UncategorizedComments Off on Opinion: Why fall is so nostalgic

Opinion: Fame is just a job

The cultural iconography of the small-town girl turned big-time pop star is one of the most tried and true tropes of American popular culture. While it is a prevalent American dream, it often seems to turn young starlets’ lives into nightmares. 

With the rise of social media, it’s easy for the public to feel like they’ve broken the fourth wall. We as consumers of celebrity culture and media love to claim our righteous take on authenticity and transparency. Of course, the media has caught wind of this and now, it seems nearly every celebrity is “authentic” and “relatable.” 

They’re abandoning stage names, they’re doing live videos on tour, they’re inviting us into their homes and giving us makeup tutorials. Celebrities are using the same social media platforms we do to sell their authenticity to us. 

We complain about being sold to, yet we always take the bait. This brand of pseudo-authenticity is still feigned, though. The live videos have product placements, as do the home tours and the makeup tutorials. 

Celebrities are selling themselves to us. This isn’t a new phenomenon. 

Christopher Terry, media law professor at the University of Minnesota, said celebrities’ names, likenesses and most notable attributes are protected under laws similar to copyright. This is called the right of publicity. 

The right of publicity protects celebrities and public figures from being misrepresented, according to Terry. It distills their likeness down to their associations. Essentially, celebrities own rights to their persona because they’re famous for a reason. 

Celebrities are hot commodities and have certain traits that define them, their trajectory and their marketability, according to Terry. Celebrity itself is a business and monetary asset that requires legal protection from misrepresentation. To a large extent, celebrities are brands themselves. In order to become famous, the 3-D of what constitutes a real person must be flattened into a few distinct, summarized and well-defined boxes to check. 

This can mess with a person’s identity and legacy. Celebrities are known by the world for a few things that are usually not authentic. Those traits are used to define them, often for the rest of their lives and beyond. 

An infamous case of this is Marilyn Monroe — or Norma Jean Baker. 

Over the course of her lifetime, Monroe went from child bride to one of the most iconic women on the planet. She’s become synonymous with midcentury American femininity and her image has proliferated after her untimely death. Monroe became more of an icon and symbol than a person. 

Ruth DeFoster, an assistant professor who teaches media and popular culture at the University, said Monroe and Norma Jean couldn’t be more different. 

“It’s not her, it’s not who she was, the image of her,” DeFoster said. “The famous Andy Warhol shot for example, which I think is the one we all associate with her, that wasn’t her… she was this very laid back down to earth person.” 

The discrepancy between who Monroe was and who she was made out to be lies at the heart of her exploitation, according to DeFoster. 

“The way that she was really exploited as this vapid, dumb blonde,” DeFoster said. “That’s really the archetype of the character she played her entire life. There’s no question that she was subsumed by this character Marilyn.”

There is a new wave of change and awareness sweeping the public consciousness. A new star has risen this past summer. Chappell Roan, or Kayleigh Rose Amstutz, has had a meteoric rise over the past year. 

The Missouri native broke the record for the largest daytime crowd at Lollapalooza, opened for Olivia Rodrigo on her Guts World tour, reached number four on the Billboard Hot 100 chart with her debut album “The Rise and Fall of a Midwest Princess” and has earned the unofficial title of “your favorite artist’s favorite artist.” 

Within a year, Roan went from barely being able to afford rent to one of the most lauded figures of pop music’s last decade. Most recently, she’s come under fire for her anti-establishment attitude, particularly surrounding her stardom. 

Roan’s stage persona is a very meta representation of the transformation required of celebrities. She wrote in an Instagram post, “When I’m on stage, when I’m performing, when I’m in drag, when I’m at a work event, when I’m doing press… I am at work,” Roan said. “Any other circumstance, I am not in work mode. I am clocked out.” 

By having a drag persona, Roan implicitly separates her personhood from her fame. Kayleigh Rose and Chappell Roan are two separate people. One is a person and the other is an entity for public consumption. 

Of course, stage names and name changes are nothing new in show business. There has been a precedent for a Jekyll-Hyde duplicity in celebrity women seen from Norma Jean (Marilyn Monroe) to Stefani Germanotta (Lady Gaga). 

However, her likening of Hollywood to a day job is something notable. Roan is, in a way, protecting herself through her bluntness and refusal to play by the rules. Monroe and many other stars were exploited by the conflation of their public and private personas. 

Roan won’t stand for it, and it’s unsurprisingly making waves. 

By shedding light on the inauthentic nature of her fame, and celebrity culture in general, Roan is actually extremely candid and truthful. While a large public sentiment is that her mindset is naive and the 24/7 harassment and exploitation is part of fame, maybe it’s time we reconsider why we think that is.

Roan stated in the same Instagram post, “I feel the most unsafe I have ever felt in my life.” Why is it that she should have to put her life in jeopardy for simply being a performer? 

A big part of Roan’s appeal is her authenticity and willingness to say the unsaid. Here, she’s saying something that the public really ought to know by now: celebrities don’t owe anyone authenticity, and frankly, have never tried to give it to anyone. Why should they? 

Shouldn’t celebrities have some license to personhood outside of what they do for work, especially when we’ve seen time and time again how the conflation between celebrity alter egos and the people behind them has led to all kinds of exploitation? Why do so many people so desperately want to see another young star exploited by the entertainment industry? We’ve seen it for as long as the concept of celebrity as we know it has existed, from Clara Bow to Britney Spears. 

Don’t we want to see real change, like we all performatively claim? 

Fame in and of itself requires a transformation. Celebrities aren’t people, or just like the rest of us. They’re functionally brand mascots. We don’t care about what Ronald McDonald does outside of work. Why should we care what Chappell Roan does?

Posted in UncategorizedComments Off on Opinion: Fame is just a job