As of right now, I'm not a published opinion writer. But I have a body of work from my Fall 2024 Opinion Writing class seminar at Duke with Frank Bruni, and it's some of my favorite writing yet. Check out some of my work from the class below.
Whenever I need to empathize with the politically incorrect, I think about incest.
I’m a progressive college student, about as ‘woke’ as they come. I’m bisexual. My sibling is transgender. I have all the patience in the world for pronouns and very little for those who don’t.
But when adults who grew up in a different time turn their noses at two men kissing, I have a new tactic for understanding them. I imagine the two men are brothers.
Hear me out.
Today, it’s not controversial to be repulsed by incest. It’s illegal in 48 states, and it doesn’t even fall into the improper-but-innocuous category of crimes, like marijuana; it is morally reprehensible. A Schedule 1 offense.
But by the same logic I use to justify the legal and social permissibility of homosexuality, incest should be perfectly okay. The government does not get a say in what you do with your body. Or who you sleep with. Love is love, after all.
It’s more complicated than that, I know. But it seems that, especially if the couple opts not to reproduce, all that distinguishes incest from other unconventional relationships is a moral stickiness that is tricky to name and even trickier to get rid of.
Yet I can picture a world, 50-something years from now, where the ever-progressing wave of political correctness finally engulfs incest, swallowing up any dissidents in its cancel-culture tide. A world where incest is okay, and you either swim with the current or drown.
In that world, us woke watchdogs would become the bigots. But are we so wrong to pause before diving in?
The resistance to immediate agreement is not an indictment of character – it’s critical thinking. A core democratic value. But our intellect and youth are our hubrises, and we read hatred into older generations’ hesitancy to hop on board with our definition of the politically correct.
P.C. is notoriously unpopular. A 2021 study found that only 4 in 10 Americans think people should tailor their speech to avoid offending others. Duke senior Lucan Franzblau, who comes from a conservative New Jersey town, says he knows many who voted for Trump, “not for any of his policies, but for his refusal to bow to woke culture.”
And it’s fair to be skeptical.
The term first emerged in the 1930s among Soviet Communists to mock their own deference to the regime. As Boston University Professor Angelo Codevilla writes in his history of P.C., political correctness was used as a reminder that “the Party’s interest is to be treated as a reality that ranks above reality itself.” He says one such reference might look like this:
“Comrade, your statement is factually incorrect.”
“Yes, it is. But it is politically correct.”
Political correctness literally referred to obedience under a prevailing power, even when it meant abandoning reality. Now, Americans reject it on the same grounds.
American P.C. is characterized by the policing of language in service of an ideal: a world in which we are cleansed of our social sins. But the right idea does not mean the right execution.
I’m too young to be scarred by the Communist paranoia that ripped through the U.S. during the Cold War. But the fear of losing individuality under a totalitarian regime is legitimate baggage that generations still carry. To them, political correctness smells suspiciously familiar.
The fears of an American-Communist future can be somewhat assuaged by the fact that the policing of language comes from the people, not the government. But our political correctness follows a top-down approach nonetheless.
Of all the hierarchies that progressives attempt to level, there is one glaring omission: the diploma divide. Research proves that the educated fare better in almost every way – life expectancy, marital success, social satisfaction, and income – and political correctness is no exception.
Though the process of converting a term from convention to canceled is too vague to attribute to one group, the educated lie at the center. Academia is an exclusive circle where progressive ideas bloom, and enforcing P.C. standards nationwide is like hosting an exam but giving the answer key to the privileged. You’d be pissed, too.
It’s not just because progressivism and academia are all tied up. It is a privilege to have the freedom of mind to mull over microaggressions. If we’re looking at Maslow’s hierarchy of needs, political correctness must be the cherry on top.
Thus it is us young thinkers at elite universities, agile and unburdened, who cast our picture-perfect P.C. blanket over the world, while everyone else suffocates beneath it.
And we often get it wrong. The term ‘Latinx’ was meant to offer a gender-neutral identifier for people of Latin American descent, but many Latinos rejected it emphatically. One poll reported that only 2% of Latin American voters identified as Latinx, and 40% found the term offensive. Plus, the word is awkward to pronounce: few words end with two consonants in Spanish.
This is the hypocrisy of P.C.: the people who enforce it are rarely the people it claims to protect. Yet our modesty remains low, and the price for a misstep high.
While many P.C. pioneers understand that a one-time offense is not grounds for social expulsion, a large-scale rejection of the movement is seen as an ethical offense. And ironically, it drums up some name-calling.
Donald Trump’s presidential campaign posed a referendum on political correctness to voters. And, decisively, Americans affirmed that offensive language wasn’t a dealbreaker. Liberal accusations flew in the wake of his win: anyone who voted for Trump is a bigot.
But, as columnist David Brooks pointed out in his election postmortem for the New York Times, “There will be some on the left who will say Trump won because of the inherent racism, sexism and authoritarianism of the American people. Apparently, those people love losing and want to do it again and again and again.”
It’s a critical misunderstanding.
From the inside, failure to subscribe to P.C. language means moral deficiency. But most on the outside aren’t rejecting political correctness for its ideology. They are rejecting a restriction on what they can say. They are rejecting an objective definition of how to think. And they are rejecting our appointment of ourselves as the ruling voice.
It didn’t have to be this way. Unfortunate Communist roots aside, the rollout of the modern P.C. movement did a poor job of distinguishing itself from an infringement on free speech, too focused on instant perfection to be inclusive, too sprawling to be reasonable. And we’re still not getting the message.
Go search up a modern glossary of offensive language. You’ll find the classics, but you’ll also find dozens of newly-canceled terms you hadn’t thought twice about. ‘Brainstorm,’ a potential slight to the neurodiverse. ‘Spooky,’ a derivative of a Civil War-era slur.
Many of these even make their supposed victims chuckle. “I’m partially blind,” said Frank Bruni, Duke professor and longtime Times contributor, “and I don’t care if you say ‘blind-spot.’”
Our adolescent zeal propels a desperate rush to enshrine more and more words into the Hall of Slurs, where they may only be revisited as a reminder of the harshly offensive history we are heroically cutting ties from. It does more to stoke our egos than fuel genuine progress.
Endless and occasionally inaccurate indictments of language in the pursuit of an ideal are nothing new. In his analysis, Codevilla highlights this throughline: “There [is no] endpoint to what is politically correct, any more than there ever was to Communism.”
But the lack of an endpoint doesn’t invalidate the effort. It brought us all of our social justice accolades. The goal of an equitable world is a worthy pursuit. But by being absolutist in our demands, we lack the humility necessary to navigate progress. What won’t bend will break.
Youth is an asset in a P.C. world. Even beyond the political paranoias we are ignorant enough to be immune to, or the stereotypes we are sharp enough to see through – our brains are more malleable. Our neural grooves shallower. Our learning quicker.
But our time in the spotlight will fade. History churns onward. Soon, a new class of even fiercer linguistic revolutionaries will take the stage. And when they do, and they chide us for our unwillingness to surrender wholly on an issue with residual moral stickiness – like incest – we’ll appreciate their grace.
This election season, there’s a sense that democracy is on the line. The fear of the unraveling of government as we know it buzzes around both sides of the political aisle. It’s not unfounded – the Supreme Court ruling that presidents may not be prosecuted for “official acts” does feel a little dystopian. But while everyone’s eyes are on the presidential race, smaller-name candidates find their own ways to erode our electoral integrity.
Enter Joyce Craig, the Democratic candidate for governor of New Hampshire.
After a stint on the school board, Craig earned her local government claim-to-fame in 2018 when she was elected as the first female Mayor of Manchester. She served three terms. Now, she faces Republican Kelly Ayotte in the race for governor – and current polls report them to be neck-and-neck.
I’m a registered Democrat. Many of Ayotte’s conservative policies rattle me. But what rattles me more deeply is that Craig has been cutting corners.
In the months leading up to the primary, the four contenders – Craig, Ayotte, Cinde Warmington and Chuck Morse – were invited to a smorgasbord of public events. The latter three candidates showed up consistently, seizing face-to-face time with locals at every opportunity. But all summer, there was an empty seat for a no-show Craig.
See, Craig doesn’t quite shine before a crowd. In interviews, she relies on the same opaque platitudes that define American political rhetoric, but she does so without a lick of charisma. Her voice trembles. Her eyes betray internal torture. As NH Journal’s analysis put it, “Joyce Craig reacts to news cameras the way vampires react to sunshine.”
Even when cameras aren’t around, she remains in hiding. Typed email replies from her PR team shield her from press interview requests. At an interfaith forum in July, Craig’s campaign team asked an NHPR reporter to leave the premises.
It’s not just that her talking points aren’t delivered with silky elocution. Charm is merely a bonus, not a prerequisite, to a good politician. It’s that Craig’s anxiety to stay out of the spotlight leads her to shut down important questions – and voters’ sole access to her.
And voters do feel left in the dark. Craig’s tenure as mayor, which she touts as a success, left many Manchester citizens confused. New Hampshire’s biggest city was wracked with homelessness before, during, and after she served. Reddit users debate whether or not she had any impact at all, but to no conclusion.
Marge Gruzen, an undecided voter in Exeter, said, “I haven’t been in a situation where I’ve heard Joyce Craig speak. I just want to hear her.”
In a couple of ways, running for public office is like applying for any other job. You pitch yourself to your employer, and you fulfill the rounds of interviews they put you through. And as a public servant, your employer is the people.
By swerving these standard expectations, Craig is exploiting the unfortunate truth of our democratic hiring process: that often, we don’t get to pick who we want. It’s the lesser of two evils. She’s betting that our standards are low enough to elect her anyways.
When politicians aren’t willing to throw themselves to the wolves, they are relying on voters to select them by the vague national platform they’re affiliated with. They are presenting themselves as a pawn for their party, not a pioneer for their people. And they are expecting that to be enough.
Partisan voting permits this kind of mediocrity, and Craig’s success depends on that fact. As another NH Journal analysis said of Craig during her 2023 campaign for mayor, “She was counting on Manchester Democrats voting like Democrats first, and citizens who live in Manchester second."
But in 2023, Manchester called her bluff. She lost. Now, New Hampshire has an opportunity to do the same.
While Ayotte is guilty of speaking in the same relentless nothings that wear listeners down to utter inattention, she still shows up. She does regular ‘conversations’ with Granite Staters, standing alone on a small stage while voters fire questions at her from all sides.
This is by no means extraordinary – but that’s the point. Ayotte isn’t above the process. Craig is.
When a politician does a bad job, the people ultimately hold the right to fire them. And when Craig isn’t even meeting the bare requirements, her application should be thrown out.
Democracy is at stake when public officials stop thinking on their feet. Politicians should be fighting against the current, not floating in their tube down the lazy river. Disinterest in due process is a true danger.
And Craig’s disinterest, whether fueled by pure shyness or not, renders her ineligible. She is simply underprepared. She couldn’t even get the most important facts down: at the end of a primary debate against Cinde Warmington, Craig finished with, “I ask for your vote on September 15th.” Election day was September 10th.
She’s gambling on you not noticing. Show her that you do.
Two summers ago, I was the only woman living in a house with four men. Gender didn’t play much of a role in the roommate dynamic – if anything, I did the least cooking and cleaning of anyone. But for a few hours each day, I was the odd one out.
Their favorite pastime was Super Smash Bros. (Smash), a Nintendo fighting game. When they played, their minds would disappear into the indecipherable rainbow flurry on the screen, exclaiming at seemingly random intervals and ignoring me.
I got tired of being left out. So, I joined in.
Smash has a simple game structure. You pick from an array of cartoon characters, each with different capabilities, and fight to the ‘death’ on a floating island by jumping, shielding, attacking and running.
At first, I was bumbling around the screen. The boys leapt over and slid under me, anticipating my mistakes and punishing me for them. Their fingers danced across their controllers, flicking knobs and tapping triggers deftly. Mine mashed the nearest buttons in a panic.
But as I began to process the madness, I slowed down. I stopped jumping directly into the boys’ patient attacks, and my roommate Ethan noticed.
“You’re starting to think,” he said. “I can see it.”
Playing a video game is like learning a language. From the outside, it seems like gibberish. A rainbow flurry on a screen. But as you immerse yourself, you begin to catch the nuance in the nonsense. The air kick that hit because your opponent expected you to jump. You realize the men on the couch aren’t simply staring into a colorful abyss – they’re communicating.
Video games get a lot of flack. They trigger the same widespread anxiety that accompanies the release of any new medium – like the 1600s, when scholars fretted that books would rot people’s brains, or the 1960s, when they feared television would do the same. Each new generation is born into a world of unfamiliar entertainment technology, and the adults just don't know how to cope with it.
When video games burst into the mainstream in the 1970s, they prompted similar panic. But after the Columbine High School Massacre in 1999, a new unease emerged: that violent video games might spur school shootings.
Columbine, Colorado was an upper-middle class and mostly white neighborhood when the massacre happened. People clamored for an explanation. Video games took the fall.
Dr. Alexander Kriss, a psychologist and author of The Gaming Mind: A New Psychology of Videogames and the Power of Play, reflected on this era.
“Shame, discomfort, and awkward feelings about video games pre-existed [Columbine],” he said. “But it was really at the turn of the millennium that people started to see video games as something dangerous.”
Yet the research didn’t validate the rumor. Psychological studies found conflicting results, but the emerging consensus – supported by a 2011 Supreme Court Ruling – was that any link between violent games and behavior is insignificant. Other factors, like family life and economic status, have a much heavier hand in the matter.
(The same thing happened with television – despite the Surgeon General declaring TV violence to be a public health crisis in 1969, no research ever found a conclusive correlation.)
But the stigma did little to slow their takeover. Gaming has eclipsed film and music as the giant of the entertainment industry, raking in $282 billion per year with 3 billion global users.
By the 2020s, the video-games-cause-violence argument grew stale. 26% of Americans today simply believe video games are a “big dumb waste of time.” The mob of protective parents has dispersed into a judgy peanut gallery, milling about and casting condescending eyes upon fanatic gamers. I know, because I was part of it.
It’s easy to look down on something we don’t understand – especially when it fits neatly into the generational brain-rot narrative. I watched the boys, entranced by the flashing colors, and scoffed. How juvenile.
But whether it’s because we fear video games or we can’t grasp them – or we harbor a slight bitterness about not grasping them – we’re missing the point.
First, video games are too broad of a category to be distinctly good or bad. Dr. Shai Ginsburg, director of Duke University’s GameLab – a group dedicated to studying games as a cultural product – puts it this way: “Some books are horrible, some television shows are horrible, and some games are horrible. But you don’t blame the medium.”
Some games are pedagogical tools. Last year, nearly 1.5 billion hours were spent learning new languages on Duolingo. Some are thought-provoking, using complex storylines to pose societal critiques. Ginsburg’s favorite is ‘Detroit: Become Human,’ which explores a near-future where androids gain sentience.
“They are interesting stories, they raise interesting questions,” Ginsburg said. “I think it was really, really smartly made.”
Compared to the throng of video games that boast academic or philosophical merits, Smash may exist closer to the bottom of the cultural totem pole. But still, its value prevails. Like any sport, it encourages competition, connection, and mastery.
In my house, Ethan played as Diddy Kong, a nimble and mischievous banana-wielding monkey that slips up his opponents from a distance with his peel. He is only able to stand up to heavyweight characters using a series of precise combinations and technical dodges that take hours of training to perfect. Ethan has logged about 1,500.
Off screen, Ethan is a meticulous problem-solver, preferring playful teasing to outright aggression. Diddy Kong suits him.
Meanwhile, James (another roommate) finds himself in ROB, the fire-puffing and laser-shooting robot. ROB gets his claws a little dirtier than Diddy Kong, relentlessly chasing his opponents and giving them his best shot. For James, ROB is an apt choice – both the man and the cartoon aren’t afraid to get in your face.
As Ethan and James play, the epic battle unfolding is not just between Diddy Kong and ROB, but the men themselves. And though their character choice rings true, their self-expression stretches beyond that. Like any professional athlete, they have the dexterous fluency to inject personality into their gameplay.
Chase (another roommate) attributes his affection for Smash to the joy of competition. “That’s how my dad raised me,” he said. “If something is worth doing, it’s worth being the best at.”
I’ll never be the best at Smash. I’m a decade behind in experience, and my fingers still tend to fumble: I’m not fluent yet. But I’m competent enough to say this with confidence – it’s a craft. If video games are modern-day art, Smash is a collaborative self-portrait. Don’t let your screen-time anxiety spoil the view.