Author: Ellen Squires

Distinction should achieve personal fulfillment

Throughout most of my college career, I have to confess that I have been a prolific “apply-er.” Summer jobs? Why not apply to twenty? Internship that looks like it would be 5 percent cool, 95 percent mindless busy work? Might as well! This was the logic that dictated my decisions during my first three years at St. Olaf. Now I’m a senior, and while I cannot claim to be that much older and wiser it’s only been four years, after all, I’m starting to reconsider the idea of applying for something just because I can.

I first received a message announcing the opportunity to apply for distinction this fall. Two years ago, I would have signed up without batting an eye. But after four years of applying for everything that lands in my inbox, my “just do it” mentality was finally put to the test.

I have to admit, my instinct was to go for it. Distinction sounds really great. The college describes it as “a recognition of the academic or artistic excellence of a student’s work in the major.” It is like a pat on the head in a very visible, public manner. As humble as we all try to appear, it still feels good when people acknowledge our accomplishments. Most of us have spent days laboring in the library until midnight, forgoing social events to eke out the last page of that midterm paper. We have suffered through lackluster group projects, attended lethargic lectures and read every last bit of Augustine’s “Confessions” – twice. So why not seek out a title that will give formal legitimacy to four years of intense academic effort?

After reflecting for quite some time about whether or not I should apply, I think I might have finally found an answer. I realized that distinction is not something I want to pursue. To me, academic “success” is cumulative, and it is not always easily measured through formal awards such as distinction. Educational achievements are more than GPAs and test scores and big projects. Completing a final distinction project is a nice, neat way to tie together four years of study within your major, but it alone cannot convey the meaning of the time you spent working towards that major.

To me, success is less about a title and more about the relationships formed within the departmental community and the experiences that have been personally and academically meaningful. If you want the title of distinction to formally acknowledge your dedication, that is fine. But if not, you should not feel pressured or forced into pursuing it.

In the end, completing a distinction project will not make my experiences any more valuable for me personally. Sure, it would bolster my resume, but I like to think that my other experiences, both inside and outside of class, are a better representation of the work that I did and the person that I have become.

At a college of overachievers, a lot of us seem to suffer from what I call “homepage syndrome.” We crave recognition and fall prey to comparing ourselves with other students, especially our high-achieving peers whose faces grace the St. Olaf home page. Distinction and similar academic awards are emblematic of this broader cultural paradigm at St. Olaf that can quickly become unhealthy. In comparing ourselves to others and vying for the most academic awards, we detract from the sense of community. This unspoken ethic to overachieve is probably what guilted me into blindly applying for so many opportunities in the first place.

I want to make it clear that distinction is not an inherently bad thing, and I do not mean to belittle the achievement in any way. If it is something you choose to pursue, that is great. I am simply re-evaluating the tendency to apply for something solely because it is an opportunity to add to a list of titles and achievements. I am also challenging the largely implicit belief that gaining distinction means success, and opting not to pursue it means failure. This dichotomous thinking can wreak havoc on our sense of self-esteem and personal well-being. Whether or not you choose to pursue distinction, maybe we would all do well to remember that success is measured in many ways. That alone deserves some recognition.

Ellen Squires ’14 squirese@stolaf.edu is from St. Paul, Minn. She majors in biology and environmental studies.

Graphic Credit: CAROLINE WOOD/MANITOU MESSENGER

× Featured

Student journalism is a very important platform for opinions

Dorm Gourmet: March 14

My dad cycles through hobbies like most people do fad diets. Our garage is a graveyard for old skis, woodworking equipment and oil paints, all relics of his abandoned pursuits. Thankfully, I sometimes get to profit from these pursuits. I was especially thrilled when I came home over Interim break to learn that his new hobby involved cooking.

“I have a surprise for you,” he told me when I walked into the kitchen. He opened the door of the refrigerator and pulled out a small plastic bottle – liquid vegetable rennet. After experimenting with fermenting cabbage, making his own butter and pickling asparagus, he decided it was time to try one of the world’s tastiest foods: cheese.

Cheese has an interesting history. It is rumored to have been discovered by accident when traders carried milk in casks lined with animal stomachs. An enzyme in the stomach lining, rennet, caused the milk to curdle, transforming it into the cheese.

Thankfully, there are other ways to get rennet that don’t involve the inside of a cow’s stomach. Rennet is available from online cheesemaking supply companies in many forms: liquid or tablet, vegetable or animal.

After exhausting the online collection of food blogs, my dad and I decided to try our hand at mozzarella, billed as a “beginner” cheese. The ingredients were simple enough: a gallon of whole milk, vegetable rennet, citric acid powder and salt.

The process was more exciting, a mere 30-minute cycle of stovetop heating, microwaving and hand-forming. After this, the recipe promised, we’d end up with a lovely sphere of fresh mozzarella, ready to be drizzled with olive oil and sprinkled with a generous handful of herbs.

The first step in the process was the least remarkable. The milk was heated to a temperature hot enough to let the enzymes work their magic, separating the whey the liquid part from the curd the solid part. At the end of this stage, the curd was a semi-solid, chunky mess. We then formed it into a loose ball and microwaved it in one-minute intervals until it was a stretchy but solid log that looked a bit like taffy. After doing our best to form it into a passable ball, we were done – mozzarella! It really was as easy as promised.

When I see fresh mozzarella, my mind immediately goes to caprese. And so, despite the subzero temperatures, we pretended it was July and combined mozzarella with thick slices of tomato. Perched atop a crusty slice of ciabatta and topped with ribbons of fresh basil, it was nothing short of divine.

Making cheese appeases both my inner scientist and foodie. I made it at home, but it’s perfectly adaptable to a dorm kitchen. Just make sure you have a clean microwave, a big pot and 30 minutes. I promise it will impress your friends and your palate. Now I’m just hoping this is a hobby of my dad’s that will stick around.

Here’s the recipe condensed and adapted from “The Pioneer Woman Cooks”:

Dissolve 1½ teaspoon citric acid powder in ¼ cup water. Pour 1 gallon milk into the solution and stir. Heat to 90 degrees over medium-low heat, then remove the pan from the burner and add ¼ teaspoon liquid rennet. Stir briefly and cover and let sit for 5 minutes.

Cut the curd into a 1-inch checkerboard pattern with a spatula. Return the pot to the burner over medium heat and stir it gently until the temperature of the whey the liquid that separates from the curd reaches 105 degrees. Transfer the curd to a colander set over a bowl and let the whey drain. Remove the cheese and squeeze to drain excess whey.

Transfer the cheese to a microwave-safe bowl and microwave the curd on high for 1 minute, then pour off as much whey as you can. Microwave it again on high for 35 seconds, then press the curd together again to drain the whey. Repeat. Knead in the salt and roll it under itself until it forms a neat ball. Set in an ice water bath until cool.

squirese@stolaf.edu

× Featured

Student journalism is a very important platform for opinions

Dorm Gourmet: October 25

It seems like pumpkin is the new “it” thing in the world of fall food. Pumpkin spice lattes have become synonymous with autumn, and the iconic fall gourd is making its way into everything from Greek yogurt to bagels to Pause shakes. So when I found myself with a spare can of pumpkin, I realized that there was an endless list of possible things to do with it.

A quick scan of Pinterest proved that I was right. I browsed through an interesting set of options:

Pumpkin pie protein smoothie? Nope.

Pumpkin fritters? Intriguing, but nope.

Pumpkin casserole? Gross.

Then it hit me. Right between pumpkin cheesecake and pumpkin macaroni and cheese was a fall classic: pumpkin bread. Pumpkin and bread are two of my favorite things, so I was surprised that I’d never made it before. In fact, I’d never even eaten pumpkin bread. Feeling like I couldn’t join the growing masses of pumpkin devotees without first trying the basics, I decided to give it a go.

The next step: find a recipe. Despite the onslaught of new diets and nutritional guidelines, I’ve retained a firm belief in my German grandmother’s conviction that fat and sugar really do make things taste better. So, I skimmed over recipes with anything like “low fat” or “healthy” in the title on my quest for the most fat and sugar-laden recipe I could find. The winner? 3.5 cups of flour, 2.5 cups of sugar, 1 cup of oil and 4 eggs, mixed together with a can of pumpkin and spices and baked into two loaves of fall goodness.

Lured by the promise of a share in the end product, a couple of friends joined me in our honor house kitchen to do the deed. It was almost too easy. After throwing everything into a large bowl and stirring it into a thick batter, we poured the mixture into loaf pans, and it was ready to go.

As the bread baked in the oven, a smell not unlike the famous Malt-O-Meal scent permeated the house. One by one, my housemates came back from campus gushing about the smell and dropping not-so-subtle hints that they’d appreciate a share of the spoils.

After they had baked for a full hour, we pulled out two perfectly golden loaves, rounded at the top with a narrow slit beginning to form down the middle. I’m a notoriously impatient baker and eater, so I cut a thick slice just a few minutes after taking it out of the oven.

Still steaming hot, the first bite melted in my mouth as my taste buds picked out the subtle hints of nutmeg and cinnamon.

While the bread was perfectly delicious when eaten plain, I later discovered that it was also good lathered with a generous swipe of homemade almond butter and a drizzle of real maple syrup. Paired with a bitter autumn ale, it was even better.

With my first loaf of pumpkin bread in the books, I feel like I can join the throngs of pumpkin addicts everywhere. And for this, a little thanks is due. I’m indebted to my sugar-loving grandma. I’m indebted to my helpful and only slightly self-interested friends, and I’m indebted to that clever soul who thought it was a good idea to smash a gourd into a mushy pulp and make it into a loaf of bread that’s perfect for a fall afternoon and perfect for sharing with friends. If pumpkin is the new trend, I’ll be the first to jump on board.

× Featured

Student journalism is a very important platform for opinions

Focus technology on needs, not trivialities

We in America have a lot of faith in technological progress. Chances are, most people will tell you that we’re better off now than we were 50 years ago. The world is a better place, we like to think, because of our latest and greatest inventions.

A healthy optimism is all well and good, but we’re deluding ourselves: Technology hasn’t improved the world like we think it has. This myth of progress, fueled by technological advances, is both misplaced and problematic. We see the immediacy of technological change in our own lives, but despite the unprecedented rate of innovation, quality of life isn’t improving globally. Technology is just making life a whole lot easier for those lucky enough to have access to it.

Falling prey to the myth of progress can carry steep consequences, blinding us to the real, pressing issues of the world. To define progress in exclusively technological terms is to remove the focus from the problems that really need our attention, like global hunger and poverty. Rather than defining progress by the newest metal rectangle to come off of Apple’s production line, we would do better to focus on widespread, global improvements in the quality of human life.

I can’t be a complete Luddite and claim that technology is harmful and regressive. Personally, I would be reluctant to surrender the cell phone and laptop that have become daily fixtures of college life. On a grander scale, technological advances have allowed us to feed more people, extend the length of human life and connect people around the world in astounding ways. Reverting back to pre-Internet, pre-modern medicine days would hardly be desirable for anyone. Technology has undoubtedly contributed to progress in certain areas of the world, but there is a danger in including only these advances in our definition of progress.

From my admittedly naïve, privileged college student perspective, the problem is rooted in a failure to connect technological progress to meaningful improvements in human life on a global scale. This can breed complacency toward more pressing, unaddressed problems. While we see progress in the newest photo-sharing app, many people around the world believe progress means easier access to the food and medicine they need just to live. Being on the receiving end of technological improvements, it’s easy to forget about the majority of the world’s population who are largely excluded from reaping the benefits of recent “progress.”

With a considerable amount of new technology accessible only by those in wealthy countries, technological progress can contribute to the increasing division of global society into the haves and have-nots. While our lives become more convenient, we ignore 85 percent of the world’s people who live in developing countries. Excluding this substantial majority from our definition of progress results in a greatly divided and inequitable world.

Changing the way we measure progress might require us to add a moral dimension to our definition. In his diary about life in a work camp during World War II, Langdon Gilkey states that “technological advance spells ‘progress’ only if men are in fact rational and good.” When man defines progress in terms of self-interest, he notes, we risk being sent on a crash course towards the fictional dystopian societies like those described in “Brave New World” and “1984.”

Technology doesn’t have to be a divisive force. We aren’t on an irreversible trajectory toward a real-life “Brave New World.” In fact, when properly used, it has the power to unite us around a common goal of human betterment. The focus just needs to shift away from convenience and limited accessibility and toward tangible gains in quality of life for those who need it the most. Technology can make the world a better place, so long as we invest in technologies that will produce measurable improvements on a global scale. This isn’t a myth: This is progress.

× Featured

Student journalism is a very important platform for opinions

Wolf hunt raises moral, ecological concerns

While the closest most of us have come to wolves is on some hipster graphic tee, the recent controversy over the Minnesota wolf hunt has earned this species some unwanted attention. This is more important than T-shirt graphics; this is serious. Opening the recently delisted Minnesota wolf population for hunting is an irresponsible and unnecessary move that ignores a complex ecology while threatening to compromise the integrity of the species. It’s highly risky, and its consequences might not be fully understood or realized before it’s too late.

The history of the wolf in Minnesota reads like a soap opera turned success story picture “Air Bud” for wolves. Their habitat already fragmented by human colonization, wolves were targeted from the get-go. They were trapped, hunted, poisoned and even became the focus of government programs intended for their elimination. Then came the Endangered Species Act, a landmark piece of environmental legislation that provided protection for the gray wolf. They were first listed as “endangered,” and as populations grew, they were granted the more benign “threatened” status. The gray wolf has since been delisted, and the Minnesota population has been considered stable for the past 10 years.

The recent success of wolves has provided much of the impetus for the hunt, with concern for livestock spurring calls to actively decrease the wolf population. This response is both ignorant and selfish, a thinly-veiled excuse on the part of those who stand to gain from the hunt.

A simple consideration of ecology provides ample concerns about the hunt. First, 10 years in ecological time is hardly a drop in the proverbial bucket, and it’s certainly no sign that the population is permanently stable and immune to future declines. To make this assumption is to deny the complex and intricate workings of a dynamic ecosystem. Wolves are a textbook keystone species, exerting a disproportionate influence on their ecosystem. This means that tinkering with the wolf population will have far-reaching effects on other members of the ecosystem.

And it’s not just deer, traditional prey for the wolves, that stand to be affected. When wolves were reintroduced to Yellowstone in 1995, the health of the river improved as grazers spent less time near the river and vegetation near the bank was restored. In tampering with these intricate relationships, the wolf hunt threatens to wield an undue impact on the entire natural web. It’s not just wolves who stand to lose.

Science notwithstanding, wolf hunting is simply unwarranted. The alleged problems stemming from the wolf population are unsubstantiated. First, the danger to livestock is minimal, and could even increase as a consequence of the hunt. As the population is more stressed, individual wolves become weaker and are more likely to turn to livestock for food. It’s also unlikely that the size of the wolf population could ever spiral out of control because of natural checks that it performs on itself. When a wolf population reaches its habitat or resource limits, an increase in wolf-on-wolf kills restricts population size. Additionally, a full third of the wolf population dies naturally of starvation every year, without any human influence.

Wolves are a fragile species that are part of a fragile ecosystem. With a current size of 3,000 individuals, allowing 400 wolves to be killed is unnecessary, unwise and uncalled for. And even if the worst potential consequences of the hunt aren’t fully realized, it still can’t be deemed wise.

Suppose that I’m wrong. Suppose that the wolf population really is stable and unaffected by the hunt. Does that give us license to kill? Maybe it won’t drastically alter the ecosystem or threaten the integrity of the wolf population, but does that mean we should allow the killing of hundreds of wolves so a handful of hunters can get an adrenaline rush? That raises deeper ethical questions, not just scientific ones. Like, do we have a right to enable mass killing of a species for no justifiable reason? Sounds like bad karma to me. And history certainly isn’t our ally either.

If we aren’t careful, the wolves gracing those beloved tees might become a historical relic, homage to the majestic wolf that once was.

Ellen Squires ’14 squirese@stolaf.edu is from Andover, Minn. She majors in biology and environmental studies.

× Featured

Student journalism is a very important platform for opinions