A good friend of mine develops load management systems for a major airline. The goal of those systems is to maximize revenue by selling every available seat on every flight at the highest possible price.

“It’s the same for admissions, right? Fill the dorms and charge the most you can?” my friend asked.

In general, the answer is no – that’s because admissions offices have many more goals than simply revenue. They need to achieve gender balance, academic profile, legacy admissions, athletic needs, diversity, and dozens of other considerations.

But admissions offices had still better hit their revenue number … and on only one “flight” a year.

My friend’s reaction? I’m paraphrasing, for professional reasons: “Holy wow. Your clients have it way harder than we do.”

They do. And on top of that, there’s very little room for error. Unfortunately, every year, we’re seeing more and more smart, talented, ethical enrollment managers lose their jobs.

Meanwhile, airlines get to test and make adjustments all the time. They have access to enormous data sets. They can measure things like the number of searches that begin to converge around cities (and know when those searches signal future purchases). The higher flight prices out of Louisville on the Sunday after NACAC were undoubtedly driven by this data.

Over the course of any given week, every major airline runs dozens of A/B tests to see if a small price increase depresses purchases, or conversely, if a small decrease suddenly generates a surge in interest. Worst-case scenario? They take a loss on some of the flights out of the hundreds they’ll run that day.

Enrollment managers don’t have that luxury. They can’t run multiple tests themselves – nor should they try. The risk of missing their class/goals is too great. A safer and more valuable approach is to tap into outside experience and expertise.

Over the course of any given year, Fire Engine RED will complete over 40 A/B tests related to Student Search. These tests run the gamut. They include modeling parameters, send dates, send times, sender name (individual vs. institution), sender title, message sequence, message content, calls to action, subject lines, new sources for names, the impact of print, the impact of digital, and many more.

Our breadth of clients means we can run dozens of A/B tests with minimal risk to any single client, while delivering maximum insight and value for all our clients.

The majority of our tests reveal something small, or even inconclusive. But cumulatively, even small things add up. As a result, our clients get measurably better results every year.

And occasionally, we find something dramatic. A couple of years ago, we tested a change to one client’s message sequencing and found that it increased junior and sophomore response rates by nearly 20%. We then executed a second test with a very different client and saw an even better result. By the end of the following week, we were able to implement this change for every one of our Student Search clients.

In a future blog post, I’ll return to the airline-model comparison to discuss what colleges should be measuring in order to develop an effective data-driven enrollment operation.

In the meantime, if you have an idea for an A/B test and want a low-risk way to test it, we’re your team.

Read more by Jeff:

Now Is The Beginning
Keep Calm, Act Expeditiously, Look Ahead
First, Tie Your Shoes
What I Wished I Knew About Student Search When I Was Dean


Jeff McLaughlin is the Executive Vice President of Data, Strategy & Analytics at Fire Engine RED. He has led Fire Engine RED’s data team since 2015. Prior to joining Fire Engine RED, he was Dean of Admissions and Financial Aid at St. Olaf College in Northfield, MN.