How to Prevent Recruit-to-Deny and Reject-to-Preempt Admissions Strategies
by Michael C. Dorf
A recent NY Times article highlights one of the more despicable practices one sees in the college admissions game -- what has come to be known as "recruit to deny." Here's how it works: US News and other purveyors of college ratings and rankings include "selectivity" among the factors on which they evaluate colleges. The harder it is to get into a college, the more selective that college is. Selectivity is expressed as a ratio of applicants who are admitted to applicants who apply: The lower the ratio, the more selective the college. A college can improve (i.e., decrease) its selectivity ratio by increasing the denominator, i.e., by encouraging applications from more students it expects to reject. (Encouraging applications from students a college expects to accept will decrease selectivity, because it will increase the numerator as well as the denominator, and thus in general increase the ratio.) Accordingly, as the Times article notes, many colleges reach out to prospective applicants with recruiting material, creating false hope that they will be admitted, when the college only values them as filler for the denominator. Such colleges "recruit to deny."
The practice is despicable for various obvious reasons. Sometimes colleges waive application fees for students they recruit to deny, but not always. When they don't, they waste those applicants' fees. Even if a college waives its application fee, there are fees associated with sending standardized test scores. There is also a cost in time. Although most colleges accept the so-called "common application," submitting additional applications can nonetheless be time-consuming, because many colleges have specialized additional questions they ask. Applicants who are recruited so the college can deny their applications might also waste their (and their parents') time and money by visiting the campus. They might make costly decisions to forgo certain other applications to colleges that would actually have admitted them. At the end of the process, there is both the sting of rejection and the sense of betrayal. I imagine a great many applicants who were victims of recruit-to-deny efforts feeling more than a little miffed that a college that went out of its way to tell them how great they were then rejected them.
Below, I'll offer a simple suggestion for eliminating the incentive for recruit-to-deny. But first I want to make a couple of observations about how it fits with other admissions practices.
The Times article linked above focuses particular attention on the racial dimension of Harvard's recruit-to-deny strategy. It says that Harvard "intentionally [solicits] applications from a large portion of African-Americans in particular who effectively have no chance of getting in." Why would Harvard do that? I don't know, but I have a hypothesis.
Aware that its admissions policies are under legal attack, Harvard engages not only in recruit-to-deny for the reason other colleges do (i.e., to produce a better selectivity rating) but specifically recruits large numbers of African American applicants it intends to deny as a means of justifying its affirmative action policy: With a larger base of African American applicants, Harvard can claim that the odds of admission for an African American applicant are comparable to those of applicants of other racial groups, where odds are determined by considering the ratio of people of a racial group admitted to those of that group who apply.
Do I know that's what Harvard is up to? No, it's just speculation. Harvard denies that it engages in recruit-to-deny at all and claims that it is simply looking for diamonds in the rough, especially among African American students. That's possible. I'm not going to try to nail down what's actually going on at Harvard, because I see recruit-to-deny as a broader and systemic problem.
Before coming to my proposed solution, let me pause over a related admissions policy, which I'll call "reject-to-preempt." In addition to rating colleges based on selectivity, the various ratings organizations also rate them based on yield, the fraction of students admitted who enroll. A high yield is ostensibly a marker of quality because it shows that students admitted want to attend. However, colleges can and do game their yield by rejecting applicants who are "too good."
I can explain with an example. Let's say that the median GPA of students admitted to College X is 3.6 and the median SAT score is 1200. Deploying reject-to-preempt, the college would reject an applicant with a 4.0 and 1450 because the college's admissions personnel believe it very unlikely that the applicant would matriculate if admitted, choosing instead to attend a more selective college for which the applicant's grades and SAT score will result in acceptance. College X would be delighted to have the applicant enroll, but the very low likelihood of that occurring results in an admissions denial to avoid the applicant's very likely decision to go elsewhere negatively affecting X's yield.
Some number of preemptively rejected applicants might have actually attended X, either because they were unexpectedly rejected at more selective colleges or because X has some special features that appeal to them. When a college rejects these applicants, it harms the applicants and itself. The ego blow to the students who are preemptively rejected may not be so great if they are admitted to more selective colleges they'd actually prefer attending, but even so, the experience could provoke anxiety and maybe even the possibility of striking out entirely.
Why would preemptive rejection lead to an excellent student striking out? Consider that guidance counselors and others advise students to apply to three categories of colleges: (1) "safety" schools, where the applicant's numbers essentially assure admissions; (2) "match" schools, where the applicant has a good but not certain chance of admission; and (3) "reach" schools, where admission is unlikely but not out of the question. Reject-to-preempt effectively eliminates the safety-school category, thus making the whole process less predictable for applicants. Because there is no guarantee at "match" schools, an excellent student could end up being rejected everywhere that student applies.
Now onto solutions. I've called recruit-to-deny and reject-to-preempt despicable policies, but they are logical responses to ratings and rankings based on selectivity and yield. The obvious way to make the despicable policies go away would be for the relevant ratings and ranking organizations to stop weighting selectivity and yield.
But wait. Wouldn't that throw away the baby with the bathwater? Aren't selectivity and yield valuable measures of a college's quality? The short answer is no. There's no baby here, just bathwater.
Suppose a high school senior is trying to decide whether to apply to or to attend (after getting admitted to) College Z. The senior wants to know all sorts of things. What is the student/faculty ratio? Is the college especially good in the student's expected major? What sort of study abroad programs are there? What are the job prospects and graduate school admissions experience of recent graduates? Is it a big school or a small one? Urban, rural, or suburban? Does one have to audition to be in the marching band?
In addition to all that and more, a prospective student will want to know the educational qualifications of likely classmates. Knowing the range of standardized test scores and high school grades of the other students who matriculate will provide useful data. Will I be among the most prepared? The least prepared? Average? Numerical data about the students who actually attend can be helpful for this evaluation. But the numbers of the students who don't attend College Z provide no information along this dimension, regardless of whether those non-attending students were rejected (and thus figured into selectivity) or chose not to attend (and thus figured into yield).
To be sure, a student who is trying to decide whether to apply to College Z might want to know her odds of admission. If selectivity were not game-able by tactics like recruit-to-deny, it would provide some useful guidance. But much much more useful guidance will be provided by knowing what the odds of being admitted are with a particular profile. If the ratings organizations want to provide actually useful selectivity data for this purpose, they would do two things: (1) they would not base rankings on selectivity at all; and (2) they would publish data ranges with admissions odds. It might seem odd not to rank based on selectivity, but even if selectivity is not gamed, it's not useful as a measure of quality, only as a measure of likelihood of admission.
What about yield? Isn't it a cheap and dirty way to figure out revealed preferences? Again, no.
For one thing, students have all sorts of reasons to accept or not accept a college's offer of admissions. Many colleges use non-need-based financial aid to recruit top students. That can boost yield without saying much of relevance. Suppose that a college has higher yield than it otherwise would because it gives generous financial aid to the top 10% of each admitted class. A student who is not offered such aid faces a very different choice from the one faced by those who chose to matriculate with the aid. Yield does not reveal such differences.
More broadly, yield provides information based on revealed preferences of people who know virtually nothing -- those deciding whether to attend but who haven't yet attended. Insofar as yield aims at providing a kind of market assessment of a college's value, ratings/rankings organizations would do much better to provide information that comes from students who already attend or have attended the colleges. Put bluntly, it's much more valuable to know whether students at a college got educations they valued and landed good jobs or graduate school admissions than to know what fraction of students who had the chance to go to that college chose not to.
A recent NY Times article highlights one of the more despicable practices one sees in the college admissions game -- what has come to be known as "recruit to deny." Here's how it works: US News and other purveyors of college ratings and rankings include "selectivity" among the factors on which they evaluate colleges. The harder it is to get into a college, the more selective that college is. Selectivity is expressed as a ratio of applicants who are admitted to applicants who apply: The lower the ratio, the more selective the college. A college can improve (i.e., decrease) its selectivity ratio by increasing the denominator, i.e., by encouraging applications from more students it expects to reject. (Encouraging applications from students a college expects to accept will decrease selectivity, because it will increase the numerator as well as the denominator, and thus in general increase the ratio.) Accordingly, as the Times article notes, many colleges reach out to prospective applicants with recruiting material, creating false hope that they will be admitted, when the college only values them as filler for the denominator. Such colleges "recruit to deny."
The practice is despicable for various obvious reasons. Sometimes colleges waive application fees for students they recruit to deny, but not always. When they don't, they waste those applicants' fees. Even if a college waives its application fee, there are fees associated with sending standardized test scores. There is also a cost in time. Although most colleges accept the so-called "common application," submitting additional applications can nonetheless be time-consuming, because many colleges have specialized additional questions they ask. Applicants who are recruited so the college can deny their applications might also waste their (and their parents') time and money by visiting the campus. They might make costly decisions to forgo certain other applications to colleges that would actually have admitted them. At the end of the process, there is both the sting of rejection and the sense of betrayal. I imagine a great many applicants who were victims of recruit-to-deny efforts feeling more than a little miffed that a college that went out of its way to tell them how great they were then rejected them.
Below, I'll offer a simple suggestion for eliminating the incentive for recruit-to-deny. But first I want to make a couple of observations about how it fits with other admissions practices.
The Times article linked above focuses particular attention on the racial dimension of Harvard's recruit-to-deny strategy. It says that Harvard "intentionally [solicits] applications from a large portion of African-Americans in particular who effectively have no chance of getting in." Why would Harvard do that? I don't know, but I have a hypothesis.
Aware that its admissions policies are under legal attack, Harvard engages not only in recruit-to-deny for the reason other colleges do (i.e., to produce a better selectivity rating) but specifically recruits large numbers of African American applicants it intends to deny as a means of justifying its affirmative action policy: With a larger base of African American applicants, Harvard can claim that the odds of admission for an African American applicant are comparable to those of applicants of other racial groups, where odds are determined by considering the ratio of people of a racial group admitted to those of that group who apply.
Do I know that's what Harvard is up to? No, it's just speculation. Harvard denies that it engages in recruit-to-deny at all and claims that it is simply looking for diamonds in the rough, especially among African American students. That's possible. I'm not going to try to nail down what's actually going on at Harvard, because I see recruit-to-deny as a broader and systemic problem.
Before coming to my proposed solution, let me pause over a related admissions policy, which I'll call "reject-to-preempt." In addition to rating colleges based on selectivity, the various ratings organizations also rate them based on yield, the fraction of students admitted who enroll. A high yield is ostensibly a marker of quality because it shows that students admitted want to attend. However, colleges can and do game their yield by rejecting applicants who are "too good."
I can explain with an example. Let's say that the median GPA of students admitted to College X is 3.6 and the median SAT score is 1200. Deploying reject-to-preempt, the college would reject an applicant with a 4.0 and 1450 because the college's admissions personnel believe it very unlikely that the applicant would matriculate if admitted, choosing instead to attend a more selective college for which the applicant's grades and SAT score will result in acceptance. College X would be delighted to have the applicant enroll, but the very low likelihood of that occurring results in an admissions denial to avoid the applicant's very likely decision to go elsewhere negatively affecting X's yield.
Some number of preemptively rejected applicants might have actually attended X, either because they were unexpectedly rejected at more selective colleges or because X has some special features that appeal to them. When a college rejects these applicants, it harms the applicants and itself. The ego blow to the students who are preemptively rejected may not be so great if they are admitted to more selective colleges they'd actually prefer attending, but even so, the experience could provoke anxiety and maybe even the possibility of striking out entirely.
Why would preemptive rejection lead to an excellent student striking out? Consider that guidance counselors and others advise students to apply to three categories of colleges: (1) "safety" schools, where the applicant's numbers essentially assure admissions; (2) "match" schools, where the applicant has a good but not certain chance of admission; and (3) "reach" schools, where admission is unlikely but not out of the question. Reject-to-preempt effectively eliminates the safety-school category, thus making the whole process less predictable for applicants. Because there is no guarantee at "match" schools, an excellent student could end up being rejected everywhere that student applies.
Now onto solutions. I've called recruit-to-deny and reject-to-preempt despicable policies, but they are logical responses to ratings and rankings based on selectivity and yield. The obvious way to make the despicable policies go away would be for the relevant ratings and ranking organizations to stop weighting selectivity and yield.
But wait. Wouldn't that throw away the baby with the bathwater? Aren't selectivity and yield valuable measures of a college's quality? The short answer is no. There's no baby here, just bathwater.
Suppose a high school senior is trying to decide whether to apply to or to attend (after getting admitted to) College Z. The senior wants to know all sorts of things. What is the student/faculty ratio? Is the college especially good in the student's expected major? What sort of study abroad programs are there? What are the job prospects and graduate school admissions experience of recent graduates? Is it a big school or a small one? Urban, rural, or suburban? Does one have to audition to be in the marching band?
In addition to all that and more, a prospective student will want to know the educational qualifications of likely classmates. Knowing the range of standardized test scores and high school grades of the other students who matriculate will provide useful data. Will I be among the most prepared? The least prepared? Average? Numerical data about the students who actually attend can be helpful for this evaluation. But the numbers of the students who don't attend College Z provide no information along this dimension, regardless of whether those non-attending students were rejected (and thus figured into selectivity) or chose not to attend (and thus figured into yield).
To be sure, a student who is trying to decide whether to apply to College Z might want to know her odds of admission. If selectivity were not game-able by tactics like recruit-to-deny, it would provide some useful guidance. But much much more useful guidance will be provided by knowing what the odds of being admitted are with a particular profile. If the ratings organizations want to provide actually useful selectivity data for this purpose, they would do two things: (1) they would not base rankings on selectivity at all; and (2) they would publish data ranges with admissions odds. It might seem odd not to rank based on selectivity, but even if selectivity is not gamed, it's not useful as a measure of quality, only as a measure of likelihood of admission.
What about yield? Isn't it a cheap and dirty way to figure out revealed preferences? Again, no.
For one thing, students have all sorts of reasons to accept or not accept a college's offer of admissions. Many colleges use non-need-based financial aid to recruit top students. That can boost yield without saying much of relevance. Suppose that a college has higher yield than it otherwise would because it gives generous financial aid to the top 10% of each admitted class. A student who is not offered such aid faces a very different choice from the one faced by those who chose to matriculate with the aid. Yield does not reveal such differences.
More broadly, yield provides information based on revealed preferences of people who know virtually nothing -- those deciding whether to attend but who haven't yet attended. Insofar as yield aims at providing a kind of market assessment of a college's value, ratings/rankings organizations would do much better to provide information that comes from students who already attend or have attended the colleges. Put bluntly, it's much more valuable to know whether students at a college got educations they valued and landed good jobs or graduate school admissions than to know what fraction of students who had the chance to go to that college chose not to.