It’s one of the few admissions variables you can actually control. Most families don’t know which schools are watching.
Consider the awkward position a Tulane admissions reader sits in every January. Tulane accepts about 14 percent of applicants, lower than the admit rate at Cornell or Notre Dame. But Tulane’s yield (the share of admitted students who actually enroll) trails those schools by a wide margin. So the reader is doing two jobs at once. They have to figure out who is admissible. They also have to figure out who will actually show up. Demonstrated interest is the tool that helps them answer the second question, and it influences who gets through the door on the first.
Of the roughly 312 selective colleges in our dataset, 175 (about 56 percent) report that they factor demonstrated interest into their admissions decisions. Thirty-nine schools go further, rating it Important or Very Important. At those schools, whether you visited campus, opened the admissions office’s emails, attended a virtual information session, or logged into your applicant portal may have directly influenced whether you were admitted.
Demonstrated interest, or DI, is the admissions world’s most paradoxical variable. It is widely used and poorly understood. Most families have heard the term, but far fewer know which specific schools track it, how seriously they weight it, or what behaviors actually count. Unlike GPA or test scores, DI is not reported on any standardized form. Unlike extracurriculars, it does not appear in the application itself. It lives in the background, in CRM databases and tracking pixels and event sign-in sheets, and it influences outcomes in ways that are deliberately opaque.
This article maps the landscape. Drawing on self-reported data from the Common Data Set, we identify which schools consider DI, how the pattern varies by selectivity and institutional type, and what it means practically for students building their college lists.
The Landscape: Who Cares, and How Much
Every year, colleges complete the Common Data Set, a standardized self-reporting instrument that includes a question about how various factors are weighted in admissions. For demonstrated interest, schools choose from four levels: Very Important, Important, Considered, and Not Considered.
Among our 312 selective schools, the distribution breaks down this way: 11 schools rate DI as Very Important, 28 rate it as Important, 136 rate it as Considered, and 129 rate it as Not Considered (one school did not report). The 175 schools in the first three categories represent a majority of the selective landscape, and they span every tier of selectivity, from Bates and Tulane (both Most Selective) to Appalachian State and Gonzaga (Less Selective).
The schools that rate DI as Very Important are an eclectic group: the U.S. Naval Academy, the Air Force Academy, Olin College of Engineering, Dickinson College, American University, Spelman College, Morehouse College, Wabash College, Mercer University, Quinnipiac University, and Georgia College. What they share is a strong institutional interest in enrolling students who have made a deliberate, informed choice to attend. Several of these schools also have relatively low yield rates, so DI serves a practical function: it helps predict which admitted students will actually show up.
The Important tier includes some surprising names. Tulane (14 percent admit rate) and Bates (13 percent) are among the most selective schools in the country, and both explicitly rate DI as Important. Lehigh, Kenyon, Pitzer, and Trinity College, all Highly Selective, do the same. At these schools, DI is not a tiebreaker for borderline cases. It is a named factor in the evaluation process, weighted alongside grades, test scores, and extracurriculars.
The much larger Considered category (136 schools) includes a long list of elite institutions: Duke, Dartmouth, Northwestern, Bowdoin, Colby, Rice, Haverford, Middlebury, Tufts, and USC, among many others. “Considered” is deliberately vague. It could mean anything from a formal scoring rubric to an informal glance at the CRM log. The fact that these schools report it at all signals that the information is available to readers and can influence decisions at the margin.
The Selectivity Split
The most important pattern in the data is the selectivity split. At the Most Selective tier (the Ivies, Stanford, MIT, Caltech, and their peers), 56 percent of schools report that they do not consider DI. Only two Most Selective schools (Tulane and Bates) rate it as Important, and none rate it as Very Important. The rest either Consider it (40 percent) or ignore it entirely.
This pattern makes intuitive sense. The most selective schools receive so many qualified applications that they have no difficulty filling their classes. Harvard, Yale, Princeton, Stanford, Columbia, and Brown all report DI as Not Considered. They simply don’t need the signal. Their yield rates are high enough, and their brand power strong enough, that predicting which admitted students will enroll is not a significant challenge.
The picture shifts as you move down the selectivity spectrum. At the Highly Selective level, 64 percent of schools consider DI in some form. At the Selective level, 56 percent. At the Moderately Selective level, 69 percent. These are schools that face real yield management challenges. They admit thousands of students, many of whom are also admitted to peer institutions, and they need tools to predict who will actually matriculate. DI is one of those tools.
The practical implication is stark. A student applying to both Brown (DI: Not Considered) and Lehigh (DI: Important) should understand that these schools are working from very different information. At Brown, your application file is your application file. At Lehigh, your digital footprint (every email opened, every campus visit logged, every webinar attended) may be sitting alongside your transcript in the reader’s queue.
Private, Small, and Yield-Conscious
Beyond selectivity, two other variables strongly predict whether a school tracks DI: institutional control and size.
Among private institutions, 70 percent consider DI in some form. Among public institutions, only 35 percent do. The disparity reflects the different enrollment challenges each type faces. Private schools, which lack the built-in applicant pipeline of in-state residents, have to work harder to attract and retain students. Tracking DI helps them identify applicants who are actually interested, rather than students padding their lists through the Common App.
The size pattern is even more pronounced. Among small schools (1,000 to 3,000 students), 81 percent consider DI. Among very large schools (over 10,000), only 24 percent do. This is partly mechanical. A school with 1,500 undergraduates and 400 freshmen can afford to track individual engagement behaviors and incorporate them into file review. A school with 40,000 students and 8,000 freshmen cannot. The logistics of tracking and weighting DI at that scale are prohibitive.
The result is that DI functions as a sorting mechanism that disproportionately affects students applying to private, mid-size colleges, exactly the institutions that dominate the heart of the selective landscape. Schools like Lehigh, Kenyon, Dickinson, Bates, Trinity, Tulane, and Syracuse are the places where DI is most likely to make a tangible difference in your admissions outcome.
What Schools Actually Track
The Common Data Set asks schools whether they consider DI. It does not ask how they track it. Through conversations with admissions professionals, published guidance, and industry reporting, a fairly clear picture has emerged of the behaviors that constitute demonstrated interest at schools that track it.
Campus visits
Campus visits are the gold standard. Signing in at the admissions office, attending an official tour or information session, or meeting with an admissions officer on campus creates a clear, timestamped record of engagement. At schools that rate DI as Important or Very Important, a campus visit is widely considered the strongest possible signal. Virtual visits and virtual information sessions have become more accepted alternatives since COVID, though in-person visits still carry the most weight at schools that distinguish between the two.
Email engagement
Email engagement is tracked more widely than most families realize. Many admissions offices use CRM systems (Slate is the most common in higher ed admissions) that log when prospective students open emails, click links, and download materials. A student who consistently opens and engages with communications signals ongoing interest. A student who ignores every email signals the opposite. This is one of the easiest DI behaviors to maintain and one of the most commonly overlooked.
Interviews
Interviews, where offered, serve a dual purpose. They allow the school to evaluate the student, and they allow the student to demonstrate interest. Among the 39 schools that rate DI as Important or Very Important, 21 offer admissions interviews. At schools like Dickinson, Lehigh, Kenyon, Trinity College, and the University of Rochester (all of which rate DI as Important), requesting an interview is a tangible way to signal commitment.
Portal activity
Portal activity has become a meaningful signal. Once a student creates an account on a school’s applicant portal or inquiry form, the admissions office can track how often they log in, whether they complete optional profile fields, and how they engage with content. Some schools also track whether a student has attended regional admissions events, college fairs, or school-specific webinars.
The “Why Us?” supplemental essay
Supplemental essays that ask “Why us?” are themselves a form of DI tracking. A well-researched, specific essay that references particular programs, faculty, or campus features signals real engagement. A generic essay that could apply to any school signals the opposite. At DI-sensitive schools, the “Why us?” essay is doing double duty. It is both a writing sample and a DI indicator.
Why “Not Considered” Doesn’t Always Mean Not Noticed
Up to this point, this article has drawn a clear line between schools that officially track DI and those that do not. That line is useful. It is also, in important ways, misleading.
The Common Data Set is a self-reporting instrument, and the question about demonstrated interest asks schools to characterize their formal policy. A school that checks “Not Considered” is telling you that DI is not a named criterion in its evaluation rubric. It is not telling you that admissions officers are unaware of whether you visited campus, that your regional representative does not remember meeting you at a college fair, or that your interviewer’s notes will not mention your obvious enthusiasm for the school’s marine biology program.
Anyone who has spent time inside an admissions committee knows that the gap between official policy and actual practice can be substantial. Admissions is a human process conducted by people who read thousands of files and make judgment calls under uncertainty. When a reader is deciding between two otherwise comparable applicants (similar grades, similar test scores, similar extracurricular profiles), and one of them visited campus, attended an information session, and wrote a “Why us?” essay that referenced a specific conversation with a professor, while the other applied cold with a generic supplement, the reader’s sense of who is more likely to enroll can tilt the outcome. That tilt does not require DI to be a formal scoring criterion. It just requires a human making a judgment.
This is particularly relevant at the most selective schools, where the official party line is that DI does not matter. Harvard, Princeton, Stanford, and their peers have good reasons to say this. They receive far more qualified applicants than they can admit, and acknowledging a DI preference would create pressure to visit campus that would disadvantage lower-income and geographically remote students. The policy is well-intentioned and, as a formal matter, likely accurate. But it would be naive to believe that an applicant’s engagement with the school (or lack thereof) never surfaces in committee discussions, in interviewer assessments, or in the subtle calibrations that readers make when a file is on the bubble.
This isn’t a conspiracy theory. It’s a recognition of how complex human decision-making works in any setting where subjective judgment is involved. A job interview, a grant review, a tenure decision. In all of these contexts, factors that are not on the official rubric can influence the outcome. College admissions is no different.
The practical takeaway is simple. Demonstrate interest everywhere you apply, regardless of what the CDS says. At schools that officially track DI, the benefit is direct and measurable. At schools that officially don’t, the benefit may be indirect. A campus visit that enriches your supplemental essay, an interview that produces a memorable write-up, a connection with your regional representative who later advocates for your file in committee. You can’t know in advance which of these touchpoints will matter, but the downside of engaging is zero and the potential upside is real.
Like many things in college admissions, you can’t always take an institution at its word. The formal policy is one thing. The lived reality of how 40 or 50 admissions officers read files and make decisions is something else. Students who understand this distinction have an advantage that no checkbox on the Common Data Set can capture.
What This Means for Families
Check every school’s DI status before you build your strategy
The Common Data Set for most schools is publicly available online (search “[School Name] Common Data Set”). Look at Section C7, which lists the importance of various admissions factors including “level of applicant’s interest.” If a school lists DI as Important or Very Important, everything you do (or fail to do) from your first inquiry forward is potentially being tracked. If it’s listed as Not Considered, your formal DI score may not exist, but informal impressions still matter.
Start engagement early and keep it sustained
At DI-sensitive schools, don’t wait until application season to show interest. Sign up for the mailing list during junior year. Open the emails. Attend a virtual event. Visit campus if you can. Request an interview if offered. Each of these touchpoints adds to your profile in the admissions CRM. The students who engage consistently over 12 to 18 months look very different from those who appear for the first time when they submit their application in January.
Visit campus when you can, even at schools that say DI doesn’t matter
If a school rates DI as Important and offers admissions interviews (Lehigh, Dickinson, Kenyon, Trinity, the University of Rochester), a campus visit paired with an interview is the single highest-value DI action you can take. Even at schools that officially ignore DI, a campus visit gives you material for a stronger “Why us?” essay, a relationship with a regional admissions officer, and a more informed sense of whether the school is right for you. The visit helps your application even when no one is formally scoring it.
Don’t confuse “official policy” with “no benefit”
Harvard, Yale, Princeton, Stanford, MIT, Caltech, Columbia, Brown, UPenn, Cornell, Johns Hopkins, Vanderbilt, Emory, and Georgetown all report DI as Not Considered. That means your engagement won’t appear as a scored line item on your evaluation. It doesn’t mean your regional representative won’t remember you from a college fair, that your interviewer’s enthusiasm won’t color their write-up, or that a committee member won’t note that you referenced a campus visit in your essay. Engage with these schools on their own terms, not to game a DI score, but to build the kind of familiarity that produces stronger applications and subtler advantages.
The “Why Us?” essay is your last, best DI signal at every school
At every school that considers DI, the supplemental essay asking why you want to attend is effectively a DI test. Even at schools that don’t formally track DI, the “Why us?” essay is your opportunity to prove you’ve done your homework. Be specific. Reference programs, courses, research opportunities, campus organizations, or conversations with current students. A reader can tell the difference between an applicant who has engaged seriously with the school and one who swapped in a name. That distinction influences outcomes whether or not “demonstrated interest” appears on the rubric.
Understand the equity dimension
DI tracking structurally advantages students with resources: those who can afford campus visits, those whose parents know to track admissions emails, those who have counselors reminding them to log into portals. First-generation students, rural students, and students from under-resourced schools are less likely to engage in the behaviors that DI systems measure, not because they are less interested, but because they are less coached. If this describes you, focus on the free and accessible DI signals: opening emails, attending virtual events, and writing a specific, well-researched supplemental essay.
What the Checkbox Doesn’t Tell You
Demonstrated interest occupies an unusual position in the admissions landscape. It’s less visible than test scores, less discussed than extracurriculars, and less understood than financial aid. At 175 selective colleges, it is a named factor in the evaluation process, and at 39 of them, it is weighted as Important or Very Important. At the remaining 129 schools that officially do not consider it, the reality is more nuanced than the policy suggests.
There’s an irony at the heart of DI: the schools most likely to track it are the ones least likely to explain how they track it. And the schools that say they don’t track it are not necessarily immune to the impressions that engagement creates. Admissions offices have strong incentives to monitor engagement (it helps predict yield) and weak incentives to disclose the specifics (transparency would invite gaming). The result is an information asymmetry that rewards high-information families and penalizes everyone else.
This article can’t eliminate that asymmetry, but it can narrow it. You now know which schools are officially watching, and why you should act as if all of them are.
Building a List That Accounts for DI
If you’re putting together a college list right now, the DI status of each school is one more variable to factor in alongside selectivity, fit, and affordability. The schools where DI is weighted heavily are often the same ones where careful list-building and early engagement can shift outcomes. We work with families on building lists that match the student’s profile to the schools where their odds (and effort) are best invested. If the patterns above are pointing you toward a different list than you started with, that’s a useful signal.
Methodology: This analysis draws on the Common Data Set (CDS) reports from 312 selective colleges and universities, compiled into a master dataset that includes institutional characteristics, admissions data, and self-reported factor weightings. The demonstrated interest variable comes from CDS Section C7, where schools indicate the importance of “level of applicant’s interest” using a standardized four-point scale: Very Important, Important, Considered, or Not Considered.
Selectivity tiers (Most Selective, Highly Selective, Selective, Moderately Selective, Less Selective) are from the master dataset and are based on a composite of admission rate, test scores, and yield. Institutional control (Public, Private) and size categories (Very Small through Very Large) are also from the master dataset.
Several limitations apply. First, CDS reporting is voluntary and self-reported; schools may interpret the categories differently. A school that reports DI as “Considered” may weight it very lightly or quite heavily depending on context. Second, the CDS does not disclose how DI is tracked or operationalized, so our discussion of tracking methods draws on publicly available admissions industry sources rather than institutional disclosures. Third, a school’s DI policy can change year to year; the data here reflects the most recent available CDS filing for each institution. Finally, one school did not report its DI status and is excluded from percentage calculations.
Data sources: Common Data Set reports compiled into a master dataset of 312 selective colleges. Admissions factor weightings from CDS Section C7. Interview availability from CDS Section C3. All percentage calculations by the authors.