Why students quit band: what a structural model of 253 9th graders tells us

Based on Corenblum & Marshall (1998): The band played on: Predicting students' intentions to continue studying music.  ·  17 min read.

A 9th grade student leaving the band classroom

In a rush?

We've broken down the insights into easy-to-digest Blini-bites at the end of this article. Jump straight to any question.

TL;DR
  • Socioeconomic level and teacher evaluations were the two strongest predictors of whether ninth-grade students intended to stay in band, both directly and through other variables like parental support and outside musical interests.
  • Students' own attitudes toward band had no meaningful relationship with their intentions to continue, which means liking band class is not the same as planning to stay in it.
  • The more positively a teacher rated a student's performance, the more that student believed their parents supported the program, and perceived parental support independently predicted intentions to continue.
  • Course grades did not predict retention at all once teacher evaluations were accounted for, suggesting that the holistic judgment teachers form about students matters more than a letter grade.
  • Dropout risk at the high school transition is not a single-cause problem: it is a web of social, economic, and evaluative signals that reinforce each other, which means single-intervention fixes are unlikely to work.

Every band director has watched a student disappear between June and September. The kid was fine in class. Decent player. Showed up consistently. And then one day in the fall, they're not on the roster. You find out later they didn't re-enroll. No warning, no conversation, just gone.

What if there were signals you could have read months earlier? Not gut feeling, but measurable, structural predictors of who was likely to stay and who was quietly deciding to leave?

That's exactly what Barry Corenblum and Eric Marshall set out to investigate in their 1998 paper in the Journal of Research in Music Education, "The band played on: Predicting students' intentions to continue studying music." Marshall was a working band teacher at the time, which gives the research a grounding that pure psychology papers sometimes lack. The two of them built and tested a structural equation model using data from 253 ninth-grade band students across seven schools in Winnipeg, Canada. Their goal was to map, with statistical rigor, which factors actually predict whether a student intends to take band again the following year, and which factors merely seem like they should matter.

Some of what they found confirms what experienced teachers already suspect. Some of it is genuinely surprising. And one finding in particular should change how directors think about their daily interactions with students.

Why ninth grade is the critical moment

Corenblum and Marshall chose ninth grade deliberately. They cite Timmerman (1977), who found that the largest single-year drop in band enrollment happens when students first enter high school. That transition is a natural decision point: new schedule, new social environment, new demands on time. Students who were drifting away in middle school often make their exit official at exactly this moment.

This matters because ninth grade is when the warning signs, if they exist, are most likely to convert into actual withdrawal. Understanding what drives that conversion is not just academically interesting. It is operationally useful for any director managing that grade level, which is often the most volatile in an instrumental program.

The student sample reflected real-world diversity in socioeconomic background, and the seven participating schools each had at least one full-time, trained specialist in secondary music education. Band class sizes ranged from 30 to 50 students, and most participants had been in a band program since seventh grade.

What the study actually measured

The researchers collected data on a wide range of variables: students' self-reported socioeconomic indicators (parental occupation, number of instruments owned, personal instrument ownership), current academic grades across multiple subjects, outside musical activities (choir, playing with friends, private lessons, and similar), and students' perceptions of the attitudes of their parents, their band teacher, and their school toward the band program. They also captured two attribution dimensions through factor analysis: one they called "Strategy" (students who believed effective learning approaches would improve their performance) and one called "Pessimism" (students who had largely given up on effort as a path to improvement).

Band teachers, meanwhile, completed their own evaluations of each student independently, rating performance on a 5-point scale, ranking students relative to peers, and estimating current and historical grades. These ratings were combined into a composite "teacher evaluations" variable.

The criterion variable was simple: did the student intend to take band next year? Yes or no.

Structural equation modeling (SEM) was used rather than standard regression, which allowed the researchers to examine both direct effects (does variable X predict intention directly?) and indirect effects (does X predict intention through its influence on Y, which then predicts intention?). This is important because it reveals the mechanism behind a relationship, not just the fact that a relationship exists.

The model fit the data well by standard SEM criteria: chi-square (1224) = 1269.04, p = .18; comparative fit index = .99; standardized root mean squared residual = .08. All path coefficients reported were significant at p < .05 or better.

The two variables that drove everything

Two exogenous variables predicted student intentions to continue: socioeconomic level and teacher evaluations. Both had direct effects on the criterion, and both also operated indirectly through chains of other variables.

Socioeconomic level predicted perceived parental support (Beta = .79) and outside musical interests (Beta = 1.22), both of which independently predicted intentions. Teacher evaluations predicted intentions directly (Beta = .25) and also predicted students' perceptions of their parents' attitudes toward band. Notably, grades showed no significant paths to the criterion despite being correlated with teacher evaluations (r = .28). The model accounted for 28% of the variance in intentions.

The direct effect of teacher evaluations on intentions is worth sitting with. It means that how a teacher judges a student's musical competency, separate from that student's official grade, is an independent predictor of whether that student will come back next year. That is not a small thing. It suggests that the evaluative relationship between teacher and student carries information about a student's likely trajectory that a gradebook cannot fully capture.

The indirect pathway is equally striking. Higher teacher evaluations predicted more positive student perceptions of parental support. The more a teacher rated a student well, the more that student believed their parents were behind the program. And perceived parental support, in turn, predicted intentions to continue.

Corenblum and Marshall offer a plausible interpretation: students who are performing well receive more encouragement in class, and that encouragement may signal to them that their parents' support is warranted. Or, more simply, students who feel successful in band have more reason to believe the adults around them approve of their participation. Either way, the teacher's evaluation is not just a private judgment. It propagates through the student's social perception in ways that affect the student's own future plans.

The chain from school climate to student intentions

Hypothesis 2 in the original model predicted that perceived school support would predict band-teacher attitudes, which would predict student attitudes, which would predict intentions. The data confirmed this chain.

Socioeconomic level strongly predicted perceived school support (Beta = 1.00). Students from higher socioeconomic backgrounds were significantly more likely to believe their school valued the band program. That perception of school support predicted student perceptions of band-teacher attitudes (Beta = .52). And teacher attitudes, as perceived by students, predicted student attitudes toward band (Beta = .55).

The social transmission chain

SES (Beta = 1.00) School support Teacher attitudes (Beta = .52) Student attitudes (Beta = .55)

This is a social transmission of norms that moves from the institutional level down to the individual student. Students do not form their attitudes about band in isolation. They read the room, and they read whether the adults around them seem to believe the program matters. When that signal is positive, student attitudes follow. When it is absent or mixed, the ripple effects travel all the way down to enrollment decisions.

The implication for teachers is uncomfortable: individual effort within a program that lacks institutional support operates against a significant structural headwind. A director who believes in their program but teaches in a building where the administration treats band as a scheduling inconvenience is working uphill every day, and the students can tell.

Why good grades don't keep students in band

One of the more counterintuitive findings in Corenblum and Marshall's study is that course grades did not predict intentions to continue. At all. Not directly, not indirectly through any other variable.

This runs against a straightforward competency-motivation story where doing well academically in band translates into wanting to stay. And it is inconsistent with some prior work, including Harrison, Asmus, and Serpe (1994), which the authors cite.

Their explanation is methodological: grades and teacher evaluations covaried significantly (r = .28), and teacher evaluations were more strongly correlated with other variables in the model. When both compete in the same SEM, teacher evaluations absorb most of the predictive variance, leaving grades with little unique explanatory power.

Grades vs. teacher evaluations

A B+ in band and a teacher evaluation of "developing, shows potential" are not the same thing, even if both appear positive. The teacher's composite judgment integrates information about trajectory, engagement, and potential that a grade alone cannot.

What does this mean in practice? It means the holistic judgment a teacher forms about a student, informed by performance ratings, class rankings, and historical grades, carries more signal about that student's likely persistence than any single grade does. This also has an implication for teacher self-awareness. If your evaluation of a student is a predictor of whether they stay, and if that evaluation is not the same as their grade, then your judgment matters. The subtle signals you send to students about where you see their ability and potential are not neutral. They are consequential.

What "liking band" can and cannot predict

Here is the finding that probably diverges most sharply from intuition: student attitudes toward band did not meaningfully predict intentions to continue.

In the SEM, student attitudes toward the band program showed an unexpected negative path to intentions (Beta = -.27), which sounds alarming until the authors explain it. Post-hoc analysis showed that the bivariate correlation between attitude and intention was near zero (mean r = .09). The negative path coefficient in the model reflects a statistical suppression effect, not a genuine reversal. The honest reading is: student attitudes have no meaningful relationship with whether students intend to stay.

This is worth saying plainly, because it contradicts a lot of conventional wisdom. A student who reports enjoying band class, finding it interesting, and valuing music is not significantly more likely to re-enroll than a student who reports more mixed feelings, once you account for socioeconomic level, teacher evaluations, and perceived support.

Corenblum and Marshall's interpretation draws on a body of social psychology research showing that attitudes are frequently weak predictors of future behavior. What predicts behavior better are factors that initiate and maintain it over time: perceived support from significant others, felt competence, and environmental resources. A student can genuinely enjoy band and still decide not to re-enroll because their parents are indifferent, their teacher evaluations are mediocre, and they don't play outside of class. Liking something is not enough to sustain participation when the structural conditions don't support it.

For teachers, this means surveys about student attitudes, though useful for other purposes, are probably not reliable early warning systems for dropout risk. A student who says they love band on a survey in October may not be on your roster in September.

Socioeconomic level: the background variable that touches everything

Socioeconomic level is what researchers call a proxy variable. It doesn't directly cause anything; it represents a cluster of conditions that enable or constrain the things that do. As Corenblum and Marshall explain, higher socioeconomic backgrounds provide the financial means to own instruments, access lessons, and participate in extracurricular musical activities. They also tend to produce households where academic and artistic achievement is normalized and encouraged, where parents are more likely to attend performances and reinforce the value of music education.

In the model, socioeconomic level predicted both perceived parental support (Beta = .79) and outside musical interests (Beta = 1.22), and it predicted perceived school support (Beta = 1.00). All three of those variables, in turn, fed into the chain that ended at intentions to continue.

One particularly interesting finding: perceived parental support was negatively associated with outside musical interests (Beta = -.30). Students who believed their parents strongly supported band were less likely to engage in extracurricular musical activities. The authors speculate that students may have interpreted their parents' support for band as implying that band was sufficient, reducing the perceived need for additional musical involvement. Or students may have believed their parents saw outside musical activities as competing with academic priorities. Either interpretation points to how parental attitudes shape student behavior in ways that are not always straightforward.

What this means for teachers in lower-income schools is significant, and Corenblum and Marshall address it directly. Programs in economically depressed areas operate with less perceived institutional support, less parental musical involvement, and fewer outside musical resources for students. Teachers in those contexts cannot change the socioeconomic reality, but they can adapt. The authors suggest that music instruction sensitive to the musical traditions of students' racial and ethnic backgrounds may be more effective at reducing dropout than traditional band programming, because it builds on existing student strengths and interests rather than importing a framework that may feel culturally foreign.

That is not a criticism of traditional band programs. It is a recognition that what works in one context does not automatically transfer to another, and that directing energy toward student strengths is more likely to produce retention than doubling down on a model that doesn't fit the community.

What teachers can actually do with this

The findings from Corenblum and Marshall point toward a few concrete areas of practice, though they're worth framing honestly: the model accounts for only 28% of the variance in intentions. A lot is going on that the study didn't measure. Scheduling conflicts, social dynamics, competing extracurricular options, family circumstances that have nothing to do with music. No model of this kind captures the whole picture.

With that caveat on the table, here is what the data actually supports:

  • Teacher evaluations matter more than grades. If you evaluate students' musical performance in a composite way, track that data, and notice trends, you may be seeing something that a gradebook doesn't tell you. A student whose composite evaluation is trending downward over a semester is not the same as a student whose official grade is holding steady.

  • Perceived parental support is downstream of your evaluation of the student. When you communicate positively about a student's progress and potential, that student is more likely to believe their parents support the program. Conversely, when students feel like their teacher doesn't rate them highly, they may read that as a signal that even their parents think band isn't right for them. The way you talk to students about their progress has social ripple effects you may not be aware of.

  • School climate and institutional support are real variables. If your administration doesn't visibly support the band program, students from lower socioeconomic backgrounds especially will notice. Parent concerts, assembly performances, and public recognition of the program are not just nice-to-haves. They are inputs into the social perception chain that Corenblum and Marshall document.

  • Outside musical engagement correlates with retention. Students who play with friends, sing in choir, or pursue music beyond class are more likely to stay. This isn't something you can force, but it's something you can encourage. A casual ensemble before school, a recommendation to join a community youth band, a suggestion to a parent that private lessons would be beneficial: these recommendations are not just about skill development. They are, the data suggests, related to whether students see music as part of their identity outside the classroom.

  • Attitude surveys won't give you an early warning. If you're relying on students saying they enjoy band to identify who's at risk of leaving, the research suggests that's not a reliable signal. What matters is not how students feel about band but whether the structural conditions for continued participation are in place: positive teacher evaluation, perceived parental support, school-level endorsement, and outside musical activity.

The harder implication is this: if you want to know who is at risk, you need visibility into practice behavior, engagement patterns, and evaluative trajectory over time, not just periodic check-ins on attitude.

Source: (). The band played on: Predicting students' intentions to continue studying music. Journal of Research in Music Education, 46(1), 128-140.

Blini logo

How we think about this at Blini

We read Corenblum and Marshall early in our thinking about why band programs lose students, because the question they asked is the same one that motivated us: what can you actually see before a student leaves?

Their answer, at the structural level, is that dropout risk is distributed across a network of variables: teacher evaluations, perceived parental support, socioeconomic context, outside musical engagement. None of these are single, visible events. They are slow-moving conditions that accumulate over months. By the time a student is visibly disengaged in rehearsal, the decision has often already been made.

Where Blini fits

Blini doesn't measure socioeconomic level or parental attitudes. We're a practice tracker, not a sociological profiler. What we do track is home practice behavior: whether students are practicing at all, how consistently they're showing up to the instrument, how long their sessions run, and whether those patterns are shifting over time.

That's directly relevant to one of the paths Corenblum and Marshall document. Outside musical activity was a significant predictor of intentions to continue. Home practice is the most common form of musical activity outside class. A student who stops practicing at home is not just falling behind technically. They may be quietly withdrawing from musical identity itself, weeks or months before anyone at school notices.

The teacher dashboard in ensemBlini shows this at the class level and the individual level. You can see which students haven't logged a practice session in two weeks. You can see whether a previously consistent practitioner has gone quiet. You can't see why, but you can see that it happened, and that's the first step to having a conversation before it becomes a withdrawal.

The screen problem

One of the structural arguments in Corenblum and Marshall's paper is that extracurricular musical activity has to overcome costs not associated with regular school activities. Getting together to play with friends, attending choir, taking private lessons: these all require time, transportation, and sometimes money. Adding app friction to home practice, asking students to open a phone, log in, record themselves, or interact with a screen before they can start, adds to those costs in a small but real way.

Blini removes that friction entirely. The device clips to the music stand. The student picks up their instrument and plays. Nothing else is required during the session. If that sounds like a minor convenience, consider it from the perspective of a 14-year-old who is already ambivalent about practicing. Every additional step between "sitting down" and "making music" is a reason to do something else instead.

What the teacher gets

When practice data flows into ensemBlini, you get class-level visibility without chasing down anyone. You can see participation rates week over week, spot the students whose session frequency is dropping, and have data-informed conversations with parents instead of vague impressions. If Corenblum and Marshall are right that perceived parental support is shaped partly by how well-rated a student feels in the program, then having a concrete practice log to show a parent ("your child practiced five times this week and averaged 22 minutes") is a way to support that perception with real evidence.

Blini-bites

Quick answers: each question has its own link.

1

What are the strongest predictors of whether a high school student will continue in band?

According to Corenblum and Marshall (1998), writing in the Journal of Research in Music Education (Vol. 46, No. 1, pp. 128-140), the two strongest predictors were socioeconomic level and teacher evaluations of student performance. Both predicted students' intentions to continue in band directly, and both also predicted intentions indirectly through other variables like perceived parental support and outside musical interests. Course grades, surprisingly, did not independently predict intentions once teacher evaluations were accounted for.

#blini-bite-1
2

Does a student liking band class predict whether they'll stay in the program?

Not meaningfully, according to Corenblum and Marshall (1998). When the structural model was fully specified, student attitudes toward band showed no significant positive relationship with intentions to continue. The bivariate correlation between attitude and intention was near zero (mean r = .09). Perceived parental support, teacher evaluations, and outside musical activity were all stronger predictors than how positively students reported feeling about band.

#blini-bite-2
3

How does the teacher's evaluation of a student affect retention?

In Corenblum and Marshall's (1998) model, teacher evaluations predicted intentions to continue both directly (Beta = .25) and indirectly through students' perceptions of parental support. The more positively a teacher rated a student's musical performance, the more that student believed their parents supported the program, and perceived parental support independently predicted intentions to continue. This suggests that teacher evaluations carry social signals that students use to interpret the attitudes of others around them.

#blini-bite-3
4

Why does the biggest dropout from band happen at the high school transition?

Corenblum and Marshall (1998) cite Timmerman (1977) in noting that the largest single-year decrease in band enrollment occurs when students first enter high school. The transition brings new scheduling demands, new social dynamics, and a natural decision point about which activities to continue. The ninth-grade year is when cumulative risk factors like low teacher evaluations, weak perceived parental support, and limited outside musical activity are most likely to tip a student toward withdrawal.

#blini-bite-4
5

Do course grades predict whether students stay in band?

No, based on Corenblum and Marshall's (1998) findings. Despite being correlated with teacher evaluations (r = .28), course grades did not predict intentions to continue in band through any path in the structural model. The authors suggest that teacher evaluations, which integrate ratings, rankings, and historical performance data, carry more unique predictive variance than a course grade does on its own.

#blini-bite-5
6

How does socioeconomic level affect band retention?

Corenblum and Marshall (1998) found that socioeconomic level was a strong predictor of retention through two main paths: it predicted perceived parental support (Beta = .79) and outside musical interests (Beta = 1.22), both of which independently predicted intentions to continue. Socioeconomic level also strongly predicted perceived school support (Beta = 1.00), which fed into a chain reaching student attitudes and eventually intentions. Higher socioeconomic backgrounds provide financial resources, time, and a cultural environment where musical participation is normalized.

#blini-bite-6
7

Does playing music outside of class reduce dropout risk?

Yes, according to Corenblum and Marshall (1998). Outside musical interests, including activities like singing in choir, playing with friends, or taking private lessons, were a significant predictor of intentions to continue in band. These activities are also predicted by socioeconomic level, since they involve costs not associated with regular school activities. Students who engage musically beyond class hours appear to be developing a musical identity that supports continued participation.

#blini-bite-7
8

How does parental support influence whether students stay in band?

Corenblum and Marshall (1998) found that perceived parental support predicted intentions to continue in band, and that this perception was shaped by both socioeconomic level and teacher evaluations of the student. Importantly, it is the student's perception of parental support, not necessarily the parents' actual attitudes, that appears to matter. Research cited by the authors suggests that children's beliefs about parental attitudes can diverge significantly from those parents' actual views.

#blini-bite-8
9

What can band directors do to reduce dropout risk based on this research?

Corenblum and Marshall (1998) suggest several practical directions. Communicating positive evaluations of student progress can shape students' perceptions that adults around them support the program, which independently predicts intentions to stay. Encouraging outside musical involvement, including informal music-making with peers, may also reduce risk. In lower-income schools, adapting programming to reflect students' existing musical interests and cultural backgrounds may be more effective than maintaining traditional band formats that feel disconnected from students' lives.

#blini-bite-9
10

Is it worth surveying students about their attitudes toward band as an early warning for dropout?

Based on Corenblum and Marshall (1998), attitude surveys are unlikely to serve as reliable early warning signals. Student attitudes toward band had no meaningful relationship with intentions to continue once other predictors were accounted for. Factors like perceived parental support, teacher evaluations, and outside musical activity were far stronger predictors of who intended to stay. If the goal is early detection of dropout risk, behavioral indicators (like whether a student is engaging musically outside class) are likely more informative than attitude reports.

#blini-bite-10
11

What percentage of dropout variance can be explained by known factors?

Corenblum and Marshall's (1998) structural model accounted for approximately 28% of the variance in students' intentions to continue in band. The authors acknowledge that factors not measured, such as competing extracurricular demands, scheduling conflicts, and non-musical priorities, likely account for meaningful additional variance. This finding is a reminder that even a well-specified model captures only part of the story, and that individual circumstances often matter in ways that aggregate data cannot fully predict.

#blini-bite-11
12

How does perceived school support affect student retention in band?

In Corenblum and Marshall's (1998) model, perceived school support (students' beliefs about whether their school values the band program) predicted student perceptions of band teacher attitudes (Beta = .52), which in turn predicted student attitudes toward band. Perceived school support was itself strongly predicted by socioeconomic level (Beta = 1.00), meaning students from lower-income backgrounds were less likely to perceive institutional support for music. This chain of social perception links the school environment to individual student decisions about whether to continue.

#blini-bite-12

See which students are quietly pulling back before they quit

ensemBlini shows you practice patterns at the class level the moment data starts flowing in.

Try the demo