What Soviet-Era Production Can Teach Us About Innovation In Higher Ed
Last month’s celebration of the 30th anniversary of the fall of the Berlin Wall had me thinking back to my graduate studies of soviet-type economies. A consistent theme: When bureaucrats control production, perverse incentives take hold and innovation comes to a standstill.
Soviet central planning authorities set performance measures for each industry based on some key metric such as quantity or tonnage of product produced. Because bonuses were tied to these metrics, plant managers had a perverse incentive to meet the quantitative target but nothing else. Little attention was paid to quality. Innovation and product differentiation took a back seat to meeting the target.
Soviet propaganda poster by Nicolai Kotcherguin: ‘On the ruins of capitalism, let us walk towards … [+]
The phenomenon was well-recognized enough to be lampooned in popular culture. Americans who came of age in the 1980s remember the Wendy’s ad depicting a soviet fashion show. In a heavy Russian accent, the announcer called out, “Day-vare…Evening-vare… Svim-vare” as the model lumbered up and down the runway. The joke? She wore the same gray sack-like garment, no matter the occasion. Economist Paul Craig Roberts recalled, “a famous Soviet cartoon depicted the manager of a nail factory being given the Order of Lenin for exceeding his tonnage. Two giant cranes were pictured holding up one giant nail.”
Though apocryphal, the soviet fashion show and big nail story pointed to a widely-felt frustration within Soviet economies. The singular focus on a metric imposed from the top-down obliterated any incentive to innovate or differentiate production according to what customers might actually want.
One can hear similar frustrations in Patricia McGuire’s recent commentary. McGuire, president of Trinity Washington University asks, “Has there ever been an enterprise that produced so much data to so little effect as higher education? We are drowning in data, awash in analytics… Here’s a heretical thought: Perhaps the problem is not a lack of data, but rather, that metrics alone are a poor measure of accountability.”
McGuire is identifying a phenomenon that sovietologists will recognize. While measurement is necessary, in higher ed we seem to fetishize specific metrics at the cost of what’s meaningful. She notes, for example, that low 6-year graduation rates are reported as if the institution is failing its students. But in the case of non-elite schools, many students intend to stay only for a year or two, as a stepping stone toward a more specialized or more selective institution. In other words, the institution that helps its students achieve their educational goals is doing its job, but its retention rate may not tell that story.
As McGuire notes, “statistics are no substitute for professional judgment about the meaning of data for a specific institution. Unfortunately, magazine rankings and the federal College Scorecard choose to present isolated data points as institutional quality measures without interpretation.”
“We are drowning in data, awash in analytics …”
-Patricia McGuire, president of Trinity Washington … [+]
To be fair, it’s easy to see why metrics loom so large in higher ed. College enrollment is on an eight-year decline. Administrators are struggling to justify the rising price of a four-year education. Prospective students and their families rarely have a clue about how to assess the quality of the higher ed choices before them. The tuition-paying, tax-paying public is, understandably, eager for clarity and accountability.
And there are forces that that attempt to push in this direction, from federal compliance standards, to regional accreditation, to state commissions of higher education. But does the measurement make a difference? Does it actually improve the quality of the education students receive or inform the college search process?
Professor of history and associate dean at Arkansas State University Erik Gilbert is skeptical. As an academic administrator Gilbert is deeply involved in the assessment of student learning. So, when it came time to help his own son select a college, you might think that he would read up on the institutions’ assessments of learning outcomes. What did it say, he wondered, that he didn’t? “It says that I, like virtually everyone else, don’t think that good assessment makes good universities and well-educated students or that bad assessment makes bad universities and poorly educated students. In fact, I am starting to wonder if assessment may actually do more harm than good.”
The main harm that standardization inflicts may be the disincentive it creates to innovate. While regional accreditors profess that they have no interest in overriding the distinct mission and culture of individual colleges and universities, the reality is that the institution’s safest course toward maintaining accreditation—and the flow of federal and state funds that tie to it—is to “run with the herd” and mimic the programs, personnel, and capital investments of other institutions. Innovate too much—cut areas that are weak to feed areas of strength, for example—and you give up the safety of the herd.
This will sound unfair to anyone who does the hard work of building a truly distinctive program—the signature interdisciplinary initiative, say, that actually is transformational for the students who take part. But evidence of that transformation will not get captured in standardized assessment metrics imposed from the top-down. Worse, the investments that are required to stay with the herd mean fewer dollars available to scale those innovative high-impact programs. In fact, standardized assessment virtually guarantees that such innovative programs will remain small and peripheral to the core business of the institution, feeding the sameness we see across the industry.
The motivation to impose standardized measures from the top-down is understandable. But as with soviet-era production, such metrics are, inevitably, more “accountability theater” than a sincere effort to measure what’s meaningful.