Pages

Thursday, August 27, 2015

Higher Education: Both a Private and a Public Good


In the September 2015 issue of Harper’s Magazine, William Deresiewicz writes about “How college sold its soul to the market.”  He argues, effectively, that American higher education has essentially abandoned its traditional obligation (in the words of one institution’s founder in 1920) “to develop in its students the ability to think clearly and independently, and the ability to live confidently, courageously, and hopefully.”  Instead, he writes, American colleges and universities have become focused on several buzzwords:  leadership, service, and creativity—things, notes Deresiewicz, which have little to do with thinking or learning.   He ties this change to the rise of “neoliberalism,” and notes:  “The purpose of education in a neoliberal age is to produce producers,” adding that “only the commercial purpose survives as a recognized value.”  The notion that higher education “might prepare you for life by inciting contemplation and reflection,” he reports, “. . . is typically dismissed.”
            Historically, American higher education has been founded in a social mission.  The early church-related private colleges—Harvard, Yale, etc.—emphasized the liberal arts in order to prepare leaders for their communities.  The 1828 report of the Yale College faculty defended the liberal arts curriculum against what Yale faculty saw as an external threat:
It is said that the public demand that the doors should be thrown open to all; that education ought to be modified, and, and varied, to adapt to the exigencies of the country, and the prospects of different individuals, that the instruction given tot hose who are destined to be merchants, manufacturers, or agriculturists should have special reference to their professional pursuits. 
            In response, they called for a prescribed curriculum organized around the “discipline and furniture of the mind.”  Notes the report:  “Our object is not to teach that which is peculiar to any one of the professions, but to lay the foundation which is common to them all.”
            A generation later, two factors began to change that view.  One was the rise of research as a core academic mission, which created increasingly specialized academic studies.  The second was the blossoming of the Industrial Revolution and with it new professions, new disciplines—the social sciences, for instance—and a vastly larger number of students who needed higher education in order to meet demands of work in a vastly more complex economy.  Normal schools—which evolved into our systems of state college and universities—were created in the 19th century to train the multitude of teachers needed to educate the children of immigrants.  At the same time, Land Grant Universities were established to ensure education in “the practical and mechanical arts” and to translate research into practice.  In the 20th century, community colleges emerged to meet local workforce needs.  All of these innovations served to create professionals for both a personal and a social benefit.
            The result was a complex system of higher education institutions across the nation—more than 3,000 colleges and universities—offering a variety of curricula formed around a diversity of research interests, state and local economic and workforce needs, and social philosophies.   This diversity has been both a social and economic advantage over decades of rapid social, political, economic, and technological transformation.  It has given us multiple starting points for new ideas and multiple perspectives on how to achieve innovation and positive social change.  That diversity is as important today as it was during the height of the Industrial Revolution.  In fact, diversity may be more important today, given the rapidity of change in almost every aspect of our lives.
            The American Association of Colleges and Universities (AAC&U) is focusing on this issue in a couple of ways.  As reported recently in Inside HigherEducation, the association defines liberal education as an:
“approach to learning that empowers individuals and prepares them to deal with complexity, diversity and change. It provides students with broad knowledge of the wider world (e.g. science, culture and society) as well as in-depth study in a specific area of interest.”
          
            AC&U’s longstanding Liberal Education and America’s Promise (LEAP) project is attempting to reposition liberal arts within the curriculum.  As Inside Higher Education reported, “The decade-old effort seeks to bridge what AAC&U sees as a false dichotomy between the intellectual and the practical in higher education, with a narrow, vocational training for some students on one side, and a more ethereal, high-minded liberal education for the lucky few at residential colleges.”  Part of that bridge is a focus on instructional process—the idea that students learn best via “deep, hands-on learning with collaborative assignments and major “signature” projects” and that the emphasis must be on “deep, hands-on learning with collaborative assignments and major ‘signature’ projects.”
            Clearly, the issue here is not just a tug-of-war between liberal education and professional education.  Instead, the challenge is for higher education to prepare students not only for jobs, but as individuals prepared to face the challenges of change and to take their place as members of a community—a reminder, I suppose, that many of the costs of an individual’s higher education are paid by taxpayers through direct state appropriations to colleges and universities and through state and federal scholarships and loans.   For much of my career in higher education, there has been a debate over whether higher education is a public good or a private good.  We need to acknowledge that a higher education—and our curriculum—should serve both goals.