The cat is slowly scratching its way out of the bag. More people are becoming aware of the colossal waste of money, tragic waste of young people’s time, and cruel imposition of stress and anxiety produced by our coercive educational system.
Children come into the world biologically designed to educate themselves. Their curiosity, playfulness, sociability, and willfulness were all shaped by natural selection to serve the function of education (here). So what do we do? At great expense (roughly $15,000 per child per year for public K-12), we send them to schools that deliberately shut off their educative instincts—that is, suppress their curiosity, playfulness, sociability, and willfulness—and then, at great expense and trouble, very inefficiently and ineffectively try to educate them through systems of reward and punishment that play on hubris, shame, and fear.
Research shows that for far less expense, and with joy rather than pain, we can facilitate, rather than suppress, children’s and teens’ natural ways of educating themselves with excellent results (see here and here). More families are becoming aware of this and are finding ways of removing their children from imposed schooling in favor of self-directed education (here).
Most of my previous writing about education has to do with the years that we unfortunately think of as “the K through 12 years” (as if education is or ever could be a graded thing in which learning is staged along an assembly line). I have written about how doing away with the whole graded system and letting young people do and learn whatever interests them at any given time, in age-mixed settings, works so well in schools such as Sudbury Valley and the many other settings that have been developed to facilitate self-directed education (e.g., here and here).
But what about those years of schooling that we call “higher education,” especially the four years toward a college degree? Many young people, because of family and societal pressure, see that as essentially compulsory, too. For them, college is just a continuation of high school—grades 13, 14, 15, and 16. And those years of schooling are even much more expensive than the earlier ones, an expense that must generally be paid by the parents or through loans that can saddle a person for decades. Moreover, there is growing evidence that very little is actually learned in those years. Fundamentally, college is a socially sanctioned system of discrimination.
Here’s how one college professor, Shamus Khan, who is critical of the endeavor he is part of, has put it: “I am part of a great credentialing mill. … Colleges admit already advantaged Americans. They don’t ask them to do much or learn much. At the end of four years, we give them a certificate. That certificate entitles them to higher earnings. Schools help obscure the aristocratic quality of American life. They do so by converting birthrights (which we all think are unfair) into credentials (which have the appearance of merit).”
Recent studies have documented the paucity of actual learning that occurs during the years of college. Because of the way we structure it, education in college is that commodity for which people try to get the least they can for their money. This was true even when I was in college decades ago, and it is even truer today. Research shows that average study time per week for college students has declined from about 25 hours in 1960 to about 12 hours now and that students commonly avoid courses that call for original writing or considerable amounts of reading.
College administrators have long argued that the main benefit of college is a gain in critical thinking, but systematic studies show that such gains are actually quite small overall, and for approximately 45% of students they are non-existent.  I’ve so far been unable to find any evidence that critical thinking improves over four years of college any more than it would have, in the same or similar people, if they had spent those four years doing something else. In a recent survey, by PayScale Inc., 50% of employers complained that the college graduates they hire aren’t ready for the workplace, and the primary reason they gave is a lack of critical thinking skills. The rote ways of learning, which are endemic to high schools and involve little or no critical thinking, are increasingly the ways of college as well. My own observations suggest that critical thinking grows primarily through pursuing one’s own interests and engaging in serious, self-motivated dialogues with others who share those interests, not from standard classroom practices.
I don’t know just how or how fast the change will happen, but I think the days of K-12 and four years of college are numbered and sanity will begin to prevail in the educational world. I envision a future with something like the following three-phase approach to education:
Discovery: Learning about your world, your self, and how the two fit together.
The first 15 to 18 years of a person’s life are ideally, in this view, years of self-directed exploration and play in which young people make sense of the world around them, try out different ways of being in that word, develop and pursue passionate interests, and create at least a tentative plan about how they might support themselves as independent adults. This is what happens already with young people educating themselves in schools or learning centers designed for self-directed education or in home-and-community-based self-directed education (commonly called “unschooling”). In my vision for the future, publicly supported learning-and-recreation centers will enable everyone, regardless of family income, to educate themselves well in these ways (here).
Exploring a career path.
One of the many problems with our current educational system is that even after 17 years of schooling, including college, students have very little understanding of potential careers. The only adult vocation they have witnessed directly is that of classroom teacher. A student may have decided, for some reason (maybe because it sounds prestigious), to be a doctor, or a lawyer, or a scientist, or a business executive, but the student knows little about what it means to be such a thing.
In the rational system of education that I have in mind, students would spend time working in real-world settings that give them an idea of what a career entails before they undertake specialized training for that career. For example, the person interested in becoming a doctor might work in a hospital for a period of time, maybe as an orderly or a medical assistant. Maybe it would be an official apprenticeship, with a bit of course work as part of it, or maybe just a regular job. By this means, the person would see and interact with doctors in their real-world practice and experience directly some of what it is like to be a doctor, which would enable him or her to make an informed decision about this as a career path. Do I like being in hospitals and around sick people? Do I have the kind of compassion and fortitude, as well as thinking skills, required to be a good doctor? If the answer is no, then it is time to try out a different career path.
The same is true for any other career. The person interested in law might work in a law office; the person interested in being a scientist might work as a lab assistant or field assistant; the person interested in becoming an engineer might work as an engineering apprentice. In this way, they would further their education and gain real-world experience while drawing at least some income rather than accumulating debt. In the process, the person would get to know, and be known by, professionals in the realm of his or her potential career, who could write recommendations that would help in applications for further training or advancement.
Already many companies, recognizing that a typical college education doesn’t prepare people well for their kind of work, have apprenticeship programs. According to the U.S. Labor Department, the number of apprenticeships available in the United States rose from about 350,000 in 2011 to about 450,000 in 2015 and is continuing to rise . As examples, BMW has an apprenticeship program in Spartanburg, SC, for training engineers (here), and at least one commercial insurance company offers apprenticeships in claims adjustment and underwriting (here)—jobs that formerly required a college degree.
Becoming credentialed for specialized work.
For some sorts of work, it is crucial to be sure that the people doing it know what they are doing. Those are the jobs for which specialized training, guided by experts and evaluated by rigorous testing, may be essential. Before I engage a surgeon, dentist, lawyer, electrician, or plumber I want to be sure that the person has been credentialed and licensed through means that include proof of competence. This is the only phase of the educational system where testing should be essential. Such credentialing might in some cases be part and parcel of an apprenticeship, or in other cases occur in schools for professional training, such as medical, engineering, or other vocational schools. So, the young woman who has explored a medical career by working as a medical assistant might, at some point, apply to medical school. For admission, she would have to present evidence that she knows what she is getting into and has prepared herself adequately to begin such training; and then, at the end, she would have to prove competence in whatever medical specialty she had chosen.
I think with this system we will have far fewer unhappy doctors, lawyers, business executives, and so on than we do now and far more happy ones.
I’ve described this all as a vision for the future, but it is a future that is already on route to becoming. As I said, ever more families are finding alternatives to standard K-12, and ever more businesses are finding that they would rather train employees themselves, through apprenticeships and other means, than rely on college degrees as evidence of competence. The numbers are still relatively small, but they are increasing.
What will happen, in this vision, to the educational institutions we currently have in place? The graded K-12 schools will gradually disappear, replaced by age-mixed learning centers supporting self-directed education. Universities will continue on, with public support as centers of research and scholarship. They will not enroll “students,” as we think of them today, but, like other institutions, will bring in assistants and apprentices, some of whom may move on, through experience and desire, to become full-fledged scientists and scholars. Community colleges, which already provide useful, often hands-on training for a variety of careers at relatively low cost, may expand and become part of a growing system of apprenticeships that involve some classroom training related to potential employment.
And now, what do you think about this? … This blog is, in part, a forum for discussion. Your questions, thoughts, stories, and opinions are treated respectfully by me and other readers, regardless of the degree to which we agree or disagree. Psychology Today no longer accepts comments on this site, but you can comment by going to my Facebook profile, where you will see a link to this post. If you don't see this post at the top of my timeline, just put the title of the post into the search option (click on the three-dot icon at the top of the timeline and then on the search icon that appears in the menu) and it will come up. By following me on Facebook you can comment on all of my posts and see others' comments. The discussion is often very interesting.
1. Erik Hayden. Study says college students don’t learn very much. The Atlantic, Jan. 18, 2011.
2. Richard Arum & Josipa Roksa. Academically Adrift: Limited Learning on College Campuses. Chicago University Press. 2011.
3. Douglas Belkin. Exclusive test data: Many colleges fail to improve critical thinking skills. The Wall Street Journal, June 5, 2017.
4. David Paulson. Apprenticeships: College without debt. USA Today, March 23, 2016.