There has arisen a kind of parallel network – a lot of it is on the Internet, a lot of it is free – where people teach themselves things, often very effectively. But there is a kind of elitist bias: people who are good at using this content are people who are already self-motivated.
The better technology gets, the more human imperfections matter. Think about medicine: the better pharmaceuticals get, the more it matters which people neglect to actually take them in the right doses. Education is entering the same kind of world. There’s so much out there, on the Internet and elsewhere. It’s great; but that means that human imperfections, like just not giving a damn, will matter more and more.
What concrete changes would I make in schools? The idea that you need to take a whole class to learn some topic is absurd. Whatever you’ve learned is probably going to be obsolete. A class is to spur your interest, to expose you to a new role model, a new professor, to a new set of students. We should have way more classes which are way shorter. It should be much more about learning, more about variety, give up the myth that you’re teaching people how to master some topic; you’re not! You want to inspire them; it’s much more about persuasion, soft skills.
Related, competency-based education:
For the most part, colleges and universities have changed very little since the University of Bologna gave the first college lectures in 1088. With the exception of Massive Open Online Courses, or MOOCs—free lectures and courses on the Internet—most university learning still requires students to put their butts in seats for a certain number of hours, complete a list of courses, and pass tests demonstrating that they learned from those courses (or were able to successfully cram for over the course of a few days).
But a new model is upending the traditional college experience, and has the potential to change the way universities—both new and old—think about learning.
Called competency-based education, this new model looks at what students should know when they complete a certain degree, and allows them to acquire that knowledge by independently making their way through lessons. It also allows students who come into school with knowledge in a certain area to pass tests to prove it, rather than forcing them to take classes and pay for credits on information they already know.
A model that focuses horizontally on the accreditation function of schools, rather than competing with the full vertical stack offered by a university. Seems like a model that could be useful in companies, or to companies, as well. Today, for many job functions, say product management, a college resume only obliquely hints at competencies, it functions more as some signal of one's generalized learning ability and willpower.
Software engineering interviews have a version of competency tests in the form of coding questions or challenges, but lots of business competencies aren't really tested optimally with a live interview. The ideal interview is over a longer period of time, goes into more depth, and in its most optimal form may be just an internship, but not all candidates are willing or able to do an internship, especially those who aren't in college or just graduating.
Face-to-face interviews are good for testing chemistry (which makes it a reasonable method for job roles where that's a key attribute, like sales), but they are susceptible to all sorts of unconscious biases and often just flatter the interviewers into believing in their powers of observation. It would be interesting to compare face-to-face interviews to a competency-based interviewing method that eschews in-person exchanges altogether. As radical as that might sound, I'm confident it would lower many types of discrimination.