In defense of teaching yourself the material

Robert Talbert
6 min readMar 11, 2021

--

Having students teach themselves things is a feature, not a bug in higher education and we need to stop apologizing for it.

Student studying in a lecture hall — Photo by <a href=”https://unsplash.com/@flipboo?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText">Philippe Bout</a> on <a href=”/s/photos/study?utm_source=unsplash&utm_medium=referral&utm_content=creditCopyText”>Unsplash</a>
Photo by Philippe Bout on Unsplash

I believe in lifelong learning. Not in a hazy, college-mission-statement sort of way but as a concrete goal for college pedagogy, realized in our teaching and course design and actualized in the day-to-day work of the class. I’m interested in, and have been evangelizing for, pedagogical models that don’t just talk about lifelong learning but make it real, by giving students tangible tools and experiences to get them engaged with it now and not hoping that lifelong learning sort of “happens” at some point in the future. I really don’t care about any other kind of teaching. In fact, pedagogical models that don’t give students these tools and experiences, along with real support and guidance, are just performance art — cosplaying as educators. Such models are not serious about students and have no place in the post-pandemic world, except maybe in the movies.

What are those tools and experiences? I’ve been exploring that question for years, not only as a classroom instructor but as a parent and a human being in an ongoing state of development. I have studied people who seem to have acquired lifelong learning successfully and adapted to change, perhaps reinvented themselves long after formal schooling was over. There seem to be two important skills all these people have in common: the ability to self-regulate and the ability to self-teach.

I’ve written about self-regulation before (here, here, here), but I don’t believe I’ve ever addressed self-teaching directly, until this post. It may be the first post on this, but it won’t be the last. We in higher education need to come to grips with the concept that college students can, and should, be engaged in intentional self-teaching while in college. Colleges that can make this happen and do it well, will thrive; colleges that can’t or won’t, will slip into irrelevance, and justifiably so.

There’s a lot to discuss here. For now, I am just going to define what I mean and then lay out some theses that will get developed further, later.

What do I mean by “self-teaching”? I mean exactly what it says, no more and no less: Self-teaching is the act or process of approaching a body of knowledge and using informational and intellectual tools at hand to internalize it, without complete dependency on another human being. This is also called autodidacticism. The Wikipedia article at that link says that

Generally, autodidacts are individuals who choose the subject they will study, their studying material, and the studying rhythm and time.

I would add that autodidacts also engage in self-regulation to monitor and regulate their affect and motivation while learning, and have some sense of knowing when they do or do not know “enough” relative to their goals for learning, and know how to use intellectual and informational tools at their disposal to learn more if they believe they need to.

As I mentioned in my recent talk on flipped learning in a post-pandemic setting, the notion that student should not self-teach college level material — in fact that they cannot and so it’s highly inappropriate to ask them to do so — is baked into the axioms of the higher education world we created. How many of us has ever seen the comment “We’re expected to teach ourselves the material” listed as a positive student perception? But I think that assumption is broken — not only objectively false, but fundamentally unworkable for the future of higher education as we move forward from the pandemic.

Here are five theses about this. I do not offer empirical support for these but rather just want to nail them to the door of higher education for now.

  1. Self-teaching is not only appropriate for higher education, it is the one true end goal of higher education. It’s called “higher” education for a reason — it’s education about education itself. Meta-education would be an acceptable synonym. If we run students through our organizations, and take their money, but do not equip them to survive as learners without us — if students can’t and don’t want to teach themselves new things — it doesn’t matter what their diploma says: We failed them.
  2. All human beings have an innate ability and desire to teach themselves. You can see this just by spending time around kids. I have three of them in my house, all teenagers. I have participated in their growth and learning for 17 years, and everything that is important to them is something that, ultimately, they taught themselves. This ability is what makes us human. The ongoing desire to learn is what keeps us human. Taking it away is therefore dehumanizing.
  3. Higher education institutions must orient themselves toward teaching students how to teach themselves, or risk becoming irrelevant. I’ll add to the above that self-teaching (and self-regulation) are also highly valuable job skills. During my time at Steelcase, I learned that what the company wanted was not so much a recent college graduate with straight A’s, but someone who could learn quickly and get up to speed without having to pull someone else off their job to teach them. So self-teaching is not only the ultimate goal of higher education and the main instantiation of lifelong learning, it’s also what gives graduates a competitive advantage on the job market and sets them up not to be stuck in a loop for their careers. I want my students to be able to say truthfully in an interview, when asked why they should be hired: I know how to learn things, and learn fast, without a lot of hand-holding. That is music to the employer’s ears. The word will get out about which colleges equip students well in this area. Similarly for those that don’t.
  4. That innate ability is raw and therefore needs guidance, structure, and practice. But it’s one thing to teach yourself how to ride a bike or play Call of Duty, and another thing to teach yourself what the definition of a derivative is, or how to find the zero divisors in a ring. We all know that college students know how to learn, but we also perceive that this ability is rough around the edges. That’s where we faculty come in. We do need to engage students in self-teaching because this is what college is for. But you don’t do that by simply stepping away from the course and letting them handle it all themselves. College is hard; students need support.
  5. Frameworks exist for doing all of this, and we need to move toward those frameworks and away from others. The first step toward support is the widespread adoption of teaching frameworks that support self-teaching. Flipped learning is one such framework. I evangelize for flipped learning because self-teaching and self-regulation are at the heart of the process — measured out appropriately so that students’ self-teaching is limited just to the lower third of Bloom’s Taxonomy. It engages students in appropriate practice for self-teaching in a zero-sum situation that does not have to add onto their workload; it just swaps one kind of work for another and it’s done with a structure in place that helps them bootstrap into it. Other frameworks are also available. Frameworks that promote dependency on others, on the other hand — I’m looking at you, lecture — are things we should be actively moving away from, right now.

It’s time for colleges and universities to stop apologizing for having students teach themselves things. Instead, it’s time to lean into it, and do it right.

Originally published at http://rtalbert.org on March 11, 2021.

--

--

Robert Talbert
Robert Talbert

Written by Robert Talbert

Math professor, writer, wannabe data scientist. My views != views of my employer.