I’ve run into a number of educators recently who were critiquing the trendy-ness of modern education. “There is always something new,” they explain, “but they never last.” In fact, I’ve heard that dozens of times over the years when it came to online learning. It was even a question asked at my thesis defense for my master’s on online learning in the 1990s. Isn’t this just yet another passing educational fad? Almost twenty years later I can say with confidence that it is not a passing trend.

Nonetheless, many seem to be even quicker to judge something new in education as a passing or fading trend. Now it often seems to depend upon how long the concept makes frequent headlines in the news and blogosphere. The assumption is that it must be a passing trend if people are not writing articles about it.

MOOCs are a good example of this. In 2013 and early 2014, MOOC headlines where all over the place. There were bold claims that they would disruptive higher education and just as bold rebuttals that they would never replace what we do in traditional education. There were debates about their uses and other musing about how they might supplement middle and high school curricula, provide new employable skills, serve as a low-cost and high-impact form of professional development for teachers, and just serve as a way for more people to gain access to useful learning experiences apart from enrollment in a University or expensive tuition expenses.

Of course, there were also no shortage of critiques as first hints of data analysis came out about retention rates. People wrote about low “retention rates” as if it was proof that MOOCs are a failure. At the same time, others challenges this critique, noting that the intent of the learner is more important than some traditional measure of success used in formal schooling.

Then things slowed down over the last few months of 2014. There were fewer (but still plenty) of headlines. As such, I’ve had multiple conversations and listened to speakers use this decrease in media coverage as evidence that MOOCs are on the decline, that this was more hype than substance.

The problem is that this is not accurate. I reached out the people at EdX in November, inquiring about their enrollment. Following is their response.

Hi Bernard,

Thank you for your edX question. Please find our enrollment stats below.

October 2013: 2.31 million enrollments

October 2014: 6.26 million enrollments

Thank you.

Best,

R.

From 2.31 million to 6.26 million in one year! That sure doesn’t seem like a decline to me. If a sector of formal education saw that much of an increase in a twelve month period, it would certainly be in the headlines. The same is true for growth in almost any sector. My point is that there is a difference between the facts and the frequency or nature of media coverage. An innovation exists apart from its media coverage, and we are wise to not judge things too quickly based upon what we are seeing in our favorite education news sources.

In the case of MOOCs, I don’t think we’ve seen the last of the articles and blog posts. I suspect that there will be an ebb and flow to the coverage, but beneath all that we continue to see steady growth, new experiments, new successes, new challenges, new opportunities, and yet another educational technology initiative (like online learning) is likely to become a persistent and impactful part of 21st century education.

I apologize. This title is deceptive. Research doesn’t suggest that we should ban laptops, but I suspect that some will jump to such conclusions after reading a recent report comparing note taking on laptops and versus pen and paper.

Perhaps you saw the articles showing up on the web based upon a June 2014 study published by Pam Mueller and Daniel Oppenheimer. The original report was titled, “The Pen is Mightier Than the Keyboard: Advantages of Longhand over Laptop Note Taking.” If you have access to an academic library, I encourage you to find and read the original report in Psychological Science instead of getting the information second-hand from here or any other blog post or article.

The focus of the report is upon the impact of taking notes on laptops in class compared to using pen and paper. Early in the essay, they explain two hypotheses behind the benefits of note taking: the encoding hypothesis and the external storage hypothesis. The encoding hypothesis suggests that the benefits of note taking come from the process through which one goes to take notes. As people take notes they summarize, put things into their own words, create concept maps, etc. In contrast, people who just write word for word what is being spoken are less likely to get the encoding benefits from note-taking. There are also external storage benefits, which refers to the ability to review the content later. There is a record of research indicating that both are benefits of note taking.

With this in mind, the researchers set up an experiment involving 67 Princeton students watching a 15-minute TED talk while taking notes in their ordinary way. Some were asked to do it on a laptop. Others used a pen and paper. After 30-minutes of activities that distracted students from thinking about the video, they were given an assessment. The researchers found a positive correlation between the amount of notes taken and performance on the assessment. They also found a negative correlation between verbatim note taking and performance on the assessment. Students were more likely to take notes verbatim when using a laptop, although they tended to take more notes.

Then thy conducted a second study.  This time they examined 151 students from UCLA. Again, students were asked to watch a video and take notes with the following instructions:

We’re doing a study about how information is conveyed in the classroom. We’d like you to take notes on a lecture, just like you would in class. Please take whatever kind of notes you’d take in a class where you expected to be tested on the material later—don’t change anything just because you’re in a lab.” p. 1162

A second group got these instructions:

“We’re doing a study about how information is conveyed in the classroom. We’d like you to take notes on a lecture, just like you would in class. People who take class notes on laptops when they expect to be tested on the material later tend to transcribe what they’re hearing without thinking about it much. Please try not to do this as you take notes today. Take notes in your own words and don’t just write down word for word what the speaker is saying.” p. 1162

They completed the study using a similar approach to the first, taking the groups through activities that would distract students from thinking about the video followed by an assessment. The goal of this second study was to determine if simple instructions about the downside of verbatim note taking would potentially mitigate against the negative impact of such a strategy on a laptop. They found that long-hand note takers “performed better” on the assessment. The students who took notes long hand wrote less but also included fewer verbatim notes. The explain that, “The instruction to not take verbatim notes was completely ineffective at reducing verbatim content” (p. 1163).

They did a third study as well, this time listening to less interesting lectures and then coming back a week later to take a test on the content. Some were given ten minutes to study their notes. Others took the test right away. In the end, participants who had a chance to study and they took long hand notes outperformed all other groups.

In discussion of the three studies, the authors wrote, “The studies we report here show that laptop use can negatively affect performance on educational assessments, even—or perhaps especially—when the computer is used for its intended function of easier note taking” (p. 1166).

Now that I’ve briefly described the study (it is better to get a copy and read it for yourself), let’s get back to the terribly misleading title of my post, “Research Report Suggests That We Should Ban Laptops & Require Note-taking with Pens.” That title is a stretch. This study isn’t adequate to conclude such a thing, but it does challenge us to ask some questions. What are the potential implications for this study? Should it lead us to ban laptops from classrooms, requiring students to take all notes using a pen and paper? Or, while not the main purpose of this study, perhaps this serves as a wake-up call that it isn’t enough to instruct students to take notes and throw out a few words about how to do it. Reading this study, I was compelled to further understand the research on the most effective strategies for note taking in general. What really helps us learn? Once we discover that, what if we intentionally, persistently taught (not just told, but taught) students to be excellent note takers, with excellence being defined by the extent to which the notes help us remember and learn.

A study like this shouts for us to use the digital revolution in education as opportunity to get informed about something that has a long history in education, but there is limited common knowledge by teachers about what truly does and does not work…something like note taking. How many other common practices in classrooms are similarly promoted without a substantive understanding of the research behind the practice? As I ask myself this question, I must confess that my list is long. I have so much to learn. This is a wake-up call for us to dive into the research, maybe to conduct some of our own, but to build a growing and solid set of research-informed principles that can guide how we help students become high-impact learners.

In the meantime, perhaps there is wisdom in being caution about simply adding a new technology to an old practice, expecting no impact or maybe even hoping for something better. As schools are moving to one-to-one programs, this report is an important caution. How are those one-to-one schools teaching students about how to leverage these tools to improve their learning (based upon empirical research, not just assumptions) or how to set them aside for more effective alternatives (when the research supports that)? How are we teaching students to use note taking as an opportunity to think deeply about what they are learning, to grapple with the content in ways that is likely to increase understanding and retention of what is learned? Or, perhaps there are strategies that are completely different from traditional note taking, whether one is doing with a pen and paper or a laptop. Research reports like these remind me that, while there is much that we know about effective learning, there is so much more for us to learn. There are so many more studies to conduct.

openbookI’ve been an educator for twenty years. As I was participating in a lively Twitter chat recently, the moderator asked what professional development advice we would give to first year educators. I had no problem thinking about my own failures and challenges through the years and listing off a half-dozen tips. However, if I had to rank them, the one that I would put at the top of the list is this. Be an open book.

I’ve written about my first weeks as a middle school educator years ago, when I struggled with classroom management. What made the difference between my success and failure in those early weeks and that first year was one critical decision. Almost everything in me wanted to close my classroom door, hide my limitations as an educator, and hope that it would go away or that I would figure it out on my own. That decision would have ended my career as an educator. Instead, thanks to a wonderfully open and non-judgmental principal, I found the courage to walk into his office, explain my situation, my fears, my limitations as a teacher. I asked for help.

I’d love to say that ever since that time I’ve been completely comfortable opening up about my shortcomings and not trying to hide them, but that would not be the truth. It is true that I’m much more comfortable with being open, however, because I know that it can make me better. It can help me become the type of educator to which I aspire, or at least to get closer to that ideal.

This requires vulnerability, being what I am calling an open book. It means not just letting people look into your classroom and life as an educator, but asking…even begging for as much feedback as you can get from them. I’m talk about being really curious about how you are doing. Ask anyone and everyone to observe and share their thoughts and insights. Learn to use that feedback to grow as an educator. It might be inviting one or more colleagues, asking students to give you frequent feedback, or asking people who might have no direct connection to your teaching but can offer a fresh perspective and different set of insights.

There is good research to show that a key to growing and improving as an educator is what they call reflective practice. This is developing the ability to reflect on your practice as an educator, to review and critically analyze what you did, the results, and how you might adjust future behavior to get better results. Reflective practice is evidence in most or all people of excellence, whether it is a concert pianist, a pro golfer, a dancer, a comedian, a motivational speaker, a small business owner, a researcher, or an educator.

However, simply reflecting is not enough. You also need accurate feedback about what happened. Just asking about how you did and what results ensued might result in self-deception as much as self-discovery. This is where we benefit from getting feedback from multiple sources and perspectives. It doesn’t mean that you have to treat the student’s perception as 100% accurate. Nor do you need to accept without doubt the observations of a colleague. However, they all provide input. Combined, you are likely to get a richer and more accurate understanding of what is taking place. This means setting aside your ego, degrees, titles, and credentials. We can get excellent feedback from almost any source. Even if we don’t agree with their observations, they are giving us insight into how different people perceive your teaching, and that is valuable.

This plus that habit of reflective practice, prepares you to adjust your behaviors, collect more data from multiple sources and see if you are making progress. This simple approach can help you address how to increase student motivation, engagement, improved performance of as many learners as possible, improved positive relationships with students, more accurate and in-depth teaching of certain concepts, an improved classroom ethos, or any other valued aspect of your work as an educator.

It starts with a desire to improve and a willingness to do what it takes to improve, and this is about more than professional reading, attending conferences, and going to professional development days. No presentation, conference session or book will make you a better educator. Head knowledge is never enough. Excellence in teaching comes from practice, reflection, an openness to input from others, rich feedback, and adjusting your behaviors accordingly. Yes, there are many great concepts that can be learned through books and presentations, but it isn’t until you practice them and incorporate these other elements of reflections, feedback and adjustment that you reap the benefits.

I attended the mid-year graduation ceremony recently at the University where I’m honored to work, teach and serve. At the beginning, the President shared a few opening remarks. He said something about the “credential” or diploma that students would soon receive. “Your degree is not as much a certificate of completion as it is a marching order,” he explained. While I followed along with the rest of the ceremony, this short statement sent me on a two-hour mental journey.

Read my blog long enough, and you’ll see that I often write and reflect about credentials. However, the claim in this statement from the President posed a perspective that is in contrast to many current conversations about academic credentials. In some ways, his statement represented the diploma in a fascinating and different light. I’m sure he also sees the diploma as recognition for accomplishments and evidence of learning over the past years, but in this case, he represented the diploma as a form of marching orders, a sending off. The temporal destination is unknown, but the charge is clear. They are sent off from our University as representatives, ambassadors.  In fact, at my school, Concordia University Wisconsin, we sometimes refer to members of this community as Concordians. We have certain core values that make up what it means to be a Concordian. While we embrace the diverse gifts, talents, abilities and callings of each person; we also seek to nurture a set of common core values and convictions that collectively represent who we are as individuals and a community.

Diplomas really do have this element to them. There is a brand associated with different diplomas. That why many people think of a diploma from Harvard differently than a diploma from the local community college…but this identity starts before getting the diploma. Even being a Harvard dropout or a current student at Harvard starts to open doors for people. If you are someone associated with that brand and learning community, there are benefits. It could also be said that there are likely expectations of one associated with that brand as well.

This got me thinking about open badges in a new way, a new possible application of them. I’ve been thinking about open badges as a way to recognize or make visible some sort of achievement, accomplishment or as a symbol provided when a person demonstrates competence in an area. I still think of them in that way. Yet, what is keeping us from also using them as a way to identify affiliation with the brand of a movement, community, organization, or something else of value? What if we issued badges at the beginning, before there is an actual accomplishment, achievement or demonstrated competence. What if the badge were used to mark one’s start and commitment to a brand?

The fields in OBI already lend themselves toward such an application: description, criteria, issuer, issuing date, expiration date, etc. The expiration date would be a way to check in on a person’s commitment to the brand or community. Does it persist? Do they have new actions or accomplishments that can be recognized or made visible with additional or supplemental badges? Even without expiration dates, it would be easy enough to stamp badges with dates of membership, service or affiliation; allowing them to be yet another way to visually represent the “brands” with which they have been or are currently affiliated. The description or criteria fields could just as easily describe the nature and extent of the affiliation. In the end, such uses could be a way to generate an entirely new form of visual and verifiable resume.

Perhaps there are already cases or user stories of open badges being applied in such a way, but I have not noticed them. As such, this  possibility creates an entirely new(at least to me) set of options for how badges, which are notably visual symbols with meta-data, might serve as an additional way to manage and represent oneself online. Some might argue that this is just as easily done by listing your affiliations on a resume, and perhaps that is just as effective. Yet, one distinction here is that the affiliation and symbol is issued by the organization, adding a level of verification and potentially a measure of credibility and “klout” that exceeds self-reporting.

Have you seen such use of open badges? What possibilities can you imagine? What challenges and opportunities are created by such a use?

At Educase 2006, Georgia Nugent, the then president of Kenyon College gave a talk on “The Tower of Google.” You used to be able to listen to the entire talk here, but the link on their site stopped working a couple of years ago. It was a thought-provoking presentation, and one of her self-made buzz words stuck with me. She described her background as a classicist, but also explained her hope for the potential of technology in higher education. She called herself a Luddvocate (Here is a quick Wikipedia primer on Luddites if that is a new term for you). I can relate.

I still find  that the most intriguing books about technology were written by the self-proclaimed or often-labeled neo-luddites (Mumford, McLuhan, Ellul, Postman, Kirkpatrick Sale, Larry Cuban, Sven Birkerts…). These neo-luddites craft messages of caution. They plea for counting the cost of our technological escapades. They challenge the notion that technology is savior of the greatest social and human needs, and they highlight the adverse impact of technology in society. I read these texts and find myself shouting more than a few inner Amens to their sermons. These thoughtful texts give a perspective that I believe is valuable and needed in the modern world. Of course, there are some who, like the original Luddites, turn to violence and destruction (e.g. Theodore Kaczynski), and I’m quick and clear about rejecting those methods of dissent.

Luddism is not about being anti-technology in the same way that the Amish are not anti-technology. As I’ve written before, the Amish are not anti-technology as much as they are pro-community. Similarly, Luddism is about counting the cost of technological progress, not assuming that new technology is always a universal gain for humanity. It is recognizing the values-laden and intrinsically political nature of each technology. New technologies lead to new winners and losers. Luddism champions and gives voice to the losers in the race for technological progress.The original Luddites were moved to action by new machines displacing workers in the textile industry. For the sake of increased productivity, machines replaced people, and that affected their ability to feed their families. Luddism is about challenging us to be users of technologies instead of allowing ourselves to be used by them.

Then there is the advocate. While I’m not sure what how Nugent might further extrapolate on what it means to be an adovocate, I confess that I became one because I didn’t think I had much of a choice. Just like the original Luddites lost their fight against the vision of progress brought about by the industrial revolution, I suspect that many neo-Luddites will experience the same. What is a person to do? I distinctly remember struggling with this question throughout the middle of the 1990s, with my first regional presentation being about the negative implicat of technology in education at an educational technology conference in Chicago. I still remember the line of software vendors glaring at me with their arms crossed along the back of the surprisingly standing-room only session. I quoted Neil Postman freely as I warned about the “Faustian bargain” of new technologies. I didn’t, however, call for rejecting these developments. Instead, I took the hopeful position of striving to influence which technologies to amplify and which to muffle. I called it values-driven decision making. Identify your core values and convictions and let them drive your decisions about technology. Once you have clarity about those core convictions, you will be able to decide where, how, when, and about what to advocate. I felt good about my talk until the first question during the question and answer period, a principal wanting a list of the best software to use in his kindergarten classes. So much for values driving the decisions. I learned early on that technology in education had become a value of its own.

I found that many of my concerns about technology were connected to some what I considered to be dehumanizing effects of the industrial revolution. Don’t get wrong. I liked many of the benefit from the industrial revolution. It is just that I had and have hope that emerging democratizing technology might assist us in mitigating about some of the negative aspects of that era.

I had visions of Brave New World, 1984, the Giver and the dozens of dystopian stories of our future. So I chose to advocate for technology that seems to amplify values of democracy, access, and opportunity. It is what inspires me about the possibilities of everything from blended learning to online learning, alternative education to self-directed learning, open education to open badges and micro-credentials, personalized learning to adaptive learning software, project-based learning to social media, personal learning networks to communities of practice. It is why I can geek out about designing high-impact online learning communities while being a passionate supporter of existing and emerging physical third spaces (thank you Ray Oldenburg) that conjure a spirit of community. It is why I lobby for choice and variety in options for formal education. It is why I have degrees in both the humanities and instructional technology. It is why I am a champion for formal higher education why calling for those some institutions to resist the temptation to claim a monopoly on learning and knowledge, as if either were a commodity to be bought and sold. Each of these represent deep-seated personal values and convictions about truth, beauty, goodness, purpose, and what it means to be human. It shapes how I write and speak about the future, both forecasting and striving to create or influence possible futures. Ironically, it is the Luddite in me that drives me to be such an advocate for life and learning in the digital world. The Luddite is the one who cries out that our ideas have consequences, our convictions matter, human access and opportunity are noble causes and that all three should inform the futures that we help create.

“We’ve always done it that way.” Universities are rich with traditions and history, but it would be a mistake to think that what we see and experience in the Universities of the last 50 years mimic what came before them. Yes, perhaps certain teaching practices and structures have persisted. However, the curriculum has been in flux, adjusting to the broader changes in society.

Look at the history of higher education and it is a history of change. The first Universities in the world were in Morocco, Egypt and what is now Iran. Those were founded between 800 and 1100 AD.  The first Western Universities emerged near the end of the 11th century: the University of Bologna, the University of Paris and the University of Oxford. In the earliest Universities, areas of study were not nearly as extensive. Theology, medicine and law were among the dominant areas of study in these early years, and the modern concept of academic disciplines did not come along until the 1800s. They spread around much of the globe by the end of that century. Essentially, these disciplines emerged with the scientific revolution, with different disciplines eventually representing distinct methods and approaches to seeking and understanding “truth.” For example, we saw a shift from “natural historians” to physicists, biologists, and chemists. Early in the 20th century, we saw the growth of new disciplines in the social sciences, resulting in programs like psychology and sociology. It is not until the mid to late 1900s that we see saw rapid growth of modern programs like nursing, business, and a host of specializations in areas like gender and ethnic studies. As such, the modern idea of a University offering hundreds of majors is indeed a modern idea. Many of the largest disciplines in colleges today have a relatively short history.

It is no surprise to see yet another expansion of University degrees. The scientific revolution brought forth distinct majors in the hard and soft sciences. The industrial revolution brought about a myriad of professional and career track majors. Now, in the 21st century, we see another collection of degrees emerging in response to the broader trends in society. This time we see interdisciplinary programs addressing the nature of life in an increasingly digital world. Consider that none of the following degrees existed thirty years ago, some less than ten years.

  1. MA in Telecommunications with an emphasis in Digital Storytelling - Ball State University
  2. MA in New Literacies and Global Learning – North Carolina State
  3. PhD in Media Psychology – Fielding Graduate University
  4. MS in Game Design – Full Sail University
  5. Master of Internet Communications – Curtin University
  6. MA in Social Media – Birmingham City University
  7. MS in Digital Marketing – Sacred Heart University
  8. MA in Digital Humanities – King’s College London
  9. MFA in Digital Arts and New Media at the University of California Santa Cruz
  10. MS in CyberSecurity – University of Maryland University College
  11. MBA with a specialization in E-Business at Eastern Michigan University
  12. Master of Distance Education at University of Maryland University College
  13. MA in Digital Journalism at National University
  14. MS in Digital Forensics at the University of Central Florida
  15. Doctor of Ministry in Leadership in Emerging Culture at George Fox University

New degrees are emerging in response to the digital age. There are degrees ranging from education to business, criminal justice to psychology, literacy to theology, journalism to communication. Some look at such programs with concern that Universities are over-specializing, but this seems to be representative of a century-old trend in higher education. As new areas of need and interest emerge in society, higher education responds with new majors, degrees and specializations. Even as new fields emerge, some of those fields converge to create new, interdisciplinary areas. This is the case in an area like educational technology, which has roots in library science and audio visual studies, educational psychology, and even military training.

There is something different about some of these newer degrees. While some are still quite broad (like Internet studies or digital arts), others are very specialized. The scientific revolution produced physicists and biologists, those developed into distinct fields with unique methodologies. Many of these new majors are not fields as much as they represent distinct skill sets and competencies, or the ability to apply the core aspects of a field or area of study in a new or distinct context. These are also areas that seem to be far more fluid and fast-moving, leaving one to wonder whether University degrees are the most responsive and effective ways to prepare people in these areas.

While some Universities are creating such specializations with the hope of reaching and recruiting new students, it is uncertain whether these hyper-specialized degrees give the breadth necessary in a constantly changing digital world. It is no coincidence that the 15 degrees listed above are graduate degrees. Scan the workplace for people with these degrees and you are likely to see a massive number of them working outside the specialization represented in the degrees. Graduates of these programs who are working in the specialities are often working alongside peers with comparable ability, but who do not have such speciality degrees. As such, these are not gate-keeper degrees. While one might opt to pursue such a degree as a means of preparation, there are equally accepted alternatives, even simply demonstrating that you are competent to do the job. A person with 3-5 years experience as a successful marketer who has done so in digital spaces will probably beat out the recent graduate of a digital marketing degree who hasn’t actually done it. The degree doesn’t have greater value than comparable experience in the marketplace. This is different from past eras of new degree growth.

This leaves space for innovation and micro-disruptions. While I do not expect to see higher education institutions moving away from adding more such degrees in the near future, I expect these specific areas to be prime candidates for the trends toward nano-degrees, certificate programs, and more granular training programs recognized by digital badges and other such credentials.

Five years ago, I attempted something for the first time as a college professor. I’d done it as a K-12 teacher over the years, but this was a first with traditional undergraduate students. I redesigned an entire undergraduate course around six self-directed learning projects for each unit in the course. I still kept a midterm and final exam to test understanding of the “grammar” of the course. However, I threw away every other graded assessment. Instead, at the beginning of each unit, students had the challenge of proposing a project that tied directly to the learning objectives for that unit. The proposal needed to cover the following elements.

1. What is the question that will drive my inquiry?

The question should be compelling, provocative, deep, substantive, and it should drive you to explore and discover something that matters to you and others.

2. How will I pursue answers to this question?

It might include a tentative reading list, field trips, observations, interviews, experiments, research, a review of the peer-reviewed literature, participating in online or other communications or anything else that might help. This part of the proposal didn’t need to be complete, but the student had to show that they had an initial and tentative plan.

3. How will I document my journey?

This should be in a form that allows the instructor/coach the review it at any point, and it needs to be updated at least twice a weekly, but daily is recommended. I also encouraged this to be designed in a way that classmates and others can view it and provide feedback. Students could use a shared Google Doc (or folder), a Wiki, a blog, a YouTube video diary, or any other format that met the above criteria.

4. What culminating product, project or performance will be the result of my work?

This should be something that demonstrates the learning gained by pursuing answers to that driving question. I encourage students to do something that is valuable to the student and beneficial to a specific person or group of people. In some ways, this added a service learning element to the project, something the resonates with my deep conviction that a great education is about discovering one’s calling, which is always found in love for and service to others in some way.

5. Who will be the target audience for my product, project, or performance?

While sharing it with classmates is nice, I challenged students to find the audience that would most benefit from the work and share it with them, preferably in a presentation to them. I went back and forth on this element over the years, but some of the best projects were consistently the ones created and presented to a real-world audience.

6. What is the tentative timeline for this journey?

Since this was done in a traditional semester class and each project was for a unit, there was a fixed maximum amount of time, but if students requested more time, I always gave it to them. In fact, I remember a student coming to me midway through a unit, troubled that she would not be able to finish it in time. I simply said, “No worries. Just change the due date.” She looked at me confused so I said it again, explaining that she was in charge of the timeline, not me. I was just there to help her meet her goals.

If students were doing all this work, what were you doing as a teacher?

This was the best part. I didn’t have to do anything. I just sat around and checked Facebook status updates. I’m just kidding. In fact, I was never so busy and never had so much interaction with students as I did in the classes with these projects. I was a coach, mentor, guide, concierge, intellectual match-maker, and resource. I met with students individually and in small groups, giving them tips and resources on how to frame their questions, how to identity great resources, how to connect with experts and groups related to their projects. In fact, along the way, I built my own personal learning network by helping students reach out and connect. I also provided often optional mini-lecture on topics and skills relevant to their work. I also coordinated activities to them them work their plans, workshop their projects with peers, and surface new possibilities.

I never used my cell phone more in class. I would be talking with a student or small group about something and they had a pressing question. If it was about an author or person, I would just look up their contact information and call them on the spot, asking if I could put them on speaker phone. Or, I would craft an email and try to broker introductions. I was modeling what I hoped the students would gain the competence and confidence to do, and many of them did it.

What was the result?

Without question, starting to run a class this way resulted in the most impressive student work and thinking that I’d ever seen as an educator. I learned so much from these students and their inquiries. Many of them, with a little polishing, were publishable quality. They were substantive and proposed compelling solutions to pressing problems in the world, especially the world of education (I’m and education prof).

It wasn’t all good. There were four common challenges.

1. Time Management

Some students just didn’t stick to their plan. They were so used to cramming for tests, papers, and projects that they had lost or failed to develop time management skills so valuable in working on an in-depth and extended project. This was evident in their learning journal entries, so I would reach out to them individually, trying to help them build this valuable skill, but it was not easy going for some of the students. In fact, I found myself helping students systematically develop time management skills to find success in this new model.

2. Limited “Research” Literacy

Some students really struggled at first with how to explore answers to a question. They had spent little time really getting to know about libraries, Internet search strategies and the like. In addition, I was challenging them to use the skills present in qualitative and quantitative researchers, so I taught optional mini-lessons on different strategies: observation strategies, interviewing techniques, how to reach out to a stranger, how to set up an informal thought or life experiment, how to make sense of a quantitate research report, etc. It was very rewarding to watch the lights turn on in their minds when they began to imagine the possibilities of using these strategies to reach a personally meaningful goal.

3. Time

I mentioned time management before, but I discovered quickly that many students were used to getting through college courses with a minimal time investment. Many skimmed instead of reading deeply and pondering. They did what they needed to pass or get the desired grade on the test or paper. However, many worked long hours, had extensive time commitments with extracurriculars and more. Perhaps this is part of time management, but I really saw it as just not having a lifestyle that left room for deep, really deep learning.

Add to this the fact that some students were busy meeting the demand and expectations of 4-5 other professors at the same time, and it becomes hard to really lose yourself in a strain of inquiry like this. It is why such things work so much better in schools that embrace it on a school-wide level, and why some people don’t discover the joy of these approach until they are out of school (either because they drop out, graduate, or never start). As it stood, I often found students cutting short on the depth of an inquiry because of the demands of other courses. There was only so much time in the day, week and semester; and each student prioritized their work differently.

4. Discomfort with the “Lack of Direction”

I gave more direction than ever in this approach, but I didn’t tell them what to learn as much. I told them that I was there to help them develop the competence and confidence to learn on their own. There were still students who nearly demanded that I tell them what question to ask, what resources and methods to use, and what projects to create.

I remember reading the course evaluations of a couple such students. “People don’t learn by asking their own questions and seeking answers. They learn by a teacher telling them what to learn.” “I don’t have time for this ____. You are the teacher, so shouldn’t you be teaching me instead of telling me to teach myself?” “This is too hard.” I took every one of these statements to heart because they told so much about the beliefs, values, perceptions and experiences of these students. In the end, there were some who were genuinely philosophically opposed to such an approach. I was teaching future educators and they truly believe that good teachers tell you what to do, when to do it, and how to do it.

When I was convinced that this was truly a deep-seated philosophical conviction of students, I tried to channel it, but suggesting a couple of strong resources that related to these convictions. In fact, some of these students ended up creating the most amazing projects by tapping into the community of fellow essentialists, perennialist, or classicists. That love for content and ideas was often a great foundation for this sort of work.

Helping Students Be More Self-Directed

In the end, while I too easily recall my failures in these classes, my failure to light the spark of self-directed learning in some students, I have so many amazing memories of students who got it. Boy did they get it, and I pray that they still have it and use it. School has a way of taking it away from you sometimes through teachers who dictate, direct, and demand more than spark, ignite, and fan into flame. As a teacher, there are definitely times to direct. However, this experiment with my students left me with strong convictions about the transformational power of student-centered projects. When I look at this type of inquiry and learning I see a bit of the spirit of Abraham Lincoln, Ben Franklin, Booker T. Washington, Andrew Carnegie, Frank Lloyd Wright, Ansel Adams, Thomas Edison, the Wright Brothers, and hundreds of others. I think this is a spirit that we want to spread today.