What is the future of educational publishers and content providers? As more content becomes freely distributed online and there are more creative (and sometimes free) products and services that help aggregate, curate, chunk, edit and beautify this content; there are questions about the role of educational publishers and content providers. While there is something to be said for a one-stop-shop for content, that might not be enough to secure a solid spot in the marketplace of the future, especially given that content is not the only thing for which people are shopping. Some fear or simply predict the demise of such groups, but I expect a long and vibrant future. In fact, over the past decade or two, we’ve already witnessed publishing companies rebrand themselves as education companies with a broader portfolio of offerings than ever before. They’ve done so by adding experts in everything from educational psychology and brain research to instructional design, software development to game design, educational assessment to statistics, analytics, and testing. These are exactly the types of moves that will help them establish, maintain, and extend their role in the field of education. This is a shift from a time when many educational publishers and content providers would suggest that they should leave the “teaching” up to the professional educators. Now, more realize that there is not (nor has there really ever been) a clear distinction between the design of educational products and services and the use of them for teaching. Each influences the other, and just as much (if not more) understanding of educational research is critical for those who design and develop the products and services that inform what and how educators teach students.

According to this article, the preK-12 testing and assessment market is almost a 2.5 billion dollar market, “making them the single largest category of education sales” in 2012-2013! A good amount of this is the result of efforts that nationalize and standardize curriculum across geographic regions (like with the Common Core), allowing education companies to design a single product that aligns with the needs of a larger client base. However, even apart from such moves for standardization, more people are becoming aware of the possibilities and impact of using feedback loops and rich data to inform educational decisions.

This is just the beginning. If you are in educational publishing or a startup in the education sector, this is not only a trend to watch, but one to embrace. Start thinking about the next version of your products and services and how learning analytics and feedback loops fit with them. If you look at the K-12 Horizon Report’s 5-year predictions, you see learning analytics, the Internet of everything, and wearable technology. What do all three of these have in common? They are an extension of the Internet’s revolution of increased access to information, but this time it is increasing a new type of information and making it possible to analyze and make important decisions based on the data. Now we have a full circle. Data is experienced by learners. The actions and changes of the learner become new data points, which give feedback directly to the learner, to a teacher, or the product that provided the initial data. There is a new action taken by the learner, teacher and/or interactive product and the cycle continues (see the following image for three sample scenarios).

Interactive Text Feedback Cycles

Some (although an increasingly small number) still think of the Internet and digital revolution in terms of widespread access to rich content. Those are people who think that digitizing content is adequate. Since the 2000s, we’ve experience the social web, one that is read and write. Now we live in a time where those two are merged, and each action individually and collectively becomes a new data point that can be mined and analyzed for important insights.

While there are hundreds of analytics, data warehousing and mining, adaptive learning, and analytic dashboard providers; there is a powerful opportunity for educational content providers who find ways to animate their content with feedback, reporting features, assessment tools, dashboards, early alert features, and adaptive learning pathways. Education’s future is largely one of blended learning, and a growing number of education providers (from K-12 schools to corporate trainers) are learning to design experiences that are constantly adjusting and adapting.

The concept that we are just making products for the true experts, teachers, is noble and respectable, but the 21st century teacher will be looking for new content and learning experiences that interact with them (and their students), tools that give them rich and important data (often real-time or nearly-now) about what is working, what is not, who is learning, who is not, and why. They will be looking for ways to track and monitor learning progress. If a content provider does not do such things, then it will be in jeopardy, unless it is simply extremely scarce or high-demand content that can’t be easily accessed elsewhere.

As such, content still matters. It always will. However, the thriving educational content providers and publishers of the 21st century understand that the most high-demand features will involve analytics, feedback (to the learner, teacher, or back to the content for real-time or nearly now adjustments), assessment, and tracking.

I’ve run into a number of educators recently who were critiquing the trendy-ness of modern education. “There is always something new,” they explain, “but they never last.” In fact, I’ve heard that dozens of times over the years when it came to online learning. It was even a question asked at my thesis defense for my master’s on online learning in the 1990s. Isn’t this just yet another passing educational fad? Almost twenty years later I can say with confidence that it is not a passing trend.

Nonetheless, many seem to be even quicker to judge something new in education as a passing or fading trend. Now it often seems to depend upon how long the concept makes frequent headlines in the news and blogosphere. The assumption is that it must be a passing trend if people are not writing articles about it.

MOOCs are a good example of this. In 2013 and early 2014, MOOC headlines where all over the place. There were bold claims that they would disruptive higher education and just as bold rebuttals that they would never replace what we do in traditional education. There were debates about their uses and other musing about how they might supplement middle and high school curricula, provide new employable skills, serve as a low-cost and high-impact form of professional development for teachers, and just serve as a way for more people to gain access to useful learning experiences apart from enrollment in a University or expensive tuition expenses.

Of course, there were also no shortage of critiques as first hints of data analysis came out about retention rates. People wrote about low “retention rates” as if it was proof that MOOCs are a failure. At the same time, others challenges this critique, noting that the intent of the learner is more important than some traditional measure of success used in formal schooling.

Then things slowed down over the last few months of 2014. There were fewer (but still plenty) of headlines. As such, I’ve had multiple conversations and listened to speakers use this decrease in media coverage as evidence that MOOCs are on the decline, that this was more hype than substance.

The problem is that this is not accurate. I reached out the people at EdX in November, inquiring about their enrollment. Following is their response.

Hi Bernard,

Thank you for your edX question. Please find our enrollment stats below.

October 2013: 2.31 million enrollments

October 2014: 6.26 million enrollments

Thank you.

Best,

R.

From 2.31 million to 6.26 million in one year! That sure doesn’t seem like a decline to me. If a sector of formal education saw that much of an increase in a twelve month period, it would certainly be in the headlines. The same is true for growth in almost any sector. My point is that there is a difference between the facts and the frequency or nature of media coverage. An innovation exists apart from its media coverage, and we are wise to not judge things too quickly based upon what we are seeing in our favorite education news sources.

In the case of MOOCs, I don’t think we’ve seen the last of the articles and blog posts. I suspect that there will be an ebb and flow to the coverage, but beneath all that we continue to see steady growth, new experiments, new successes, new challenges, new opportunities, and yet another educational technology initiative (like online learning) is likely to become a persistent and impactful part of 21st century education.

I apologize. This title is deceptive. Research doesn’t suggest that we should ban laptops, but I suspect that some will jump to such conclusions after reading a recent report comparing note taking on laptops and versus pen and paper.

Perhaps you saw the articles showing up on the web based upon a June 2014 study published by Pam Mueller and Daniel Oppenheimer. The original report was titled, “The Pen is Mightier Than the Keyboard: Advantages of Longhand over Laptop Note Taking.” If you have access to an academic library, I encourage you to find and read the original report in Psychological Science instead of getting the information second-hand from here or any other blog post or article.

The focus of the report is upon the impact of taking notes on laptops in class compared to using pen and paper. Early in the essay, they explain two hypotheses behind the benefits of note taking: the encoding hypothesis and the external storage hypothesis. The encoding hypothesis suggests that the benefits of note taking come from the process through which one goes to take notes. As people take notes they summarize, put things into their own words, create concept maps, etc. In contrast, people who just write word for word what is being spoken are less likely to get the encoding benefits from note-taking. There are also external storage benefits, which refers to the ability to review the content later. There is a record of research indicating that both are benefits of note taking.

With this in mind, the researchers set up an experiment involving 67 Princeton students watching a 15-minute TED talk while taking notes in their ordinary way. Some were asked to do it on a laptop. Others used a pen and paper. After 30-minutes of activities that distracted students from thinking about the video, they were given an assessment. The researchers found a positive correlation between the amount of notes taken and performance on the assessment. They also found a negative correlation between verbatim note taking and performance on the assessment. Students were more likely to take notes verbatim when using a laptop, although they tended to take more notes.

Then thy conducted a second study.  This time they examined 151 students from UCLA. Again, students were asked to watch a video and take notes with the following instructions:

We’re doing a study about how information is conveyed in the classroom. We’d like you to take notes on a lecture, just like you would in class. Please take whatever kind of notes you’d take in a class where you expected to be tested on the material later—don’t change anything just because you’re in a lab.” p. 1162

A second group got these instructions:

“We’re doing a study about how information is conveyed in the classroom. We’d like you to take notes on a lecture, just like you would in class. People who take class notes on laptops when they expect to be tested on the material later tend to transcribe what they’re hearing without thinking about it much. Please try not to do this as you take notes today. Take notes in your own words and don’t just write down word for word what the speaker is saying.” p. 1162

They completed the study using a similar approach to the first, taking the groups through activities that would distract students from thinking about the video followed by an assessment. The goal of this second study was to determine if simple instructions about the downside of verbatim note taking would potentially mitigate against the negative impact of such a strategy on a laptop. They found that long-hand note takers “performed better” on the assessment. The students who took notes long hand wrote less but also included fewer verbatim notes. The explain that, “The instruction to not take verbatim notes was completely ineffective at reducing verbatim content” (p. 1163).

They did a third study as well, this time listening to less interesting lectures and then coming back a week later to take a test on the content. Some were given ten minutes to study their notes. Others took the test right away. In the end, participants who had a chance to study and they took long hand notes outperformed all other groups.

In discussion of the three studies, the authors wrote, “The studies we report here show that laptop use can negatively affect performance on educational assessments, even—or perhaps especially—when the computer is used for its intended function of easier note taking” (p. 1166).

Now that I’ve briefly described the study (it is better to get a copy and read it for yourself), let’s get back to the terribly misleading title of my post, “Research Report Suggests That We Should Ban Laptops & Require Note-taking with Pens.” That title is a stretch. This study isn’t adequate to conclude such a thing, but it does challenge us to ask some questions. What are the potential implications for this study? Should it lead us to ban laptops from classrooms, requiring students to take all notes using a pen and paper? Or, while not the main purpose of this study, perhaps this serves as a wake-up call that it isn’t enough to instruct students to take notes and throw out a few words about how to do it. Reading this study, I was compelled to further understand the research on the most effective strategies for note taking in general. What really helps us learn? Once we discover that, what if we intentionally, persistently taught (not just told, but taught) students to be excellent note takers, with excellence being defined by the extent to which the notes help us remember and learn.

A study like this shouts for us to use the digital revolution in education as opportunity to get informed about something that has a long history in education, but there is limited common knowledge by teachers about what truly does and does not work…something like note taking. How many other common practices in classrooms are similarly promoted without a substantive understanding of the research behind the practice? As I ask myself this question, I must confess that my list is long. I have so much to learn. This is a wake-up call for us to dive into the research, maybe to conduct some of our own, but to build a growing and solid set of research-informed principles that can guide how we help students become high-impact learners.

In the meantime, perhaps there is wisdom in being caution about simply adding a new technology to an old practice, expecting no impact or maybe even hoping for something better. As schools are moving to one-to-one programs, this report is an important caution. How are those one-to-one schools teaching students about how to leverage these tools to improve their learning (based upon empirical research, not just assumptions) or how to set them aside for more effective alternatives (when the research supports that)? How are we teaching students to use note taking as an opportunity to think deeply about what they are learning, to grapple with the content in ways that is likely to increase understanding and retention of what is learned? Or, perhaps there are strategies that are completely different from traditional note taking, whether one is doing with a pen and paper or a laptop. Research reports like these remind me that, while there is much that we know about effective learning, there is so much more for us to learn. There are so many more studies to conduct.

openbookI’ve been an educator for twenty years. As I was participating in a lively Twitter chat recently, the moderator asked what professional development advice we would give to first year educators. I had no problem thinking about my own failures and challenges through the years and listing off a half-dozen tips. However, if I had to rank them, the one that I would put at the top of the list is this. Be an open book.

I’ve written about my first weeks as a middle school educator years ago, when I struggled with classroom management. What made the difference between my success and failure in those early weeks and that first year was one critical decision. Almost everything in me wanted to close my classroom door, hide my limitations as an educator, and hope that it would go away or that I would figure it out on my own. That decision would have ended my career as an educator. Instead, thanks to a wonderfully open and non-judgmental principal, I found the courage to walk into his office, explain my situation, my fears, my limitations as a teacher. I asked for help.

I’d love to say that ever since that time I’ve been completely comfortable opening up about my shortcomings and not trying to hide them, but that would not be the truth. It is true that I’m much more comfortable with being open, however, because I know that it can make me better. It can help me become the type of educator to which I aspire, or at least to get closer to that ideal.

This requires vulnerability, being what I am calling an open book. It means not just letting people look into your classroom and life as an educator, but asking…even begging for as much feedback as you can get from them. I’m talk about being really curious about how you are doing. Ask anyone and everyone to observe and share their thoughts and insights. Learn to use that feedback to grow as an educator. It might be inviting one or more colleagues, asking students to give you frequent feedback, or asking people who might have no direct connection to your teaching but can offer a fresh perspective and different set of insights.

There is good research to show that a key to growing and improving as an educator is what they call reflective practice. This is developing the ability to reflect on your practice as an educator, to review and critically analyze what you did, the results, and how you might adjust future behavior to get better results. Reflective practice is evidence in most or all people of excellence, whether it is a concert pianist, a pro golfer, a dancer, a comedian, a motivational speaker, a small business owner, a researcher, or an educator.

However, simply reflecting is not enough. You also need accurate feedback about what happened. Just asking about how you did and what results ensued might result in self-deception as much as self-discovery. This is where we benefit from getting feedback from multiple sources and perspectives. It doesn’t mean that you have to treat the student’s perception as 100% accurate. Nor do you need to accept without doubt the observations of a colleague. However, they all provide input. Combined, you are likely to get a richer and more accurate understanding of what is taking place. This means setting aside your ego, degrees, titles, and credentials. We can get excellent feedback from almost any source. Even if we don’t agree with their observations, they are giving us insight into how different people perceive your teaching, and that is valuable.

This plus that habit of reflective practice, prepares you to adjust your behaviors, collect more data from multiple sources and see if you are making progress. This simple approach can help you address how to increase student motivation, engagement, improved performance of as many learners as possible, improved positive relationships with students, more accurate and in-depth teaching of certain concepts, an improved classroom ethos, or any other valued aspect of your work as an educator.

It starts with a desire to improve and a willingness to do what it takes to improve, and this is about more than professional reading, attending conferences, and going to professional development days. No presentation, conference session or book will make you a better educator. Head knowledge is never enough. Excellence in teaching comes from practice, reflection, an openness to input from others, rich feedback, and adjusting your behaviors accordingly. Yes, there are many great concepts that can be learned through books and presentations, but it isn’t until you practice them and incorporate these other elements of reflections, feedback and adjustment that you reap the benefits.

I attended the mid-year graduation ceremony recently at the University where I’m honored to work, teach and serve. At the beginning, the President shared a few opening remarks. He said something about the “credential” or diploma that students would soon receive. “Your degree is not as much a certificate of completion as it is a marching order,” he explained. While I followed along with the rest of the ceremony, this short statement sent me on a two-hour mental journey.

Read my blog long enough, and you’ll see that I often write and reflect about credentials. However, the claim in this statement from the President posed a perspective that is in contrast to many current conversations about academic credentials. In some ways, his statement represented the diploma in a fascinating and different light. I’m sure he also sees the diploma as recognition for accomplishments and evidence of learning over the past years, but in this case, he represented the diploma as a form of marching orders, a sending off. The temporal destination is unknown, but the charge is clear. They are sent off from our University as representatives, ambassadors.  In fact, at my school, Concordia University Wisconsin, we sometimes refer to members of this community as Concordians. We have certain core values that make up what it means to be a Concordian. While we embrace the diverse gifts, talents, abilities and callings of each person; we also seek to nurture a set of common core values and convictions that collectively represent who we are as individuals and a community.

Diplomas really do have this element to them. There is a brand associated with different diplomas. That why many people think of a diploma from Harvard differently than a diploma from the local community college…but this identity starts before getting the diploma. Even being a Harvard dropout or a current student at Harvard starts to open doors for people. If you are someone associated with that brand and learning community, there are benefits. It could also be said that there are likely expectations of one associated with that brand as well.

This got me thinking about open badges in a new way, a new possible application of them. I’ve been thinking about open badges as a way to recognize or make visible some sort of achievement, accomplishment or as a symbol provided when a person demonstrates competence in an area. I still think of them in that way. Yet, what is keeping us from also using them as a way to identify affiliation with the brand of a movement, community, organization, or something else of value? What if we issued badges at the beginning, before there is an actual accomplishment, achievement or demonstrated competence. What if the badge were used to mark one’s start and commitment to a brand?

The fields in OBI already lend themselves toward such an application: description, criteria, issuer, issuing date, expiration date, etc. The expiration date would be a way to check in on a person’s commitment to the brand or community. Does it persist? Do they have new actions or accomplishments that can be recognized or made visible with additional or supplemental badges? Even without expiration dates, it would be easy enough to stamp badges with dates of membership, service or affiliation; allowing them to be yet another way to visually represent the “brands” with which they have been or are currently affiliated. The description or criteria fields could just as easily describe the nature and extent of the affiliation. In the end, such uses could be a way to generate an entirely new form of visual and verifiable resume.

Perhaps there are already cases or user stories of open badges being applied in such a way, but I have not noticed them. As such, this  possibility creates an entirely new(at least to me) set of options for how badges, which are notably visual symbols with meta-data, might serve as an additional way to manage and represent oneself online. Some might argue that this is just as easily done by listing your affiliations on a resume, and perhaps that is just as effective. Yet, one distinction here is that the affiliation and symbol is issued by the organization, adding a level of verification and potentially a measure of credibility and “klout” that exceeds self-reporting.

Have you seen such use of open badges? What possibilities can you imagine? What challenges and opportunities are created by such a use?

At Educase 2006, Georgia Nugent, the then president of Kenyon College gave a talk on “The Tower of Google.” You used to be able to listen to the entire talk here, but the link on their site stopped working a couple of years ago. It was a thought-provoking presentation, and one of her self-made buzz words stuck with me. She described her background as a classicist, but also explained her hope for the potential of technology in higher education. She called herself a Luddvocate (Here is a quick Wikipedia primer on Luddites if that is a new term for you). I can relate.

I still find  that the most intriguing books about technology were written by the self-proclaimed or often-labeled neo-luddites (Mumford, McLuhan, Ellul, Postman, Kirkpatrick Sale, Larry Cuban, Sven Birkerts…). These neo-luddites craft messages of caution. They plea for counting the cost of our technological escapades. They challenge the notion that technology is savior of the greatest social and human needs, and they highlight the adverse impact of technology in society. I read these texts and find myself shouting more than a few inner Amens to their sermons. These thoughtful texts give a perspective that I believe is valuable and needed in the modern world. Of course, there are some who, like the original Luddites, turn to violence and destruction (e.g. Theodore Kaczynski), and I’m quick and clear about rejecting those methods of dissent.

Luddism is not about being anti-technology in the same way that the Amish are not anti-technology. As I’ve written before, the Amish are not anti-technology as much as they are pro-community. Similarly, Luddism is about counting the cost of technological progress, not assuming that new technology is always a universal gain for humanity. It is recognizing the values-laden and intrinsically political nature of each technology. New technologies lead to new winners and losers. Luddism champions and gives voice to the losers in the race for technological progress.The original Luddites were moved to action by new machines displacing workers in the textile industry. For the sake of increased productivity, machines replaced people, and that affected their ability to feed their families. Luddism is about challenging us to be users of technologies instead of allowing ourselves to be used by them.

Then there is the advocate. While I’m not sure what how Nugent might further extrapolate on what it means to be an adovocate, I confess that I became one because I didn’t think I had much of a choice. Just like the original Luddites lost their fight against the vision of progress brought about by the industrial revolution, I suspect that many neo-Luddites will experience the same. What is a person to do? I distinctly remember struggling with this question throughout the middle of the 1990s, with my first regional presentation being about the negative implicat of technology in education at an educational technology conference in Chicago. I still remember the line of software vendors glaring at me with their arms crossed along the back of the surprisingly standing-room only session. I quoted Neil Postman freely as I warned about the “Faustian bargain” of new technologies. I didn’t, however, call for rejecting these developments. Instead, I took the hopeful position of striving to influence which technologies to amplify and which to muffle. I called it values-driven decision making. Identify your core values and convictions and let them drive your decisions about technology. Once you have clarity about those core convictions, you will be able to decide where, how, when, and about what to advocate. I felt good about my talk until the first question during the question and answer period, a principal wanting a list of the best software to use in his kindergarten classes. So much for values driving the decisions. I learned early on that technology in education had become a value of its own.

I found that many of my concerns about technology were connected to some what I considered to be dehumanizing effects of the industrial revolution. Don’t get wrong. I liked many of the benefit from the industrial revolution. It is just that I had and have hope that emerging democratizing technology might assist us in mitigating about some of the negative aspects of that era.

I had visions of Brave New World, 1984, the Giver and the dozens of dystopian stories of our future. So I chose to advocate for technology that seems to amplify values of democracy, access, and opportunity. It is what inspires me about the possibilities of everything from blended learning to online learning, alternative education to self-directed learning, open education to open badges and micro-credentials, personalized learning to adaptive learning software, project-based learning to social media, personal learning networks to communities of practice. It is why I can geek out about designing high-impact online learning communities while being a passionate supporter of existing and emerging physical third spaces (thank you Ray Oldenburg) that conjure a spirit of community. It is why I lobby for choice and variety in options for formal education. It is why I have degrees in both the humanities and instructional technology. It is why I am a champion for formal higher education why calling for those some institutions to resist the temptation to claim a monopoly on learning and knowledge, as if either were a commodity to be bought and sold. Each of these represent deep-seated personal values and convictions about truth, beauty, goodness, purpose, and what it means to be human. It shapes how I write and speak about the future, both forecasting and striving to create or influence possible futures. Ironically, it is the Luddite in me that drives me to be such an advocate for life and learning in the digital world. The Luddite is the one who cries out that our ideas have consequences, our convictions matter, human access and opportunity are noble causes and that all three should inform the futures that we help create.

“We’ve always done it that way.” Universities are rich with traditions and history, but it would be a mistake to think that what we see and experience in the Universities of the last 50 years mimic what came before them. Yes, perhaps certain teaching practices and structures have persisted. However, the curriculum has been in flux, adjusting to the broader changes in society.

Look at the history of higher education and it is a history of change. The first Universities in the world were in Morocco, Egypt and what is now Iran. Those were founded between 800 and 1100 AD.  The first Western Universities emerged near the end of the 11th century: the University of Bologna, the University of Paris and the University of Oxford. In the earliest Universities, areas of study were not nearly as extensive. Theology, medicine and law were among the dominant areas of study in these early years, and the modern concept of academic disciplines did not come along until the 1800s. They spread around much of the globe by the end of that century. Essentially, these disciplines emerged with the scientific revolution, with different disciplines eventually representing distinct methods and approaches to seeking and understanding “truth.” For example, we saw a shift from “natural historians” to physicists, biologists, and chemists. Early in the 20th century, we saw the growth of new disciplines in the social sciences, resulting in programs like psychology and sociology. It is not until the mid to late 1900s that we see saw rapid growth of modern programs like nursing, business, and a host of specializations in areas like gender and ethnic studies. As such, the modern idea of a University offering hundreds of majors is indeed a modern idea. Many of the largest disciplines in colleges today have a relatively short history.

It is no surprise to see yet another expansion of University degrees. The scientific revolution brought forth distinct majors in the hard and soft sciences. The industrial revolution brought about a myriad of professional and career track majors. Now, in the 21st century, we see another collection of degrees emerging in response to the broader trends in society. This time we see interdisciplinary programs addressing the nature of life in an increasingly digital world. Consider that none of the following degrees existed thirty years ago, some less than ten years.

  1. MA in Telecommunications with an emphasis in Digital Storytelling - Ball State University
  2. MA in New Literacies and Global Learning – North Carolina State
  3. PhD in Media Psychology – Fielding Graduate University
  4. MS in Game Design – Full Sail University
  5. Master of Internet Communications – Curtin University
  6. MA in Social Media – Birmingham City University
  7. MS in Digital Marketing – Sacred Heart University
  8. MA in Digital Humanities – King’s College London
  9. MFA in Digital Arts and New Media at the University of California Santa Cruz
  10. MS in CyberSecurity – University of Maryland University College
  11. MBA with a specialization in E-Business at Eastern Michigan University
  12. Master of Distance Education at University of Maryland University College
  13. MA in Digital Journalism at National University
  14. MS in Digital Forensics at the University of Central Florida
  15. Doctor of Ministry in Leadership in Emerging Culture at George Fox University

New degrees are emerging in response to the digital age. There are degrees ranging from education to business, criminal justice to psychology, literacy to theology, journalism to communication. Some look at such programs with concern that Universities are over-specializing, but this seems to be representative of a century-old trend in higher education. As new areas of need and interest emerge in society, higher education responds with new majors, degrees and specializations. Even as new fields emerge, some of those fields converge to create new, interdisciplinary areas. This is the case in an area like educational technology, which has roots in library science and audio visual studies, educational psychology, and even military training.

There is something different about some of these newer degrees. While some are still quite broad (like Internet studies or digital arts), others are very specialized. The scientific revolution produced physicists and biologists, those developed into distinct fields with unique methodologies. Many of these new majors are not fields as much as they represent distinct skill sets and competencies, or the ability to apply the core aspects of a field or area of study in a new or distinct context. These are also areas that seem to be far more fluid and fast-moving, leaving one to wonder whether University degrees are the most responsive and effective ways to prepare people in these areas.

While some Universities are creating such specializations with the hope of reaching and recruiting new students, it is uncertain whether these hyper-specialized degrees give the breadth necessary in a constantly changing digital world. It is no coincidence that the 15 degrees listed above are graduate degrees. Scan the workplace for people with these degrees and you are likely to see a massive number of them working outside the specialization represented in the degrees. Graduates of these programs who are working in the specialities are often working alongside peers with comparable ability, but who do not have such speciality degrees. As such, these are not gate-keeper degrees. While one might opt to pursue such a degree as a means of preparation, there are equally accepted alternatives, even simply demonstrating that you are competent to do the job. A person with 3-5 years experience as a successful marketer who has done so in digital spaces will probably beat out the recent graduate of a digital marketing degree who hasn’t actually done it. The degree doesn’t have greater value than comparable experience in the marketplace. This is different from past eras of new degree growth.

This leaves space for innovation and micro-disruptions. While I do not expect to see higher education institutions moving away from adding more such degrees in the near future, I expect these specific areas to be prime candidates for the trends toward nano-degrees, certificate programs, and more granular training programs recognized by digital badges and other such credentials.