The New GED Test: Who are the Winners & Losers?

On January 2, 2014, a new version of the GED (General Educational Development) test was published. What changed and what are the results? Let’s answer the second question first. According to this NPR article, the number of graduates declined by 85% since the new test. This is a drop from over 400,000 people passing the test in 2012 to less than 60,000 in 2014. What has changed?

  • They doubled the price from $60 to $120 and partnered with the for-profit Pearson to administer the test.
  • A credit card is the standard form of payment (although they have other options as well).
  • It is now a computer-based test.
  • They changed the standards upon which the test was built, aligning it with the Common Core State Standards.

Advocates for the new test argue that it more accurately represents the standards expected of high schools for a traditional diploma and employers seeking to hire people with a diploma or its equal. Simply looking at the four new features above, there appears to be more to the story.

I’m compelled to return yet again to one of Neil Postman’s important questions for evaluating new “technologies” or innovations. “Who are the winners and losers?” How might we answer that question for this new test? Certainly Pearson has something to gain since this is a new revenue stream for the company. What about employers? Are employers raving about how GED graduates of 2014 are so much more effective on the job than those who earned their GED in previous years? What about the fact that so many fewer GED graduates are available? I suppose there may be some employers out there whose business can handle simply leaving positions vacant until a larger percentage of people can pass the new test and therefore be qualified for employment. Or, they can just hire people with a GED or diploma. More likley is that people who pursued the GED after January 2014 but failed the test are just out of luck. They may be just as qualified at a test-taker from a year earlier, but they get passed over for the job because they don’t meet the minimum requirements.

That brings us to thinking about who the losers are with this new test. The people who don’t pass are an obvious group that fits into this category. Before I even get to the test itself, I’ll start with the payment system. A credit card is the seemingly standard and expected method, although they do accommodate other options. Yet, even the payment process has been complicated with this new test. Why put any unnecessary barriers in front of students? How is this new payment system designed to represent the best interest of these students?

Now to the impact of failing this test. If people don’t pass it, they have fewer opportunities available to them. What happens to people without a high school diploma or GED? While I respect the caution of confusing correlation with causation, people without a GED or diploma are more likely to be unemployed, imprisoned, and/or stuck in a cycle of poverty. We can dismiss these realities by explaining how we must maintain high standards and “academic rigor” through this new GED, but this is not about maintaining. They raised the bar, and to the best of knowledge, they did so without any substantive body of research as to how this will produce greater social good, better benefit the well-being of people in GED programs, or even benefit the employers of people with a high school diploma or GED. Simply saying that these new standards will produce more capable employees is not adequate. Show me the evidence. Look at the types of skills required of people in jobs with a prerequisite of a GED or high school diploma but no higher credential. Show, through a workforce skills assessment, that the old test inadequately prepared people for those jobs, while the new test does. That will at least help me reconsider my position.

This is too massive of a shift of to just guess or even lean on what some claim to be common sense. There are too many complexities, too many people’s life situations at stake, and too little hard data to support the decision. If we really wanted to pursue such a development without such massive potential negative implications, I have another simple suggestion, one that was overlooked or disregarded. Why not design a test that is based upon both the 2013 and 2014 standards, making those who meet or exceed the 2014 standards as graduates with an honors GED and the others with a standards GED? This way we  are raising the bar without using a nation of GED students as guinea pigs.

Yet, that is not what people did. They changed everything overnight and have done little to address some of the potential harm inflicted on people and society. As a result, some states have abandoned this test as a requirements for a GED, opting for alternatives and deviating from what was previously a standard GED test for over 60 years. I guess if you are doubling the price for the test, you can handle losing half your customers.

Getting Good at Getting Good #deliberatepractice

In the 1993 film Groundhog Day, Phil Conners, a less than pleasant weatherman, finds himself repeating the same day over and over again. The movie tells the story of how this man used this strange experience to learn. He learned more about a special women in his life. He learned to play the piano. He learned how to become a good person. This is a comedy, so he also takes advantages of the situation to make more than a few careless decisions and take risks that he would have never done before. After all, he had a daily “do over” regardless of the outcome.

My favorite scene in the movie is the piano scene. He is in on stage playing these impressive jazz riffs on the piano, something he couldn’t do the “day” before. It was a simple but brilliant reminder about how we get good at things. We practice. We do it over and over again. The moment we start to talk about “being good” at something, the conversation often turns to the nature versus nurture debate. Some people are just born musically gifted. Others are not. Or maybe it is math, basketball, sales, leadership, listening, conducting ethnographic research, teaching, photography, or starting a successful business. I don’t deny the role of genetics. It is just that the vast majority (as in the 99.999999%) of people don’t become world-class in any of these things on the basis of genes alone. For that, we need lots of practice.

When I think about people from my life who have become truly exceptional in their field or discipline, I see an obvious pattern. Not only are these people who were devout about practicing and refining their skills in this one impressive area, they so often wired their brains to think about many areas of life in the same way. These are people who learned the benefit of practice and developed the mindset that they could get better at pretty much anything through practice and persistence, whether it was gardening, playing an instrument, playing chess, an athletic pursuit, playing cards, home decorating, or fixing cars. They set a goal, practiced, and modeled this wonderfully deliberate, thoughtful, reflective approach that showed a commitment and intent on improving.

People get really good at something through a process that is simple but profound, something that we can easily doubt or forget, only to find ourselves regretting it years later. We get good through deliberate practice. Both of those words are critical. We all know that practice is important, but that lesson doesn’t come to life until we experience the benefits of practice and reap the rewards. It isn’t just practice. Bad practice is a great way to stay bad at something. That is where the first word is so important, “deliberate.”

Interestingly, the classic article about deliberate practice was first publish in 1993, the same year that Groundhog Day hit the theaters. Ericsson, Krampe, and Tesch-Romer published The Role of Deliberate Practice in the Acquisition of Expert Performance. The article starts:

The theoretical framework presented in this article explains expert performance as the end result of individuals’ prolonged efforts to improve performance while negotiating motivational and external constraints. In most domains of expertise, individuals begin in their childhood a regimen of effortful activities (deliberate practice) designed to optimize improvement. Individual differences, even among elite performers, are closely related to assessed amounts of deliberate practice. Many characteristics once believed to reflect innate talent are actually the result of intense practice extended for a minimum of 10 years. Analysis of expert performance provides unique evidence on the potential and limits of extreme environmental adaptation and learning.

This is not mindless practice and repetition. It is deliberate. The structure of the practice matters. An example in the Ericsson, Krampe and Tesch-Romer article has to do with handwriting. I’ve been writing for a long time, but it still looks like chicken scratch. It isn’t enough that I’ve been doing it for years. If I want to drastically improve the quality of my handwriting, that will require deliberate practice that is structured in a way that will help me improve. Sometimes that comes from an expert coach or mentor. It might develop as I watch and systematically learn from others. It can happen in several ways. What is important is that it moves from simple experience and repetition to something more intentional, systematic, and structured in a way that results in increased performance over time. Things like feedback and reflective practice become important, allowing me to learn from my experiences and to improve upon past performances. As such, the article notes four important elements of practice: motivation that results in attending to the task(s) and extending the effort necessary to improve,  practice that accounts for prior knowledge and skill (different types of practice for different levels of experience and background are often important), frequent feedback about the quality of the person’s practice, and repetition. Put these four together and we get deliberate practice.

If I were starting a new school, business or any organization; I would want to fill it with teachers or employees who’ve been poisoned by the joy and addiction of deliberate practice. I want people who knows what it takes to get good at something, and they know about it from direct experience. I’m especially interested when I see a person whose done this in several unrelated domains. This demonstrates to me that they know how to learn something new. As long as I’m convinced that they are committed to getting good at what we are doing in the school or organization, and I see a track record of learning to get good at things, I see promise.

It is popular these days to create top ten lists of important skills for young people in the 21st and 22nd century, but I’m not going to give a full list of ten. I’ll just start with one. A powerful 21st century skill is getting good and getting good at something. Perhaps this is a bit too simplistic, but I might even be willing to drop my list of 21st century skills down to five if I can make this one of them. Imagine what would happen if we set aside long lists of standards and outcomes for a handful of life-changing skills like this. What would happen if we had learning organizations that nurtured young people who were world-class at becoming really good at things…at anything they set their mind to doing?

This is not a simple task. You don’t get really good at something in a semester, maybe not even a year or four years. So our understanding of time and pace might have to change. Becoming world-class is usually a multi-year, even a decade or longer task. Also, the four conditions for deliberate practice are not easily dropped into many traditional schools and classrooms. There are policies, practices, and traditions that stand in the way. Perhaps that is why so many people develop their life’s passions and pursuits beyond the walls, confines and hours of the school day. They do it in areas where they can engage in long-term deliberate practice.

This is not just an important attribute for formative education. I consider this an important social good when we are talking about adult education, workforce development, and addressing skills gaps in society as well. Workforce development divorced from personal development may address immediate needs in industry along with immediate needs for a paycheck by the worker. However, what happens when that task is no longer in demand? That person risks being out of a job. That is why I contend that the most humane approaches to workforce development helps people achieve specific job skills, but also offers them guidance on developing life skills that will allow them to thrive in a workplace of constantly changing demands for skills. That is why we invest in helping people discover the skill (and joy) associated with using deliberate practice to get good at something new. Without such a skill, I’ve seen too many people become bitter, felling trapped and disenfranchised,  overwhelmed and at the mercy of a single employer. If I truly value human agency, then my vision for education has to include helping people learn the “secret” of becoming skilled.

Conference on Meaningful Living & Learning in a Digital World 2015

Like many of you, my inbox fills up quickly. If I check my mail before going to sleep, I wake up at 6:30 or 7:00 AM to 50+ new emails. A third are from people in different time zones. The other two-thirds are newsletters, Google news alerts (my daily me), ads and announcements. Amid that influx of emails, one subject stuck out last week, a conference on Meaningful Living and Learning in the Digital World, scheduled for May 27-29 in beautiful Savanna, Georgia. It was refreshing to see a conference devoted to the human side of learning and learning in an increasingly high-tech world.

As some of you know, my doctorate is in instructional technology, but in some ways, I hacked the program to focus upon the social, cultural, psychological, and philosophical side of life and learning in an technological world. In fact, to add more of a humanistic bent to my doctoral dissertation, I completed a second master’s in humanities while writing my dissertation.

I’ve always been drawn to questions about human implications, whether it was unexpected health implications of children’s early immersion in technology-rich contexts or what Neil Postman calls the Faustian bargain of technology. As part of this thinking, I’ve gone on inquiry walkabouts that included the study hacker culture, Amish culture, the history of the Luddite movement, and neo-luddite perspectives on technology. As much as I’ve an advocate for educational innovation and leveraging technology for social good, I continue to welcome the civil war that goes on inside of me regarding the unexpected and/or negative impact of technological advancements.

Our innovations will almost always develop faster than our ethics and moral compass in the digital world. That is why conferences like these are refreshing and important. I’m not sure that I’ll be able to make it this year, but I wanted to at least post this to demonstrate my support for it. Thank you to all who are making such an event possible!

Workforce Development, the Skills Gap, & the Limits of the College Solution

I am starting to turn a corner in my thinking about workforce development and equipping people for many of their life callings. I’m still an academic, and I’m not ready to throw out the idea of at the University. Sectors that old don’t just disappear overnight. They adjust, adapt and pivot; but they do not usually disappear. Yes, individual colleges and universities have to closed their doors, but the idea of the colleges and universities remains alive and well. Nonetheless, conversations about workforce development call for a multifaceted approach that embraces college as one of many important elements.

It would be a mistake to think that institutions of higher education have stuck around in largely unchanged formats, or that they are the sole means of preparing people for work and life. Higher education has experienced massive overhauls over the last thousand years: who is admitted (and who is not), who is taught (and who is not), how progress is determined, how learning is categorized, the concept of disciplines an fields of study, how credentials are earned (and what they mean), what is learned, how it is learned, when it is learned, where the learning takes place, how it is funded, how success is measured, and what (if anything) is measured. These have been in flux for over a millennium. At the same time, we still see many ways in which people prepare for life and work that does not include college or what is sometimes considered the “traditional college experience.”

This is sometimes not understood or it is forgotten.  “The university” or “the college degree” is often discussed as if it is one universal thing. Journalists and academics who often experienced residential 4-hour college programs too often write and think as if that is the typical college experience, when it is far from it. There is less attention for the commuting community college student, the evening student pursuing a degree while working and/or caring for family, the online learner, the student studying at a satellite site for a University that doesn’t have athletic teams or even student clubs and groups. Scan the list of the largest colleges in the United States and the top of the list includes names like the University of Phoenix, Ivy Tech Community College, Ashford University, and American Public (Military) University. That is a bit deceptive, however, because narrowing it to the largest undergraduate institutions results in a completely different list: University of Central Florida, Texas A & M, Ohio State, Penn State, and the University of Texas – Austin. Regardless, there are no universally accepted elements of the perfect college experience.

My point is that there is value in thinking about preparation for work and life more broadly than college and degrees. Following are two ways to do that, especially with workforce development in mind.

Alternatives to College

There is a  growing national conversation in the United States about skill gaps of the present and future. How do we prepare people for the job openings of the future? I commend President Obama’s call to consider free community college as part of the solution. That will help. However,we would be unnecessarily limiting ourselves if we only looked to colleges and universities. I am increasingly confident that we will find far more scalable and sustainable solutions if we expand our awareness of the possibilities. To only focus on colleges and universities to address the skills gap would be like focusing on adding more restaurants to solve a problem with hunger and malnutrition. There are other ways.

  • There are promising education startups.
  • There are efforts where employers are taking the initiative to build their own training programs for certain prospective employees.
  • There are partnerships between companies needing new employees with certain skills and training providers (sometimes colleges and universities, but often just education companies).
  • There are self-study opportunities that lead to assessments where people can demonstrate competence and/or readiness for a given job.
  • There are well-known certifications through professional organizations and companies that open the door to certain employment opportunities.
  • There are alternate credentialing systems (think digital badges) that can be used to document learning regardless of the learning pathway. With the continually growing number of free and open resources for learning online, why not embrace them as ways to help people prepare for various jobs, leveraging open badges or other emerging credentials to signify when someone is ready?

Think About More than Degrees

Even when we think about college and university as a solution, it is important not to get too sidetracked by the idea that a degree is the only valid or useful outcome. Learning takes place even when the final credential is not awarded. How might we recognize and credential that progressive learning so that people can step out and into a job, only to return later for further training and new opportunities? What would it look like to design the college or university experience (or at least some of them) with such built-in options? What about a program where you are credentialed for certain jobs after a semester or year, but you keep studying while working in that job. Now you have employment, but you are working to become prepared for a more advanced position in the near future? This might help solve workforce development and the cost of higher education at the same time. This is not how schools or most government entities related to education think. As it stands, schools would be penalized for such a model because students would be recorded as drop-outs, when they are really just stop-outs, and the stopping out led to employment. This is not a quite change, but it is doable, and I expect to see more higher education institutions re-discovering that they have so much more to offer in terms of preparation for work than courses, credits, and degrees.

How do we address issues related to workforce development and skill gaps? It doesn’t come from just trying to get more college graduates. That can be part of the solution, but I contend that we will make much more progress by 1) increasing access and opportunity to traditional college degrees while also 2) reimagining college in terms of progressive credentialing, and 3) looking beyond college for solutions.