The Promise, Peril, and Possibility of Data, Analytics, and AI in Higher Education (5 of 7): Organizational Health Dashboards

A University with a series of off-campus locations reviewed its dashboard of key performance indicators. During a couple years of enrollment decline and informed by the location dashboards, it was decided to close two locations. Only, when they did so, they inadvertently closed two of their most financially sound locations. They also unknowingly closed locations that had an unrecognized benefit of serving as a sort of brand awareness for the online degrees programs that they offered. What they expected to be a cost savings turned into a multi-million dollar loss for the college.

How is this possible? It was a poorly developed dashboard, and it was used just as poorly. Many people in higher education are rushing to build impressive dashboards of key metrics, but many are not spending enough time thinking about what they need, the affordances and limitations of different data dashboards, and how to extract the best and most helpful & actionable insights from those dashboards. In some ways, the use of data in higher education today is going through the same struggles of educational technology in the 1990s. When the combination of computers and the Internet started showing up in schools in the 1990s, people were so focused about the cables and hardware that most failed to devote enough thought and time to considering how any of this will fit into the specific educational goals. They also didn’t figure out how to equip educators with the necessary skills (not technology skills as much as new teaching skills). This same thing is happening in the age of data dashboards in education.

When a school begins to identify key performance indicators, collect data, and build dashboards, the job has just begun. It might be argued that the job should have begun long before that with a clarification of goals and values. Once goals are established, then we begin to ask how we can gain rich and valuable insights about those goals. At this point, people are persistently tempted to go with what is easiest to measure instead of what measures that is most important to the school’s goals and values. Every single time, doing this will create a new set of goals and values (even if they look and sound similar).

When I first started to make use of dashboard data for building an online education unit, we had a team of four leaders who met twice a week for two hours at a time. We did this for six months, analyzing reports and key indicators of success. Through this we developed a shared understanding of the data. We learned how to make sense of the nuances. This might sound time-consuming, but it was worth every minute.

We learned how to use the data in service of our goals and values, and we helped each other resist the temptation to shift our work to be in service of the data. For example, an admission director, striving to meet enrollment goals, found himself craving to count certain new enrollments just to reach his target more easily. Only those new enrollments were for a special program that we subsidized out of a separate budget. Putting those enrollments in the mix would have misrepresented the financial situation of the college. He might have felt better about his “numbers”, but that was too narrow of a view. As a result, we created a different way of measuring enrollment and a more nuanced set of goals for his team. Without meeting and developing a shared understanding, we would have missed out on this important consideration about the overall organizational health.

People find themselves treating dashboards and metrics like a game, and their goal is to reach a certain level or number of points. Only, that is usually too simple of an approach, and it rarely assists in helping us understand the overall health of the program or organization. It leads to thinking in silos.

Dashboards and metrics are certainly useful in determining the health of an organization. As a baseline, we usually want data about pre-enrollment patterns and trends; admission data; early enrollment data; data that provide insights about student progress, retention, success, and satisfaction; a way to gain insight about employee satisfaction, engagement, and quality of work; data about student learning; and key financial indicators. It is imperative that we figure out how to gain actionable insight in these areas through some sort of shared effort. This is not an “IT thing” to be passed off or entrusted to the technically-minded. This is best done as a cross-unit effort that benefits from various voices and perspectives.

Outside groups will promise to build beautiful dashboards about organizational health. While you can tap into external expertise, beware of giving away too much. Engaging an internal team is an opportunity to create a shared vocabulary while infusing your distinct beliefs and values into your use of data. If you opt for an outside group, seek true partners, not just people who are willing to do it for you and then hand it over. That rarely turns out well.

Especially in this tumultuous time for higher education, it is important to have a means of monitoring the health of the organization, and you probably already have access to the data needed for this. Yet, it is important to recognize that what you measure, how you measure it, and how you think about/analyze/discuss what you measure will be a reflection of your core beliefs and values as an organization.