This has been a big year for DataArts and for all of us who work with its Cultural Data Profile (CDP) data. DataArts is a valued, primary data partner of Southern Methodist University's National Center for Arts Research (NCAR), and data from its CDP power NCAR’s Key Intangible Performance Indicators (KIPI) dashboard and inform our reports and white papers.
A key reason it was a big year was massive change, ultimately for the better. It centered on the conversion from an older, outdated survey that required all respondents to see and work through more than 1,250 possible responses, to a new, modular format with more internal checks and balances.
Three adages rang true for us at different moments during the journey of change from the old-to-new system. They serve as reminders of my very human tendency to blame the system when the real nature of the error was PICNIC (Problem In Chair, Not In Computer).
Adage 1: Change is not for the faint of heart. Every time I get a new cell phone I cringe at the prospect of what might go wrong in the transition. In fact, my aversion to the hassle-factor is so high that I only get a new cell phone when the manufacturer tells me that my phone is so old that it will no longer be supported. There is fear of losing valuable information. One wrong move in the transition process and you risk losing your contacts or erasing your calendar.
Building up to the big DataArts Cultural Data Profile conversion, I suffered the same anxiety. The very idea of having to re-map 1,250 variables was enough to make me want to close up shop. An awful lot can go wrong with that much information in play, such as loss of data reliability or loss of data entirely. As a researcher, you lose these and you lose trust in your work. I kept hoping there would be a back-door way to change without really changing, like getting data exports that would look exactly like the ones we always received, but faster and better.
Adage 2: Change is never easy. You fight to hold on. You fight to let go. My new phone anxiety extends from losing information to resistance to learning a new system. It is disorienting. The only way to get through it is to invest time reading and following the instructions. Who has time for that? And yet who has time to suffer the consequences of not doing it?
Transition with anything unfamiliar is uncomfortable and frustrating at first, but just because something is unfamiliar doesn’t make it wrong. Both DataArts’ and NCAR’s reputations rest on the cleanliness of the data. Data integrity means that the data is accurate, consistent, and reliable. It is the opposite of data corruption, or unintentional changes to information. Data integrity requires scrutinous attention to detail.
DataArts invested significant time re-mapping the old-to-new CDP survey questions. So did we. We put on our hip boots and waded in the weeds of every new line item to understand how it relates to line items in the old survey. When we finished, we started again from the top, just to be sure. It is the only way to get through the transition and come through the other side with confidence. Our colleagues at DataArts were tremendously helpful in providing documentation and answering my reams of pesky questions in the process.
The new CDP is arranged quite differently than the old CDP. In the new CDP there are a lot of new pieces of information collected, and the order of retained questions has been altered. Initially, it is disorienting and hard to find what you’re looking for. Many questions are asked differently, so what used to be three line items now combine into one, and vice-versa. When all is said and done, information wasn’t lost. It just takes a while to find out its new location.
The reward for the big investment required to connect the dots between the old and new surveys is the assurance that the integrity of the data remains intact. NCAR is in the process of launching an update to its KIPI Dashboard, which draws on both the old and new CDP surveys to show each organization its raw data trends and its KIPIs trends. We could not be more pleased to report that our greatest fears about loss of data integrity in the transition were unfounded. There will be some wonkiness in some organizations’ scores in some years, but not more so than before the conversion. For the most part, the trends follow a smooth and logical pattern. When they don’t, it is due to unusual activity that occurred in those years for an organization or to the user entering erroneous information (e.g., reporting 1,000,000 employees rather than 1,000,000 in employee compensation).
Adage 3: Off with the old and on with the new. The thing is, once you live with the new phone for a while, you realize that it actually has some nice new features.
The new CDP survey order actually makes more intuitive sense (e.g., put questions related to pricing next to questions related to earned revenue). Being modular, it collects far more comprehensive detail on the type of activity relevant to the kind of organization you tell it you are while freeing you from wading through irrelevant questions (e.g., it asks for a lot of meaty details about exhibits only if you selected “Exhibits” as a program activity). A few sections and questions from the old survey have been eliminated entirely, but DataArts did its homework first and determined that users found them to be unnecessary. If enough users really miss a question, DataArts is open to reinstating it. Best of all, it has upped the ante on internal checks that help users realize when they’ve made mistakes and ultimately increase the cleanliness of the data.