5 KPIs to Measure Your Chief Analytics Officer
In which Jill wonders whether you REALLY know how your CAO is doing.
I love the variety of people who reach out to me via my Upside Q&A column. It's no surprise. Analytics transcends geographies, industries, and job titles. The most recent question comes from a chief analytics officer (CAO) who's looking for different ways to define his -- and the team's -- success.
It's a deceptively knotty problem because ownership boundaries for CAOs can differ across companies. Some CAOs own data and some don't. Some include infrastructure and platforms in their scopes while others leave that work to IT. Still others sit within the strategy ranks, a testament to an analytics program's competitive mojo.
Rohit's question is succinct, and very timely:
Hope all is well.
In your opinion, what would be the most appropriate set of KPIs for a CAO role?
-- Rohit, New York
Hi, Rohit. Great question, and like many of the questions I get for this column, it has several possible answers.
Typically companies that have the vision to designate a CAO have already experienced the competitive horsepower of analytics and data. They've not only made the financial investment, they've committed to supporting the leadership and specialized talent necessary to sustain analytics delivery.
That last word -- delivery -- is critical because at the end of the day, petabytes and platforms are a means to the end. It's the delivery of analytics projects -- and their associated tools and data -- that differentiate a successful CAO from a mere figurehead.
If I were to call out five key performance indicators for a CAO, it would be the following.
KPI #1: Application delivery
Analytics capabilities that solve business problems must be delivered across lines of business (see KPI #2 -- breadth -- below) at a regular cadence. For instance: propensity to default in the risk group; image recognition for medical screening; an executive dashboard for the C-suite.
When I judged the TDWI Best Practices awards, my primary yardstick for measuring a best practice analytics program was this:
Either new data or new functionality every 3-4 months.
What does that look like? Businesspeople regularly getting access to new data to enrich already-robust analytics capabilities or building on those analytics with new functionality. For instance, a home improvement retailer purchases external weather data so it can stock up on plywood in neighborhoods in the path of a hurricane, or a large securities firm acquires blockchain to predict the movement and momentum of cryptocurrencies.
Hurling more data and tools at the business isn't the point. Although keeping it warm with constituents is important -- and regular delivery accomplishes this -- the agile nature of a short and regular delivery cadence lends credibility to the team, cultivates a common vocabulary among business and technology groups, and keeps analytics top of mind.
KPI #2: Breadth
The surest predictor of success is regular delivery. The second related predictor is the breadth of that delivery.
It's not just how many users but how many different lines of business consider themselves beneficiaries of analytics and data. Companies that hire CAOs often do so because executives recognize analytics as an enterprise-class capability. Marketing might be at a different maturity level than finance, with each department focusing on its own quests for revenue generation and operational efficiencies, while risk and compliance teams have their own unique measures. Keeping various lines of business happy is a challenge best accomplished by a team that can introduce new features quickly (see the yardstick in KPI #1) and at scale.
KPI #3: A delivery road map
Companies that excel in analytics not only develop their own road maps -- plans that show the gradual deployment of analytics applications and data acquisition -- they make the road maps accessible and reference them often. CAOs themselves should be comfortable representing these road maps. They must be able to explain what will be deployed as well as the specific metrics used to determine delivery priorities and time frames.
KPI #4: Governance
I'm not just talking about data governance here, but governance of the entire analytics initiative. This often involves a steering committee comprised of both business and technical professionals, all of whom have a stake in the company's analytics success.
The analytics governing board will be familiar with overarching corporate priorities. Its members should be involved in prioritizing projects according to the best interests of the company, followed by their own interests. In this way analytics stays strategic.
KPI #5: Strategy alignment
If analytics isn't considered strategic, then a company probably shouldn't bother with a CAO. That might sound harsh, but the reality is that analytics projects should be prioritized in a deliberate way based on business priorities and overall corporate strategy.
In my book, The New IT: How Technology Leaders Enable Business Strategy in the Digital Age (McGraw-Hill), I introduce the "strategy on a page" framework, a visual construct of a company's objectives and how various programs can map to -- indeed, enable -- strategic initiatives. These days I find that most executives realize the strategic value of a formal analytics program and are keen to apply renewed rigor to analytics investments and planning.
A Final Word
As Peter Drucker said, "Culture eats strategy for breakfast." Thus every aforementioned measure ultimately depends on your company's culture, its established processes, incumbent technologies, and entrenched orthodoxies.
At the very least, the above performance indicators may serve as an effective checklist. At worst, it can invite an honest discussion of your program's weaknesses or gaps. At best, it can validate that you're on the right track with your analytics program and you made the right decision in becoming CAO.
Original article published on TDWI.org.