KM Conversations — Using Analytics Part 2: Metrics and Reporting

Stan Garfield

Stan Garfield

March 05, 2020
Analytics is the discovery and communication of meaningful patterns in data and text. In this second part of a three-part series, I’ll discuss metrics and reporting. There will be a companion free webinar on March 25, 2020 (subscription link at the foot of this post).

Types of Metrics

Three different kinds of metrics are typically captured and reported when following knowledge management best practices:

  • Goal-oriented measurements directly relate to employee goals and allow assessment against those goals.
  • Operational metrics are based on data captured by systems used by the initiative. For example, a knowledge sharing initiative would capture details such as web page visits, document uploads, and document downloads; threaded discussion subscribers, posts, and replies; and repository submissions, searches, and retrievals.
  • Business impact metrics attempt to determine the specific value of initiatives, and include costs saved, costs avoided, incremental revenue, improved quality, increased customer satisfaction, customer retention, new business attracted, increased market share, and revenue from innovation.

Collecting and reporting on goal-oriented measurements ensures that the organization is aware of how it is performing and that individuals can be held accountable for achieving their goals. Reports should be produced and distributed every month to track progress, reinforce good performance, and encourage improvements where needed. Reporting metrics by group within the organization, for example, regions of the world or countries within a region, allows each group to compare its performance against other groups, and create a friendly competition to excel. Reporting metrics by individual may be limited by data privacy laws, and if allowed, transmitted confidentially to the manager for use in performance coaching and appraisals.

Operational metrics can be helpful in analyzing how an initiative’s infrastructure is being used, understanding who is using it, and identifying areas for improvement. However, there is only so much which can be inferred from data such as page hits, uploads, and downloads. These metrics don’t indicate the value of any of these activities. If a user visits a web page, they may not find what they need there. If a document is uploaded, it may not be of any use to anyone. If a document is downloaded, it may not be reused. Follow these rules when deciding on which operational metrics to collect and report: Keep the time and effort required to a minimum, automating as much of the collection, extraction, and production as possible. Ask your team about which metrics will help them the most. Focus on a few key metrics which relate to your overall objectives. Use the metrics to improve the environment and test for this in user surveys. Communicate the metrics regularly so that they influence behavior.

Business impact metrics are potentially useful in justifying the expense of a program, in garnering management support, and in communicating the value of spending time on recommended activities. Anecdotes and success stories can be collected and converted into numerical values. Data can be captured in incentive points systems about the value of an initiative. Processes can be created or modified to ask participants about the business impact of initiative tasks. But there are few definitive ways to prove that a particular business indicator was solely influenced by the initiative. There are usually multiple reasons for a specific business result, and the initiative in question may be one of those reasons.

If there is a way for you to collect business impact metrics, then do so. They have more significance than operational metrics. But limit the effort involved to a reasonable amount.

Collect, Report, and Act on Metrics

Collecting and reporting on the measurements used in your KM program will help you to communicate progress, motivate people to improve their performance, and reassure management of the value of the initiative. Keep the effort required to do so in the right balance with other projects, look for ways to continue to streamline the process, and review the reporting process annually to keep it relevant to the current state.

Collect metrics directly related to the objectives of your program. Report on the key activities of knowledge management: sharing, innovating, reusing, collaborating, and learning. Here are three things you can do with the metrics that you collect.

  • Take action based on what the numbers indicate. For example, if you are leading a communities initiative, report on the health of each community every month, and retire the inactive ones using a community health report.
  • Track and communicate progress against goals. For example, if you are leading a knowledge management initiative, identify the top three objectives, track and report on how the organization is doing in a monthly report, and inspect and discuss progress (or the lack thereof) in management team meetings.
  • Persuade others, answer typical questions, and refute baseless assertions. For example, I received comments such as “no one uses our Enterprise Social Network (ESN).” I refuted these by pointing out that the ESN actually had 118,652 members, 1,256,806 messages, and 144,432 files.

What to Avoid

Don’t capture metrics for the sake of metrics. Many people express a desire for data that doesn’t drive any action or insight—collecting data for data’s sake. For each metric to be captured and reported, there should be an associated action or insight which is expected to be driven. Avoid collecting every random thing, sliced and diced every possible way, which someone might want to know just once, but has no intent of doing anything with other than say, “Oh, that’s interesting.”

Don’t establish a long list of arcane metrics. The fewer the number of metrics, and the simpler the reports, the better. Don’t attempt to measure knowledge using the metrics of balance sheets. Conventional balance sheet metrics do not adequately measure knowledge.

Be wary of publicizing numbers that reflect actions you don’t want to encourage. For example, if you don’t want lots of groups being created in your Enterprise Social Network, don’t promote the number of new groups.

In the third part of this series, I will discuss making evidence-based decisions.

Stan Garfield

Stan Garfield

KM expert, consultant and author, Stan Garfield, will be presenting the next in his series of KM Conversations for Lucidea on Wednesday, March 25, 2020 at 11:00 am Pacific, 2:00 pm Eastern—subscribe here to be notified. Stan has compelling information to share, based on his distinguished career as a KM practitioner. Read his posts for our Think Clearly blog, and learn about Inmagic Presto, which has powered the KM  initiatives of many organizations.

Similar Posts

Leave a Comment

Comments are reviewed and must adhere to our comments policy.

0 Comments

Submit a Comment or not

Pin It on Pinterest

Share This