Monitoring and assessing communication and knowledge work
Communication and knowledge work are notably difficult to assess since they relate to intangible results. Subsequently, most organizations end up monitoring simply the outputs, not so much the outcomes that are related to those outputs or even less so their impact.

Here is an attempt at unpacking how ILRI might want to assess its communication and knowledge work (IM / KM / learning). In this attempt, individual communication channels are listed, as well as 'events' as specific communication activities/outputs.

The framework

Comms and KM (encompassing information management, knowledge sharing, learning) broadly entails the following dimensions as progressive steps towards more impact - which could be referred to as the PRICE scale (to remember that tracking each of these steps comes at a cost too):
  • Production: How many outputs (products and services of different kinds) are developed, produced, disseminated by the team. This relates clearly to the outputs developed. Activities are currently not monitored in this framework;
  • Reach: How the outputs above are actually seen/consumed by (un)intended audiences, to what width and depth;
  • Interact (Engage): How the audience that was reached by the outputs is reacting to these outputs;
  • Change (Influence): How the engagement (or simply the reach in some cases) leads people and organizations to change their behaviour (their thinking, their discourse, their practice, their policies and processes);
  • Effect (towards impact): The effective stories of change that link the influence comms/KM has had with the lives of other stakeholders and actors that are related to the initiative but not directly involved in the activities undertaken and outputs developed.

The metrics

The table below plots all essential communication channels used at ILRI onto the above-mentioned framework.
Once this table of metrics is agreed upon and finalized, targets can be established, monitored and analyzed for better understanding and for increasingly clear and sharp assessment of results.
Targets
Production
Reach
Interaction / Engagement
Change / Influence
Effect / Impact
Websites and blogs
Posts
Page views
Comments
Links to website
Testimonies / stories of change
Wiki
Edits
  • Page views
  • Visitors
New pages co-developed
Simultaneous editors
Links to wiki
Testimonies / stories of change
CG Space
Entries
  • Document views
  • Downloads
Links
Citations
Links
Testimonies / stories of change
Yammer
Posts
  • Members
  • New members?
Replies
Cooperation on specific initiatives
Testimonies / stories of change
Slideshare / Prezi
Presentations
Views
Likes, comments, favourites
Embeds
References
Testimonies / stories of change
FlickR
Photos
Views
Comments
Favourites
Reuses
Testimonies / stories of change
Face-to-Face 'Events' and engagement processes
Events

Ways to assess:
Keep track of the list of events on e.g. Maarifa or program wiki
  • Participants
  • Online interactors (followers / participants)

Ways to assess:
Keep track of:
  • Participants' lists (on program wikis)
  • People engaging online (e.g. Tweetreach)
  • Online interactions (e.g. Tweetreach + comments and other metrics mentioned above)
  • Group exercises/discussions
  • Concrete results developed
  • Ideally backed up by participants' feedback about:
    • satisfaction about the content (general feeling of appreciation, quality of conversations and their novelty, new insights gathered)
    • satisfaction about the process (respect of conversations, reaching objectives, reaching consensus / establishing boundaries - whether preset or emergent but important, respecting time, quality of facilitation)
    • likelihood of applying results in the future
    • side-benefits registered (e.g. increased network, information collected)
  • Virtual participants' interactions with the face-to-face group

Ways to assess:
  • Design of the event clearly specifying interactive sessions;
  • Testimonies of event participants (interviews/FGD/survey) about appreciation of the process
Increased:
  • Insights
  • Network, connections and trust
  • Capacities,
  • Willingness to share / learn / change
Resulting in:
  • Acting upon commitments (taking action)
  • Changed practices over time

Ways to assess:
Testimonies by participants about: * likelihood of applying results
  • connections made
through survey, focus group discussion or stories of (personal) change
Testimonies / stories of change

...related to the application of concepts, new programs developed thanks to insights and connections from the event etc.

Ways to assess:
Testimonies/stories of change
Twitter
Tweets
  • Twitter users reached
  • Twitter followers acquired
  • Mentions
  • Retweets
  • Favorites
  • Reactions
  • Citations,
  • Links,
  • Associated (referrer-driven) downloads
Testimonies / stories of change
LinkedIn
  • Updates posted
  • Group discussion initiated
  • LinkedIn followers
  • New LinkedIn followers (conversions)
  • Likes
  • Comments
  • Shares
  • Links
  • Associated (referrer-driven) downloads
Testimonies / stories of change
The particular case of 'effect' is special as it has various dimensions and potential applications:
  • Change to the livelihood of beneficiaries
Given the complex nature of these possible effects, M&E methodologies such as 'Most Significant Change' seem to be best placed to capture these changes... However, it is also possible, directly after the event, to ask participants what might be the effect of their participation in the event that they expect to see happen - to prompt their reflection about change in view of the after action review that follows 6 to 12 months later.

The case of face-to-face and impact of 'events'

This is an interesting case unto itself as everyone, including ILRI, tends to focus on social media tracking and monitoring, even though we organize a large amount of events that deserve to be investigated in their own right.
Following the example of IFPRI with their impact assessment report about the conference on 'Building resilience for food and nutrition security' (May 2014), some of the effects of such face-to-face interactions at events can be the following:
  • Individual change;
  • Organizational change;
  • Change in the thematic discourse in a given field - measured through e.g. media analysis, social media (e.g. Tweetreach) tracking, web search patterns, website google analytics etc.
The changes themselves can be:
Internally (tacitly)
  • New insights (knowing what/why): i.e. "were your views changed as a result of the event"?
  • New connections (knowing who);
  • New capacities (knowing how);
Externally (explicitly)
  • A change in the discourse
  • A change in practice (actually implementing new insights and capacities into different actions);
Impact could be none, light or durable/continuing.

The process (to make sense of it all)

How to collect the data to assess these areas of comms/KM monitoring?
Around events
  • Production metrics: collected and assessed individually by facilitator / comms team;
  • Reach: collected individually by organizing team / facilitator + possibly by social media team;
  • Engagement: collected individually by the facilitator / organizing team. Additional 'participant feedback' to be collected by facilitator (and incorporated as part of the overall event design) and/or by the social media/social reporting team;
  • Influence: collected by the facilitator through after-action-reviews / follow up emails/surveys 3 to 9 months after the event;
  • Effect: collected by the project M&E team and/or the event facilitator, 6 to 12 months (and again possibly 24-36 months later) after the event as part of e.g. MSC or another collection of 'change stories'. In addition, during (right at the end of) the event, the facilitator may collect expectations of participants about possible impacts of their participation in the event on other actors...

Methods to collect that information at events:
  • Simple informal assessment (either at the end of every day or at the end of the workshop)
  • Online survey (at the end of the event, a few days after it, a few months later)
Both methods can then

When to collect these metrics?
These metrics can be combined in two different 'modes':
  • On an ongoing basis, to collect statistics about all the communication channels;
  • Around specific events and 'campaigns', for which more intensive monitoring is being carried out and specific metrics might be chased.

Next steps
  • Develop a standard 'engagement impact assessment' survey to use/administer at upcoming events we organise - both before and after.
  • Pilot this at the next important event ILRI organises or contributes to (E-Learning Africa) or in a safe environment (APM 2015, CKM retreat)?

Some ideas about (social media) monitoring around the ILRI@40 campaign


Objectives of the ILRI @ 40 campaign - for each of the comms objectives we have specific goals for this campaign:
  • Produce: Announce anniversary events and activities across channels, with thematic focus on the questions that will be zoomed in on, report on them live and reflect on the outcomes and implications for ILRI's research and role in the livestock sector. All outputs (presentations, reports etc.) captured in CG Space
  • Reach: Connect all the current and former staff members with information about these events and activities, connect with the wider livestock arena and with the agricultural arena to some extent (key development partners)...
  • Interact: Have a pool of staff and partners react to the postings across social media and to . Offline: get the audience to share their impressions about current science and future research pointers.
  • Change: (beyond the events, to be monitored in the future). Get people exposed (participating to anniversary events) to invest in livestock options in an integrated farming system approach, to fund research efforts and to develop activities following the research priorities identified. RI (scientists) adapt their research efforts towards these research priorities.
  • Effect: (beyond the events, to be monitored in the future) Better investments in livestock lead to improved livelihoods and more productive livestock developments leading to improved nutrition, reduced food insecurity etc.

The main channels used for the campaign are the following:
  • Websites and blogs for content production and in-depth reporting'
  • Events for face-to-face engagement/interaction and leading to longer term change
  • LinkedIn for online engagement/interaction
  • Twitter for online reach and engagement/interaction
  • Yammer for internal awareness-raising
  • CG Space for the collection of all final outputs (with specifically presentations on Slideshare)

Although we have set up a wiki for the campaign it is mostly used by

Specific targets are mentioned in the table below:
When no target is put it doesn't mean that stats cannot be collected but it is simply not considered a priority.
Targets
Produce
Reach
Interact
Change
Effect (N/A) See overall objectives for this information
Websites & blogs
  • ILRI@40 page regularly updated (once per week at least) (not on track)
  • 1 intro post for each event at least (on track)
  • 1 summary post (on track)
  • = a total of 12 posts (on track)
An average of a monthly 2000 (?) page views in relation with the ILRI@40 posts?

(NEEDS CAREFUL ASSESSMENT)
At least one comment per post overall
= a total of x comments

(not on track)
6 months after the end of the campaign, at least 2x links to the pages / outputs from the campaign

(not on track)
N/A
Wiki
N/A
N/A
N/A
N/A
N/A
CG Space
All presentations etc. online
= a total of 50 (?) outputs collected throughout the campaign

(not on track)
N/A
N/A
N/A
N/A
Yammer
  • At least 1 post introducing each event (Almost on track)
  • At least 2 posts covering happenings at each of the events (Almost on track)
  • At least 1 post summarizing / reporting each of the events (Almost on track)
  • At least 1 post summarizing activities about ILRI@40 between events (Almost on track)
= at least 30 posts related to ILRI@40
  • At least 5 new ILRI staff members joining Yammer after the end of the campaign (on track, though perhaps not related to ILRI@40 events)
  • ...of which at least 2 become semi-regular / active members (not on track)
  • At least one reply from a comms person after each event post for further reach & engagement (on track)
  • At least one reply from non-Comms person for every 3 event posts (not on track)

= a total of 30 posts from comms persons
= a total of 10 posts from non-comms persons
x proposals developed on the basis of the key ideas introduced during the campaign / events, 6 months after the campaign has stopped

(not on track)
N/A
Slideshare / Prezi
A total of 20 presentations or prezis (on track)
N/A
N/A
N/A
N/A
FlickR
N/A
N/A
N/A
N/A
N/A
Events
6 events (or event sessions) hosted / organized by ILRI
(on track)
  • An average of 15 (?) ILRI people participating at each of the anniversary event (on track, overall)
  • An average of 2x as many non-ILRI participants for every ILRI participant in ILRI sessions (NEEDS CAREFUL ASSESSMENT)
  • An average of 1.5x online participants for each ILRI participant at each campaign event (NEEDS CAREFUL ASSESSMENT)
  • At least 1/3 of every event to be based on group interactions (or Q&A)
  • At least one set of ILRI recommendations emerging after each event
  • At least 1/3 of all possible feedback forms collected after each event (NEEDS CAREFUL ASSESSMENT)
  • An estimated 70% happy or very happy participants at ILRI events for content/ process
  • 6 stories of change collected (1 for each event) 6 months after the event to indicate what will be done differently on the basis of the Anniversary campaign

(not on track)
N/A
Twitter

Hashtracking.com
An average of 10 tweets shared around each day of ILRI@40 events
= 100 #ILRI40 tweets

(on track)
  • An average of 300 Twitter users reached for each group of 20 ILRI people present at each event (NEEDS CAREFUL ASSESSMENT)
  • 10 new ILRI Twitter account followers gained after each event = 60 new followers after the end of the campaign (on track, but the growth is steady, i.e. not affected by ILRI@40 events it seems)
  • An average of 1.5 mentions for each #ILRI40 tweet shared
  • An average of 0.3 RT for each #ILRI40 tweet
  • An average of 0.025 favorited tweet for each #ilri@40 tweet

(NEEDS CAREFUL ASSESSMENT)
  • An average of 0.1 referral for each #ILRI@40 tweet shared towards our websites and blogs

(NEEDS CAREFUL ASSESSMENT)
N/A
LinkedIn

https://www.linkedin.com/groups?gid=7490762&groupDashboard=
  • At least one post introducing each event (on track)
  • At least one post between each event (on track)
  • At least two posts during each event (on track)
  • At least two posts after each event covering results and related resources (Almost on track)
  • At least 25 new followers after each event = a total of 150 new followers after the campaign (on track for overall total, almost on track for 25 new followers after each event)
  • A total of 500 LinkedIn followers at the end of the campaign (not on track)
  • At least one comment (from non-organizers) and 2 comments from organizers for every event = a total of 18 comments (not on track)
  • At least two likes for each event and one like for each = a total of 12 likes overall (on track - but from usual suspects!)
3 stories of change about different ways of doing livestock research/development based on pointers from the event conversations etc.

(not on track)
N/A




Table template to re-use in the future

Targets
Produce
Reach
Interact
Change
Effect
Website





Wiki





CG Space





Yammer





Slideshare / Prezi





FlickR





Events





Twitter





LinkedIn