That’s me (on the right) in full discussion mode at the workshop. The following is the report about the workshop. This is my previous blog about the workshop, and my thoughts on the use of K* (K-Star) for Knowledge Mobilization.
A Collaborative Workshop
March 3rd 2011
The Arboretum, Guelph
Louise Shaxson, Delta Partnership
1 | P a g e
Workshop Objectives and Summary
This is the report of a workshop jointly held by the Public Health Agency of Canada, the OMAFRA-University of Guelph Partnership, Health Canada, and Environment Canada.
The use of evidence in policy is not simply a matter of sourcing high quality evidence and presenting it to policymakers – as important as the quality of the evidence is the quality of the various processes through which that evidence is put, in order to inform decisions at all levels about designing, delivering, monitoring and evaluating policy. Improving the quality of these processes means drawing on the fields of science policy, knowledge management and evaluation, as well as having a detailed understanding of the internal workings of line ministries.
This workshop was designed to stimulate, inform and network people with a broad interest in the use of evidence in policy and techniques to improve the interface between science and policymaking. The workshop had three objectives:
1. To bring together people from the local community interested in aspects of evidence-informed policy making and knowledge brokering in order to share experiences, broaden networks and discuss issues of common interest.
2. To outline and discuss current thinking on evidence-informed policymaking; its key components, some challenges and opportunities
3. To draw from experience of implementing an evidence-informed approach to policymaking in various UK Government departments and consider some of the lessons learned
The ‘local community’ stretched from Ottawa to Hamilton and from health to agriculture and involved people from federal, provincial and local agencies, universities and wider stakeholders including industry and interest groups. The disciplinary alignment mattered less than the fact that they were all bonded by a shared interest in the interface between research and policy and the intricacies of knowledge brokering. The presentations drew from Canada and the UK and covered a variety of issues: Barb Marshall built on the background paper to emphasise the breadth of what ‘policymaking’ covers, and Melissa Mackay summarised the responses to the participant survey which asked people to detail the issues facing them and their needs from the day. Phil Malcolmson spoke of his experience of policymaking in OMAFRA and outlined the successes that the OMAFRA-U of Guelph partnership has had and the factors contributing to that success. Laurent Gémar talked about the complexity of federal policymaking processes and outlined the detailed work that Health Canada has been doing internally to improve the flow of knowledge across the science-policy interface. Alex Bielak talked of the work done at Environment Canada – some of the skills and tools they developed and how they have contributed to better engagement between science and policy. Louise Shaxson gave the final presentation with some reflections on some of the principles of ‘evidence-informed policymaking’ and demonstrated some tools used successfully in the UK. The presentations are summarised on pages 5-10 of this report.
There was a great deal of cross-learning; networks were built and strengthened, experiences were shared, and participants were able to benefit from lessons learned from work elsewhere in Canada and the UK. On scale of 1 (low) to 5 (high), average scores for participant evaluations ranged from 4.1 to 5, showing a great degree of satisfaction with the day.
2 | P a g e
Table of Contents
Workshop Objectives and Summary …………………………………………………………………………………………… 1
Acknowledgements …………………………………………………………………………………………………………………. 3
Agenda ………………………………………………………………………………………………………………………………….. 4
Summary of the presentations ………………………………………………………………………………………………….. 5
Introduction to a knowledge café …………………………………………………………………………………………….. 11
Knowledge café questions ………………………………………………………………………………………………………. 12
1. Drawing evidence from a wide variety of people and organisations …………………………………….. 12
2. Creating and improving the demand for evidence from policy and decision-makers ………………. 14
3. Defining a role for ‘knowledge brokers’ within a department ……………………………………………… 16
4. Improving availability and access to all types of evidence …………………………………………………… 18
5. Integration across departments ……………………………………………………………………………………….. 20
6. Managing upwards: convincing managers that work at the science-policy interface adds value ………………………………………………………………………………………………………………………………… 22
7. Assessing the impact of knowledge brokering and other communication activities at the science-policy interface ………………………………………………………………………………………………………. 23
Themes from the day – a reflection ………………………………………………………………………………………….. 24
Evaluation of the day ……………………………………………………………………………………………………………… 26
3 | P a g e
The workshop was a collaborative effort between many partners, and the organising committee is grateful for the support given to them by the organisations and individuals concerned.
For organisational support and funding, we are grateful to Health Canada, the Ontario Ministry of Agriculture, Food and Rural Affairs (OMAFRA), the Public Health Agency of Canada and the University of Guelph; and to Environment Canada for organisational support.
We are grateful to the day’s speakers: Susan Read and Barb Marshall from the Public Health Agency of Canada; Melissa Mackay from the OMAFRA-University of Guelph Partnership; Phil Malcolmson from OMAFRA, Laurent Gémar from Health Canada, Alex Bielak from the United Nations University – Institute for Water, Environment and Health, Louise Shaxson from Delta Partnership and Elin Gwyn from OMAFRA.
The organising committee consisted of Barb Marshall, Kate Thomas and Elmer Mascarenhas from the Public Health Agency of Canada; Bronwynne Wilton, Melissa Mackay and Ken Hough from the University of Guelph Agri-food and Rural Link: Elin Gwyn from OMAFRA; Shannon deGraaf, Kristin May and Jaime Dawson from Environment Canada, and Louise Shaxson from Delta Partnership.
4 | P a g e
Using Evidence to Inform Policy – A Collaborative Workshop
March 3, 2011
The Arboretum, University of Guelph, Ontario
8:00 – 8:30 Registration & Coffee
8:30 – 8:40 Welcome Susan Read, Public Health Agency of Canada
8:40 – 8:50 Setting the Stage Barb Marshall, Public Health Agency of
8:50 – 9:10 Knowledge Gaps & Lessons Learned Melissa Mackay, OMAFRA-UofG Partnership
9:10 – 9:30 Linking Academia with Policy Phil Malcolmson, OMAFRA
9:30 – 10:00 Science to Policy Interface Laurent Gémar, Science Policy
10:00 – 10:30 Break & Networking
10:30 – 12:00 KT/KB Evolution Alex Bielak, United Nations University
Evidence‐Informed Policymaking Louise Shaxson, Delta Partnership
12:00 ‐ 1:00 LUNCH
1:00 – 1:15 Introduction to Knowledge Café
1:15 – 3:00 Knowledge Café
3:00 – 3:15 Break
3:15 – 4:00 Report Back
4:00 ‐ 4:15 Closing Remarks Louise Shaxson
4:15 – 4:30 Wrap Up/Evaluation Elin Gwyn, OMAFRA
5 | P a g e
Summary of the presentations
This section summarises the main points made in each of the presentations – the slides for which are available from Barb Marshall from PHAC (Barbara.email@example.com).
Susan Read from PHAC opened the workshop by stressing the importance of the workshop; encouraging participants to think deeply about how to bridge the gap between research and policy, to be groundbreaking and to learn from others. Barb Marshall then set the scene for the workshop, outlining the three themes for the day:
• To explore current thinking on evidence-
informed policymaking; key components,
challenges and opportunities
• To learn from our past experiences and draw
from experience of implementing an evidence-
informed approach to policymaking in the UK
• To discuss ways to improve the integration of
evidence in policy – what are the main
concerns and priorities with regard to
She then drew from the background paper which was circulated to participants before the workshop1, to emphasise the breadth of what we mean by ‘policymaking’, the nature of public policies and the key actors involved. Stressing that policy often deals with clusters of entangled and long-term issues, she noted that it needs to involve a wide range of players; that networks and partnerships are important parts of the process of policy development and implementation.
Melissa Mackay from the OMAFRA-University of Guelph partnership then gave an overview of the results of the pre-workshop survey, which asked respondents a series of questions about what they needed to improve the links between evidence and policy, the barriers and opportunities.
The main barriers to knowledge translation and transfer were identified as:
A lack of funding for knowledge translation and
transfer to enhance evidence informed policy
A mutual lack of understanding between groups of what policy and research groups are
working on, what their needs are, who to approach
The difficulties of ensuring that evidence is timely, responsive and accessible
The complex nature of policy issues
The difficulty of getting governments to do regular reviews and updates of regulations, and base them on evidence
Being able to frame questions which work for both research and policy
Understanding and applying basic public policy concepts. Also available from Barb Marshall.
6 | P a g e
There had been some important successes, however: respondents noted that there had been successes with individual projects (and that economic evidence was compelling for decision-makers), but there could be greater impact with close and ongoing collaboration; mixing policy and technical specialist on the same committees, for example. It takes time for policy and science to define the questions, but establishing a rapport with decision makers does not only help ‘get the question right’. It also helps at the other end of a project when results need to be disseminated: by keeping in close contact with policymakers, scientists can receive hints or suggestions as to when it is an appropriate time to push for changes (though this can only happen if all the results are well documented and ready to go at short notice). Melissa noted the different roles we all play in this, and the different tools we need to fulfil those roles.
Phil Malcolmson made a series of observations based on his 23 years with the Ontario public service: the last 15 being with OMAFRA. He talked of the complexity of the policy environment and of the need to address problems that cross many disciplines (such as Food For Health) whilst recognising that there always gaps in our ability to underpin advice with analysis. Working with different partners to fill those gaps is therefore an important part of policy development. Phil spoke of the need for horizontal policy processes which mean leadership with partnership – sharing an agenda and understanding where each partner can best contribute to the joint process. This implies that policy development is less about the problems and more about the people, networks and institutional agendas. He said that there is really no substitute for both sides getting to know each other and being willing to work on a joint agenda.
OMAFRA and University of Guelph have been working together since the 1890s, with the shared goal of producing more food in Ontario. Through there have been changes in institutions, relationships and the types of engagement over this period, during the 1990’s the levels of engagement between researchers and academic decreased. As a result of this, in 2008 the agreement between OMAFRA and the University included reinvigorating that relationship to understand the agendas, each side’s capacity, to get to know each other and to work jointly on projects. Describing it a bit like an ‘open marriage’, he noted that the new structure includes Director Champions, a role created to ensure that ministry leaders are actively engaged in the process.
The relationship benefits OMAFRA by:
• Enabling OMAFRA to work directly on specific research projects and steer them to effectively meet its policy research needs
• Enhancing OMAFRA’s policy and economic analysis capacity
• Giving early support of “thought leaders” for government’s policies / programs
• Developing networks and communities of practice
7 | P a g e
And the relationship benefits the University of Guelph by:
• Building profile
• Providing opportunities for research funding (including student funding)
• Giving academics a better appreciation of government’s policy position on critical issues
• Enhancing a sense of inclusiveness in policy-making
• Strengthening knowledge translation and transfer (extension) opportunities through government channels
• Increasing opportunities for applied research
Phil’s final observations were that there is a tremendous academic potential and research capacity in Canadian and Ontario academia if it is possible to identify academics who may be interested in collaboration with the government. However this sort of relationship-building requires time, a community of interest and trust between both sides.
The next speaker was Laurent Gémar from the Science Policy Directorate at Health Canada. Laurent provided an ADM perspective from Claire Dansereau, who said that science is an essential, absolutely fundamental component for both regulation and policy; and that the science done by scientists at Health Canada must fit into either the regulatory or the policy front. Health Canada should leverage more science by collaboration, maximising the available expertise and ensuring that results are communicated effectively. The complexity of Federal Government policymaking processes makes this a challenging task, however, and Laurent outlined two definitions developed by Health Canada to take this forwards:
Science Policy Integration: A two-way process which encompasses both “policy for science” (strategies to conduct, access, assess, synthesize and disseminate quality scientific evidence) and “science for policy” (how scientific knowledge informs decision making).
Science Policy Interface The juncture at which the technical world of describing evidence and the political world of making policy (or regulatory decisions) interact, using the best available evidence towards informed decision-making.
Science-policy integration occurs at many points in the development of policies and regulations: when information briefs are created for senior management, when preparing Notes for Questions to Parliament, when responding to Media inquiries, when specific Task Forces and Working groups are convened, during the process of developing science priorities (expertise and lab infrastructure needs) and in foresight activities. There are two other processes which encourage integration, though: the ADM Science & Technology Integration Board, and the Deputy Ministers S&T Committee.
8 | P a g e
The latter has four pilot projects underway which will demonstrate the integration of science and policy: Laurent gave a brief outline of two health-related projects (One Health – Food Safety and One Health – Emerging Infectious Diseases) before moving onto two issues where there had been demonstrated success. One of these was the re-labelling of cough and cold medicines, which set out to manage the risk of potential misuse or accidental overdose of over-the-counter medicines. Setting out the science and policy considerations, he noted that in working through them Health Canada was able to develop a very clear communication strategy whose ‘point person’ was bilingual and had high credibility with both scientists and citizens: it also made extensive use of the internet (their video was picked up within seconds by YouTube and Google).
Finally, Laurent noted Health Canada’s ongoing work to develop a more strategic approach to the science-policy interface, including the four key objectives of:
• Establishing awareness of Science Policy integration within the Health Portfolio science and policy communities (via a Community of Practice; sharing success stories, SPI events)
• Developing and making available to Health Portfolio staff the necessary tools to make evidence-informed policy decisions in an efficient and timely manner (via systematic processes, checklists and templates, performance indicators)
• Training Health Portfolio staff to integrate Science and Policy and to fully contribute in multi-disciplinary teams (via standard terminology, SPI primer course for all staff)
• Building linkages to gather intelligence, share information and inform activities occurring in relevant external organizations (via improved KT, directed funding, issue specific teams)
Following a break for networking, Alex Bielak gave an overview of the evolution of the phrase ‘K*’ – a term he recently coined to encompass the multitude of acronyms relating to the knowledge-policy interface2. The knowledge management specialist Dave Snowden has said that knowledge is most useful when it is needed – the Bielak corollary is that Federal science and technology (S&T) must be used to be useful. This means shifting from a products model to a marketplace of products tailored to specific audiences – and further than that to an iterative knowledge brokering process based on ongoing, durable relations which recognise the importance of ‘policy pull’. Alex gave a series of examples from his previous position in Environment Canada (EC) and from his current position at the United Nations University – Institute for Water, Environment and Health(UNU-INWEH). As the principal hub of environmental research in Canada, EC moved from a model where there was a linear push from the science world to the policy world towards one which ‘funds the arrows’ between the two worlds and brings in knowledge from
2 KT, KM, KB, KMb, KTE, knowledge adoption, knowledge integration, knowledge adaptation, knowledge synthesis… while they each mean slightly different things, the phrase ‘K*’ is intended to cover all of them.
9 | P a g e
This means finding and using people who understand both worlds and who are able to convene, translate and mediate as necessary. This doesn’t happen for free, and making provisions for the cost of this sort of co-ordination is important.
Alex noted that there has been a noticeable growth of formal knowledge translation and brokering jobs and units in Canada, delivering research results to targeted audiences and decision-makers and encouraging the two-way flow of knowledge across the interfaces between science and policy. This implies a particular skill set: moving away from Big-C (centralised) communications to dedicated ‘little-c’ science communications, developing integrated mechanisms and tools to sustain the interaction between science and policy. This sort of work is epitomised by Environment Canada’s S&T Liaison team but also more internationally:
Alex showcased some of the work being done on knowledge management by UNU-INWEH which involves:
Data, information, knowledge storage system (a database & database management system, user interface, QA)
Information extraction (systematic database searches, information filtering)
Knowledge synthesis (capturing tacit knowledge, engaging international expertise, capturing generic findings, knowledge feedback)
Knowledge dissemination (producing policy briefs and recommendations, mobilize learning networks, targeted marketing)
Finally, Alex outlined five key issues to be addressed to move forwards the work on K*. As a community of knowledge workers working at the interface with policy we all need to continue to build, implement and use toolkits, to legitimise the field (both in the Government of Canada and elsewhere), to build an international community of practice, to explore appropriate classifications without getting too hung up on terminology, and to set metrics so we can assess our usefulness. The plan is to address all of these, and more, at an international conference on K* to be held in 2012 in Southern Ontario… watch this space!
The day’s final speaker was Louise Shaxson from the UK’s Delta Partnership, who spoke about evidence-based policymaking and outline some of the tools and techniques she has used to build bridges between science and policy in several UK government departments.
Her first observations were that policymakers often give three reasons for not focusing on the evidence base: first, that it takes too much time given all the other pressures they’re under; second, that politics gets in the way; and third, that anything about ‘knowledge’ is a fad which will fade away in time. She countered these by saying that politics is an integral part of the policymaking process, not some form of noise in the system that prevents us from designing and delivering the perfect policy. Politics at all levels influences how decisions are made: recognising this means recognising the importance of ensuring that all decisions are based on the best available evidence.
10 | P a g e
The implication of this is that in seeking to improve evidence-informed policymaking, we need to focus on the processes that use evidence as well as the quality of evidence itself. A broader definition of knowledge management (away from a focus on IT tools) will help us develop a deeper understanding of whose knowledge is important and how we can source and interpret it to make better policies.
She then developed a definition of evidence-based policymaking as ‘a process which uses robust methods to support decisions about how to deliver outcomes for citizens, using the best available knowledge to deliver cost-effective solutions’. While this may seem obvious, the challenge is to determine which outcomes are important and whose knowledge should be used – which depends on the characteristic of each issue. Louise outlined three basic models to make this point: a simple linear model which could be characterised as ‘policy by instruction’, a more complex relationship model of ‘policy by consensus-building’ and a network model involving many actors which she termed ‘policy as a constant process of negotiation’.
This complexity means that knowledge for policy can be defined as what is known about an issue in the context of what needs to be achieved: it takes into account the power dynamics which arise in complex actor networks and the need to ensure that different voices are heard in the policy development process. The definition also helps us understand why it needs to contain both explicit (codified) and tacit (individual, experiential) knowledge; and why the quality of the processes for using evidence in policymaking must be as robust as the quality of the evidence itself.
These processes can be roughly divided into four – all of which need to be done jointly by the providers and users of evidence:
Scoping the question
Assembling existing and emerging evidence
Procuring new evidence as necessary
Interpreting the evidence for policy development, delivery and communication
The suite of tools which support these four processes are the foundation of K*: they include tools that are more familiar (such as advisory committees, brown bag lunches and seminars, systematic reviews) but Louise noted that we need to think widely about which tools work where and develop new ones as appropriate. She outlined a few she has seen or worked with:
Evidence maps, which help give a strategic overview of the evidence base for policy issues and which can help assess its overall strength or policy relevance
Lines of argument, which can help develop the rationales for a policy and form the basis for scoping an evidence base for a new policy issue
Report cards, which go beyond policy briefs to present up-to-date assessments of scientific issues based on a consensus around the confidence with which policy-relevant statements can be made
Social frameworks, which draw out the actor network map and encourage all actors to negotiate the institutional and individual behaviour change that will be needed to deliver the intended outcomes.
11 | P a g e
Evidence strategies: a useful tool to ensure that a department’s evidence needs are
closely aligned with its policy goals.
Joint work planning by both researchers and policymakers: an obvious tool that is rarely
done, which leads to mistimed reports that are useless because they arrive after the
Finally, Louise echoed Alex’s call to improve our collective understanding of the impacts of what we
do in K* so that we can demonstrate conclusively the value of our work.
Introduction to a knowledge café
A knowledge café is a workshop technique used to allow a number of people to discuss a number of
issues in a structured way. It helps people get to know each other, gives them time to discuss the
detail, yet also encourages them to think broadly and interact with a variety of people to learn from
Before the workshop, seven questions were identified as the ‘table questions’ for the knowledge
café. These were drawn from the responses to the participant survey as well as from the experience
of the organisers. Guided by a facilitator at each table, participants spent 20 minutes discussing a
question before moving on to another table, and another question. Being free to choose which
questions they wanted to answer ensured that people mixed with others with shared interests. This
brought experience to the discussions but also encouraged networking, one of the main objectives of
12 | P a g e
Facilitators took notes of the main points which emerged, ensuring in particular that issues raised repeatedly by different groups were captured. Some groups stuck to the questions posed to them at the outset, while other groups were more free-flowing and preferred to follow a thread of conversation stimulated by the written questions. Neither was necessarily right or wrong: both methods produced rich discussion which was fed back to the groups at the end of the day.
The seven questions are set out below, together with a summary of discussions.
Knowledge café questions
1. Drawing evidence from a wide variety of people and organisations
Policy development needs to draw evidence from a wide variety of people and organisations – it’s not just a case of linking researchers to policymakers. With your particular organisations in mind:
a. What are the types of organisation that we need to be sure to include?
b. Where are the links particularly strong/weak? Why is this?
c. Are there any specific examples of good practice that could be shared more widely?
d. What might be some priority steps your organisation could take to improve the breadth of evidence it works with?
We have a wide variety of stakeholders: consumers, industry, municipal, political, farmers, and different levels of government: we need to involve them meaningfully so that we can effectively incorporate all viewpoints (friendly and perhaps not so friendly). But we need to bear in mind that stakeholders can be internal or external to an organisation (including finance, statisticians, field staff) – and at the outset we need to think widely about who the stakeholders for an issue actually are. We need to involve them in action not just talk – face-to-face meetings are important and useful. We also need to think about top-down as well as bottom-up types of action, think across government and regions, and think about the co-creation of knowledge between stakeholders not just the transfer of knowledge from one to the other.
Setting the scene for the workshop:
feeding back from the participant survey
Language is important – we need to work to develop a shared understanding of what’s been said and what the next steps might be, considering who may be missing from the table and how to incorporate them in future. But it’s not just about involving people: we need to validate or confirm the message: after each meeting, we should confirm what we heard or think we heard and ensure that we share a common understanding about of what everyone expects to happen.
13 | P a g e
The four key themes are: mess, mud, power & trust: these are messy issues, often unclear and because of this we need to be conscious of who holds the power and what this means for trust in partnerships. Whose evidence counts? Which evidence? How much weight do we give a piece of evidence and why?
We need to ensure that policy isn’t made on Google: to use a wide variety of tools to support policymakers such as linking and exchange events, workshops & working groups, social network maps, job exchanges, experts in residence, mentoring activities, lay people & advocates, informed leaders, social media.
What are the types of organisation that we need to be sure to include?
It’s hard to do this meaningfully if you come to it late in the process: we need to engage early and move beyond talk to action. Use communication channels that are inclusive from the beginning (explore social media), connect into the various communities and ensure that in the co-creation of knowledge we consider those without a large voice
Engage early, engage formally and informally: bring different perspectives to the table and stop policymaking-by-Google
Ensure industry is included in the definition of ‘knowledge experts’, not just researchers, so that any solutions that are being considered are examined for how sensible they are for industry, not just policy
Engage both top-down (national leaders and internal actors) and bottom-up (communities, NGOs, regional governments), and ensure that all layers of government are consulted
Facilitators captured the discussions and fed back key points at the end of the day
Where are the links particularly strong/weak: why is this?
Remember that interacting with external stakeholders can be scary for policymakers not used to it: it’s messy, muddy and there are issues of power and trust involved. The Cochrane collaboration has lay advocates – well-informed stakeholders who can begin to break down some of the barriers to stakeholder involvement
What’s the weight of evidence that we consider – whose views count and what weight do we give them?
14 | P a g e
It’s easier to draw in our ‘friends’, more difficult to bring in those with contrary views, but it’s important to engage them nonetheless and to have sufficient resources to do so:
Understanding others’ drivers is important, as is building trust between stakeholders (are
there data protection issues here?).
Hard data is costly, but not having broad and long-term data gives rise to uncertainty
Any examples of good practice?
The Cochrane collaboration, and its use of lay advocates
The National Collaborating Centre for Methods and Tools
(NCCMT) public health decision-making tool (on right)
Working groups (particularly within industry), policy round
tables which get people to talk, inclusive policy
development processes and workshops as part of the
policy drafting process
Linkages and exchange (Great Lake to Great Lake) –
engaging finance ministers is important as are secondments, mentoring activities, job
exchanges and experts in residence
What might be some priority steps your organisation could take to improve the breadth of
evidence it works with?
KT working groups, involving partners beyond government
Staff co-learning and mentoring processes
Checking with stakeholders that you understand what it was that they said and what they’ll
do with the evidence: not assuming that what you take away from any engagement process
is what they meant
Collecting data to justify/measure impact of K* activities
15 | P a g e
2. Creating and improving the demand for evidence from policy and decisionmakers
How can we create or at least improve the demand for research and other types of evidence by
policy and decision-makers? What can we do to ensure that policymakers actually demand
evidence rather than just responding to what is put in front of them?
a. How is ‘demand for research and other types of evidence’ actually manifest? What sorts of
processes within policy departments do draw in research and evidence?
b. Where are the needs greatest?
c. What role might structures and funding play in creating demand (such as earmarked funding to purchase research to directly inform decision-making)?
d. What measures could we used to demonstrate that user demand has been strengthened?
We need to create a repository of evidence that is accessible to policymakers: but this needs funds to maintain and to ensure it’s of a high quality. Researchers need to provide summaries in clear language, and also to be ‘humble’ in terms of how we present our work: there’s a lack of education on both sides about how to communicate effectively with each other. Training on how to communicate with a variety of audiences is useful, helping us to ensure that what we say resonates with their interests. Actual examples of where science has helped policy are always beneficial, and auditing the use of evidence could be help learn lessons: was the evidence actually used in policy, and if it was used, was it very good? Case studies and examples of where evidence has been used successful could be captured in short briefs or presentations on what was the difference the evidence made to the decision.
It’s important to build relationships – funding the arrows – which means getting up and walking around, and involving policymakers early on in the development of research proposals. Priority setting tools can help us build on the needs of policymakers, and researchers can link into these as they design their programmes. Thinking about what policy needs at the outset of research will help us answer the ‘so what?’ question as part of the research programme. It will also be important to recognise the different timings of policy and research, with researchers providing the long-term support.
Reporting back to participants: synthesising the main points and drawing conclusions from a fascinating series of discussions:
How is demand for research actually manifest? What sorts of processes within policy departments draw in research and evidence?
They’re not looking at us because they don’t know what we have or don’t understand it: a repository of evidence, experts, contacts would be a great help
Summarising, synthesising would all help, but we need to recognise that academics aren’t rewarded for producing 2-page policy briefs (funders could drive a change, though)
Being clear about how policy works and its timelines is a first start in improving the service research provides to policy, and for this they need to understand ‘what is policy’
16 | P a g e
Where are the needs greatest?
There’s a market for information: the needs vary but good, engaged discussion between
research and policy will help us predict future demand and build effective communication
(though we need to do more than just discuss and move to real action)
Maps, tools, repositories, training in communication & clear language can all help – it’s
important to be seen to be neutral and reputable (eg UNU has three searchable repositories
and a good reputation for neutrality)
What role might structures and funding play in creating demand?
What’s more important is transparency, referencing, synthesis methods, maintaining current
evidence, repositories/databases, and providing easy ways to access authoritative sources
Demonstrating the utility of science advice to the policy community is an important way of
stimulating demand – setting the expectation of evidence as part of the process (adding it to
policy templates) and providing plain language summaries
What measures could be used to demonstrate that user demand has been strengthened?
Feedback is important – completing the policy cycle. Post-implementation evaluation will
also demonstrate this
Checking with policymakers when submitting for funding (using a knowledge broker to make
the linkage?) will show if demand for has been strengthened; as will a greater willingness to
allow science and policy people to connect (though this takes time)
3. Defining a role for ‘knowledge brokers’ within a department
It is better to focus on the role of knowledge brokers before trying to work out precisely who should
do what. The ‘brokering’ role doesn’t only happen outside policy: it could happen within an
academic institution, a think tank, or even within policy teams.
a. What are the key attributes of a knowledge broker – as an individual, and as an
b. Knowledge brokers are often thought of as organisations external to policy, but what would
their roles and priorities be if they were to exist within a department?
c. How could researchers best support them from the outside?
d. Where are there good examples that others could learn from?
17 | P a g e
Knowledge brokers need to know the questions being asked of both sides, and to understand where the information gaps lie – getting away from the ‘them versus us’ mentality. They need to be ‘the context person’, able to build networks and use that to stimulate information flow. It doesn’t need to be one person: brokering could be done by all of the people some of the time, and one successful model is a science-policy fellowship where a scientist sits in a policy shop.
It’s best if researchers consult with brokers at the start of a project, to help them frame questions and agendas. And it’s best if it’s not done off the side of the desk, but is recognised as a real job.
What are the key attributes of a knowledge broker (individual or organisation)?
Flexibility, to appreciate different perspectives and see the broader picture: a generalist who can question where the evidence is coming from and also ask ‘so what?’
The ability to speak different languages (science and policy) – and to do so in plain language while being taken seriously, showing the depth of understanding of both sides of the debate
Transdisciplinarity, both in skills and experience – they need to know a wide variety of information sources and be a hub for many contacts
The ability to ask questions, avoid bias, communicate well, initiate conversations, calm situations and take on the role of facilitator: they need to be neutral, diplomatic, a good listener and able to negotiate
Be matchmakers, a good relationship builder and as neutral as possible (possibly withdrawing from the situation if they are becoming advocates rather than neutral brokers)
What would their roles be if knowledge brokers were to exist within a department?
Form follows function – define what you need and what budget you have, but it depends on the issue, the type & size of organisation. If embedded within policy it’ll always be a challenge to balance K* with all the other policy work that needs doing. They need a good feel for both sides’ needs, but to work hard to remain neutral
They’d have a different role if they were identifying a champion or becoming a champion; possibly would find it a bit constraining unless they were a liaison between external stakeholders & policymakers
Is it always a formal role, or could it be taken on by each person at different times? It is a concern that if too much emphasis is placed on one or two people in the role, opportunities could be missed where others could do K* as part of their job.
18 | P a g e
How could researchers best support them from outside?
It’s a bit awkward because of the outside/inside divide: it would work better to develop a
team rather than think about us versus them. Researchers recognising the role of K* would
help, though it is important for both sides to be prepared to meet in the middle
Returning phone calls would help! It can be hard if there is no day to day contact, though –
out of sight is out of mind
Where are there good examples that others could learn from?
Science-policy fellowships are still underway, giving good exposure to scientists in policy
shops and real personal experience
ResearchImpact / York University: got on the bandwagon early, it seems to be working and
they don’t do research in silos
OMAFRA has committed resources to KTT
Environment Canada has committed staff
All of the above are doing this as part of their job description, not off the side of their desks
4. Improving availability and access to all types of evidence
Statistical data, research, stakeholder & citizen opinions, and evidence from evaluations / prior
experience: all are needed in policy though in different proportions
a. From your point of view, which types of evidence tend to be strongest/weakest in policy,
and why? Can you give specific examples?
b. Where you have seen good examples of how to supply evidence to policymakers, what made
it so good?
c. What more could be done to improve how evidence is communicated, internally and
d. What would be your priorities?
19 | P a g e
The conversations did not exactly answer the questions above, but instead reflected participants’
experiences of different tools they had used and issues they had come across.
There is a bit of a paradigm shift happening in how we interact with our stakeholders because the
information in social media can be based on scientific content – Youtube can be an important first
step in getting messages out to people, but important questions remain of how we reach people
who can’t access social media, how we monitor the evidence of who we are reaching and what they
are doing with the information; and how we use social media to communicate new evidence.
Participants also discussed the importance of databases and other tools to collect validated evidence
in once place, access it and evaluate its relevance to policy. There is a real need for standards of
information to be developed to ensure consistency in a database; and a lexicon could help people
understand the relevance of information. Maintaining and updating databases can be a significant
cost if they are to remain useful. The health sector has a variety of databases which rate the
information (see http://www.healthevidence.ca, http://www.thecommunityguide.org but also the work of the
Cochrane Collaboration, the McMaster Public Health Centre, CASP and predictioncentre.org, the
AGREE tool template. There are also checklists for clinical trials at http://www.reflect-statement.org and
http://www.consort-statement.org which could be adapted to provide useful evidence checklists.
Assessing the quality of evidence is key – participants noted that grey literature is not necessarily
worse than peer-reviewed literature if it is of high quality, recent, relevant and formulated so that
stakeholders can make use of it (such as a World Health Organisation report). Quantitative is not
necessarily better than qualitative: both are needed as there are many issues require us to integrate
different types of knowledge (including aboriginal, industry data, public health inspectors’ data, for
example). There are methodological challenges to this, though, as different techniques give different
types of evidence and multidisciplinary approaches are not well understood by many university
academics and are often difficult to use for evaluations.
Finally, timeliness was identified as an important constraint to getting the right evidence – the
process of providing policy advice is a high-pressure one and without the time to search for the most
recent evidence there can be a tendency to rely on reports that are somewhat out of date.
Other issues the groups touched on included:
The need for librarians who can help search rationally, rather than relying on Google:
research is only one form of evidence for healthcare and it is important to reflect this in how
evidence is sought.
Talking about ‘knowledge creation’ rather than ‘knowledge generation’ gives different
emphasis to the types of knowledge/evidence that we look for, though we need to be clear
what makes good evidence and careful of anecdotal evidence (for example). Regular
updates of the evidence increase transparency and could be used to inform a ‘rolling brief’
for the Board
There may be a hierarchy of evidence quality with systematic reviews at the top, but while
systematic reviews are important they do become out of date (or are not updated) which
reduces their usefulness. Communicating risk and uncertainty is an important part of the
evidence base and this demands different tools
We need to think broadly about different tools for communicating evidence, including radio
and tv/Youtube, though the anonymity of the internet can work against our need to engage
people around an issue and we need to be careful how we use it.
20 | P a g e
5. Integration across departments
This is often problematic, partly because of the silo-ising effect of working in large organisations and partly because of the difficulties of sharing budgets
a. What are the big issues where integration would be important? Do they share any defining characteristics (risk, geographical area, vulnerability etc)?
b. What are the main barriers to, and opportunities for, integration?
c. Where have you seen good examples of integration, and what was it about them that worked?
d. How can we use evidence to improve integration?
Integration depends a lot on upper management – some ‘mavericks’ encourage integration, but many do not. Effectively communicating common goals is critical, but it’s often hard to see these goals and to understand where VFM can be improved through collaboration.
Areas of greatest risk emerged as areas where integration is important (eg economic risk, health risk). The lack of ability to network informally with others is a barrier to integration – such as not being able to use social media to network and collaborate. It’s also difficult to measure how effective collaboration & integration actually is, particularly when it’s hard to know who does what and the different mandates. Also, time is a major barrier: policymakers need to get approvals and have little time to work on integration. Workshops and conferences are helpful in creating networks and generating awareness & common areas of interest about issues: building communities of practice to deal with issues are useful and help to sustain relationships – it’s important to have a champion to help build them – and building informal networks (eg through social media) has also been helpful.
It’s important to prove that collaboration & integration has had positive outcomes (i.e. improved health outcomes, has been more cost-effective etc), but it’s difficult to measure the effectiveness of collaboration efforts.
21 | P a g e
What are the big areas where integration is important? Do they share defining characteristics?
It’s hard to move out of your own unit and interact with others; often difficult to know what
they’re doing and how to collaborate
It does depend on senior management encouraging it: some do, some oppose it, some have
different priorities (and it always takes time for approval: the more players there are at the
table, the more complex it becomes and the more time is required)
Different departments use different terminology, which can be a barrier
What are the main barriers to, and opportunities for, integration?
Barriers: not knowing how to create metrics to assess integration, the fact that some may
feel threatened by it, that it takes a lot of effort to overcome geographical silos because of
limits on travel. Political considerations can sink initiatives as can the size of the organisation
and the fact that people aren’t allowed to use social media at work
Opportunities: budget pressures may encourage greater collaboration rather than reducing
it, and including it in performance planning can help (though we need to define shared goals
and common outcomes). Geography matters – creating the space and opportunities to
create networks will help such as workshops, conferences and building informal networks
Where have you seen good examples of integration and what was it about them that worked?
Workshops, conferences, networks, communities of practice – the Migration & Health formal
network is a good one
Focusing on particular problems can encourage integration: emergency management
situations (eg SARS) are problem driven, and crises can create relationships which are then
maintained (though the danger is that you just end up fire-fighting)
How can we use evidence to improve integration?
Find and publicise stories of success, and objective scientific & economic analysis; proving
that it’s more cost-effective (more bang for our buck) to collaborate more with other
Require through funding incentives to demonstrate value for money, and create
opportunities for integration within contracts
22 | P a g e
6. Managing upwards: convincing managers that work at the science-policy
interface adds value
It is often difficult to work at the interface between science and policy, because the job description
doesn’t resemble other, more conventional ones.
a. What is it that managers are looking for in terms of adding value? Does it differ between
academics and policymakers?
b. Where can we find examples of good practice that could be summarised and shared?
c. What might be some priority steps we could all take, and then share?
There are a number of items or ways to manage upwards to convince our managers of the value of
work at the SPI:
Figuring out what they want and how to make them look good – give them what they need
and a bit more if you can…
Taking advantage of a crisis situation, and if you do have an impact then building on it
Start small; expand on small successes and build momentum
Try to think ahead and be practical; don’t only react to situations
Build up your ‘kudos’ file or list of successes – email people who value your work and have
their response available when you need it
Sell from the outside: have someone else tell your manager about the value and success of
your work – it often has a greater impact if someone else tells them
What is it that managers are looking for in terms of adding value?
Examples of where work at the SPI has added value elsewhere; such as building trust, filling
gaps, helping policy answer questions
Being more proactive than responding to crises; getting early engagement, linking to
departmental priorities and predicting where things are going
What might be some priority steps we could all take, and then share?
Starting small, not using a lot of resources, and gaining momentum
Google ‘skunk time’ and show the value of that to your managers
Build a business plan for K* activities which shows value#
Respond quickly, understand risks in an issue and build a strong business case for looking at
risks and thus meeting real needs
23 | P a g e
7. Assessing the impact of knowledge brokering and other communication activities at the science-policy interface
a. Referring to the short background document which defines public policy, where would you expect to see knowledge brokers having a direct and immediate impact?
b. Where might you have to wait to see impact, or where might the impact be less direct?
c. What sorts of measures might you look for to assess impact?
Responses varied in all four groups, as they put different interpretations on the question (it was the most asked question but the hardest to answer). It is important to describe the process, not just focus on the outcome; relationships, rapport and collaboration are all key and relatively easy to monitor, but evaluating impact takes time and resources and is difficult.
How many examples do you need to show that K* works? Should we be using hindsight to learn about K* activities? We need to have the right people in the roles to demonstrate it, but it’s always difficult to assess if and how evidence is used by policymakers (and lack of impact is important to examine as well). Getting baseline data is important if we’re to measure impact, but K* is a fluid process so it is difficult to prove value for money. We can use networks and behavioural theory to affect change within a group, because hard numbers don’t tell the whole story.
We can’t wait until we know how to evaluate it perfectly – we need to start somewhere.
Other issues raised (sub-questions a-c were not separately addressed)
We’re producing knowledge without understanding the role of the knowledge broker, so indicators don’t fit impact as we can’t collect outcome data (economic parameters can’t be evaluated, and results may not come directly from planned knowledge brokering efforts). Network theory may help, as in health behaviour research
Determining the return on investment for knowledge brokers is difficult; we talk about process and outcome and it’s easier to evaluate the former than it is the latter (impact takes time to be seen, which is a problem when government needs to justify things on a reporting term – annual – basis)
Health Canada has two venues: the Best Brains event brings together experts on a particular topic, and it engages students who will be a new generation at the science-policy interface.
If you work with communities it’s relatively easy to track outcomes on individual levels, but it’s important to track possible unintended and negative side effects of brokering. For academia it’s hard to use measures of success that aren’t related to achieving tenure
The community can be a barrier (eg Guelph vs. Ottawa) when working in policy at a federal level: it can be difficult to form relationships and there’s a tendency to build reactive rather than collaborative ones
A better understanding of government structure would help (who to talk to, key collaborators, where information should be targeted) though turf building can limit collaboration
Setting governmental research priorities can help guide academic research and funding: there’s currently a big disconnect between the two
24 | P a g e
Themes from the day – a reflection
It is always hard to capture the detail and nuance of discussions but never more so than when a group of experienced practitioners come together to talk about issues of mutual concern. This section is therefore a personal reflection on some of the highlights of the day – drawn from listening in to the conversations at the knowledge café, hearing insights from the speakers, and putting them together with our own experience and ideas.
Mess, mud, power and trust – four big issues emerging from the first table at the knowledge café. The partnerships we are working with are messy and the issues are muddy: we often have to operate in situations that are unclear both in terms of what we are talking about and who we ought to be dealing with. This lack of clarity gives rise to issues of power and trust – we need to be aware of how power dynamics between organisations and individuals affect the interface between science and policy; and that forming relationships based on trust takes time and effort. None of this is easy to deal with, but we have to acknowledge that it’s all integral to what we do in our work at the interface between science and policy. Our actions need to recognise that they are part of the process, not ‘noise’ in the system – that while they may hamper our search for clarity and rigour they raise important questions about such things as: whose voice counts? Whose evidence should we use and how should we balance one type of evidence against another? They are complex questions without definite answers but we shouldn’t ignore them: while we may not get the clarity and rigour we would like, the process of seeking it will deliver all sorts of other benefits and explanations.
Funding the arrows, not just the boxes – the dialogue between knowledge providers and decision makers (and between different groups of knowledge providers, such as universities and industry) does not often happen of its own accord, yet it is an important determinant of what programmes of work are supported and how they articulate with policy needs. Linking science to policy involves finding and using the right tools to improve this dialogue, assessing and meeting the requirements for timely information, and particularly using (or at least having access to) the latest evidence which is well synthesized and interpreted in the context of the weight of the evidence base around an issue as well as the context of what policy is trying to achieve for citizens. Knowing that all our hard work may end up as one bullet point in a single briefing note should not put us off from striving to ensure that policy is based on the best available knowledge.
25 | P a g e
This is not without its challenges. First, we need to be a little clearer on what we mean by KMb, KT,
KTT, KI (K*, as Alex Bielak calls it) – settling the terminology and meaning so that we have a shared
and consistent understanding of the skills and tools that are needed to work at the science-policy
interface. As it is a relatively new area there are institutional challenges to proving that our work is
valuable; and while we have access to a suite of tools through the network of K* practitioners, we
need to develop sound indicators of good practice which will help us convince others of the worth of
what we do.
Collaborating with different stakeholders means working to resonate with all of their interests, and
the varied time frames to which we are all bound. And it means doing this not only for the friendly
faces, but for those with contrary views: understanding different people’s priorities is one way we
can begin to develop a good shared understanding of the outcomes we seek, the knowledge that’s
important and the various different tools we could use to help us all make progress. We need to get
in there early, thinking ahead about which issues could be important to our stakeholders in future
and planning well so that we can anticipate their future needs and be there when that need arises.
But it’s not just about setting up processes: we need to be timely throughout, engage in person if we
can, use informal and formal methods, and look for common interests and priorities.
A final thought is that while more work needs to be done to develop a repository of stories of good
practice, techniques that work and contacts who can help, we can all contribute to demonstrating
our collective value. There is nothing as good as receiving praise from someone else: while we need
to be rigorous in our analysis of what others do, we must also be generous in our praise where we
see innovation or good practice. The community of K* practitioners is still emerging and we must
all, individually, do as much as we can to nurture it.
Louise Shaxson & Elin Gwyn
March 25th , 2011
26 | P a g e
Evaluation of the day
“I wanted to tell you again how much I learned from the symposium yesterday. Thank you for thinking of me; it was directly relevant to the next piece of work I will be doing in Peel on evidence-informed decision making. The day was also very well planned and seamless. Congratulations to you and the planning committee.”
“Thank you so much for all your efforts to make the last two days so productive. I was especially impressed with the breadth of the group you invited yesterday – we don’t often get to hear those perspectives. Job very well done!”
“I just wanted to thank you for organising the workshops over the last two days. It really was very informative for me on a number of levels and all the speakers, especially Louise, were exceptional…. Even the knowledge cafe went very well, sometimes people can be quiet, but each table I was at had a lot to say. I know it is a lot of time and effort to organise these events, thank you for doing this. Again, thank you for the great workshop… I have a bunch of reading and follow-up to do.”
While overall people were very happy with the day, the two comments which stood out were the need to broaden the event out from what was perceived to be an emphasis on federal issues; and the need to be clear about objectives. Participants would also have benefited form more information in advance, particularly about the objectives of the workshop.
Scores from the evaluations (1 = strongly disagree, 5 = strongly agree)
WORKSHOP CONTENT Average Score
1. I was well informed about the objectives of this workshop. 4.1
2. This workshop lived up to my expectations. 4.7
3. The content is relevant to my job. 4.5
4. The workshop objectives were clear to me. 4.2
5. The workshop activities stimulated my learning. 4.7
6. The difficulty level of this workshop was appropriate. 4.5
7. The pace of this workshop was appropriate. 4.7
WORKSHOP INSTRUCTOR (FACILITATOR)
8. The instructor was well prepared. 4.95
9. The instructor was helpful. 5
10. I accomplished the objectives of this workshop. 4.4
11. I will be able to use what I learned in this workshop. 4.5
27 | P a g e
12. The workshop was a good way for me to learn this content. 4.6
What was most valuable about the workshop?
Learning more about policy-maker needs.
Knowledge Café was very useful.
Good to have Louise recap at the end of the workshop.
Bringing together individuals from various sectors to work together and communicate. Excellent workshop!
Great location; excellent ‘topical’ topic. Not always sure about the two different “camps” i.e. policy/research. It’s about a two-way exchange.
Talking to other people and making connections.
Bringing together speakers and participants from a variety of backgrounds and organizations, time for networking.
Rediscover what I love about my job.
Links to contacts for particular information coming from presentations.
Great mix of lecture and networking. Knowledge Café was a great way to learn about others’ work.
General understanding of policy-science interface, the tools, network of people for resources.
The morning speakers.
Networking. Applicability of information to employment and other organizations/associations I am involved with. Tools and topics presented are broad reaching and easily adaptable to many situations.
Louise was a great speaker and provided interesting content.
Networking, hearing different perspectives in the Knowledge Café.
Contacts. Very enjoyable day.
What was least valuable about this workshop?
Too focused at federal level issues. Need more local context, participants and issues.
What other improvements would you recommend for this workshop?
Record it for future viewing – the presentations anyway.
Maybe add a poster session to share KTT projects and success stories.
More time for questions with presenters after presentations.
A list of additional resources for further information and recommended readings.
Better signage to the Arboretum for people on foot.
Provide listing in materials (afterwards) of references e.g. books mentioned, website links.
An activity break.
Microphone, pointer for presenters.
More prompts or questions at breakout tables to stimulate conversation. Provide more examples of successes, KTT at work.
More Knowledge Café-type interactive activities rather than didactic presentations.