 I want to turn now to another old friend, Jill Petrakowski, who is with the Biodiversity Institute at the University of Oxford. She's also working closely with C4 on an interesting initiative, which focuses on the quality of evidence for science policy dialogue and for constructing relevant research agendas. She is leading a collaborative project, partly coordinated by C4, to use information management to extend stakeholder engagement mechanisms to identify policy priorities and for science and to improve the quality of science for policy. And here, I mean, this is a particularly resonant question for us in the World Bank. As some of you know, we've had a new president now. He's been in place for about a year and a half. And one of the first things he said to us when he took over that position, he said, you know, you guys are supporting development work. I want to make sure that it's evidence-based. And so the importance of evidence, the kind of work that Jill is working on in terms of generating evidence to inform development outcomes, to inform policy, it's hugely important and very much resonant with my own institution. Over to you, Jill. Thanks. As Peter says, I'm now actually in the long-term ecology and resource stewardship group, and our evidence base has come from such things as pollen cores from lake sediments and ice cores going back into the paleo. So we really do believe in very long-term data sets, which might become apparent as I go through some of the things that I'm talking about. Thanks very much, C4, for the opportunity of participating in this session. It's great that it's so full. Okay, I saw this slide recently. A lot of you have seen it. It's a fantastic little graphic on how to influence negotiators. All good stuff. What worried me to a certain extent was this little bit over here. It's good advice. Get up to speed on data and facts, using for graphics and so on. Way before you reach the stage of actually being loud and blogging and tweeting and all the sorts of cool things that John's just been talking about. The boring thing for me, the boring worry that I have, I suppose, is I would want to say what facts, what data, how reliable are they and what are you doing about the bias inherent in those sources. So that's the policy context, the influence that I worry about and that my research deals with. Just those four questions to a certain extent. Because I see the landscape of this discussion being here, encapsulated in this diagram, the policy context and knowledge, the knowledge that we're using to influence policy. So it can be good information, great information, well collected. It can be poor information. And then it can be really well done. It can be done effectively and engagingly in the way that John and his team do. Or it could be done, it could be presented rather poorly. Bottom line, bad information presented poorly, not such a problem. Good information, great information presented well is great. It's what we're all aiming for. And again, I have this worry about the really poor information that looks rather good and looks influential and is often the stuff that policy makers might grab at. So that's the context that we're working in to try and solve some of those challenges. And the other context, as has been alluded to in the last few speakers, is that policy is now really in this bigger picture again. This is a slide that Peter Holmgren, DGFC, has been using a lot this summer very effectively. It's describing how forestry now sits outside its own little domain. It has to interact with these other very important policy arenas. So we've got to do something that tackles poor information or bad use of information and is outside our comfort zone perhaps and how we're going to be tackling that. Now, the medics with their enormously broad topic networks came up 20 years ago with something that they called evidence-based medicine. Some people don't like the term. Evidence-informed medicine is a nice term and perhaps one that I prefer myself. And we thought perhaps we can look at how this might work in forestry and it's reasonably simple. It's getting together the best science that you possibly can. Those people who are involved on the ground and have real expertise in working with communities or something, people like Maria and so on and others here. And societies need some preferences because what science says does not in a linear way necessarily feed into what we're going to do for society. So in the middle sits evidence-based medicine in the medical field and we hope will fit evidence-informed forestry. So that's the model that we tried to build on and there are many, many long-standing, successful examples of this being used in other fields, not just medicine. So we've drawn on all the great information and resources and experiences in these other areas. And C4 has led a coalition, if you like, a collaboration with these partners here that you can see with DFID funding, generous funding. We've actually started to explore how evidence-based forestry, systematic reviews might play out in our fields. So the current desire, I suppose, of this program is mainly to conduct systematic reviews and I'll tell you a little bit about those in a moment, but also importantly to get people talking together in that collaborative three-circle model that I spoke about earlier to identify good questions for reviews and good policy where we need good policy to be developed. So these things work outside just the systematic review and also to promote good practice so that we're not constantly doing that bottom-left-to-top-right poor information used rather attractively kind of model. So that's where we are looking at very broad types of questions with complex landscapes and perhaps rather narrow questions about sort of methods of measuring things in our traditional forest domains. All of these scales are able to be tackled in the program. So really the systematic review sits in that circle of best science, but it itself is collected using a collaborative approach. So although it informs good science, it is done with a lot of stakeholders. It's not just talking to your best friends and coming up with the best papers which happen to be mine or Peter's and nobody else's. It's a real attempt to be broad and collaborative and inclusive. And it has a very strict methodology which I won't go into too much today because I've only got six minutes or fewer. But it's a three-stage process really to find the question. Is that an important question that policymakers want? Do we have enough evidence, scientific papers and others, to support such a question? If we have, let's review it. And let's review it in a very thorough, comprehensive, transparent and reliable way, repeatable way. And it's set up exactly like a scientific project would be with a good method, a method that's peer-reviewed, then the results are looked at, that's peer-reviewed, then the conclusions are drawn, that's peer-reviewed. So it's set up, although it's a slightly dull review, sitting in a library looking at published literature and other forms of evidence. It is set up as a scientific experiment. In a sense, it's a scientific approach to literature review. It's almost as simple as that. And then very importantly, particularly in the context of this session, is this wider engagement and sending your results out, using such tools as John's got available through C4 and other important, influential, relevant fora that speak to the audiences that you want to reach. I won't go into great detail about the stages, but perhaps people who are interested can talk to me later. These slides will be available anyway, and we might want to get into some discussions around our tables. But essentially, those are the steps of a systematic review. As I say, they're like a scientific experiment where you go through in a very, very rigorous way. And there's slightly shorter versions of these called systematic maps, which are rather exciting. We're starting to look at those, where you just concentrate on those first six stages. And these can be also very powerful tools to tell policymakers how much information is there out there in the question that you're interested in and what kind of quality is it. So that there's a kind of landscape, if you like, of the research and other evidence in your field. So good progress in the 12 months since we've been up and running. Lots of systematic reviews are up and running. We've got a steering committee with our partners keeping us on track, if you like, and trying to... It's not really keeping us on track. I don't know why I said that. It's trying to discover together what we might want out of this. So it's very open early days, really, and we're being guided by these other initiatives, but not bound by them. This is something we'll have to find out for ourselves in the forestry and landscape community. I'd urge you to look at the website that's hosted by C4. It's a nice little website, lots of stuff on it. We've just had a call for new proposals for systematic reviews. Some of you in the room may have submitted some. I hope so. Some fantastic ideas came through, and we'll be choosing some new topics later. These are the existing topics, which I'll leave on the slides for you to look at later. But you can see just looking at this, a very wide spread of questions which have policy relevance. They have plenty of literature that will inform them, and they're pretty interesting to do. So those are the kinds of questions we'll look at, but we're looking at different scales, different types of question which will inform different policy arenas. So there's more work being done. You can get involved. We'll be announcing something called T20Q quite soon, which will be a collaborative bottom-up way of asking you what you think are the most important policy and review questions. And we'll be rolling that out, I think, in January, February, and perhaps launching it in March if we can get up and running. And we're really hoping to have the Delphi phase of this where people have collected all these exciting ideas that come in through internet, personal contact, and workshop situations. And a Delphi group of people who'll look at those and push them back out to the community, we hope, will be sitting in Salt Lake City connected with the Euphro Congress, which seems like a very good, broadish arena for us to make those decisions. But this will be announced. Please do keep an eye on the C4 webpage. We'll reach you because John knows who you are. John knows who everybody is. And with his incredible databases and those of the partner organizations, we really are interested in reaching out to as many people as we possibly can to find out what you think are the important questions in the way that the youth session did. So that's it. That's the webpage. The questions that were on the tables for you to discuss later, I truly believe that the systematic review approach speaks to all of those questions in one way or another. So I would urge you to ask questions, throw out some ideas, and engage with that idea when we sit around the tables later. And if there are any particular questions about the process, I realize I've covered it very quickly and there's a great deal of detail that I haven't said. Just grab me at some point today or tomorrow. And thank you very much indeed for listening.