MODERATOR: Okay. Welcome again. Let's get started. Thanks for staying with us the whole day. I see some people drifted off to lunch, and perhaps �'�' welcome to our last session of our foreign ministries panel. We will be basically discussing the World Cup among the panel members.
MODERATOR: We'd better not do that. It could lead to a scuffle, perhaps. So �'�'
PARTICIPANT: (Inaudible) the whole time, because she represents three countries.
MODERATOR: Okay. So we know what to stay away from here.
Right. Anyway, we're glad to put this panel together. And the order of events will be I will first introduce each panel very quickly, even though you have their bio in the program book, just to refresh your thoughts on who they are.
And then, I will ask each person to speak briefly, five to seven minutes or whatever you like, about any particular best practices or evaluation initiatives in your country, or perhaps anything that you've been sparked to think about from the deliberations the last day�'and�'a�'half, whatever is on your mind.
And then we will interact with a couple of questions, and leave plenty of time to open it to the audience.
So let me first start with quick introductions. First, on my left is Ruth Wiseman, from the United Kingdom. Dr. Wiseman is head of research and evaluation in the communications directorate of the UK's Foreign and Commonwealth Office. She is an experienced research and evaluation specialist, having worked for five government departments during her civil service career. She undertook her Ph.D. research at the University of Sheffield.
And thanks for joining us for a second consecutive year �'�' she was here last year, as well.
And on her left is Marc Calcoen, from Belgium. Ambassador Calcoen is currently the deputy to the secretary general, or a permanent secretary of the Belgian Ministry of Foreign Affairs. He has held several positions in the foreign ministry and at overseas missions, including Tokyo, Buenos Aires, and Paris. From 2005 to 2009, he was the ambassador of Belgium to Singapore (inaudible). He studied at Catholic University in Belgium, and received his master's of laws degree from the University of Virginia.
On his left, René Dinesen from Denmark. Mr. Dinesen is the director of the strategy and policy planning unit of the Danish Foreign Ministry of Foreign Affairs, which makes policy recommendations to the minister for foreign affairs and to the minister for development cooperation. Mr. Dinesen has held many positions in the Danish foreign ministry and at various embassies. He received his M.A. and B.A. from the University of Copenhagen.
And on his left, Mr. Kentaro Fujimoto, from Japan. Mr. Fujimoto is currently the first secretary for management at the Embassy of Japan in Washington, and has served in several positions of increasing responsibility in the foreign ministry. He graduated from the law faculty at the University of Tokyo, and received his master of arts degree from the Johns Hopkins University's School of Advanced International Studies.
So thank you all, panelists, for coming, coming to our conference, and agreeing to be on this panel. So why don't we start with Dr. Wiseman?
MS. WISEMAN: Okay. Let me know if you can't hear me. I just thought, before I start, I would just give a bit of context about the foreign office and its sort of situation within our government. The foreign office does our public diplomacy work, our traditional diplomacy work, and our influencing. But we have a separate department for international development. So I'm not involved in the evaluation of those kind of programs.
We �'�' Shelly yesterday talked about this sort of new system you have for high�'level policy goals, and having outcome�'focused indicators, and so forth, of that. And our treasury, we have a system that we feed into, just like that one that she described. So apologies to those of you who sort of think, "Oh, what is this that's coming to us?" But she would have spoken to the United Kingdom about that.
In terms of the foreign office and its evaluation, I can speak on behalf of the communications directorate. There are pockets of evaluation around the office. We have program offices for climate change, counter�'terrorism, conflict prevention. And where there are programs in place, then there is evaluation of those. But a lot of our work is outside of the program spending. And so a lot of that isn't necessarily evaluated, and certainly not perhaps in a very robust way.
We also have a �'�' something called a balanced school card for our posts overseas, which allows comparisons between our posts within regions, and our directors and so forth have some kind of challenge process with heads of mission for that. And that looks at not only the performance of programs or the work they're doing, but also financial �'�' the performance, and whether or not they're treating their staff well, and their communications, and kind of the links up between all those areas.
So it's not just concentrating on performance, but are you getting a performance perhaps at the expense of the treatment of your staff, or perhaps you're spending too much money. So it's looking at that in the whole. So that's another way in which we evaluate our diplomatic efforts.
And also, we have a staff survey and a stakeholder survey as well, which my team also manages. So there are different ways in which we evaluate different elements of our work. And �'�' but perhaps there is not a central unit. It's sort of different pockets of work.
I think �'�' my understanding is that my team, the research and evaluation team in the communications directorate, is the only sort of dedicated research and evaluation specialist team within the office. There are �'�' as I say, there are other elements of evaluation going on, but this is a dedicated team to evaluation.
As I said, I can comment more specifically on the communications directorate work. And the work that we do there is around public diplomacy, strategic campaigns, and policy implementing from the sort of soft power side of things. We also do our internal communications on our (inaudible) and digital diplomacy, as well. So that's the kind of areas of work the communications directorate does.
And my team supports all those different areas within the directorate, but also our 200�'plus posts overseas, and other directorates within the office, because of the special (inaudible) that we have. So the demands on our time are huge, and there are four of us. So it's �'�' there is quite a lot of work to do.
And another interesting thing is �'�' I'm trying to remember which presentation it was now �'�' but there was a cycle of strategic campaigns or communications, and the target audience was sort of left out of that, and they're bringing that in through social science behavioral work. And my team actually does research up front, and the situation analysis and target audience analysis, as well as the evaluation of activities, as well.
And because of that, through quite a lot of hard work and determination, we get involved quite early on in some of the strategy and planning for a lot of our campaigns. And that has involved, you know, being quite (inaudible) about it for quite a long time, and now it's kind of, well, we'd better do it otherwise (inaudible) going to tell us off. So we'd better get her involved up front.
But that also means �'�' I mean there are lots of benefits to that. But it also means there is extra work, in terms of supporting people in strategy and planning, too. So there is a lot of demands on our time.
Just a little bit of context, before I describe the sort of evaluation approach that we take. We have very small budgets. I have heard a lot about tight fiscal situations, and all countries are suffering from this. And I think my counterparts have even smaller foreign ministries than we do. So �'�' but, you know, it is very, very small.
Sometimes �'�' I've heard that the amounts of money being spent on evaluation is the money we would have for whole projects, let alone the evaluation as well. So I was just in a debate on RCTs (phonetic). Well, that's an easy decision to make, because we would just not have the money for that.
And also we're operating in a situation where there is sort of limited skills and experience of evaluation. Evaluation is relatively new to the office, and we have �'�' as I will explain in a minute �'�' our evaluation approach is we have to kind of get people to do it as part of their day jobs, because there are only four of us. And there are not the resources to go out and get other people to come in and do an evaluation, as well.
So we're starting with a situation of low budgets, low skills and expertise as well, from a day to day basis.
So our approach to evaluation, just some sort of basic principles around it, has been to look at the proportionality and appropriateness of methods for the projects that we're looking at. And we've had to be quite innovative about how �'�' the sort of methods we're able to use. Sometimes we can't do surveys �'�' again, there is not necessarily the budgets for that.
And so, we try to use sort of local knowledge and existing ways of working to try and get people to evaluate as part of their day jobs.
There is an emphasis a lot on learning, as well as on establishing impact, which we �'�' quite often, if we do an evaluation, for example, we won't just write a report and hope that, you know, the recommendations and �'�' get out there, but in sort of almost �'�' we kind of know that's probably not going to happen; very busy people are probably not going to spend a lot of time reading the reports �'�' and actually get them involved in the evaluation process, in terms of feeding back the findings to them in workshops, and getting them to help come up with the recommendations and the action plans, going forward.
So then we know they have listened. We know they have taken it on board. And then you actually start to see that some of this �'�' you know, they are actually implementing some of this stuff, and they're thinking about it in a way that perhaps, if we were doing big reports and things, that perhaps that wouldn't happen.
But we �'�' the approach we take is a sort of logic model approach theory of change, looking at inputs, outputs, intermediate outcomes and the link to policy goals �'�' although sometimes we don't actually tell our staff that's what we're doing to them, because they find that scary, so we kind of �'�' we sort of redo it in a way that perhaps is a little less scary.
So we try and help to scope out objectives at the start, very clear, specific objectives they're trying to achieve, asking questions that perhaps aren't framed in objectives and outcomes. But, "What does success actually look like? Don't worry yet about putting it in the right language or the right indicator and the right objective, but tell me. What does success look like? What will actually be happening on the ground?"
And then, looking at their specific indicators will help us to know whether or not that's happening. And again, people forget what the word "indicator" means. You know, indicator of your success is giving an indication of whether or not something is working. And sometimes explaining it a little bit �'�' people think key performance indicator is a thing that's a bit scary. But when you kind of ask the right questions, sometimes you can get there.
And then, looking at what methods might be appropriate with what we can do with our budgets.
And quite often we do need to look at the activity on the project level side of our work, in order to understand outcomes. If we just measure the outcome, or look at what the outcome is, how can we actually link up what we've done with that? So it's trying to make that link, too.
As I mentioned, quite a lot of it is helping people to have this way of thinking in their day�'to�'day work. And so, a lot of our work, as well, is on guidance, and developing tools and templates to help people to do this.
And there is a sort of question of when to train and when to sort of give people material that we have to balance all the time. People are very busy. They're not going to �'�' I remember once somebody produced �'�' about this thick, if you printed it all out �'�' on how to do interviews and focus groups and things. And people are just not going to read it, and they're not going to do it.
So it's sort of, you know, when to train and when to give people material.
And I think there are lots of challenges we have, but I sort of picked out three, and they've been in the discussions throughout the conference, I think. One is sort of a reality question, one is around organizational and one methodological.
Somebody mentioned it just in the last session I was in, the sort of reality of we set objectives that are very, very ambitious sometimes �'�' or our policy colleagues do �'�' very ambitious. And sometimes there is the political pressure for that. You know, in two years what can you realistically achieve? If we actually go and tell our foreign secretary that, that's not going to go down very well. We have to kind of, you know, have that extra level of ambition in there, and perhaps be a bit more visionary than actually what we can realistically achieve in two years.
But then we �'�' when we come to evaluate that, that's very difficult. Because, actually, what people are doing, they don't marry up. You've got this very high level, very ambitious objective, and then a few activities and events or some media coverage going on, and somehow they want to sort of evaluate to say that they have achieved this very high�'level objective. So that's one key challenge we have.
Second, organization, it's new, it's cultural change. And there are barriers to that, and the lack of skills and expertise make introducing this difficult, although we �'�' in the last �'�' I've been working there about a year�'and�'a�'half and I was going to say we've won the battle on whether or not to do evaluation now, which was quite a massive break�'through. Now it's trying to get people to know how to do it.
And, thirdly, methodologically, is the sort of attribution contribution debate, you know, how �'�' the cause and effect. How can we actually show that what we've done has actually impacted on the outcome?
And in the brief we were given in the email that was sent out, we were allowed to ask questions, too. So I put a question to others, as well. Can you have quality evaluation on a small budget with limited expertise? And if you can, is it still worth doing? I have my own views on this, but I thought it was an interesting one in terms of, you know, whether or not it's still �'�' you know, what is quality and what is valuable evaluation? But is it still worth doing? And I shall leave it there.
MODERATOR: Okay. You have included a lot there in those few minutes. Thank you.
MS. WISEMAN: Yes, sorry.
MODERATOR: No, no. That's great. Okay, Ambassador Calcoen from Brussels.
MR. CALCOEN: Yes, thank you. Well, I am representing the Ministry of Foreign Affairs, Foreign Trade, and Development Assistance, and that's already a whole program.
And just to put things in perspective �'�' because being at State and listening to my colleague from the FCO (phonetic), I feel like David vis-à-vis Goliath �'�' not that I don't know who won, ultimately. But anyway, that's of no importance here. But just to put things in perspective �'�'
MODERATOR: Yes, not the best analogy there.
MR. CALCOEN: Yes. Well, anyway, just to put things into perspective, our ministry employs 3,500 people: 1,500 in Brussels, 2,000 abroad. We have 137 missions. We have a budget of 2 billion Euro, of which 72 percent is for development aid, and the rest, about 560 million Euro �'�' I will not count into dollars, because this changes every day, and as the Euro has gone down, it will look even worse in dollars now than it would have looked three months ago, but �'�' so about 1.5 billion Euro for development aid and 560 million Euro for foreign affairs.
In the past few years, this budget has gone up. And this might surprise you, but this budget has gone up considerably. And in the past decades, the budget for development aid has doubled. In the past 2 years, if you look from 2008 to 2010, you might expect, with the financial crisis, there would have been some consequence. But no. The budget of development assistance has gone up by 30 percent, 3�'0.
So this is due to the fact that our government has a firm commitment to achieving 0.7 percent of GDP being spent on the third world, on development assistance. And last year we were at 0.48 percent. This year we will reach 0.7. This is a firm commitment. Nobody in Belgium is actually putting this into question. So the government is putting its money where its mouth was, and we will see in the future what will happen. But for the moment, we are spending the money in this program.
I am saying this, because if you talk about evaluation �'�' because you might think that I am rambling on, nothing to do with you �'�' but if you talk about evaluation, there are two places in the ministry where evaluation is going on. And one is in the development aid, and that's one part.
And the other part is in a program that we call peace building. And this is within the general foreign affairs ministry. I was going to talk about this �'�' about (inaudible) later on. But just as for peace building, this is a quite recent development, because the service was formally put into place in 2006 only. And so it's peace building, conflict prevention, and so on.
So this service intervenes all over the world in places where conflicts are about to begin, have begun, or have just ended, in order to end violence and, on the longer term, in order to do something about the causes and the roots of violent conflict. So this service has operational criteria, geographic, thematic. They have all kinds of procedures into place that are public that people who want to apply for such projects can know and will know and apply.
This is a budget line of about 30 million Euro, so 30 million Euro out of 567 is more than 5 percent. So maybe to you it's peanuts, but in the general budget of the foreign ministry, it's more than six percent of the budget, of the general foreign affairs. So it's rather important to us.
In �'�' some of these projects can be approved by the minister. Some of these projects even have to be approved by the council of ministers.
And geographically, we are spending most of our money in Africa. That will come as no surprise to you. It's about two�'thirds of the projects money goes to Africa, one�'third goes to central Africa. Three countries with which we have historic links �'�' Rwanda, Burundi, and the Democratic Republic of Congo �'�' but we also spend a lot of money in Sudan, for instance. We spent 22 percent in Asia and in the Middle East.
So �'�' and if you look at what it is spent on, it's mainly spent on capacity building and the reinforcement of civil society, human rights support, and then the conflict prevention and peace dialogue, law, and justice. These are normally small projects, less than one year, less than 50,000 Euro. Sometimes, if they are big projects, they are more than that, they can last two to four years.
And in as far as the general development assistance is concerned, just on background to you, the evaluation office not only is evaluating this 1.5 billion that I was talking about. They also evaluate all the money that is being spent, and that is being recognized as official development assistance, including state loans, debt relief, interest subsidy. So the amount that the evaluation office can look into amounts to two billion Euro. And last year it was just 1.6 billion Euro, so also there you see there is a huge increase over the years.
So we have 18 partner countries, most of them located in Africa, but a few of them located in South America and one in Asia, Vietnam.
Recently we made some few reforms in our development assistance. For example, we collaborate with multilateral organizations. Before we were funding programs, now we just give core funding or we give general budget funding. So we give a lump sum to the multilateral organization, and they can do with it as they see fit, because we will have talked about the programs and the general framework of the policy, of the organization, beforehand.
We also have put into play the reform of the subsidies to NGOs very recently. And this year we had the OECD peer review, the results of which are being published right now. This evaluation office makes about five to six evaluations per year, and the cost of these evaluations varies from 50,000 to 300,000 Euro per evaluation. I can talk about this evaluation, how it's done, who does it later. And these are evaluations that last from one to one�'and�'a�'half years.
And that is about my introduction. Thank you.
MODERATOR: Okay. Thank you so much. Great. Mr. Dinesen from Denmark.
MR. DINESEN: Yes, thank you very much. I can promise to be relatively brief.
First of all, one of my main conclusions from being here is that size matters. My goodness. When I compare our system to the bureaucracy �'�' that's a word that has a somewhat negative tone �'�' but to the system, the administration here in the U.S., it is clear to me that when we speak about the whole of government approach, or interagency processes, it's much, much different.
So in many ways, I realize that it's an enormous benefit to be small and homogeneous. Obviously, there should �'�' I hope you are able to achieve much more, due to size. But in terms of coordination, it must be incredibly difficult.
The size of the Danish foreign service is somewhat comparable to what Marc described as �'�' of the Belgian foreign service. Our tradition of evaluation has very much to do with development cooperation. Our foreign service is a comprehensive one, in the sense that it comprises of foreign affairs, trade policy, export promotion, and development assistance, all in the same building, which makes it fairly easy.
Our traditions of evaluation has to do specifically and only with development assistance. And we have our own little unit, a rather small one, five professionals and a limited budget, but they're working absolutely independent, and they are doing evaluation together with the OECD of our development assistance programs.
But, obviously, we feel the same need �'�' be it for public management arguments or others �'�' we feel the same need to be able to be better to justify what we do also in foreign affairs. And this is a little bit my perspective here. I am not the expert in development, corporation development assistance. I would love to answer questions, if you have any.
My focus is more on the foreign policy part, and the challenge it is to evaluate not foreign policy programs, meaning programs with a specific budget, but rather day�'to�'day actions in foreign policy. There is a tendency to say that, well, this cannot be evaluated in any way. We cannot do any counter�'factual analysis. It's multi�'dimensional, it's �'�' objectives will change fairly often. So we cannot evaluate that. This is basically what I am interested in challenging and trying to help bring about a culture �'�' at least in the Danish foreign service �'�' where we would be better at learning as we go along.
So my approach to evaluation is pragmatic and rather, I would say, light�'traveling. It's a question of saying of course you can evaluate classical foreign policy performances. You could look at multilateral negotiation processes, and obviously you could measure what kind of �'�' measure or count what kind of interventions we made, if we made proposals that succeeded, if we did the right kind of outreach, and so forth.
So we would be tempted to have an approach where we slowly create a culture whereby it's only natural to mainstream evaluation into all kinds of affairs.
To some extent we have been doing evaluation on non�'development assistance by chance. So for instance, our experiences with the evacuation �'�' now I'm evaluating everything �'�' the evacuation of 8,000 Danish citizens from Thailand during the tsunami, that was obviously, you know, the unknown unknown that happened to us there, and something that called for evaluation.
So we did that, and we, I think, learned a lot from it. And the whole idea of evaluations of foreign affairs and cultural affairs to be something you were afraid of actually, I think, disappeared. Because a few years later we had to evacuate, I think, 3,000 Danish nationals from Lebanon during the Israel�'Lebanon War. And we did that very effectively and very clearly drew on a lot of lessons learned. So we are very pragmatic and slowly increasing the kinds of evaluation we do.
The same with the cartoon crisis, where we had a Danish newspaper printing some cartoons that created an uproar in the Middle East. It was very obvious that this was something we had to evaluate. I mean, when we send our ambassador in Cairo to speak to the Egyptian foreign ministry to hear what they thought of the situation, and whether they could assure calm and stability in the streets of Cairo, at least surrounding the Danish embassy, they assured us that nothing would happen and they were in full control, but we saw very clearly what happened a few days later.
So obviously, that called for some kind of evaluation. And what we learned from that was obviously that, you know, just conducting diplomacy in government offices is surely not enough. Public diplomacy, meeting the imam or the priest or the university professors that are important in such affairs is very important.
So it's, to some extent, learning by doing. But with a relatively small foreign service, we are, I would say, flexible. And it's rather easy to adopt such changes.
So what I would be very much interested in is to discuss with people that have the same general ideas about evaluating foreign policy practices. I would be very interested in keeping contacts with those of you that have the same challenges, or with my fellow colleagues from other foreign services.
I think I will stop here, not to take up all the time. There should be time for questions. If there are some particular questions on development assistance, I would be happy to respond. But that should be it for now. Thank you.
MODERATOR: Okay, thank you so much. Over to Mr. Fujimoto.
MR. FUJIMOTO: Thank you very much. My name is Kentaro Fujimoto. I am working in the Japanese embassy in the management and coordination section in the Japanese embassy. Thank you very much for inviting me.
As an introductory remark, I would like to describe our �'�' I mean Japan's �'�' policy evaluation system, especially on foreign policy. In Japan we had a very huge administrative reform in 1997. And at that time, the policy evaluation �'�' the introduction of policy evaluation was proposed in the Japanese Government. And in 2001 Government Policy Evaluation Act was enacted in the Japanese parliament.
So under this law, each ministry, including the foreign ministry, is primary body for evaluating its own policies and projects. And foreign ministry is in charge of evaluating Japanese diplomatic efforts, in addition to �'�' in order to ensure the objectiveness of policy evaluation. The Ministry of Internal Affairs and Communications conducts the secondary evaluation in Japan.
Now, each ministry submits the annual report on this policy evaluation. And this report is publicized each year. And the Ministry of Internal Affairs and Communications conducts secondary evaluations based �'�' on the basis of that report.
Moreover, Ministry of Internal Affairs has the main responsibility for evaluating policies that cover more than one ministry. In the case of foreign affairs, over official development assistance and student exchange programs and tourism policy are the subjects to that kind of evaluation.
And under the Government Policy Evaluation Act, each ministry establishes individual guidelines to implement the law and conduct evaluation.
In concrete terms, ministries and agencies establish so�'called basic plans, plans in three to five years to prescribe such kind of issues as principles, how to conduct policy evaluations, how to publicize information relating to the evaluation.
And additionally, ministries and agencies draw up an annual plan for policy evaluation. And in Japan policy evaluation is linked to budget drafting process. Ministries are required to submit policy evaluation report by the end of August every year. Because, from September, budget drafting process starts for the next fiscal year. I mean, the Japanese �'�' our Japanese fiscal year starts from April and ends in March, so that we can utilize the result of the evaluation and reflect in the next fiscal year.
And one aspect I would like to mention is that �'�' how to evaluate foreign policy for effectiveness of foreign policy because, you know, in the Government Policy Evaluation Act, there are three standpoints that we should take into consideration. Those are necessity, efficiency, and effectiveness. And the law says quantitative methods should be used for evaluation. However, it is very difficult to �'�' as you may know �'�' very difficult to evaluate foreign policy in terms of quantity. And we should take �'�' we should consider that foreign policy can be successful with kind of steady implementation for long period of time.
So therefore, Japanese foreign ministry uses quantitative method in such areas as consulate matters and improvement of our diplomatic infrastructure. At the same time, in other areas like economic aid, peace�'building efforts, ministries break down the targets and purposes of each policy as concrete as possible, and evaluate it each year, and write down the result gained in each target, and purpose in each.
In terms of under �'�' lastly, I would like to say that we had a new government last year. And the new government attaches high value to the policy evaluation, and tries to utilize it more. So in October last year, Government proposed four types of reform to increase the efficiency of our budget. And one of them is more intensive use of policy evaluation. And we are �'�' and so we are working now on the details of that kind of principles.
Yes, I would like to stop here. Thank you very much.
MODERATOR: A wealth of information and a wealth of points of view here. Thanks for the whole panel.
We would like to move to your questions now. So as you are formulating your questions, please move to the microphones. I know there are �'�' to quote Dr. Levine, I'm sure there are a lot of thought bubbles going on among people's heads, but let's get some thoughts into questions.
But let me start with one question. I would like to ask Ruth if she can comment on whether �'�' or to what extent there is a strong evaluation culture within the FCO (phonetic), or �' and has that �'�' has it been constant? Has it dipped? Has it been enhanced or strengthened recently? Can you comment on that, overall?
MS. WISEMAN: Evaluation is something that's quite new to the foreign office. We only recently had a business planning process that has become more systematized within the organization. So there hasn't been a culture of evaluation.
I would say in the last year�'and�'a�'half there has been a move to �'�' sort of an increase in the �'�' in understanding the value of evaluation, and knowing it's importance, especially with �'�' budget constraints are only going to get even tighter. I think people have begun to see that it is something they should be doing.
So I don't think that there is a �'�' sort of a culture, certainly not something that's systematic. There are a lot of skeptics because, obviously, with the kind of work we do, how can you evaluate it? Is it sort of something tangible, and can you say what influence you have had on it?
So I wouldn't say there has been a sort of a long enough history to have sort of ebbs and flows. And I think it also depends on the area of the department. I think where I work, we've done a lot of work to really push the benefits of evaluation. And so, within our specific team, the communications directorate, especially the public diplomacy side and the strategic campaigning side, it is something people think about up front when they think about projects. And if I hear about something, and I haven't been involved in it, then I will get involved with it very quickly.
So from that point of view, there is a kind of �'�' it's part of our cycle, we've got it �'�' we made sure it was included in the strategic campaigning approach. So that's �'�' it's become kind of �'�' it's a bit more something that people think they should be doing. We've got to make that something that will continue and be sustainable, whoever is in the role that I do, and also broaden that out throughout the organization.
MODERATOR: Okay. Thanks. Let's go to the question on this side, please.
QUESTION: Thank you very much. I was just wondering, in light of tight budgets, to the extent that there are shared policies, do you ever work together to share evaluations, especially in the EU context?
MR. CALCOEN: No, not really. What we do, and especially for the development aid, of course is �'�' the fact that you have the OECD �'�' and in that respect, people work together. That is certainly a fact.
And what we also do is �'�' I did not tell you anything about our system, but this evaluation office for the development aid, that's an independent evaluator appointed by Parliament. And what they do is they have public tenders for evaluation. So they put down the parameters, and then they launch a public tender. And this public tender is, of course, open to external consultants because we work with external consultants to do the work. And it's open for Belgian external consultants, as well as for foreign external consultants.
So we have a lot of people from France, Holland, Luxembourg, and the UK especially, who do evaluation work for our development aid. So in that respect, maybe we might work a little bit together. But on the official level, because �'�' as (inaudible) says �'�' it's quite new, and this evaluation office in our case is �'�' was put into place in 2003 only. So I think there is indeed not a long culture, there is not a long experience.
And, apart from the OECD, there is not really a culture of collaborating in that respect, even if we do the same things, like the Danish, for instance. We also made a big evaluation. We don't call it "evaluation," but it's a lessons learned, like from the council of crisis in Thailand or from the last one with the ash cloud in Iceland. Of course we do all these things as well. We don't call it evaluation. But apart from that, no formal collaboration, I would say.
MR. DINESEN: Yes, I would maybe just add that that's correct. This is where we share that kind of analysis of our development assistance. But that is exactly limited just to the development assistance.
We are, for instance �'�' Denmark, for instance �'�' peer reviewing both Japan and the U.S. I think we just did Japan and will do U.S. So there is a lot of cooperation there, within the OECD. But in terms of foreign affairs, there is very little, also because of the concepts are that �'�' less developed
We do, however, meet in different groups, very seldom, the EU, as a whole, all 27 members. Because we are at different stages, have different priorities, but we tend sometimes to meet in groups of five, eight countries that share an interest. For instance, on, let's say, more strategically, what could foreign ministers do in light of, you know, being pressed on budgets? What are our strategic assets? What should we focus more on in the coming years to, you know, retain our budgets, basically? What are our constituents demanding from us? And what are best practices here?
So for instance, you know, to put more weight on export promotion, consular affairs, public diplomacy, this is something that is rather obvious, but something that we could discuss among each other. At the same time, we are challenged with how to make sure that we have, let's say, a constituency for what is classical foreign policy or diplomacy work.
I mean, there is probably no one in the Danish parliament that would really cry out �'�' or among Danish journalists �'�' if we cut down 10 percent of our general diplomatic staff, you know, those doing political work. And since this is a lot of the self�'conception of what foreign affairs is about, it's rather critical.
So we could meet in smaller groups to share, I would say, best practice. How do we make best use of our �'�' let's say our generalists also. But it's very informal coordination.
MODERATOR: Okay. Question on the other side, please.
QUESTION: I think I have a very similar question. I work here at the State Department on humanitarian programs. And I also wanted to ask if you could speak about some ways that we might better coordinate to avoid duplication of effort, or to cut costs by sharing best practices, even maybe electronic ways.
I know in one of the sessions someone was speaking about a social networking type of tool where you could get information from people on the ground �'�' for instance in Haiti �'�' about what was going on during the crisis in Haiti. If you could, talk about any other options that we may have in that respect.
SPEAKER: For all these humanitarian crises you have these donor conferences, where people come together and share what they are doing and share what they will be doing, and share their experiences. But that's already a long time after the crisis has come to the foreground. So that probably is not an answer to your question.
I think what you could do is, indeed, put up a network of people dealing with these things. I guess each country now has a �'�' we call it (inaudible), which is a team of very �'�' multi�'disciplinary team that we send out each time such a crisis happens. And they get to the place of the crisis, and they start their activities.
But indeed, this is never done in collaboration �'�' it's done in collaboration with the country in question, because they say, "We need this and we need that, and we don't need this and this." That is being �'�' I mean that is being covered. But collaboration with other donors is never being covered in the first place.
I guess an informal network or even a formal network, be it among people who �'�' in Europe or with the U.S. or with Canada or with other people that have a tradition of coming to the foreground in such crisis would be probably useful, indeed.
QUESTION: Good morning to you. I am Pontus Jaeborg, I work from the Embassy of Sweden. So I understand very much what my Danish colleague is saying, and the Belgian colleague, to some extent, or the two others.
But you know, the smaller countries, we really have a difference. We have a benefit of being small, and you know, maybe that way more flexible. But also we have a lack of resources. So that's very interesting in those two days, to �'�' how difficult it is to compare, you know, the United States foreign service with, you know, the kind of service we have.
In �'�' so I don't have so much of a question, but to say that we are just now also studying how we can become more flexible in the foreign service, using, you know, the few resources �'�' we realize we will not get more budgetary sources. How can we use what we have in a better way? By maybe cooperating with other countries.
For instance, within the Nordic group we have a long tradition of working together. In certain cases we actually have joint embassies. And we see how we can work more there, and maybe also get to a higher degree of cooperation of administration and consular affairs and in other ways.
We also note that there is a European action service being set up. But that is just now coming into effect after the Lisbon Treaty. And we don't really know how far that will go.
There is another area within the �'�' most of the European Union countries that is the Schengen Agreement, where we could also cooperate on visas, where �'�'�' perhaps to have joint Schengen offices, instead of each embassy within that group, you know, working on these matters.
So there are �'�' I think for smaller services there are �'�' there is scope to cooperate. And you know, in that way also more on a day�'to�'day basis, see what works and very quickly move forward, and maybe not so much, you know, this theoreticizing on evaluation, because we simply don't have the resources for that. Thank you.
MODERATOR: Thank you. Any particular reaction to �'�' general agreement?
DR. WISEMAN: Just a thought about sort of talking about sort of joining up more, which I agree with. You know, the more we can do, the better.
But when it comes to evaluating, influencing policy, influencing diplomacy, it's okay if you've got a shared understanding of where you want to be and what position you want people to take at various meetings. If you've got opposing views, you might not want to be doing shared evaluations to sort of say, "Well, this is what works and this is what doesn't work," when we are trying to convince the Danes to do something, "What's a really good idea is to do X, Y, Z," because we don't want them to know that.
So it's kind of �'�' you know, you have to �'�' if the actual sort of evaluation and monitoring of diplomacy can be a bit tricky to perhaps share in some cases. Just a thought.
QUESTION: Good afternoon. Thank you so much for your remarks. I have two questions, and actually just one that just really stands out, that I think the similarities between you, being �'�' representing the EU and others, and us is that we both talk about what our development colleagues have done, in terms of sort of leading the charge in creating metrics and evaluating what development assistance is doing.
And I'm just wondering, from your perspective, are there any instances of best practices or indicators that we can borrow from the development community that could be applied to our foreign policy? And I wonder if there is a silo �'�' perhaps it may be artificial; I want to get your comments on this �'�' and how we, as foreign policy experts, view development assistance? I mean, are we all working towards the same goals or not? And, if so, is there a way of merging what the best practice is for evaluation into what we're doing now in foreign policy? First, a complicated question.
The second one is pretty straightforward. I really want to applaud my colleague representing the Japanese ministry, because you mentioned something very interesting, that every year you require some sort of evaluations or metrics to come together in August, and you use that to develop your budget and your planning. I would love to hear more about what was sort of the driver, the strategies around creating that sort of process, any sort of best practices that you can share with us, as to how you are �'�' or how well that's functioning for you in incorporating metrics and evaluation into your budget formulations and execution. Thank you.
MODERATOR: Any takers on the first question?
SPEAKER: Well, I could add a few remarks there. I mean, I had the initial idea, when I agreed to come here, that I would propose, for instance, the OECD to convene a kind of very open�'ended seminar or conference, exactly with a view to what you are saying, to make sure that best practice �'�' which I think, within development assistance cooperation �'�' is collected in and by the OECD, to try to make sure that people who are interested in evaluating foreign policy get to know about those best practices.
But I understand that the OECD, as such, and what kind of mandate they should have, that's a political discussion in itself. So I'm not �'�' I'm sure it's a non�'starter to suggest that the OECD should start peer�'reviewing each other's foreign policy.
SPEAKER: But nonetheless, I think it's interesting, and it alludes to what you are also interested in.
So we might find another place to meet to discuss this �'�' then Paris, and OECD. But I think it's obvious, of course we can learn a lot.
Could I just �'�' before I give the word �'�' it's not my job to give anybody the word �'�' but before I let my Japanese colleague answer to the specific question, I would just say that we have had very similar experiences as the Japanese, regarding that whole cycle of priority�'setting and resource allocation.
So we are actually doing almost the same, in the sense that we are starting right now �'�' and it's my particular department that does that �'�' drawing up a list of, say, the plan of priorities for the coming year. And then, it's approved by, you know, everybody that has to approve such a thing, and in September it's ready. And then, throughout the autumn, or the fall, you would discuss resources with departments at home, and embassies, missions abroad, and this would all be, you know, looking towards that kind of plan. What can we do individually to achieve those goals?
And so, we have one central document from which you are �'�' from which all resource discussions are emanating. And that has proved �'�' it's a bit of a cumbersome process to make it. But when you have done that a number of years, you would slowly know how to streamline it.
And we have been working with, you know, some kind of self�'evaluation, or self�'performance evaluation in those �'�' in that kind of work. So for instance, when the �'�' our embassy in Washington is playing towards the headquarters, explaining what they could do to realize these goals for the coming year, they would be asked to set up some kind of performance measures for themselves. So our embassy in Washington might have 20 or 30 of those, saying, "In this particular field, to get an A, we would do this or that. To get a B, we would do this or that. Or, if we fail and have a C, it would be because we didn't do more than this or that."
So obviously, this is a kind of shortcut to create a more permanent culture of self�'evaluating and performance measuring. But sorry for trying to answer your question.
MR. FUJIMOTO: I am not sure whether I am answering your questions directly, but I said in Japan each ministry and agency is responsible for its own policy evaluation. And Ministry of Internal Affairs is a secondary body.
However, I may say that budget drafting process may be the third round of the policy evaluation process. When Japan had a high rate of economic growth, we had, you know, abundant money and budget. But now we have a very huge budget deficit.
So from my experience, budget drafting process is very, very intensive negotiations between the finance ministry and each ministry. We spend almost four months �'�' August, September, October, November, December; five months �'�' until we finalize government draft of our budget plan. And it is submitted to the parliament in January.
So we �'�' I mean each ministry �'�' has to negotiate with the finance ministry, and finance ministry reviews the results, or the effectiveness of each policy, and the kind of �'�' you know, if we want to introduce new program or new projects, it's kind of scrap and build concept (inaudible) in general, introduced.
So that's how we, you know, try to maintain or, you know, reflect the result of the evaluation in the overall government policy, as much possible. Thank you.
QUESTION: Thank you very much.
SPEAKER: (Inaudible) Jay.
QUESTION: I think I want to start with our Japanese colleague. Does the Ministry of Internal Affairs evaluation ever differ from the Ministry of Foreign Affairs' evaluation of its performance? And if it does, what's the implications within either the budget discussions, or within sort of the discussion about how you're going to perform over the course of the year?
And then, a broader question for the other three of you, I guess. Well, mainly, actually �'�' sorry, I really don't have anything for the British contingent today �'�' but for the other two of you, as you think about the notion of your team that does the evaluation, that small team that does the evaluation �'�' particularly, I guess, on the development assistance side �'�' how transparent is the result?
I mean is that then given to the country where the development assistance was actually taking place, and so on. I mean, how much �'�' and how involved is that government and the actual process of the evaluation?
MR. FUJIMOTO: Yes, thank you very much. When the evaluation of each ministry and evaluation of internal affairs ministry are different �'�' you know, both evaluations are publicized. So each ministry is very �'�' under a very similar pressure to continue their current policies.
And, as I said, those evaluations are reflected in the budget drafting process, as well. So it's very hard for us to maintain the policies, as it is. That's how we coordinate our two types of evaluations.
MR. CALCOEN: As to your second question, of course the first purpose of the evaluation is to see for yourself how effective you have been, and also what has been realized on the ground, whether things that you have done and the monies that you have spent.
And secondly, of course, it's also important �'�' and that's what this independent evaluator does, working with foreign consultant, with external consultant. We have also, each time, a steering group with all stakeholders, meaning the Belgian administration, Belgian policy makers, but also the foreign people that we are making the evaluation about. And they accompany the evaluation throughout the process.
And when the evaluation report comes in, it is discussed with these stakeholders, it is adapted, if necessary. And then the report goes to the management of the political and administrative management of the Belgian development assistance, as well as to parliament, because, as I told you, this evaluator has been appointed by parliament.
And it also goes, of course, to �'�' it is open, it is public information. It goes to the people that are concerned by all of this. And after one year, there is another step that is always taken, and which means that the people that the evaluation was made about have to respond to what they have been doing about it. And they have to say what they have done with the recommendations, and this applies both to �'�' that can apply both to the internal parties, as well as to the external parties.
So there is certainly a way of engaging the people that we �'�' because we evaluate sectors, we evaluate countries. But they are also involved in this evaluation.
MR. DINESEN: I could be brief and add that the evaluations of our development assistance would always be public. They are basically meant to convince parliament that they are spending the money in the right way. So they are publicly on our home page.
The extent to which we cooperate with the recipients, I assume and I hope that what Marc just said about the Belgian way of doing it, it's the same in Denmark. But to be honest, I don't know the extent to which we do it with them.
When it comes to foreign affairs, obviously this is why we are much more pragmatic, and flexible, and low key, and travel light, and all the other words. You couldn't imagine, unless it was a tsunami or consular affairs, you couldn't imagine to make anything public on foreign affairs. Ruth mentioned some of the reasons.
So I mean, to be modest but still modern or progressive, we would start by, you know, much more systematically doing lessons learned and self evaluation. But all the other development assistance would be public.
SPEAKER: Okay. Let me bring the session to a close by thanking the panel for being part of our conference, and being very generous with your time, and joining us and being open and frank. And we hope to extend this dialogue and stay in touch on many other issues with our panel members. Hopefully you can come back to future sessions next year and at other times.
So thank you very much, panel.
SPEAKER: If you folks wouldn't mind just staying in your seats, I just have one or two minutes of closing comments here, then we will all move on to lunch.
The past couple of days have been exciting ones for those of us who have been working on this conference for many months. We had several objectives for the conference. The principal objective, of course, was to shed light on and identify promising and effective practices in using evaluation as an essential function of performance management in the conduct of foreign affairs.
Development has had a very long and deep history of evaluation; we are now trying to move to expand that to foreign affairs. Some of the comments from the panel were just exactly on point. And hence, our title for the conference: The New Paradigm. So I think we have made some progress there.
I believe our State Department and USAID colleagues, our friends from other agencies and foreign embassies, and all others here today and yesterday helped us to achieve that objective. You all provided solid information on real and tangible tools, on the challenges and rewards in effectively linking program purpose and design to the power of information derived through evaluation.
We also discussed the application of that information in decisions pertaining to policy or resource allocation.
It's been tremendously rewarding to work with all of our presenters to sponsor an interagency and international dialogue around evaluation, and to witness the exchange of ideas and experiences during the breaks, the lunch hour, and in the workshops.
I hope that all of you leave here today with a deeper understanding of the purpose and value of program evaluation, and that this conference will serve as an impetus to increase and improve our use of evaluation as a performance management tool in our respective agencies. So thanks to all of you for what you brought to this year's conference.
In closing, my special thanks for the many people �'�' literally dozens �'�' who worked to pull the conference together, for all the logistics activities, all the substantive work to adjudicate the presentations and organize everything. Just many people, as you can imagine, were paddling furiously beneath the surface of the water, even though it didn't always show above the surface.
So I won't mention all the folks who have contributed �'�' it's just really many in our office, in our support offices, those who run the center here �'�' but just many people really helped out tremendously.
I do want to mention one person and have you recognize her, and that's our overall conference coordinator, Stephanie Cabell.
SPEAKER: She has just done a tremendous job from day one, many months of organizing, is always in a very unflappable, professional manner. So we really appreciate that.
So if you wouldn't mind coming up, we have a small gift for you.
Some of you probably know that Stephanie is an avid singer, really on a professional level �'�' and I'm embarrassing her now, but that's okay �'�' she has sung in many venue, including the Kennedy Center, and places of that stature. So we picked out a gift that I think you will really appreciate, so �'�'
MS. CABELL: Oh, thank you so much.
SPEAKER: Here you go. Okay.
MS. CABELL: Thank you. Yes, and (inaudible). I am a little teary�'eyed, but this is �'�' it's been a lot of work, but we have had a great team in play. So they know who they are, but I will thank them personally afterwards.
And, obviously, thank you to all of you, both here at State and outside of State and USAID that attended the conference, our presenters in the workshops, our foreign media panelists, foreign ministries panelists. So it's been a very gratifying conference. Thank you very much. And thank you for this.
SPEAKER: And, lastly, the proceedings of the conference will be available on the State Department's website. It's on the back of your program book. It's an easy one: state.gov/evaluationconference. And for employees in the Department, the plenary sessions �'�' including this one, hopefully �'�' will be on BNET, once we can sort out all the technical issues there.
So thanks so much for your participation, and we hope to see you next year.
Back to Top