An official website of the United States Government Here's how you know

Official websites use .gov

A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS

A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

U.S. ADVISORY COMMISSION ON PUBLIC DIPLOMACY

MINUTES AND TRANSCRIPT FROM THE QUARTERLY PUBLIC MEETING ON OPTIMIZING ENGAGEMENT IN PUBLIC DIPLOMACY

Tuesday March 20, 2018 | 10:30-12:00 p.m.
Rayburn House Office Building room 2200 (45 Independence Ave SW, Washington, DC 20515).

COMMISSION MEMBERS PRESENT:

Mr. Sim Farar, Chair

Ms. Anne Terman Wedner

COMMISSION STAFF PRESENT:

Dr. Shawn Powers, Executive Director

Ms. Jennifer Rahimi, Senior Advisor

SPEAKERS:

Dr. Gerry Power, Chief of Research, M&C Saatchi World Services

Dr. Shawn Powers, Executive Director, U.S. Advisory Commission on Public Diplomacy

TRANSCRIPT:

Sim Farar: Good morning, good morning, good morning, and thank you very much for being patient. As you can see downstairs, there’s a long line out there getting in here. If you don’t have an umbrella like myself it was very difficult, but we are here. That’s the most important thing.

Sim Farar: This is the United States Advisory Commission on Public Diplomacy. This is our first hearing for 2018. I’m Sim Farar, I’m the chairman of the commission. Thank you all for being here and a special thank you to Chairman Royce and Brian Gibel for helping the commission secure this place for our meeting.

Sim Farar: First, I’d like to tell you a little about our commission. In 1948, the commission represented the public interest by overseeing United States government international information, media, cultural, and educational [inaudible 00:00:58] programs, with a bipartisan and independent body created by Congress to recommend policies and programs in support of the US government’s efforts to inform and influence foreign publics. It is mandated by law to assess the work of the State Department and to report its findings and recommendations to the President, Congress, and Secretary of State and, of course, the American people. I’m joined here on stage today with a very distinguished colleague of mine from Chicago, Illinois, Anne Terman Wedner. She’s been one of our commission members for a long time.

Sim Farar: Today we’re going to focus on how to improve the State Department’s capacity and approach to research, monitoring, and evaluation of its public diplomacy programs. As many of you know, the commission organized a summit of these issues last month at the US institute of Peace, and what you hear about today are the key takeaways from that summit.

Sim Farar: Essential to fully integrating research and assessment into our public diplomacy apparatus, is building a culture of learning from what works and, more importantly, what doesn’t work. The existing culture of inflating public diplomacy successes, while ignoring public diplomacy failures, stunts our ability to learn, and we miss opportunities to build good programs into those that have lasting impact, and programs with little to no impact, carrying on far longer than they should. We need to build up and reward officers who are good stewards of public diplomacy resources, engage programming with an inquisitive mind, and always seek to improve. This is why, the component of the assessment process is perhaps the most underappreciated piece of the puzzle.

Sim Farar: I have the distinct honor of introducing a very special guest, a friend of the commission in today’s discussion, Dr. Katherine Brown, and she’s our former executive director. My, my, what a ride she’s been on. It ain’t over yet, as they say. Dr. Brown’s book, Dateline Kabul, Storytelling of Modern Afghanistan and America’s Longest War, will be published later this year by Oxford University Press. I urge you to get a copy. Buy a copy. Most recently she’s been a public policy manager at Facebook, where she was also in residence, as a consultant on foreign relations, international affairs [inaudible 00:03:24].

Sim Farar: Previously, she held numerous roles in government, including assistant to the White House National Security Advisor, communications advisor for US embassy in Kabul, and professional staff member on the committees on foreign affairs of the United States House of Representatives. Katherine also served on the board of the center on public diplomacy, at the University of Southern California, and global ties foundation. She’s a non-resident senior associate at the Center for Strategic and International Studies, and adjunct assistant professor at Georgetown University. Without further ado, a nice please, warm welcome for Dr. Katherine Brown, please.

Katherine Brown: Thank you so much Sim, and Ann, Dr. Power and Dr. Powers for inviting me to be her today. I can’t figure out how to make that work, so hopefully you can hear me. It’s very fun to be back at a commission meeting, and I’m just honored that you asked me to speak on it. I wanted to give some context for you all about why the commission takes this issue on.

Katherine Brown: In 2013, ACPD was reauthorized in January 2013, and it was given this extraordinary reporting mandate to not just report on where $1.9 billion dollars of funds go for public diplomacy, international broadcasting programs, and how the money is being spent, but how effective these programs are.

Katherine Brown: When I started in July 2013, I went to some of the staffers in different congressional offices, asking, you know we have a staff of two people and less than half a million dollars, and part time, volunteer, bipartisan commission, how exactly do you want us to evaluate the programs, and they said well we want you to reflect the outcomes of the evaluations that the state department does, and that the Broadcasting Board of Governors does for all of their programs, and just let us know how things are faring. It became very evident that the charge for the commission was going to become to help build this infrastructure for audience research, and data analytics, and impact evaluations, because there wasn’t a capability to be able to report the outcomes of different public diplomacy programs, that Congress wanted.

Katherine Brown: We were lucky to be able to find some extremely talented professionals at both the State Department and Broadcasting Board of Governors who were doing this work already, it just wasn’t at scale, and wasn’t to the functionality that we needed. So, we also saw early on in our work, the commission members, and myself, and our former senior advisor, Chris Hensman, is that there is some tension in trying to promote research and evaluation work at the State Department. This makes sense, because relationships are inherently complex, they can take many different forms, over many different years. So assessing the impact of public diplomacy activities over the very long term is elusive, it’s been difficult to do.

Katherine Brown: On the other hand when you talk about outputs, there’s no shortage of public diplomacy outputs to boast of. There’s more than 90, I think this is still true, 90 educational, and cultural affairs programs. 450 US embassy websites, with millions of followers. 700 American Spaces are hubs for foreign citizens to gain information about, and interact with, some dimension of the United States. There’s more than 450 expert speakers who are dispatched abroad to engage foreign audiences in various topics about the US.

Katherine Brown: There’s more than one million US sponsored exchange participants and alumni worldwide, and 485 of them, probably more so now, have been heads of state. This costs just one billion dollars at State, the nine hundred thousand dollars I was referring to earlier is BBG. So, this is less than 2% of the combined economic development budget for the US, which is a fraction of defense spending, but still with all these outputs, there’s always been a desire to know, okay, what does this all mean, what are the outcomes?

Katherine Brown: I tend to talk about the economic outcomes of this work. It’s incredible the amount of money that public diplomacy brings back into the American economy. For instance the International Visitor Leadership Program contributes more than $50 million to the American economy, and with nearly one million foreign students attending American higher education institutions, they’re injecting $30.5 billion dollars into the American economy. But again, there’s this question of, what does this really all mean? How does one measure the networks, and influence of public diplomacy action can create.

Katherine Brown: While, a lot of this work is faith based, I think a lot of us who are attracted to public diplomacy inherently believe that this work is essential for the American economy as we just said, and for our borders to stay secure, and our culture to become richer, and deeper, and that we need these networks to increase prosperity and security. But regardless of where you fall, if you think that we should be investing in this work, or we shouldn’t, the thinking behind the commission, behind, Ann, Sim, and myself from the beginning was that, there’s incredible value in creating systems of research, and evaluation, just to understand public diplomacy’s impact, and to make good decisions about where resources should be allocated.

Katherine Brown: So, in 2014 we did a report called Data-Driven Public Diplomacy, and we worked with various experts, Sean Aday, Amelia Arsenault, Matthew Baum at Harvard, Kathy Fitzpatrick, Craig Haden, Eric Nisbett, Jay Wang from USC, Nick Cull form USC, and also, Dr. Shawn Powers who was at Georgia State at the time. We identified five areas that we felt needed change, and this was four years ago.

Katherine Brown: First, is increase recognition on the part of the State Department officials, and the importance of research in PD. Movement away from risk averse cultures at State, and BBG, which we felt were negatively impacting how research data and evaluations were conceived, conducted, and reported. We felt there should be more consistent strategic approaches in developing, and evaluating public diplomacy, and international broadcasting activities. We should increase training, execute their planning, including research and evaluation, and there should absolutely be more funding. We felt that there should be at least about 4% of the overall budgets for the various PD Bureaus, and for BBG, should be going towards this work.

Katherine Brown: It’s been remarkable to see the changes that have been taking place at the State Department, and in the BBG in the last few years. I don’t want the commission to get all the credit, because this work was already being done, and I like to think that we were able to shine some attention on these incredible professionals who were doing this work, and to help push for them to get more resources. So, watching their impact within state, and BBG grow, has been remarkable, and I’m really grateful for their work, and their contributions to public diplomacy, which I don’t think, get recognized enough.

Katherine Brown: So, I’m also thrilled to see there’s been follow on work. I am thrilled to read this report, to hear from you Dr. Power, unfortunately I have to leave early, but to hear from you for a while on what you’ve uncovered, and to hear the recommendations, and what they are four years later. I am grateful for Shawn’s stewardship of this work.

Anne Wedner: Thank you Katherine for joining us, and putting us into your schedule, I know I divulged that you’re at Facebook right now, and so you had a little bit of a crazy couple of days. So, we are also utterly delighted to have Dr. Gerry Power join us today. Dr. Power is the chief research officer at M&C Saatchi World Services. His international research career spans the not for profit academic and commercial sectors. Working in international development, technology, investment, and media and communications. He has led multi country research teams across Asia, the Middle East, and North Africa, Sub Saharan Africa, Latin America, and the Pacific. Prior to his appointment with M&C Saatchi World Services, Gerry was a Director of research and learning at the BBC world service trust, where he established an award winning global network of researchers spanning the world. I am delighted I got to talk to him a little bit before the meeting today, and I’m looking forward to hearing his remarks today.

Gerry Power: Thank you, Anne. Yes my name is Gerry Power, not with an S. I’ve been denying any association with Shawn since our engagement started about three months ago. I was the airport at Heathrow on Sunday, and my ticket said James Powers with an S, and I got stuck for 40 minutes, because that’s not my name. So, yes, I do global research inside the evaluation team at M&C Saatchi World Services, headquartered in London.

Gerry Power: M&C Saatchi, for those of you that do not know, is the largest, independently owned, communication and social marketing company in the world. We’ve got 31 offices across 26 countries. We work with a range of both multilateral and bilateral agencies, national governments, non-governmental organizations, foundations, and commercial [inaudible 00:14:23]. These clients include the UN agencies, USAID, DFID, [inaudible 00:14:28], the Bill and Melinda Gates Foundation, just to name a few.

Gerry Power: Core to our work, is helping our partners optimize their engagement with their various target audiences. It was in the context of this portfolio of work that Shawn Powers approached me back in late 2017, with the task of conducting a program of research that would answer three questions. The first question was to understand the range of research conducted by governmental, and non- governmental organizations to gain insight. So, how to best engage their perspective audiences.

Gerry Power: Second to establish the range of evidence and key metrics, being gathered to capture the impact of public diplomacy initiatives globally. Then thirdly, to examine how the insights and evidence are informing the target audience engagement strategy of key growth factors in public diplomacy. We shared the preliminary findings of this work, at the Research, Evaluation and Learning Summit earlier this month, but today my comments are informed by four different sources of evidence.

Gerry Power: First of all, the findings of the research. My observations at the two day summit. Takeaways from the digital diplomacy conference, convened by the Dutch Ministry of Foreign Affairs in February. Finally, insights gathered from a range of in house, and external data, digital data research experts apart from the people we consulted from the research.

Gerry Power: First of all, some background on the research. We interviewed 28 experts, from 17 countries with significant experience over insights into measuring the impact of public diplomacy, and cultural relations initiatives. The experts we spoke to fell into four categories. First of all senior representatives from Ministries of foreign affairs, principals from cultural relations institutes, research experts and think tanks, and a range of other key actors including, digital data experts.

Gerry Power: In addition our team reviewed over 40 specialist reports, and project documents that were shared with us by the interviewees. My observations for you today fall into four broad categories, first of all, the patterns and trends in public diplomacy, research and evaluation community. Secondly, how key players in the community are drawing on research to inform their public diplomacy strategies. Third, the broad range of approaches to measuring the impact of public diplomacy activities. Finally, some best practice examples of approaches to organizational learning.

Gerry Power: First, in terms of the community more generally. Unsurprisingly, the community is intrinsically competitive, yet collaboration between certain partners in securing advances in public diplomacy research practice. One example is a collaborative project between the MFA and Sweden, and the MFA and the Netherlands. It is designed to integrate technological data, and social media expertise with the traditional skill sets of diplomats.

Gerry Power: To date, it has organized 14 events worldwide. These bring together diplomats, and entrepreneurs, tech developers and experts, and within non-governmental and societal organizations that have helped produce best in class public diplomacy campaigns such as the Swedish Midwives for all work, which I’ll share a bit of detail on later on in my remarks.

Gerry Power: There may be value in this, in exploring the benefits of diplomats focused on collaboration between Ministries of foreign affairs, as a way of building upon the fine work that the tech camp initiative has already done here at the state department.

Gerry Power: Secondly, we also learned that actors are delivering a range of programs with diverse objectives, and consequentially different measures of success. These objectives range from the more traditional, such as increasing the likelihood that foreign citizens will visit, come to study, and invest in a country.

Gerry Power: To more nuanced objectives, such as the Goethe-Institute’s dedication to understanding optimal conditions for the local sustainability of an interventions effects. And more targeted objectives such as the Danish efforts to build knowledge of and increase their presence, within technology and innovation sectors in key locations around the world.

Gerry Power: Third, however there was poignant acknowledgement that politically motivated short term goals have the potential to threaten investments in research and evaluation programs. The Swiss Federal Counsel Strategy for communication abroad, stands out as an integrated program involving multiple partners that is supported by a long term commitment and investment from the Swiss government.

Gerry Power: This results in a program that provides the optimum combination of robust data gathering on multiple activities, that are aligned with long term strategic objectives. In addition to a long term commitment to research and evaluation, it affords the opportunity to monitor and recalibrate ongoing efforts to ensure they’re aligned with the long term vision.

Gerry Power: Fourth, it’s important to mention, the role of the various global indexes, the Nation Brand index, the Portland Communications ranking, the creative cities index, these are referenced frequently throughout our interviews, and they’re essential attempts to rank countries, in terms of their achievements, their assets, and other countries perceptions of them. They are part of the arsenal of the public diplomacy community, of like many actors expressed reservations about their methodology. They’ll essentially serve as proxy benchmark of global presence.

Gerry Power: Finally, we’ve observed a broad consensus among the experts we spoke to, that the data and evidence to inform and evaluate public diplomacy programs, were in greater precision and specificity. In addition, there’s a strong appetite for greater standardization, and sharing of approaches and methods.

Gerry Power: With regard to good practice and research, three points of worthy of mention. First, countries are targeting an increasingly diverse range of audiences. Traditionally, three audience groups dominated the PD space. Foreign general publics, the media, and emerging established opinion leaders, and decision makers.

Gerry Power: Emerging audiences, among the people we spoke to include, domestic general publics, more inclusive and refined categories of young people broadly defined from four years of age, to 30 years of age, with specific interests shown them. For example, women and girls, those likely to travel, those affected by violence, and young activists. Transporter groups, strategic state corners like NGOs and CSOs, and many private sector actors as well.

Gerry Power: The British Council’s results and evidence framework exemplifies the increasing diversification of audiences. It identifies 25 audience clusters, across its eight key results areas. This structure allows the counsel to exercise greater precision in program delivery, and evaluation efforts.

Gerry Power: Secondly, our research uncovered significant disharmony in the approaches to, and methods of gathering audience data. There is little standardization in how PD actors define what is most important to know about their audience. For example, many players are gathering basic demographic and reach data, however, organizations like the European Commission are developing methods to gather psychographic profiles of their target audiences as well.

Gerry Power: Finally, the capacity of the key actors to look beyond digital vanity metrics is extremely limited. However, the digital methods initiative ate the University of Amsterdam, the embryonic work by collaborative working on the digital index of global influence, and the Diplomacy Live program in Turkey, represent examples of cutting edge, and inclusive approaches to understand influence in the digital space.

Gerry Power: In regard to our focus on metrics and measurement, we uncovered widespread and varied efforts to capture evidence of the relative success of PD activities. However, we believe that effective evaluation of public diplomacy is being hindered by four factors. First, despite the depth of knowledge around the theory and practice of public diplomacy, there is little evidence of theories of change, to inform how different metrics constitute key steps in the journey to impact.

Gerry Power: This is especially true of measures of digital reach and engagement. In other words, understanding of what constitutes thresholds of success in online reach and activity is underdeveloped. As a result, actors are relying upon simplistic equations of success, with volume and frequency of interactions without knowing whether or how, that helps them achieve their goals.

Gerry Power: Secondly, short term and results driven agendas that fail to account for the long term nature of certain influence campaigns, and programs. As one of our Australian experts commented, governments want results. Economic diplomacy is king, and none of that lends to this sort of effort. An exception to this pattern, the Swiss federal counsel stands out for its commitment to long term research, and evaluation.

Gerry Power: Third, actors are using different measures to evaluate the same interventions. This is particularly pronounced with the evaluation of exchange and scholarship programs. Where as the European Commission’s evaluation of their public diplomacy program prioritizes employability measures, an evaluation of the UK’s programs focuses on perception changes, and on financial returns. Evaluation of the Australia works program meanwhile, considers all the above, and more. It includes measures of the reach and quality outputs, and uses a wide range of indicators for measuring continuing engagement with Australia, as well as their transference of skills and perspectives.

Gerry Power: Finally, with few exceptions, the trend is not to include common strategic measures across different types of activities. Again, the British Council’s is exemplary. Despite operating in 110 countries across a wide range of sectors, and with many partners simultaneously, all evaluations collect data on the same age results areas. Those areas in turn are lined with the council’s five overarching strategic objectives.

Gerry Power: The most challenging objective within our research project, was to identify group practice examples, where public diplomacy, and cultural relations organizations were deploying effective strategies for learning, from their research and evaluation work. The few examples we did identify however, constitute we believe robust, and integrated case studies, of evidence based, organizational learning.

Gerry Power: First, the Swiss Federal Administration has established an inclusive, interdepartmental working group, the IDWG, for communications abroad. This group is made up of a range of governmental department stakeholders, as well as interest representatives from across the business, travel, and culture sectors. SGE, the Swiss organization responsible for the promotion of private sector interest, Switzerland Tourism, the national marketing and sales organization, and the organization responsible for the promotion of Swiss art and culture.

Gerry Power: The IDWG is led by the secretary general of the federal department of foreign affairs, and the federal Chancellery, the staff organization of the elected federal counsel. Outside the FA and the Chancellery, IDWG comprises representatives from the federal department of home affairs, and justice and politics, of defense, of finance for economic affairs, education and research, environment, transport, energy, and communications. Objectives, priorities, and target groups from communications abroad are chosen to reflect the shared interests of the IDWG.

Gerry Power: At the operational level, present Switzerland acts as a hub, connecting the different spokes collaborating bi, and multilaterally with different actors as required. This approach is exemplary for three reasons. First, it recognizes the need for the inclusive participation of a variety of stakeholders. Second, it provides effective oversight of stakeholder activities, and third, it creates a platform for ongoing communication between the stakeholders.

Gerry Power: Second, the Goethe-Institute in Germany, have embedded within their organization, an impact cycle outlining the iterative development of their projects. This impact cycle consists of four stages. The planning and implementing impact stage, the recording and analyzing impact stage, the improving impact stage, and the communicating impact to the outside world stage. Consulting with various stakeholders, local communities, partners, target groups, policy makers, experts is emphasized at each stage.

Gerry Power: At the improving impact stage, the dialogue is with project managers, and staff, partners, and target groups. This approach is exemplary because one, it encourages the building of networks for communication with diverse stakeholders, local professional, and governmental. Second, it emphasizes a need to work collaboratively with target groups to improve projects, and it provides a common understanding within the organization of the steps for improvement.

Gerry Power: The third example of good practice and learning, comes from the Swedish Ministry of foreign affairs, that have built a digital dashboard for optimizing the delivery, and performance of diplomatic missions digital communications. This dashboard provides an overview of individual and collective performance of diplomatic missions. It is exemplary because first, it uses a set of common metrics to rank and benchmark the performance of all actors. Secondly it enables quick identification of potential sources of best practice within the organization. Third, it enables the agile delivery of targeted support and solutions to where they are most needed.

Gerry Power: The final example of good practice and learning, from the field was the Swedish MFA led midwives for all campaign. The development of this campaign allowed a set of principals advocation procreation with both expert and local groups. The core theme of the campaign was sexual and reproductive health and rights, for communities living in several of the target countries, however, this constituted a potentially sensitive topic. The focus on midwifery provided a way of talking about these issues indirectly.

Gerry Power: In addition, local Swedish missions were instructed to engage in a range of co creation activities, with local communities to develop ways of promoting, and celebrating midwifery, that were culturally sensitive, and locally adopted. The result was seven, very different, local campaigns which ran in parallel to a central, global, digital campaign. This approach is exemplary because first, over and above the fact that this demonstrates a strong project design, in terms of encouraging real time audience participation in the design of public diplomacy activities. Secondly, it successfully integrates a global, online campaign, with tailored delivery locally.

Gerry Power: Second, the data gathered from the local consultations impacted the design, and delivery of the global strategy, as well as customizing the delivery of the campaign at the local level. I conclude my remarks with a series of considerations or recommendations, drawing directly on primary research, but also on the consultations and meetings attended over the last three months.

Gerry Power: From a strategic perspective, I have four recommendations. First, to create a central hub, for insights and assessment, within the US Department of State, to pool resources in optimizing investment in data gathering, analytics, insight, and learning. For example, the way in which Presence Switzerland serves as a hub for the IDWG.

Gerry Power: Second to design a research, evaluation, and learning practice for the US Department of State, that is future proofed, dynamic, nimble, and incorporated in multiple feedback groups. For example, Defense monitoring evaluation framework for the Australia awards program, which monitors out both short and middle term outcomes, to identify and address barriers to long term goals.

Gerry Power: Third, to design a strategy, not only to engage with the private sector, and technology companies, but to ally with those that specialize in quantitative, and qualitative data optimization. The foundations for this approach are exemplified by the Danish innovation centers.

Gerry Power: Finally, consider a convening role for the US Department of State, to leverage the public diplomacy community’s collaborative spirit, and appetite for knowledge sharing. This could provide the USG, with a long term competitive advantage in the influence space, by ensuring that it is actively informed by the development of the most effective insight and assessment tools and tactics.

Gerry Power: From an implementation perspective, I also have four recommendations. First of all, it’s imperative to invest in capacity building US research teams, in technology enabled, and digital research approaches and methods. There are some existing resources that could be easily drawn on, including the digital methods initiative, DG, and diplomacy live.

Gerry Power: Second, to design platforms that enable real time audience participation in research and co creation activities. For example, the midwives for all program. In addition a best in class example from the private sector is the customer or citizen in the room methodology, which enables real time feedback from target audiences, on communications and messaging, building on opportunities to engage directly with audiences, there may be value in establishing research panels within disparate communities in the US, to monitor their perceptions of the US, in their home countries and to identify potential routes for engagement.

Gerry Power: Third, recognizing the changing information ecosystem, there is great potential to invest in basic research to inform USG wide programs, aiming to intervene in global and regional issue agendas, commonly referred to as the ideational space. For example, again DNY across platform analysis and issue mapping would provide a basis for this research.

Gerry Power: Finally, with regard to practice, in order to leverage the benefits of data modeling, we advocate design and topology-based approach to PD, where PD programs, and tools are tailored, based on a variety of context specific criteria. These criteria might include, optimal, suboptimal conditions for engagement, general support for USG policies, cultural traditions, historic relations with the USG and its people, access to western sources of information, and level of English language proficiency.

Gerry Power: Finally, in terms of the data themselves. I have four recommendations. First of all, to build theories of change that formalize digital engagement in processes of influence in order to interpret digital data in a more meaningful way. There are multiple examples from the private and social sectors, that have built in digital metrics targets into their campaigns, and communications strategies, in order to track issue residents brand equity and message traction.

Gerry Power: Second, create a data audit that maps out a hierarchy of different metrics of success. This will facilitate a common nomenclature across US Department of State research teams, regarding output, outcome, and impact measures. This has already been created by the Goethe Institute, and the British Council.

Gerry Power: Third, recognizing the dominance of visual content in popular culture, and in social media in particular, we strongly advise in investing, in building the capacity of US Department data research teams, in methods that gather visual data to capture the power of imagery, and video content. The practice of analyzing visual data is commonly employed in the private sector, to map out the variety of cultural references familiar and favorable to target audiences.

Gerry Power: Finally, and this is finally, all of this data can only be capitalized on, when supported by a robust knowledge management system, that allows for data to be gathered, sorted, and accessed in real time, across platforms. Thank you.

Shawn Powers: Thank you Gerry, for the wonderful summary of the study, it’s really a magnificent piece of work that the Commission is thrilled to be a part of. We’re looking forward to publishing it, making it widely accessible in the near future, the exact timing will be determined probably later today. I would say no longer than maybe two months from now, so hopefully it will be interesting and relevant to you all. Just a few remarks before I talk about how Gerry’s recommendations really fit into some top line commission recommendations, on continuing to improve our research and evaluation efforts.

Shawn Powers: I also want to actually go a bit farther back than Katherine did, and just mention just how important the question of research, and impact evaluation has been to the Advisory Commission on Public Diplomacy for a long time. If, you go on our website, you’ll find reports dating back to the 1950s, some of which we have by the way, because of a wonderful previous executive director, Bruce Gregory, who’s in the room, and has been very generous in making sure that we had a strong archive. But reports back to the 1950s also emphasize this exact question, in fact one from 1957, was our annual report to Congress, which concluded with the following statement:

Shawn Powers: “Evaluation of USIA, U.S. Information Agency, work remains uneven. The commission urges that greater emphasis be placed on the task of measuring the effectiveness and impact of a total USIA effort. A beginning has been made, the use of research and public opinion polls are steps in the right direction, but the commission urges that special attention be given to this crucial area.”

Shawn Powers: I mention this, because it’s not something that’s new, and we’re not trying to reinvent the wheel. Also, because it’s important to note how much progress has been made, in the 61 years since that report was published. The public diplomacy family of the State Department, ECA pioneered the first evaluation office, within the public diplomacy cone, and continues to invest resources, and human resources as well, in pursuing evaluations, with their ongoing partners of the impact of their program. The office of Analytics within IIP, is increasingly seen as a team leader across the Department, if not across the interagency, on digital metrics and assessing the impact of our programs.

Shawn Powers: Of course the research and evaluation units housed within the Office of Policy Planning and Resources, of the Undersecretary for Public Diplomacy and Public Affairs, is a wonderful coordinating team of folks that are doing independent assessments of all types of public diplomacy programs, and also providing services to regional and functional bureaus. These efforts have come a tremendous way, not just since 2014, but of course dating way back, and I think it’s important to note how much progress has been made, especially considering the constraints. Both the physical constraints, and the hiring freeze constraints, that we have been facing in the past 18 months, and so it’s worth just mentioning how incredible those efforts have been, and for those of you who were at the summit last month, you got to hear some of the detailed, innovative projects that are being pursued by those teams. So, I’m thrilled to be able to champion a lot of the work that has been done.

Shawn Powers: With all of that said, of course there’s a lot of work that can be done, and I want to highlight a few things Gerry mentioned. First is to encourage my colleagues at the State Department to consider the opportunity to take leadership in this space. There is a global need for someone to stand up and play a leadership role in convening, and directing the agendas for research and evaluation of public diplomacy efforts. I think every time I’ve spoken to Gerry about his study, one of the first thing he says is Shawn, you don’t have any idea how much energy, and enthusiasm there is around the world, but there’s this question of why don’t we have a way to get together and talk about this? Why is this the first time, we’re doing a global assessment of the different practices that are going on?

Shawn Powers: What is really exciting, I think, is that across the interviews that Gerry’s conducted, there’s a lot of excitement in having the Department of State lead that effort. There is a broad impression that we are at the cutting edge of a lot of these research and evaluation techniques, and people want to learn from us. I think there’s a lot that we can learn from them. The reason why I think this is a great opportunity, is if the State Department can take lead in this space, we can shape and form the agenda for research and evaluation around the world, that will help improve public diplomacy efforts of allied governments.

Shawn Powers: The reason that that’s so important is it becomes an impact amplifier of our efforts. Far too often we think about ourselves as having to be everything to everyone in every part of the world. But, if we have a coalition of allied governments that are equally capable in their public diplomacy programs, with shared objectives, like open societies, free markets, democratic forms of governance, freedom of expression, we can lean on those allied governments to affect public policy programs, so we don’t feel like we have to be everywhere at all times. It’s really a unique opportunity that exists right now, that I hope we can take advantage of.

Shawn Powers: As many of you know, just last month the Department of State issued, an updated Foreign Affairs Manual guidance titled, Design, Monitoring, and Evaluation, 18FAM3014, for those who are keen on looking it up. Interestingly enough, it was published the day of our summit last month, so the timing is quite coincidental. The updated guidance, includes some detailed expectations and requirements for each bureau, in terms of research, and evaluation moving forward, and it coincides not only with our summit, but continued congressional interest, and interest from OMB, on improving our capacity to demonstrate that public diplomacy programs have an impact and help us restore our national interest.

Shawn Powers: So, the following recommendations, which will be published along with Gerry’s report in the coming months, are an effort to guide how public diplomacy bureaus, and leadership think about the new FAM language and the opportunity it creates for us to take research and evaluation even more seriously.

Shawn Powers: I’ll just mention four big ideas, top line ideas. And again, you’ll see some synergies between what Gerry’s mentioned, and what the Commission is proposing. First: it’s time for us to consolidate all research and evaluation efforts from across the public diplomacy family, into a single, coordinated, centralized office that is able to serve all of those functions, but also be able to more effectively coordinate across the department, and across the interagency.

Shawn Powers: This consolidated effort would exist within the Office of Policy, Planning and Resources, and direct through the head of that unit to the Undersecretary, to be able to make sure the insights that are gained from the research and evaluations can directly shape how programs are designed, and how funding decisions are made in almost real time. This consolidated office would lead, and coordinate all research and assessment of Department of State public diplomacy programs and campaigns, including overseeing the establishment of testing of core metrics to assess program effectiveness vis-a-vi our foreign policy goals.

Shawn Powers: I should also mention that this recommendation goes along the lines of language that was already proposed in the State Department’s authorization bill, from the Senate Foreign Relations Committee in 2017. So, it has support in Congress as well, it’s a recommendation that the commission has been working on for a long time. This office would also coordinate research conducted, and gathered throughout the U.S. government, and available from private sector actors including the Broadcasting Board of Governors, INR, the Department of Defense, and the Bureau of Conflict and Stabilization Operations, which is also doing some really interesting work in the space.

Shawn Powers: Eventually, I want this office to do more than just play that coordinating role. I’d like to see them create a platform for real time sharing and presentation of data, potentially based on existing models tracking social media accounts in real time, but also focusing on tracking all data related to public diplomacy efforts so that if you’re a public affairs officer in Zimbabwe, you can in real time open up this database and get a sense for as much information as we know about local audiences as possible. So, you can consider those as you’re putting together your program.

Shawn Powers: This office would also establish, publicize, and manage a clear, simple, digital interface for public diplomacy professionals at post, to request assistance with a wide range of research and assessment efforts, one of the important takeaways from the summit, was that there’s an incredible amount of talent and work being done, but if you’re coming in from Nicaragua, for example, you have no clue, who you should actually reach out to, to get the support you need. That is troubling given all the efforts that we’re doing, but it also makes sense. A lot of us are focused on getting the job right, and not thinking about how we communicate the value of that work, and the opportunities it creates for folks throughout the state department.

Shawn Powers: This office would also prioritize research on the value of particular metrics in order to better understand what metrics matter, and why. One metrics of importance to me in particular is the idea of research on the favorability of the United States. This is a metric that the Department of State has used, the DoD has used, Gallop has used, and Pew has used. Studies that ask people do you support the United States, or do you support he leadership of the United States, or do you hold a favorable opinion of the United States?

Shawn Powers: The interesting thing about this is we presume it is of relevance to public diplomacy, but we actually don’t know if it has any tangible impact on the specific public diplomacy goals that we’re trying to achieve. Does favorability towards the United States mean more people will study in the United States? We don’t know. Does favorability mean that more people will spend money on tourism to the United States? We don’t know. So, there are some really important basic research that needs to be done to demonstrate the importance of a particular metrics, so we can get a lot more focused and strategic in thinking through, what we’re trying to achieve, and the strategic significance of those programs.

Shawn Powers: This office should also prioritize establishing more flexible contracting mechanisms perhaps modeled on what we’ve seen at the office of transition initiatives within USAID, to allow for more nimble, and flexible, and tailored partnerships with the private sector. We need to be able to choose implementing partners for our research projects that we trust, that can verify the integrity of the data that’s collected. Currently there are far too many constraints on that process.

Shawn Powers: This office, also should embrace more technically based approaches to data collection, including mobile friendly applications, and allow for the casual accumulation of data. We heard of a couple of examples at the summit that I wanted to highlight which are really interesting. First is the quick tap survey, pioneered by a group called the IMPL project, which allows for mobile collection of data, even if there’s no connectivity, in order to gather information in conflict zones, or recently destabilized countries where it’s difficult to have a strong wifi signal, or a 4G connection. Another good example was an application developed by USAID, and then implemented by the International Republican Institute, called Baldytak, which is a mobile surveying application that’s very easy to use, and very easy to teach people how to use. It allows folks to gather a tremendous amounts of information more casually. These are not formal surveys, so there are some drawbacks, in terms of how they represent an entire population. However, there’s tremendous insights that can be gained in near real time using some of these tools.

Shawn Powers: We also encourage this office to embrace both assessments that provide insights, as well as those that provide impact. One concern we have is that in the current effort to demonstrate impact, including impacts that is called for in the recent foreign affairs manual guidance. There’s a lack of focus on what are the insights that can be learned from this process. We have to avoid digging for impacts, if it comes at the expense of learning how programs can be more effective. So, I think a centralized research, and evaluation hub can provide important leadership in making sure we’re focused both on the impact question, and the insights question.

Shawn Powers: The second top line recommendation is for this new consolidated group to prioritize the creation of a strategic framework, which would guide the structure and decision making process, of department of state PD research and evaluation operations. The framework could be modeled in part after USAID’s program cycle, and adaptive management frameworks, but it needs to be built from the ground up, in order to fit the needs of public diplomacy professionals.

Shawn Powers: This strategic framework should, first be organic and inclusive, it should be crafted through a process that ensures stakeholders from across PD Bureaus are included, including a variety of stakeholders, to ensure that the development of the framework represents the broad interest of public diplomacy professionals, both here in Washington DC, as well at post. It shouldn’t just be folks that are experts on research and evaluation, but also the folks that are implementing these programs. We want a framework that makes sense, not just to us, but to everyone who’s involved in public diplomacy operations from start to finish.

Shawn Powers: The second piece of the strategic framework should be that, it has to be clear, and concise. It cannot be confusing, it cannot be overly complicated, or dense, or difficult to understand. The framework should succinctly detail how research monitoring and evaluation, and learning is a synchronous process, and valuable to a variety of stakeholders throughout the State Department. Getting that piece right, making it clear and concise, is always the most challenging part, but I think we can do it.

Shawn Powers: The third thing about the strategic framework is, it should prioritize the importance of adaptive, management techniques, that is to say it should prioritize the ability of gathering information about programs in real time, and using that information to make our programs more effective. We have to have a feedback loop, as Gerry said, to ensure that the data we gather can be actionable in near real time. The framework should also prioritize the institutionalization of learning form every single evaluation. We think that if an evaluation does not directly connect to some type of learning mechanism, if there’s not a clear sense of how we can do something better as a result of a recommendation, it simply should not be done. Period.

Shawn Powers: The framework should prioritize transparency. One of the very admirable things about USAID’s research and evaluation policy is that every evaluation conducted is made public, and I think that is an incredible goal for the department of state to pursue as well. There are going to be exceptions for security, and classified information reasons, but if that is the goal, I think it’s an important one to stay up front and work towards.

Shawn Powers: The framework should also prioritize the integration of best practices from academics and the private sector, and to prioritize identifying which measurable indicators can be used to assess the long term impacts of programs. Far too often, these conversations about the evaluation of public diplomacy devolve into public diplomacy has long term consequences, and we can’t measure in the long term, because we operate on 12 month budget cycles in the best of times.

Shawn Powers: What we can do instead of having that conversation again, is focus on what are the short term metrics that we can measure, that predict long term impacts, and that is a wonderful way that I think we’ve got some interesting models that Gerry’s presented from the Swiss, and the British in particular, that we can borrow from to try to sidestep the long overdue resolution to a debate that I think has not be helpful for public diplomacy professions.

Shawn Powers: The framework should also consider models from the Swiss and the British, as well as others, and it needs to be self-reflective. That is to say, the new consolidated research effort, and it’s strategic framework, needs to also have an assessment done. The goals of the new framework should be identified very clearly, so that it has clear metrics that it can measure it’s success against, and then be evaluated against, over the course of a series of cycles, perhaps three or four years, each evaluation effort.

Shawn Powers: The third top line recommendation is the need to consolidate existing knowledge management systems. This is to say, the MAT, PD-RAM, PDIP, into a single, new user, mobile friendly knowledge management system that is streamlined, adaptable, and reduces the burden at post. I’m thrilled to hear that we are already starting this process, but we do have a couple of guidelines that we’d like folks to consider as they think about how do we build the system to be productive moving forward.

Shawn Powers: The first, is that it needs to integrate all efforts at monitoring and evaluating programs into a single template that is digitally accessible, and truly mobile, which is to say any iPhone or Android phone can actually still contribute to this system, and use it as well. It should integrate machine learning and specific algorithm intelligence, in order to draw from existing evaluations and research when planning and designing PD programs.

Shawn Powers: That piece needs to be increasingly automated in ways that make it easier for public affairs officer to access the system, and make decisions quickly. The new knowledge and management system should emphasize learning from all programs, and allow for a range of data entry points, and types of data, and allow for data analysis to inform program design in a multi-faceted way. It needs to make data entry easy.

Shawn Powers: One example, is a platform that would allow for the option of snapping a photo of a group of IVLP alum for example, and using simple image analysis, automatically note the number of participants in the photo, participating in the events. Just imagine, rather than having to enter 25 names after an event, if you could snap one photo, which you already have snapped probably, and use that to help you reduce the burden of monitoring evaluation.

Shawn Powers: Another example, is an application, the same application of course, that has a simple record now button, which allows you to easily record an interview with alum of a program, or a participant of an ongoing program, so you can get real time feedback as to how it’s going. The recording would automatically be added to the system, so it’s not just helpful for you and your colleagues, but also help the folks back in Washington DC.

Shawn Powers: These are just some ideas, but I want to give you a sense of the kind of system that we are hoping to see. The fourth recommendation that’s coming from the summit in particular, and in the larger body of work that the commission has been invested in the space, is we are going to officially launch, the ACPD research and evaluation subcommittee, in an effort to build on the momentum we’ve seen created in the past three months, and make sure that we are offering as much support as possible to research and evaluation professionals at the Department of State. The subcommittee’s objective would be to provide feedback, early into the research process, and ensure that methodologies are rigorous, and the findings that result are rich.

Shawn Powers: They would also, the subcommittee would also help establish a set of achievable goals for this research, emphasize the employment of new research methods, and serve as a validity check to ensure that the quality of the research itself, is obvious and clear. That makes the research that results from this effort, more credible to actors outside of the Department of State.

Shawn Powers: The subcommittee will review select state department and BBG research agendas, methodologies, and interpretations at least twice a year, perhaps more if funding is available. It would be comprised of select academics, market researchers, and research professionals from private organizations that are deeply interested in these questions, and understand the constraints that my department of state colleagues operate in. So, not just folks who know how to do research in a perfect sense, or a perfect environment. People understand that doing research when you have resource and time constraints, and difficult point context, creates a different set of challenges. Folks that get that can be quite helpful.

Shawn Powers: This subcommittee will also assess the overall state of Department of State public diplomacy research and evaluation efforts, every three years. A state of the research effort, in order to make sure this consolidated office, and the strategic framework that’s been established, are being productive and helpful, and that the efforts we see as a result of that, are being measured against a very discreet, observable metrics. So, it will provide an additional check on the new consolidated effort to make sure that it is moving in the right direction.

Shawn Powers: As I mentioned, the final details of these recommendations are being fleshed out, but I wanted to give you a sense of the top line recommendations that are going to come out of the reports. At this point, I’d really like to open things up for a question and answer session, with the commission members, and of course the audience. We’ve got about 18 minutes left. So, I’m hoping that some of the presentations have sparked some questions for you all. Thank you.

Sim Farar: I have a question for Dr. Power. Of the 17 countries that you have been looking at, which one do you think has been the most innovative, would you say, and helpful in this space, and what can the State Department learn the most from this country you’re talking about, of the 17?

Gerry Power: I think it’s important to recognize that the examples we cited, are doing different things, really, really well. For example, I think the organizational structure of the Swiss example is more akin to the challenge for the state department, because it’s operating with multiple departments, and sub units, and sub groups. I think the Swiss are really, really strong there. The British Council is really strong, although it’s one cultural institution, in terms of the complexity, and the integration, and consistency, of its results and evidence framework. Then I think thirdly, the Germans particularly with the work at the Goethe Institute, has been powerful in terms of building in the stakeholders, and target audiences into their project design. I think those three examples, because of what they’re doing in very, very, unique ways. There’s a lot to be cherry picked from different countries, rather than one standing out as being very good at everything.

Anne Wedner: I think you guys, this work is incredible. Gerry I really appreciate your presentation. Yours too, Shawn. Just to step back for a second, what we’re saying in essence is that we are contextualizing ourselves at this point and saying, it’s a very competitive marketplace of ideas, and some our allies are doing it well, but you guys didn’t even mention you know what some of our maybe ideological competitors are doing, like Russia, and China. It seems like some of the Russia stuff is really sophisticated, cause it would seem to be sort of effective, but on the other hand maybe it’s not, and maybe it was just luck. So, I just am curious, would we consider thinking about ideological competitors behavior in the context of evaluating where we think they’re at on this.

Gerry Power: Just a footnote response on that: we spoke to two Russian experts, both representing different Russian NGOs. We spoke to two Chinese experts, one an academic, and one a broadcaster. The new spoke to, and I’m putting this fifth one in the same category, to a Brazilian, representative of the MFA. Because it was only five interviews, and because of the nature of the questions, we weren’t able to probe, or did not probe beyond the set questions we had for everybody else. But, I got a very distinct sense from those interviews, that they are thinking about their challenge in the world very, very differently. So, their approaches, and the party line that they’re operating with, is not the paradigm that’s more familiar to you and to our European friends, and the Australians, and Canadians. It’s much more global. It’s much more about relationships rather than influence.

Gerry Power: Really interestingly with the two Russian interviews, they didn’t know each other, didn’t know we were interviewing the other, when they said to them, what is your optimal measure of success in your public diplomacy efforts, they both said, keeping people at the table. Maintaining the dialogue. I thought it was really interesting, cause no one else had said that. It really just hints there of different ways of thinking about the nature of the challenge. Whether it’s right, or wrong, or good, or bad, it’s less relevant. It’s just, I think there’s a body of thinking out there, that’s not aligned with the traditional culture of public diplomacy. I also believe very strongly, that there’s a lot to be gained from engaging actively with the allies. I also think there’s a lot to be learned from engaging in a different capacity with the ideological folks. So, I would strongly encourage that for different reasons. That was a long foot note sorry.

Shawn Powers: Do we have questions in the audience? If you wouldn’t mind introducing yourself, and your organization, that’d be helpful.

Daniel Valley: Daniel Valley, with Global Ties US. Shawn, I’m very struck by the mention multiple times about USAID. One of the things, I think I’ve spent a fair percentage of my career within the AID program, was the cultural shift that I’ve seen over the last 20, 25 years inside USAID. Everybody in AID that I knew when I first got started as a non-profit worker, was former Peace Corps, going to save the world humanitarian, and then monitoring and evaluation came to town. It shifted the culture over time. I’m thinking about, it’s one thing to articulate very clearly, sort of this needs to happen, this needs to happen, this needs to happen. But the cultural change that needs to take place inside the state, and inside any organization that has to go from what it started out to do as a mission, into an evaluative state, where they can actually prove it, has a fairly big shift. I’m wondering how you get incentive, within the structure of the organization itself, so that you have not just one or two champions, but eventually grow an entire generation of people who see this as amplification from within, so to speak.

Shawn Powers: No, it’s a great comment. There’s a lot to be learned from USAID, both in terms of good practices, but also in terms of mistakes to avoid. I think in a lot of ways, I talked to a number of USAID folks that work in the monitoring and evaluation office, and there’s a general sense that it’s too systematic. It’s too rigorous, and doesn’t allow for flexibility that is oftentimes needed, in particular in the field, or particularly maybe with pilot programs, just in different contexts, and it has not been innovative enough. I think we can benefit from the fact that USAID did a lot of work in this space already, and we can take away what’s worked best, and what we should try to avoid.

Shawn Powers: Fortunately, I think the cultural shift you’ve mentioned is already starting to take place. I don’t want to overstate the appreciation for research and evaluation at the Department of State, but even in my year and half at the State Department, I’ve seen a change in the ways in which people think about integrating research and evaluation into their process, their programs, and their bureaus. I think there’s still considerable work to be done, but what I’m really encouraged by are folks who are willing to make key financial decisions based on a demonstration of rigorous, monitoring, and evaluations scheme. The Bureau of East Asian, EAP bureau is increasingly funding all of their public diplomacy programs around this idea, that nothing gets funded unless there is a clear demonstration of a monitoring evaluation framework.

Shawn Powers: Hopefully if that pilot goes right, there’s a lot to be learned across the State Department, but it is certainly a meeting by meeting, coffee by coffee effort to make sure the people can increasingly appreciate this. For me the crucial component is talking about the value of this, not as helping Congress understand why this is a good return on their investment, but instead helping practitioners who do public policy programs at Post, helping them understand how they can be more effective by using these tools. How they can make sure their programs actually have a greater impact. That’s where you see a culture shift happening.

Brian Gibel: Hi, Brian Gibel, House Foreign Affairs Committee. Thank you both for your presentations. Thank you for the Commission for all your good work, and especially the job, I guess maybe started with Dr. Brown, but certainly you’ve continued and taken a much farther Shawn. That’s really helpful so we can look and see how much we’re spending on every program around the world, which is a great way for us to start to see what’s being effective and what’s not.

Brian Gibel: Getting back to that cultural change, I’m also foreign service officer at State, and I’ll go back there next summer, return to State, and I’ve seen this cultural shift already take place as a PD cone officer. I think one thing I’ve been a little frustrated by, and maybe this has already started to change, but with the past year, I’ve been where working in the House, which is this overreliance on data from social media. Having worked, spent a lot of time in China where a lot of our social media tools are not available to us, I’ve been very frustrated by the overemphasis on digital benchmarks at the expense of tested, verifiable, metrics.

Brian Gibel: The response I got back many times was well, that’s one country. So many administrations, recent administrations here, have called one of the most important relationships in the world. 1.3 billion people. They’re all over Africa, Latin America, Europe, and they have their own very effective, public diplomacy, weaponized tool kit. It’s obviously much more important than just one country.

Brian Gibel: Now, I think maybe we’ve done a lot more work since then, but getting back into that idea of over reliance on social media. Are we being too reliant on social media, because we can take a look at these hits and decide, oh this is how many people have seen this, and what not, or how do we work more effectively to gather the data that we need in countries like China, also where it’s harder for us to get those numbers. Thank you.

Shawn Powers: I don’t want to answer if Gerry wants to chime in, but we’ve been very critical of what we call vanity metrics. Vanity digital metrics, which are off the shelf, remarkably easy data points to get with the right software and tools, and of course, remarkably appealing to your principal. So part of the challenge is that any number of folks who make decisions, don’t have the time to get deep in the evaluation process, and if you can give them some percentages, and some basic numbers, it’s really helpful.

Shawn Powers: We’ve been pushing back on that for a long time now, including in the 2017 commission report, Can Public Diplomacy Survive the Internet?, which called into question the validity of the vast majority of the metrics that are provided by commercial operators. I think one of the things I’ve been really happy to see over the course of the last 18 months, is using those metrics less as proof to your principal about the success of a tweet, or a Facebook post, but instead using those metrics to inform how we think about the next post, or the next tweet.

Shawn Powers: So, it’s less about demonstrating that we have an audience size of x number of people, but rather, oh, these videos work remarkably better compared to these videos. What can we discern from these videos and their success, versus these failures, to think about how we can make our content more strategically aligned with the audience that we’re trying to meet. I think that’s where the metrics can actually still be quite helpful, as long as we’re always skeptical of the overall quote unquote impact that a number of retweets can actually show. Luke, I don’t know if you want to chime in on this, we’ve got the head of the Office of Analytics at IIP here, who focuses quite a bit on this question, so if you’ve got anything you want to add that may be helpful.

Luke Peterson: Sure. Yeah, so speaking specifically about China, we’ve done a lot of work in the last year, and I think one of the more important ways that we look at social, is in the audience research realm. So, by looking comprehensively what people are saying on on different issues, or what kinds of things we’re seeing an aggregate on WeChat, which is another important platform, are informative to the what kind of programming we’re running.

Luke Peterson: Maybe, the size of your social following isn’t as important as knowing what the topic of the day might be, and therefore how Beijing might want to insert itself into the conversation. I think you’re broader point is, we’ve looked at China as a post for a long time, how we look at it as 1/5 of earth, and so we’re appropriately organizing ourselves to solve those problems.

Luke Peterson: I have a question, also if that’s okay. So, I heard a lot about programs, and campaigns, and how our peers are able to measure those efforts. One thing I didn’t hear a lot on is press. I think a lot more PD folks in the field think of themselves as press, comms, media folks. So, trying to figure out how to apply measurement frameworks to those kinds of programs, I’d be really interested in hearing who’s doing that well.

Gerry Power: It’s a great question, cause interestingly, and surprisingly, there was little reference to media in any of our discussions. I was surprised with that. I was just as expecting a much greater awareness, and sensitivity around media coverage, and media monitoring, but that wasn’t the case. That may be, because of the profile of the people we spoke to, because they tended to be more research and evaluation people, and the digital experts, rather than the press officers, so I might get a very different sort of responses if it was a different profile. But it certainly didn’t come up naturally.

William Ogborn: William Ogborn, with Motif, one quick question. How much of it is going to integrate research metrics from the Department of Defense, going to the State Department. They have a lot of integration in Afghanistan and Iraq, with the Psy-ops community, and of course, they’re doing more gray and black hat, but a lot of it was crossed over. Of course, there were also a lot more sometimes divergent goals, because they weren’t talking to each other, and so some of those needed to be ironed out as well. Thank you.

Shawn Powers: No, this is a really important point William, so thank you for raising it. All too often government agencies don’t talk to each other about the research they have, let alone the strategies that that research is informing. That’s a broader issue that we’d love to see the results of, but we have to take it one step at a time. One of the specific recommendations we’re making though, is that the consolidated research office does establish a specific person, whose job it is to coordinate research, not just within the state department, but also across the interagency. Someone whose job it is to be consistently aware of all the different assets that are available, and able to make those assets available to folks at post, as quickly as possible.

Shawn Powers: There are clearance issues, levels of classification are remarkably complex, but the goal is to get someone who can do those in an, analog, real person kind of way early on, and eventually be able to build a system that allows real time sharing of these different data sets, as much as possible, so that we don’t even have to have a coordinator kind of emailing. Hey take a look at this PDF, or go to this website, but actually a platform that allows people to do that on their own, and the coordinator then just makes sure that the data that’s building into that platform is good, and reliable. So, that’s a medium term recommendation.

Shawn Powers: We’re about out of time, and I want to respect the fact that we only reserved the room till noon, which we’re very grateful for. I’d like to invite Anne Wedner to offer some concluding remarks.

Anne Wedner: Alright, well. It worked out, I guess we hit our deadline here, I just want to thank everyone on behalf of the commission today. I think this was a really interesting, beginning of a conversation. I feel like I am so excited to see the reports that are coming out, and really be able to study it, and figure out what the eight results, and the five strategic research objectives are for the British. I hope you disclose that in the report, and all the other lists of numbers that you alluded to Gerry. So, I think from our perspective, we’re really trying to bring up, and push forward some of the ideas that Shawn is trying to work through the organization, and hopefully that will all be much more clear, and a process that everyone in this room will be involved in adopting, and adapting as we go forward.

Anne Wedner: So, one last note is that, our next meeting will be on May 8th, and we like holding these meetings as a sort of way to have these discussions. I think it’s one of the only venues where you can really try and think about ideas on public diplomacy. On May 8th we’re going to talk about the future of American Spaces, so and with that the value in furthering American interests and values around the world. I think that, that will be a really interesting discussion as well, and I thank you guys for coming, and please come back and see us in May.

Sim Farar: Thank you very much.

U.S. Department of State

The Lessons of 1989: Freedom and Our Future