An official website of the United States Government Here's how you know

Official websites use .gov

A .gov website belongs to an official government organization in the United States.

Secure .gov websites use HTTPS

A lock ( ) or https:// means you’ve safely connected to the .gov website. Share sensitive information only on official, secure websites.

SAQMMA15F3863

Executive Summary of the Comprehensive Evaluation of the Bureau of Counterterrorism (CT) Antiterrorism Assistance (ATA) Program

Submitted to:
U. S. Department of State
Bureau of Counterterrorism (CT)
Bureau of Diplomatic Security (DS) Office of Antiterrorism Assistance

Submitted by:
DevTech Systems, Inc.
1700 N. Moore Street, Suite 1720
Arlington, VA 22209
Tel: 703-312-6038
Email: cmcali@devtechsys.com

July 26, 2016

TABLE OF CONTENTS

ACRONYMS
1.0 EXECUTIVE SUMMARY
2.0 ATA PROGRAM BACKGROUND
3.0 EVALUATION METHODS & LIMITATIONS
4.0 EVALUATION ACTIVITIES
5.0 EVALUATION QUESTIONS, FINDINGS, AND CONCLUSIONS
6.0 CONCLUSIONS AND RECOMMENDATIONS

ACRONYMS

ARE — Assessment, Review, & Evaluation
AARs — After Action Reviews
ATA — Antiterrorism Assistance Program
CAP — Country Assistance Plans
CBRN — Chemical Biological Radiological Nuclear Training
CIP — Country Implementation Plan
COR — Contracting Officer’s Representative
CRT — Crisis Response Team
CT — Bureau of Counterterrorism
DGD — Directed Group Discussion
DoD — U.S. Department of Defense
DoJ — U.S. Department of Justice
DoS — U.S. Department of State
DS — Diplomatic Security
DS/T/ATA — Office of Anti-Terrorism Assistance
ECRs — End-of-Course Reports
FBI — Federal Bureau of Investigation
FDG — Focus Group Discussions
GIPN — National Police Intervention Group
ICITAP — International Criminal Investigative Training Assistance Program
INL — Bureau for International Narcotics and Law Enforcement Affairs
IT — Information Technology
JIPTC — Jordanian International Police Training Center
KII — Key informant interviews
M&E — Monitoring and Evaluation
MSP — Mission Security Plan
NDAA — National Defense Authorization Act
OIG — Office of Inspector General
PM — Program Manager
PN — Partner Nation
POC — Point of Contact
PSD — Public Security Directorate
RSO — Regional Security Officer
SME — Subject Matter Expert
SOW — Statement of Work
TOC — Theory of Change
USG — United States Government

1.0 Executive Summary

The purpose of the ATA Evaluation was to conduct a three-phase evaluation examining ATA in three distinct phases:

  • Phase 1: Theory of Change: An examination of the ATA program’s role in adequately addressing policy priorities, underlying assumptions, and ATA’s theory of change, as well as discerning overarching strategic objectives that guide ATA’s planning of program’s activities and targeted focus areas;
  • Phase 2: Results and Impact: An evaluation of ATA’s programmatic results and impacts, which seeks to identify and understand the results of ATA programming;
  • Phase 3: Cost-Effectiveness: An analysis of ATA cost effectiveness and administrative efficiency of the ATA program.

Phase I examined the ATA program across the security assistance sector as well as the effectiveness of ATA programing in serving larger policy objectives. The evaluation team assessed underlying assumptions, the ATA theory of change, as well as larger objectives that direct ATA focus areas and planning across various regions and partner countries. Evaluators examined how the ATA program conducts assessments and the methodology for ATA’s development of country implementation plans. The Evaluation team worked to identify areas of opportunity to leverage ATA’s strengths, and conversely, sought to find areas of weakness for improvement.

Phase II evaluated the results and impact of the ATA program to ascertain the impacts of ATA interventions, attempting to determine attribution, with the end result of providing CT and DS with recommendations to improve the design of ATA programs and policies. During Phase II, the evaluation team traveled to the three case study countries – Jordan, Indonesian, and Niger – conducting field work and data collection activities.

Phase III examined the management and administration of the ATA program, measuring for cost effectiveness and program efficiency across all aspects of the ATA program. Evaluators worked with DS and CT staff to analyze program operations and determine most and least cost-effective program activities.

The DevTech evaluation team was tasked with using three partner nations (Jordan, Niger and Indonesia) as case studies for all three phases. The Evaluations case study model allowed evaluators to understand the impact of the differing budgets and programmatic efforts and effectiveness across different beneficiary nations. The three phases were conducted sequentially with DevTech evaluators traveling to each of the three partner nations to observe ATA outputs and activities in different contexts.

2.0 ATA Program Background

A strategic partnership between the Bureau of Counterterrorism (CT) and the Bureau of Diplomatic Security (DS), the ATA Program serves as a primary provider of U.S. government antiterrorism training and equipment to 53 active partner nations, building capacity to investigate, detect, deter, and disrupt terrorist activities while bolstering foreign civilian law enforcement counterterrorism skills. Through a blend of training, equipping, mentoring, advising, and consulting partner nations, ATA has successfully delivered services to 100,000+ law enforcement personnel from 154 countries and has offers antiterrorism course divided into 11 main disciplines, including: Police Operations and Law Enforcement Management; Maritime & Police Training; Protection of National Leaders; Police Tactical training and Infrastructure Security; Chemical, Biological, Radiological, and Nuclear (CBRN) & Mass Casualty training; Senior Investigative & Crisis Management; Human Rights, Trends, and Cyber; Explosives; and Homeland Security.

Created In 1983 through a Congressional amendment of Part II of the Foreign Assistance Act (FAA) of 1961, the ATA program was designed to successfully assist U.S. partner nations develop the capabilities needed to detect, deter, and investigate terrorism. The ATA program’s strategic guidance, policy formation, and oversight are managed by CT, while program administration and implementation of antiterrorism training are the responsibility of DS. Both CT and DS work with regional bureaus and overseas posts, to make certain ATA activities targeted towards key focus areas that address multiple converging issues relevant to partner nations, including the threat of terrorism, individual country level operational needs, and the advancement of U.S. national security interests.

3.0 Evaluation Methods and Limitations

With the exception of financial management records, budget documents, and selected outputs from the Assessments, Reviews, and Evaluation (ARE) division, the evaluation used qualitative sources of information. A major contributor to this approach is also the absence of a performance management system that tracks ATA activities and achievements. However, the qualitative data collected by the evaluation team provided a rich contextual understanding of the motivations and circumstances that guided the ATA leadership’s decision-making processes over the evaluation period.

A logistical adjustment to interchange Phases II and III in the evaluation timeline was needed because the country teams were not prepared to work with the evaluation team at the originally planned schedule. The change did not affect the evaluation outcomes in any way.

An additional limitation resulting in a modification to the scope of work was evaluator access to financial data and documentation, due to the original scope of work requiring Diplomatic Security to produce inaccessible financial documentation and data that was beyond the time period allowed under Diplomatic Security’s current financial reporting system.

4.0 Evaluations Activities

Phases I and II Evaluation Activities

The evaluation methods applied in Phases I and II were largely qualitative approaches, which included the following:

  • Document reviews of Country Implementation Plans (CIPs), country assessments, external reports and assessments, End-of-Course Reports (ECRs), After Action Reviews (AARs), offer cables, and other program documents to understand the strategic approach and management processes of the program. In some documents, quantitative data were available and analyzed such as in-country training statistics and budget information.
  • Individual and group interviews with various ATA stakeholders. The DevTech team interviewed a total of 48 key informants. In Washington, DC, the team met with ATA program management team for the CT bureau to gain an understanding of the processes and motivations that guide the program’s strategic planning process. The team also met with senior management of the DS bureau to obtain an overview of the management and administration approaches of the ATA program. Also within the DS bureau, the team conducted interviews with various staff from TMD, TCD, TDD, and ARE to understand the processes of providing the various ATA training programs from CC assessments and curriculum design, to training delivery.
  • Following the Washington-based interviews, the evaluation team visited the case study countries of Indonesia, Jordan, and Niger to collect field-based information on ATA’s training activities. At the country program level, interviews were held with the country teams including the resident program managers (RPMs), deputy RPMs, and various FSN staff to understand the training and general day-to-day operations of the teams. The evaluation team also met with leaders, key staff, and past trainees of the national police units that received ATA training in the three countries. The meetings aimed to gather perceptions on: 1) how ATA training has built antiterrorism capacities in the national police units, 2) the specific skills gained from the program and its application after training, and 3) expectations of greater collaboration with the ATA program in the future.
  • Site visits. The evaluation team visited training facilities in the US and the case study countries to assess the venues, equipment, and other resources used in conducting the hard and soft skills courses. In the country visits, the evaluation team also toured the various national police facilities that have received ATA training and equipment. Using site observation records, the team collected qualitative information on current conditions, maintenance and daily operations, and external factors (e.g. availability of power, infrastructure conditions) to assess the management and sustainability of ATA assistance at the PN level. The complete list of PN facilities visited is provided in the Phase II report.

Phase III Evaluation Activities

Phase III evaluation activities and tasks included:

  • Interviews with CT and DS staff, including senior officials responsible for financial management, budget oversight and reporting, training curriculum design and delivery.
  • Reviews of planning, funding, and reporting documents concerning ATA, including relevant reports from State’s Office of Inspector General (OIG) on the management and implementation of ATA; DS/T/ATA budget information for fiscal years 2008 to 2015; a memorandum of agreement (MOA) delineating CT’s and DS/T/ATA’s roles and responsibilities for ATA; relevant sections of State’s Foreign Affairs Manual summarizing roles and responsibilities for ATA and U.S. embassies in the countries of focus regarding the ATA program implementation; and DS/T/ATA’s Country Assistance Plans (CAPs) and Country Implementation Plans (CIPs) for fiscal years 2009 to 2015.
  • Reviews of State strategic planning and performance reporting documents related to ATA for fiscal years 2008 to 2015, including State budget justifications; State Performance Plans; State Performance Summaries; Bureau Performance Plans; and Mission Performance Plans for Niger, Indonesia and Jordan.

5.0 Evaluation Questions, Findings, and Conclusions

1. Phase I: Theory of Change: A review of the ATA program’s role in serving policy priorities, and the underlying assumptions, theory of change and overarching strategic objectives that guide ATA’s planning of program’s activities and focus areas.

Evaluation Questions

  • In what ways could the ATA country strategies and Country Assistance Plans be better designed and executed to improve success? How well does the CT Bureau provide strategic and policy guidance to shape the focus areas of the ATA program? What methodology or methodologies has/have been employed in the research and development of US long-term, counterterrorism goals for each country?

Findings:

There should be a clearer chain of logic from regional goals to country specific goals through to the programmatic goals and the activities proposed to achieve those nested goals. What is missing is a sense of “how” these training courses support programmatic goals that feed into regional goals. Assumptions need to be made more explicit, timelines need to be more specific, and milestones need to be set along that timeline. Also and importantly, Performance Targets and Performance Indicators need to be metricized. Indicators of Potential Impacts must be developed and these must be tied to specific instances of assessment and evaluations (i.e. capstone exercises, mentor/SME centric assessments, etc.). These practices could eventually mature into a full-scale monitoring and evaluation framework incorporated as an intrinsic part of the CIP.

CT’s policy guidance is presently too broad given its limited resources and toolkit. CT needs to evaluate and prioritize what goals are fundamental to success (which needs to be specifically defined from country to country) and then refine and correct that analysis based on resources – including the political will of the partner nation – and the tools (courses) available to achieve them.

There are several assessments that occur within the ATA Program with specific foci. The Program Review assessment helps shape and re-evaluates the resources and focus of specific ATA country programs.

DS/T/ATA/ARE has developed a model called “Building a Capacity: From Launch to Sustainment” which is a sound starting point for the type of methodological approach necessary to develop, implement, monitor, and evaluate ATA efforts. DS/T/ATA/ARE also conducts technical assessments and opportunity analyses both of which should be integrated to a greater degree in decision making and planning.

Notwithstanding the number and quality of these assessments, there is no overarching methodology that captures the results of these assessments and integrates them into a standardized fashion into the planning, implementation, and justification of programs.

Evaluation Questions

  • How effectively have US long-term, counterterrorism goals been translated into operational-level programmatic objectives for each country, especially with regard to realistic expectations and being proportionate to authorized funding levels?

Findings:

There is a great deal of information collected; however, it has not been collated in a manner that would lend itself to the type of critical analysis necessary to effectively demonstrate how long-term policy goals are being met. A 2012 OIG inspection of the ATA program indicated, on pages six through eight, that despite the existence of a policy framework in place, the type of policy guidance necessary to formulate sharp programmatic objectives was still obfuscated by poor working lines of communication and cooperation between CT and DS. The evaluation team has observed the same roadblock to effective and efficient communication and cooperation in the present evaluation, although there appears to be a genuine attempt to bridge the gap

Evaluation Questions

  • Where does ATA programming fit within the broader Department of State and interagency security sector capacity building structure and related planning processes? What is the ATA program’s comparative advantage in the U.S. Government’s security sector capacity building framework and tool set?

Findings:

By law CT serves as the Presidentially-delegated managing authority for the ATA program and is tasked with coordinating/de-conflicting with interagency partners both at the strategic planning level and during field implementation activities. Given its successes to date, and relative size as compared to other organizations, ATA should be the premier organization in providing assistance in the areas of training, assessments, and curriculum development. However, ATA can be more active in its interagency coordination functions in some of its country programs. ATA is actively involved in various working groups but the program has not fully leveraged its working group participation and the relationships developed with the local law enforcement community to explore synergies among organizations in antiterrorism activities. In one partner nation, for example, a USG agency is active in countering violent extremism (CVE) focusing on anti-recidivism. At the same time, the PN law enforcement sector is also conducting CVE activities focusing on communications and outreach. The common element between the USG agency and the PN law enforcement sector is ATA but very little coordination was being done by the program.

There are three comparative advantages to the ATA program over similar programs. These largely reside in the work of DS/T/ATA. The first is a high-quality curriculum and courseware production unit that is suited to not just training but providing education that could truly build capacity within the administrative and bureaucratic segments of the partner nation. This capability could also be expanded to help coordinate the efforts of other implementers in training in a partner nation. The second advantage comes in the form of an experienced assessment unit that could be expanded to conduct critical assessments and evaluations not only for ATA but for the larger interagency players involved in security assistance. The third advantage is that DS/T/ATA represents a deep reservoir of law enforcement expertise at the pointy end of the counter terror spear. This experience is an invaluable tool to building bilateral relationships among host country law enforcement agencies.

Evaluation Question:

  • How do the core assumptions, theory of change, and overarching goals guide the ATA program’s areas of focus and planning in specific regions and countries?

Findings:

The ATA Program mission is multifaceted with constituent parts that are not prioritized and not fully delineated. As a consequence, it is difficult to successfully align the mission with resources and capabilities in a manner that meets goals, intents, and expectations. In addition, CT and DS have very different understandings of what constitutes ATA Program’s mission success. In some areas, mission focus is provided by a clear, external actor or threat that both camps can agree on (such as ISIS). It may be that in these instances the presence of a tangible external actor whose impact in the environment is palpable provides a clear foil against which to measure the impact of ATA’s training efforts. In the absence of these external actors a clearer methodological approach to the development, implementation, and justification for projects is necessary. In a general sense, CT has a top-down strategic view of ATA goals and objectives while DS has an in-depth, bottom-up knowledge of PN capabilities and needs. In many cases, the perspectives of these two bureaus are not fully aligned.

Evaluation Questions:

  • What does the ATA program view as its core competencies and how does that shape its planning processes and assumptions? How does the original mission for ATA align with current threats and gaps in assistance?

Findings:

The ATA program provides high quality training in protecting critical infrastructure and the national leadership, responding to terrorism incidents, and managing critical terrorist incidents with national- level implications. ATA’s early emphasis on tactical level training to enhance counterterrorism capability continues to greatly influence DS/T/ATAs curriculum development.

Both CT and DS are deeply aware of the ever-evolving national and regional threat conditions. Both bureaus also have the mechanisms to respond to these conditions: CT with strategic formation and DS with training delivery. For the ATA program to be more responsive, both bureaus should be more engaged with each other (and with other USG agencies).

2. Phase II Results and Impact: A performance evaluation seeking to identify the results of ATA programming.

Evaluation Question:

  • What effect, intended and unintended, have the country programs’ activities had on the areas/offices/units/etc. in which the activities were targeted?

Findings:

The overwhelmingly consistent response from respondents from the PN law enforcement units was that the ATA program and its activities had a significantly positive effect on its trainees and overall operations. The various training courses enhanced the hard and soft skills of beneficiary officers and it has built the capacity of the local law enforcement sector in antiterrorism activities. At the individual level, past trainees interviewed by the evaluation team stated that their skills and knowledge improved significantly since ATA courses are at the advanced level. Interviewees also stated the contents of the courses and the means of instruction were very detailed. In PN law enforcement units that had several trainees, unit commanders stated that the aggregated effect of the program on their capabilities was significant. A significant contributor to this outcome was the ATA-trained staff sharing skills, either formally or informally, with colleagues in their units.

Evaluation Question:

  • What degree of confidence is there that the outcomes can be attributed directly to the country program?

Findings:

Despite the overwhelmingly positive feedback on the effects of the ATA program on the capacities of the PN national police units, some considerations have to be made regarding the level of attribution that can be given to the ATA program. First, despite various proclamations from respondents about how past trainees were involved in responding to or preventing incidents, it is impossible to make that causal link that the training led to the positive outcome since all data are anecdotal in form. Second, the program did not establish a performance monitoring plan prior to implementation so that the reported outcomes can be documented and mapped to ATA activities. In the strict application of M&E principles, attribution can be made only if all other factors can be controlled for such that the effect of an intervention can be observed. These confounding effects have to be “held constant” to accurately estimate the effect of the ATA program. Unfortunately, the program does not have a performance management system that: 1) captures that type of information, and 2) has an analytical model that controls for such effects.

Evaluation Question:

  • What environmental factors (such as the recipient country’s resource base) have impacted achievement of the program goals and to what degree?

Findings:

There is a clear positive correlation between specific environmental factors in the PNs and the abilities of the local law enforcement ATA beneficiaries to achieve and sustain program goals. The evaluation team observed two significant factors. The first are the broader conditions of economic drivers in the country, specifically the information technology (IT) and infrastructure sectors. Antiterrorism and general law enforcement operations are more responsive if the local IT environment is more developed. In Jordan, for example, the Jordan Command and Control Center (JCCC) is a public security and emergency response facility established in 2009 as a partnership between the Jordan PSD and the ATA program. Within a central facility, the JCCC operates a computer-aided dispatch (CAD) similar to the 911 response system of North America, a geographic information system (GIS) covering 568,000 landmarks throughout Amman, a city-wide closed circuit TV (CCTV) network, an Automated Number Plate Recognition (ANPR) system to read license plates captured in CCTVs, a Mobile Video Recorder System, and an Automated Vehicle Locator (AVL) to track the location of police vehicles. The JCCC serves as a national crisis center that supports the PSD and coordinates responses among the various ministries and civil defense offices during an event. The effective operation of all the center’s systems depends significantly on the presence of a developed and reliable IT environment around Amman.

The second factor that has affected a PN’s achievement of goals is more programmatic in nature. Specifically, the practice of rotating law enforcement personnel to other units has affected the continuity and sustainability of ATA training assistance. The reasons for the rotations and reassignments can be operational in nature. In some national police units visited by the evaluation team, there is a rotation policy for officers above a certain rank. However, there are also non-programmatic reasons for reassignments. Unfortunately, there is a culture in some countries where a person’s initiatives for human capital development are seen by colleagues as a threat to their own advancement. Thus to eliminate that threat, the person can get reassigned sometimes to remote positions or in functions where the ATA skills learned cannot be fully applied.

Evaluation Question:

  • How effective have these country programs been in comparison with alternative approaches/interventions in reasonably comparable situations?

Findings:

As antiterrorism has become global in geographic focus, PNs have also received assistance in programs with countries aside from ATA. The general feedback from interview respondents who participate in other antiterrorism programs stated that: 1) the contents from courses given by other countries are complementary to ATA, and 2) in some cases ATA courses are designed as more long-term engagements. Respondents stated that the clear advantage of the ATA program is the CRT course because it is an intensive tactical training course to develop hard skills in responding to a variety of crisis operations.

3. Phase III Cost-Effectiveness: A review of the cost effectiveness and administration of the ATA program.

Evaluation Questions

  • What criteria and methods are used for operation and control of the processes?

Findings:

Based on ATA budget execution, formation criteria, and methodologies used for operations and control of the processes in managing and administering the ATA program, DS is sufficiently following DoS standard procedures and has adequate internal controls in place to ensure budgetary, financial, and operational management is efficient and under control. Monthly financial reporting and general responsiveness regarding financial and/or budgetary inquiries need improvement. This is an area of operational improvement but presents a low risk to ATA Program effectiveness.

Evaluation Questions

  • How well or not do procurement processes and program delivery align and interact?

Findings:

ARE and DS’s B&F conduct the NA of the capacity and capabilities of PNs to ensure proper planning, coordination and conducting of assessments, and generating of reports with technical analysis to support country and regional implementation planning. The TCD and TDD develop and/or oversee the development of curriculum for ATA sponsored training courses to ensure inclusion and development of best practices (i.e., law enforcement and antiterrorism methodologies). Lastly, TMD and TDD conduct trainings and provide equipment to build PN law enforcement agencies’ capacity to detect, deter, disrupt, and investigate terrorist activities and suspects (See ANNEX C.). This process ensures successful management and oversight of in-country training programs and staff. The procurement processes are followed in compliance with State Department standards and guidelines. ATA’s Departments and Divisions teams work collaboratively to deliver each process effectively.

Evaluation Questions

  • What are the costs associated with the operational portion of the ATA program?

Findings:

The operations processes are well structured to work harmoniously toward effective delivery of the ATA training courses and complement one another. However, despite efficient reporting and constant communication between the top management of both bureaus, ATA divisions’ members seem to not be always on the same page with regards to current and future changes in staffing and implementation of the operations.

Evaluation Questions

  • What changes could be made to the management and administration of the program to ensure maximum effectiveness? How well does the management structure facilitate program delivery?

Findings:

The ATA program can implement changes and adjustments to ensure maximum effectiveness in the following areas:

Monthly Financial Reporting and Responsiveness: DS/T/ATA should create, document, communicate and carry out timely submission of monthly financial and programmatic reporting to CT.

Management Structures and Program Delivery: The current management structure appears to perform well and the coordination between the divisions appears to be efficient and their efforts are complementing each other. Regular communication between country teams and ATA Washington (divisions) can be improved in some cases.

Current and Ongoing Cost Cutting Approaches: DS initiated two major cost-effectiveness programs in FY 2016. The first is the regional focus on training delivery as ATA identified regional training centers in PN host countries of strategic importance and geographical proximity to areas of conflict and/or areas with ongoing or potential risk of terrorist activity. This approach is expected to cut costs significantly and save on international travel, accommodation, and staff man hours as the PN authority delegates will receive training in-country or travel to a regional center that is close to the PN country to receive training.

Second, ATA is changing their staffing structure to convert qualified external contractors to Personal Service Contractors (PSCs) across all of the program divisions. Based on the number of positions that ATA is converting across the whole program compared to loaded rates for the current contractor positions, the amount is estimated to be $2.5 million in savings.

Evaluation Questions

  • How are individual course requirements determined, how is equipment procured?

Findings:

The cycle for ATA course design and curriculum development and the ATA equipment procurement process include a needs assessment, training courses curriculum development, equipment procurement, and program reviews. The needs assessments are conducted by ATA due to interest of the USG in the antiterrorism as well as general law enforcement capabilities and response capabilities of the Government of the PN’s authority(s) of interest.

The training curriculum development phase oversees the design of ATA sponsored training courses. The successful completion of this process includes the piloting of newly developed and revised curriculum. The Curriculum Management Branch (CMB) ensures conformance with ATA standards in developing new scopes of work, revisions and re-writes, and independent evaluations. The CMB also ensures that developments include best practices in Law Enforcement, Security and Antiterrorism Methodologies.

The Curriculum Project Manager (CPM) leads the development team through the formulation of the SOW, Program of Instruction (POI), Lesson Topic Development (LTD), and the walk-through and the Pilot project. The CPM also ensures instructional integrity of materials, positive learning experiences for participants and updated changes necessary to the courses/trainings. The CPM assists the ARE Division in providing subject matter expertise during in-country assessments and the TMD with providing course specifics for offer cables and guidance on course content and equipment. Additionally, the CPM assists the Training Delivery Division (TDD) with providing course specific instructor qualifications, and course specific equipment lists.

The Instructional Systems Design Branch (ISDB) is comprised of Instructional Systems Designers (ISD), Sr. Curriculum Editors (SCE) and Visual Information Specialists (VIS) who create the frameworks for instructional methods in the Design, Development, Implementation and Evaluation phases. The ISDB works with the curriculum development team to develop sound instructional goals and objectives appropriate to adult learners and foreign audiences, and to apply systems design methodology and instructional concepts for all materials and instructional activities.

Evaluation Questions

  • What alternative practices could be adopted in order to improve program efficiency?

Findings:

The regional focus on training and the shift in the staffing structure to a PSC status are significant initiatives in terms of new and alternative practices. ATA should conduct a comprehensive review in two years to assess if cost-effectiveness measures have been met or exceeded or if there are any unintended consequences that resulted in negative effects to the program (e.g. a possible decline in training delivery quality).

6.0 Conclusions and Recommendations

Based on the findings obtained from the data collection activities to answer the evaluation questions, the DevTech team concludes that:

  • The ATA is an effective program in building the antiterrorism capacities of partner nations. The participation of the numerous PN law enforcement personnel in the ATA courses has resulted in more effective operational units in combating terrorism and conducting general law enforcement activities.
  • Beyond building capacity, the ATA program helped establish strong relationships between the Embassy and the PN law enforcement community. More active coordination may be necessary to further improve relationships, but the links that have been forged since the start of the country programs have made collaboration and coordination easier between the national police units and the Embassy, specifically the RSO shop.
  • The effectiveness of the ATA program is evident in the positive feedback from the local law enforcement community. However, the exact level of attribution that can be given to the program cannot be made because a performance management system was not established at the start of the program.
  • At the country level, there are some factors that affect the long-term sustainability of the ATA program. Countries that are either more economically endowed or developed are more capable supporting the program with management and operational innovations. The rotation policies within the national police of the PNs can also negatively affect the sustainability of the assistance provided by the ATA.
  • At the management level, CT and DS should be more engaged at all levels of the program’s operations to make the ATA more efficient. There has been a history where the two bureaus had different perspectives in terms of goals and objectives that led to differences in terms of the specific activities to be conducted. However, this has been an acknowledged shortcoming by both bureaus and the current leadership of CT and DS are committed to better coordination and communication moving forward.
  • Internal controls ensuring the Budgetary, Financial and Operational management are following the State Department’s standard procedures were sufficient; however, the evaluation team identified a low risk area where controls could be improved.
  • DS should communicate clearly with their procurement staff regarding the upcoming changes on purchasing items both CONUS and locally at the PN level.
  • Based on the findings discussed in the previous section, the DevTech evaluation team recommends the following courses of action for the ATA program:
  • The delivery of training courses should remain with DS/T/ATA. From a beneficiary perspective, the subject matter content, instructors, logistics, and equipment provision that cover all ATA courses are well-planned and result in trainees gaining significant skills in antiterrorism capabilities.
  • Maintaining training delivery with DS/T/ATA will also ensure the relationships established between the Embassy and the national police will remain. Interviewees with the PN law enforcement community, ATA, and other USG agencies all generally stated that relationships forged among all parties will be set back if, for any reason, training delivery was changed from its current structure.
  • CT and DS should revive the Strategic Planning Working Group, or at least its M&E component, to establish performance monitoring plans (PMPs) for country programs. Developing a PMP or some form of performance management system will allow program leadership to accurately monitor the effectiveness of program activities and if strategic goals are being met. The system will not only result in improved efficiency but also help achieve greater accountability. Beyond the need to track program performance, the system will also respond to higher-level initiatives such as the Department of State Evaluation Policy, the Government Performance and Results Act (GRPA), and the GRPA Modernization Act of 2010.
  • In line with the revival of the Strategic Planning Working Group, DS and CT should start the performance monitoring initiatives within the Master Trainer Courses that will be conducted. From a management perspective, leveraging the courses will minimize the reporting burdens of Washington and country-based ATA personnel and will help institutional buy-in. From a technical perspective, leveraging the courses will help obtain more accurate assessments of program outcomes and will more accurately estimate the level of attribution that can be given to ATA. The courses are also ideal to pilot a performance management system that can be scaled up later to other training courses so that a comprehensive system can be built.
  • ATA should create, document, communicate and submit timely monthly financial reporting to CT. ATA should communicate with CT and inform them of reasons for delays in submitting reports and anticipated submittal timeframes.
  • ATA should communicate clearly and constantly with their staff regarding the planned and upcoming changes in operations and budgets and explain the benefits of these changes (e.g., the long term positive impact of the cost-cutting approaches and methods).
  • ATA should review the outcomes of the regional focus on training delivery and the shift in the staffing structure to the PSC status change in the next two years to assess if objectives were met and to identify unintended consequences, if any.

U.S. Department of State

The Lessons of 1989: Freedom and Our Future