ࡱ> '` RSbjbjLULU C.?.?tH    (6 V V V j r!r!r!8!6",j Mn"(""(""## #rMtMtMtMtMtMtM$Oh5RMV &##&&MV V ""M)))&V "V "rM)&rM))6,HV V J"b" @`Or!'p IF&MLM0MRIbRb(R8JJ~RV 2K#u$|)$dU%###MM>)X###M&&&&j j j Dj j j j j j V V V V V V  ؿ50Ȼ2011 CONFERENCE PAPER Making evaluation work for effective policy reform and revision Authors: Sophie Davies; Christopher Nelson Contact:  HYPERLINK "mailto:Sophie.davies@ausaid.gov.au" Sophie.davies@ausaid.gov.au (02 6206 4000);  HYPERLINK "mailto:Christopher.nelson@ausaid.gov.au" Christopher.nelson@ausaid.gov.au (02 6206 4000) ABSTRACT AusAID values the use of quality evaluation in order to improve its aid effectiveness and contribution to global poverty reduction, including the achievement of the eight millennium development goals by their 2015 target. This presentation describes a policy review which was designed to strengthen the use of evaluation information in program management decision making as well as to reinforce a performance management and evaluation culture within an aid donor agency. The approach taken was driven by a desire for effective policy reform. Behind this the drivers were: 1) to ensure senior management were engaged; 2) to encourage a stronger performance management culture and 3) to take an evaluation approach, which included a participatory process and analysis based on evidence. Section One sets out the context and background for the policy review. Section Two describes the policy review drivers in more detail. Section Three describes the review process and findings. Section Four concludes the paper with suggested ways forward following the release of the independent aid effectiveness review in July 2010. Section 1: Context and Background This paper describes a review of AusAIDs Performance Management and Evaluation Policy (PMEP) and underlying system. AusAIDs performance management system applies to all aid spending delivered by the Australian Government. Australias current commitment to aid is $4.8bn in 2010-2011 (0.35% of Gross National Income (GNI)). The majority of this is spent through AusAID (89%) with the remainder being delivered by agencies such as the Australian Federal Police (AFP), ACIAR and DIAC. Australia is on track to increase its spending to around $8bn by 2015 (0.5% of GNI), which will place it near the current OECD average of aid spending. The increasing scrutiny of aid spending and the complexity of the aid environment mean that AusAID takes a careful and conscious approach to continuous improvement of its work as informed by the performance management system. As stated on the AusAID website, the main objective of the aid program is: to assist developing countries reduce poverty and achieve sustainable development, in line with Australia's national interest AusAID is strongly committed to evaluating and improving Australia's aid program and to collecting, analysing and publishing development data and other information. AusAIDs development effort is guided by the internationally agreed Millennium Development Goals to reduce global poverty by 2015. The Australian Government launched an independent review into the effectiveness of the aid program in November 2010. This review was led by Sandy Hollway and had the objective to: examine the effectiveness and efficiency of the Australian aid program and make recommendations to improve its structure and delivery. Its conclusions will influence the further improvement and development of the performance management system. AusAID has an internationally recognised system for improving the quality of aid delivery and assessing its performance, encompassing internal performance management systems; AusAIDs internal audit function (recently strengthened through the establishment of a Chief Auditor and the appointment of an independent chair of the Audit Committee); and the analysis and evaluations prepared by the Office of Development Effectiveness (ODE) on all Australian aid expenditure. The ODE reports to the Director General of AusAID and is guided by the cross-government Development Effectiveness Steering Committee. There are also a number of external measures in place which scrutinise the aid programs performance OECD Development Assistance Committee (DAC) Peer Reviews are conducted every four years; and the Australian National Audit Office (ANAO) conducts regular independent audits. AusAIDs internal performance management system is governed by its Performance Management and Evaluation Policy (PMEP) which encompasses a quality reporting system at activity, country program and thematic levels. The PMEP replaced the Performance Assessment Framework in December 2007 and has now been reviewed twice according to a biennial policy review requirement. The PMEP established the framework for a new approach to performance management including the Quality Reporting System which had been trialled across the aid program in April-June 2007. The main aims of the PMEP are to improve the effectiveness of the aid program, articulate results, increase transparency and meet increased accountability demands. This is achieved through regular quality assurance at the activity level during design, annually during implementation, and upon completion. In addition, regular assessment and review at the country, thematic and strategic level is conducted annually and reported in Annual Program Performance Reports (APPRs) and Annual Thematic Performance Reports (ATPRs). Section 2: Intent and drivers behind the PMEP review A biennial review of the PMEP was identified as an important part of ensuring the AusAID approach to performance and quality reflected the changing priorities and imperatives of the agency. This regular review cycle was necessitated by the political dimension of government departments and the evolving nature of the international development sector. Importantly, the review was committed to an iterative approach that investigated and interrogated the policy to ensure it was fit for purpose and improved development practice. The PMEP and the quality reporting system that sits behind it is a unique feature of the agency, but there are elements of the approach requiring on-going change and the review offered an opportunity to identify and act on these changes. There were three key drivers behind the PMEP review. The first was to further develop and drive senior management engagement with the performance and quality agenda. While the early years of the PMEP had been pivotal in changing the way desk officers engaged with evaluation and the monitoring of their programs, a positive response from management had been patchy at best. There were clear links between the systems instruments and the way desk officers worked, but management was slow to see the potential in how the regular and methodical collection of data could inform their decision making. AusAID had quickly grown from a fairly small, informal agency built on person to person contacts to a large, diverse organization with increasingly demanding delivery imperatives. Changes in the management approach had not kept pace with this shift and the systems were not being used as efficiently as had been hoped. Improved guidance had some impact, but a participatory review was seen as a useful way to engage in the reasons why management was still underutilizing the PMEP and the agencys performance instruments. The PMEP had been developed in response to the recommendations of the 2006 White Paper and was originally owned by the Office of Development Effectiveness (ODE). While the policy was always intended to influence improved development practice, the early years were largely focused on accountability and there was limited buy-in from senior management on the utility of the approach. With the shift in ownership of the PMEP to the Operations and Performance Systems (OPS) Branch in 2009, the focus on using the instruments for better program delivery gained traction, but management was slow to respond. Finding a way to make the system more useful, appropriate and engaging was necessary to ensure that management recognized the importance of the PMEP. A root and branch review of the policy was seen as a useful way of making this happen. Therefore, considerable thought was applied to making the review process visible and in line with management priorities. The second driver for the review was to situate the PMEP at the heart of an emerging performance management culture. Since the policy had shifted from ODE in 2009, a number of sections in the agency had begun to introduce well funded, well resourced performance and quality units responsible for collecting and disseminating program data and review findings to their teams to improve practice. These teams had been built around performance nodes, encouraged by enthusiastic and supportive managers. They had found approaches and protocols that were working and engaged technical assistance to develop the necessary skills for evaluation to flourish. The PMEP review was seen as a way of building on and highlighting the principles of these teams and disseminating the experiences amongst the agency. By engaging with staff in a participatory review of the system, it was hoped that these successful strategies could be built into changes to the PMEP and would institutionalize a more vibrant performance culture. The third driver behind the review was to establish the PMEP as a participatory process based on evidence and able to provide the foundation for greater flexibility and strategy in the delivery of the aid program. In response to the ANAO Audit Report No.15 200910 AusAID's Management of the Expanding Aid Program, new guidance for country strategy architecture was released in 2010. This guidance had a strong focus on portfolio planning and looked to data and evidence as the corner stones of decision making in the aid program. To institute this guidance, staff required a new set of skills that focused on strategy rather than simple program management. A range of new mechanisms were available to deliver aid (including working in partner government systems) and there was support for more inventive ways to make sustainable change in recipient countries. These mechanisms required more flexibility in funding flows and a range of options other than the contractor management approach. Increasingly, AusAID was working through highly complex and diverse dispersal mechanisms that required performance management systems that were fit-for-purpose. The PMEP needed to acknowledge this diversity and ensure that participants were informed by the right performance information. A participatory review of the policy would ensure that agency staff was able to adjust to this new imperative and support the use of evidence to drive the agency results agenda. Working through portfolios was a way to ensure managers were making decisions based on the performance of their aid allocations. The PMEP needed to reflect this shift and respond with a results focus built on useful performance instruments. Section 3: Process and Findings AusAIDs performance management system influences all aspects of how AusAID undertakes its work and is guided by the Performance Management and Evaluation Policy (PMEP). It is supported by an in-house project management software (Aidworks) which works in tandem with the quality system to ensure attention to performance and evaluation are a part of normal aid management practice. As part of a regular cycle of revision, the PMEP is reviewed two yearly to take account of the changing policy and operational context of aid delivery and developments in evaluation theory and good practice. The most recent review took place over three months in 2010 and looked at ways to further improve transparency, accountability and communication on achievements of the aid program, and to take account of new ways of delivering aid. The PMEP review took a participatory, utilisation focussed approach (Patton (2002)) which was aimed at intended use by intended users. It therefore engaged a range of stakeholders both internal and external and sought their input. An internal Policy Reference Group (PRG) was formed, made up of representatives from country programs, thematic groups, corporate communications, AidWorks, corporate reform, Whole of Government and the Office of Development Effectiveness (ODE). To overcome potential turnover during the course of the review, the PRG members were asked to commit to the full review process and to ensure they had senior management agreement to their engagement. The PRG was divided into four focus groups to consider different key evaluation questions relating to the performance management system. These focus groups were presented with a range of data in guided discussions over seven meetings and recommendations were made. The data discussed included: Consultant report on AusAID Monitoring & Evaluation (M&E) systems for new modes of aid delivery Consultant report on Aid budget measures Internal report on the strengths, weaknesses and opportunities for improvement of the performance management system Summary of reflections from external M&E Panel members on the performance management system Consultant report on performance management capacity gaps and strengths at AusAID Internal report on the management and work of the M&E Panel of 29 experts Presentation from internal budget section on new budget processes Each focus groups recommendations were tested across the broader PRG for agreement and initial findings and recommendations were made. These were discussed with an external stakeholder group including representatives from NGOs, contractors, Whole of Government and M&E experts. From this, a final report of findings and recommendations was presented to the independent aid effectiveness review in early 2011. The review process encouraged frank and open discussion amongst staff in AusAIDs overseas offices and central office in Australia to ensure both a field and desk perspective were obtained. It also encouraged external stakeholder engagement and broader understanding of the agencys key challenges through testing the main findings and recommendations with this group. The external stakeholders represented the main groups which are influenced by how the policy is framed and practiced. The key findings from the PMEP review were that: Good practice exists in the agency and needs to be used to drive improvement. This includes capacity building by particular programs, the Performance and Quality network, the Monitoring & Evaluation (M&E) Panel and resources such as the M&E for civil society paper. Capacity development for M&E and performance management must be a core function of staff at all levels of the Agency. M&E information is not being efficiently used. This means information is created in response to each request rather than using existing information for multiple purposes. There is uncertainty about where accountability resides in shifting to a results-based approach to aid delivery. From these, four recommendations were made: The revised policy must be tailored to new ways of working, designed to suit different operational contexts and be subject to continuous improvement; The revised policy must take a strategic approach to information collection, use and re-use and improve feedback loops; Business processes must be aligned with the revised policy so that current systems (Aidworks, business planning, performance management etc) support results-based decision-making; Implementation of the policy must be supported by greater accountability for program performance management, incentives and staff capacity building and guidance. Specific actions were recommended to support implementation for each. Given the reviews participatory, utilisation-focussed approach, the review findings have high levels of ownership by the PRG and are more likely to be understood and implemented when they are applied in the revised policy. Section 4: Conclusion The review findings will underpin key changes to the PMEP and supporting guidance across the agency. In particular feedback loops between evaluation information and decisions about spending, program design and implementation and reporting on results will be strengthened. It is also expected that the participatory approach will promote a more prominent culture of transparency and learning. Perhaps the most important outcome of the PMEP Review has been the ability of the Program Effectiveness and Performance Division (replacement to OPS) to respond efficiently and effectively to the findings of the aid effectiveness review, Making a real difference Delivering real results. While the initial planning for the PMEP review pre-dated the aid effectiveness review, its recommendations and conclusions provided an important input to the larger independent review of Australias aid effectiveness. The emphasis on results and a streamlined performance system were key recommendations of the effectiveness review. Both were identified in the PMEP review and preliminary work was already underway to institute these changes when the Aid Review was announced. Not only did the PMEP review provide important analysis to inform the Review Panels work, but it identified the mechanisms and instruments that would support the broad move to a stronger results focus in the aid program. These were important outcomes for the program in a rapidly changing environment. The PMEP review has prepared the agency to initiate and institute the three tier Results Framework recommended by the Review Panel. It will ensure AusAID is able to act swiftly when expected results are not being achieved and it means the agency is able to better articulate what it wants to achieve in delivering its programs. The PMEP review was built around three drivers which have loosely been supported by the wider review of aid effectiveness. Senior management buy-in is now secured by a Review acknowledging the importance of focusing on results. Similarly, the emphasis on a changing performance culture and flexibility in the program has been recognized by the need for clarity of purpose and a clear strategy for implementing programs. The Review recommends fewer, larger programs in fewer sectors with a determination to change course and cease programs on the basis of poor progress. These findings are consistent with where the agency is heading and with the findings of the PMEP Review which puts us in a good position to build on the increasingly enthusiastic evaluative culture that is emerging in the agency.  Paper presented at the ؿ50ȻConference, Sydney Australia, 29 Aug 2 Sept 2011  The views in this paper are those of the authors and not of AusAID   HYPERLINK "http://www.ausaid.gov.au/about/default.cfm" http://www.ausaid.gov.au/about/default.cfm  Australian Aid: Promoting Growth and Stability - White Paper on the Australian Government's overseas aid program, June 2006.  Patton, M.Q (2002): Utilization-Focused Evaluation (U-FE) Checklist, Western Michigan University HYPERLINK "http://evaluation.wmich.edu/evalctr/checklists/ufe.pdf"http://evaluation.wmich.edu/evalctr/checklists/ufe.pdf  An Effective Aid Program for Australia Making a real difference Delivering real results, 2011, page 23.      [\]^gmwͶޞgO:"/hpPsh#l5>**䴳ϴ)Y5>**䴳ϴ/ʲh5>**䴳ϴ/ʲ95>**䴳ϴ<jhpPshh0J5>*B*CJOJQJU^JaJph/hpPshA@5>*B*CJOJQJ^JaJph-jhpPshh0JCJOJQJU^JaJ hpPshhCJOJQJ^JaJ hpPsh9}CJOJQJ^JaJ hpPsh&CJOJQJ^JaJ ]^\ ] ^ g h F G A B } ~ $a$gdYm$a$gdYmgdYmtPDSQSw{) սw[wDw/w(hpPshpPsCJOJQJ^JaJmH sH ,hpPshh0JCJOJQJ^JaJmH sH 7jhpPshhCJOJQJU^JaJmH sH 1jhpPshhCJOJQJU^JaJmH sH (hpPshhCJOJQJ^JaJmH sH /hpPsh9}5>*B*CJOJQJ^JaJph/hpPsh#l5>*B*CJOJQJ^JaJph)h5>*B*CJOJQJ^JaJph)hYm5>*B*CJOJQJ^JaJph) * + K L Z \ ] ^ g h o t v ʳʞt_J8J8J"h''CJOJQJ^JaJmH sH (hpPshl`CJOJQJ^JaJmH sH (hpPsh^CJOJQJ^JaJmH sH (hpPsh9}CJOJQJ^JaJmH sH (hpPshhCJOJQJ^JaJmH sH (hpPshpPsCJOJQJ^JaJmH sH ,hpPshh0JCJOJQJ^JaJmH sH 1jhpPshhCJOJQJU^JaJmH sH 7j hpPshhCJOJQJU^JaJmH sH   0 G a b r s   # 7 @ ֬֗ւmmX֬XXC(hpPshpPsCJOJQJ^JaJmH sH (hpPshq3CJOJQJ^JaJmH sH (hpPsh:QCJOJQJ^JaJmH sH (hpPsh|"CJOJQJ^JaJmH sH (hpPsh#lCJOJQJ^JaJmH sH (hpPshl`CJOJQJ^JaJmH sH (hpPsh&CJOJQJ^JaJmH sH (hpPshGJCJOJQJ^JaJmH sH (hpPsh(oCJOJQJ^JaJmH sH @ A B O ] j | }   6 M lVl?,hpPshpPs5B*CJOJQJ^JaJph"""+h|h|5CJOJQJ^JaJmH sH +h|hpPs5CJOJQJ^JaJmH sH (hpPshl`CJOJQJ^JaJmH sH (hpPshpPsCJOJQJ^JaJmH sH (hpPshRoCJOJQJ^JaJmH sH (hpPsh^CJOJQJ^JaJmH sH (hpPshHCJOJQJ^JaJmH sH (hpPsh|"CJOJQJ^JaJmH sH  ]^UVtuI$J$ ( (,,222 $7$8$H$a$gdYm$a$gdYm$[$\$a$gdYm Wg'(349굞s]H6"h-CJOJQJ^JaJmH sH (hpPsh^CJOJQJ^JaJmH sH +h|h^5CJOJQJ^JaJmH sH +h|hpPs5CJOJQJ^JaJmH sH (hpPshpPsCJOJQJ^JaJmH sH -jhpPshpPs0JCJOJQJU^JaJ hpPshpPsCJOJQJ^JaJ#h6|B*CJOJQJ^JaJph"""#hJHB*CJOJQJ^JaJph""")hpPshpPsB*CJOJQJ^JaJph"""9@Id$"7Ed , !!:!E!H$J$$$$$ܸܸܸʸʸpp^p^pF/jh40JCJOJQJU^JaJmH sH "hDeCJOJQJ^JaJmH sH "h4CJOJQJ^JaJmH sH "h}CJOJQJ^JaJmH sH "hO.(CJOJQJ^JaJmH sH "hGrCJOJQJ^JaJmH sH "hCJOJQJ^JaJmH sH "hlkCJOJQJ^JaJmH sH "h-CJOJQJ^JaJmH sH "hT-CJOJQJ^JaJmH sH $' ( (c(((+++++,,,D,F,m,,,,ܸܸ~hSh~:1h;h;B*CJOJQJ^JaJmH phsH (hpPshzPCJOJQJ^JaJmH sH +hzPB*CJOJQJ^JaJmH phsH +h;B*CJOJQJ^JaJmH phsH "hI_CJOJQJ^JaJmH sH "h;CJOJQJ^JaJmH sH "hEYXCJOJQJ^JaJmH sH "hCJOJQJ^JaJmH sH "h,P"CJOJQJ^JaJmH sH "hDeCJOJQJ^JaJmH sH ,--D-E------00000 11$1D1E1N1W11222긢vv`v`````J+hB*CJOJQJ^JaJmH phsH +hzPB*CJOJQJ^JaJmH phsH +htB*CJOJQJ^JaJmH phsH +hEzB*CJOJQJ^JaJmH phsH +h;B*CJOJQJ^JaJmH phsH 1h;h;B*CJOJQJ^JaJmH phsH 0h;h;0J5B*CJOJQJ^JaJph)h;h;B*CJOJQJ^JaJph222222/343r3z3{3,5M5 666 6N6O66Ͳ͆xxjx]PFP3P%jhJHhJH0JOJQJU^JhJHOJQJ^JhJHhJHOJQJ^JhJHhpPsOJQJ^Jh''CJOJQJ^JaJhJHCJOJQJ^JaJ hpPshpPsCJOJQJ^JaJ4h|h^5B*CJOJQJ^JaJmH phsH 4h|hpPs5B*CJOJQJ^JaJmH phsH 1hpPsh^B*CJOJQJ^JaJmH phsH 1h-hEzB*CJOJQJ^JaJmH phsH 222O4P4 6 68899?:h::8;;;<<== $ & F a$gdYm $ & Fa$gdYm $ & Fa$gdYm $ & Fa$gdYm$ & Fda$gdYm$a$gdYm $7$8$H$a$gdYm677777738@8X8]88888888899>9@9C9W999999::: :::>:{{m{\m\m\m\{ hBhBCJOJQJ^JaJh''CJOJQJ^JaJhBCJOJQJ^JaJ hpPshBCJOJQJ^JaJ(hpPshBCJOJQJ^JaJmH sH "hBCJOJQJ^JaJmH sH  hJHhJHCJOJQJ^JaJhJHhJHOJQJ^JhJHOJQJ^JhjOJQJ^JhpPshJHOJQJ^J$>:?:T:h:::::::; ;7;8;M;Z;];^;c;k;;;;;;;;;;< <<<<*<C<\<o<<<<<{j hpPsh6|CJOJQJ^JaJh6|CJOJQJ^JaJ h6|h6|CJOJQJ^JaJ"h6|CJOJQJ^JaJmH sH (hpPshBCJOJQJ^JaJmH sH "hBCJOJQJ^JaJmH sH h''CJOJQJ^JaJhBCJOJQJ^JaJ hBhBCJOJQJ^JaJ)<<$=,======>????pBBHEIERESE\EiEE)F*FϺϺxxnananaL(hpPshRoCJOJQJ^JaJmH sH hJHhBOJQJ^Jh6|OJQJ^J h6|h6|CJOJQJ^JaJhJHCJOJQJ^JaJ hpPshBCJOJQJ^JaJ"h''CJOJQJ^JaJmH sH (hpPshBCJOJQJ^JaJmH sH "h6|CJOJQJ^JaJmH sH  hpPsh6|CJOJQJ^JaJh6|CJOJQJ^JaJ=???@QAApBqBB4CC`DEHEIE)F*F+F,FDFEF $7$8$H$a$gdYm$ & Fda$gdYm$ & F ^a$gd''$a$gd'' $ & F a$gdYm$a$gdYm*F+F-F5F6FDFEFvFFGGGIIֻucuQ?-"h3@CJOJQJ^JaJmH sH "h|CJOJQJ^JaJmH sH "h6|CJOJQJ^JaJmH sH "hjCJOJQJ^JaJmH sH (hpPsh6|CJOJQJ^JaJmH sH +h''B*CJOJQJ^JaJmH phsH 4h|h6|5B*CJOJQJ^JaJmH phsH 4h|h^5B*CJOJQJ^JaJmH phsH 1hpPsh^B*CJOJQJ^JaJmH phsH  hpPshRoCJOJQJ^JaJ EFGGKKsPtPP QsQQRRCSDSFSGSISJSLSMSOSPSQSRSgdJH$a$gdJHgdpPs $7$8$H$a$gdYm$a$gdYmIILIIJJKK*LLLMWPbPrPsPtPuPPP Q Q Q QQBQDQEQFQʎukgkgkg_gXgM_j*h!U h}h!jh!Uh!jh!0JU1hh6|B*CJOJQJ^JaJmH phsH "hCJOJQJ^JaJmH sH /jh&q0JCJOJQJU^JaJmH sH "h&qCJOJQJ^JaJmH sH "h>jCJOJQJ^JaJmH sH "h3@CJOJQJ^JaJmH sH "hyCJOJQJ^JaJmH sH FQpQqQrQsQtQuQQQQQVRWRRRRRRRRRR1SCSDSESGSHSJSKSMSNSQSijphd\d\d\d\djh.PKUh.PKh&qh!>*hGh!0JCJ^JaJ#jchGh!CJUaJjhGh!CJUaJhGh!CJaJh!CJaJ!jhGh!0JCJUaJhDeh!CJaJhDeh!CJaJjh!0JU h}h!h!jh!Uh1Ch!0J QSRS1hh6|B*CJOJQJ^JaJmH phsH A 0PP&P 1:pJH. A!"#$%  DyK Sophie.davies@ausaid.gov.auyK ^mailto:Sophie.davies@ausaid.gov.auyX;H,]ą'cDyK !Christopher.nelson@ausaid.gov.auyK hmailto:Christopher.nelson@ausaid.gov.auyX;H,]ą'c9DyK +http://www.ausaid.gov.au/about/default.cfmyK nhttp://www.ausaid.gov.au/about/default.cfmyX;H,]ą'cDyK yK http://evaluation.wmich.edu/evalctr/checklists/ufe.pdfyX;H,]ą'c @@@ NormalCJ_HaJmH sH tH \@\  Heading 1$7$8$@&H$5>*B*\aJmH phsH \@\ pPs Heading 2$<@& 56CJOJQJ\]^JaJV@V % Heading 3$<@&5CJOJQJ\^JaJ\@\  Heading 5$$7$8$@&H$a$5B*\mH phsH DA@D Default Paragraph FontVi@V  Table Normal :V 44 la (k@(No List ^Q@^ Body Text 3$7$8$H$a$5B*\aJmH phsH XOX Author!$d*$5$7$8$9DH$a$CJaJmH sH F^@F 9} Normal (Web)dd[$\$tH *W@!* 9}Strong5\>@2> #l Footnote TextCJaJ@&@A@ #lFootnote ReferenceH*B'QB q3Comment ReferenceCJaJ<b< q3 Comment TextCJaJ@jab@ q3Comment Subject5\HH q3 Balloon TextCJOJQJ^JaJ6U@6 h Hyperlink >**<< pPs Char Char_HmH sH tH vOv pPsHeading 1 outlinenum$ & F<@&5CJOJQJ^JaJ tH fO!f pPsHeading 2 outlinenum  & Fh6CJ\]aJtH ZO1Z pPsHeading 3 outlinenum  & F6CJ]^JTOT JHDash & FdB*CJOJQJaJphtH 4X`4 ;Emphasis 56\][ N.DRKP`RK]^\]^ghFGAB}~] ^ U V tuIJ $$*****O,P, . .0011?2h22833344557778Q99p:q::4;;`<=H=I=)>*>+>,>D>E>??CCsHtHH IsIIJJCKDKFKGKIKJKLKMKOKPKSKH000000^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^0^ 0^ 0^ 0^ 0^ 0^ 0^ 0^0^0^0^0^0^0^ 0^ 0^ 0^ 0^0^0^ 0^ 0^ 0^ 0^00000000000000@0@0@0@0@0@0@000@000@000@000@00000 ]^\]gFGAB] ^ U V tuJ $$**O,P, . .008333445778Q99p:;`<=H=)>*>,>D>tHH IsIIJJDKFKIKLKOKSK000000000000000000000000@0@0@0@0@0@0@0@0@0@0@0@000 Թ0000@0@0@0@0@0@0@0@0@0000 00 ^00000000@000@00'0 (_0'0 0'0 @00+0 ,`0+0 0+0 @0@00000102`01000000@0@0@00 0 0X00000000 00 w) @ 9$,26>:<*FIIFQQSRS*-./02345689:<>?@ 2=EFRS+17;=QS,*KRKXX&]XX8@0(  B S  ? OLE_LINK1SK^SK *urn:schemas-microsoft-com:office:smarttags PersonName     "ow u}w|%%**++//56<<tHDKDKFKFKGKGKIKJKLKMKOKPKSK++tHDKDKFKFKGKGKIKJKLKMKOKPKSK339@Id()*+67cd;)>)C)D)E)N)W)))){+{+++BBWHbHsHtHCKSKtHDKDKFKFKGKGKIKJKLKMKOKPKSK -R U f`,f $[: l',@%,LT]?s9t6  U$4xj#lYmvn(o DqpPsvnvKwxIxEz9}%tif!fcGr`=''-]#}4 (5#LJHGJ|"Ro!FBlr6|Ix^O;HPOHEa:R[;h!5HAUB2I_eKC8lkA@j|%&q<2Wy3@FkW]g)tHsIJSK1110@?3?3ڔ?3?3RKP@UnknownGz Times New Roman5Symbol3& z Arial7&{ @Calibri5& zaTahoma9Palatino?5 z Courier New;Wingdings"1h;]JF v=> v=>!x4d6H6H2qHX ?02TITLE IN CAPITALS Pauline AdisChristopher Nelson8         Oh+'0   @ L X dpxTITLE IN CAPITALSPauline Adis Normal.dotChristopher Nelson5Microsoft Office Word@))@X|3@ROO@`O v=՜.+,D՜.+,P  hp  Tour Hosts Pty Limited>6H' TITLE IN CAPITALS Title| 8@ _PID_HLINKSA4hX(mailto:Christopher.nelson@ausaid.gov.auB<#mailto:Sophie.davies@ausaid.gov.auBu$7http://evaluation.wmich.edu/evalctr/checklists/ufe.pdfBg=+http://www.ausaid.gov.au/about/default.cfmB  !"#$%&'()*+,-./0123456789:;<=>?@ACDEFGHIKLMNOPQRSTUVWXYZ[\]^_`abcdefghijklmnopqrsuvwxyz{}~Root Entry F0S`OData B1TableJRWordDocumentCSummaryInformation(tDocumentSummaryInformation8|CompObjq  FMicrosoft Office Word Document MSWordDocWord.Document.89q