Public Safety Canada Internal Audit of Performance Measurement
March 16, 2016

Public Safety Canada Internal Audit of Performance Measurement PDF Version (400 KB)
Table of contents

Executive Summary

Background

Performance measurement is defined as process and systems of selection, development and on-going use of performance measures to guide decision-making.Footnote1 Performance measurement is a results-based management tool that federal organizations are required to use to strengthen decision-making, program improvement and reporting; and is becoming increasingly important in an era of open government and accountability.

Considering the importance of performance measurement, the Deputy Minister approved the Audit of Performance Measurement as part of the 2012-13 – 2014-15 Risk-Based Audit Plan.

Audit Objective and Scope

The objective of this audit was to provide reasonable assurance that the departmental performance measurement processes and practices were adequate, effective, and aligned with Treasury Board requirements.

The scope focused on the way performance indicators and measures were developed and how the data was collected and used by the Department. The audit also focused on conformance with the TB and departmental policies, and the departmental audit and evaluation committee charters.

The audit sampled performance measurement information within all departmental activities including Grants and Contributions programs, policy and research activities, and internal support functions up until September 2015.

Summary of Findings

As per the TB and Public Safety (PS) policies and guidelines, a comprehensive performance measurement process includes:

Through the audit, we found that the Department has some fundamental controls and processes in place. These include:

Based on the documentation review, interviews and sample tests and analysis, we found that PS performance measurement processes are incomplete:

Audit Opinion

Improvements are requiredFootnote2 to the performance measurement processes and practices to increase the level of compliance with TB requirements, and their adequacy and effectiveness. There are opportunities to improve the performance measurement processes, and strengthen the collection and use of performance measurement information in support of management and oversight activities.

Statement of Conformance and Assurance

Sufficient and appropriate audit procedures were conducted and evidence gathered to support the accuracy of the opinion provided and contained in this report. The opinion is based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria that were agreed upon with management. The opinion is applicable only to the entity examined and within the scope described herein. The evidence was gathered in compliance with the Treasury Board Policy and Directive on Internal Audit. The audit conforms to the Internal Auditing Standards for the Government of Canada, as supported by the results of the Quality Assurance and Improvement Program. The procedures used meet the professional standards of the Institute of Internal Auditors. The evidence gathered is sufficient to provide Senior Management with proof of the opinion derived from the internal audit.

Recommendations

  1. Assistant Deputy Minister of PACB, should ensure that:
    • Alignment between indicators in the PMF, the RPP, and the DPR
    • The Chief Audit and Evaluation Executive is supported to review and advise on the PMF in accordance with the TBS Directive on the Evaluation Function.
  2. Each Assistant Deputy Minister should:
    • Identify resources assigned to development and implementation of PMSs.
    • Ensure linkages between the PMF and PMSs.
  3. Each Assistant Deputy Minister should:
    • Ensure collection of accurate data against the PMF and PMSs indicators to support timely decision-making and reporting.
    • Develop agreements for third-party data collection, where applicable.

Management Response

Management accepts the recommendations of Internal Audit. The recommendations recognize that the accountability for performance measurement is shared by all Departmental Managers.

The key actions to be taken by management to address the recommendations and findings and their timing can be found in the Findings, Recommendations and Management Response section of the report.

CAE Signature

____________________________

Audit Team Members

Deborah Duhn
Sonja Mitrovic
Kyle Abonasara

Acknowledgements

Internal Audit would like to thank the all those who provided advice and assistance during the audit.

1. Introduction

1.1 Background

Performance measurement is defined as “process and systems of selection, development and on-going use of performance measures to guide decision-making”.Footnote3 Performance measurement is a results-based management tool that federal organizations are required to use to strengthen decision-making and reporting.

More specifically, performance measurement supports Public Safety (PS) in executing its leadership role to bring coherence to the activities of the Departments. It also provides levers to influence policy coherence; and provides insight in terms of horizontal issues with many federal partners and through the delivery of the following program activities: National Security: Borders Strategies; Countering Crime; Emergency Management.

The performance measurement processes ensure that relevant, accurate and timely information is available for sound decision-making and reporting. Ongoing performance measurement forms a strong foundation for results-based management; and is becoming increasingly important in an era of open government and accountability.

Considering the importance of the performance measurement in the Department's activities, the Deputy Minister approved the Audit of Performance Measurement as part of the 2012-13 – 2014-15 Risk-Based Audit Plan.

1.2 Legislative Framework

Performance measurement is defined through a set of Treasury Board (TB) policies, directives, and guidelines. TB policies include the Policy on Management, Resources and Results Structures (MRRS), Policy on Evaluation, and Policy on Transfer Payments. In particular, the Policy on MRRS requires that “the Government and Parliament receive integrated financial and non-financial program performance information for use to support improved allocation and reallocation decisions in individual Departments and across the Government.”Footnote4

1.3 Roles and Responsibilities

The Deputy Minister's, Branch Heads', and Chief Audit and Evaluations Executive's (CAEE) roles and responsibilities in relation to performance measurement are defined in both TB and PS policies which include: the TB Policy of MRRS (2010), the TB and PS Policy on Evaluation (2009), the TBS Directive on the Evaluation Function (2009),and the TB Policy on Transfer Payments (2008). The following is a summary of key responsibilities from select TB and PS policies and directives.

The Deputy Head's main responsibilities are to ensure that:

The Departmental Evaluation Committee (DEC):

Branch Heads' main responsibilities include:

As Head of Evaluation, the CAEE's main responsibilities include:

Program Managers' responsibilities include:

1.4 Audit Objective

The objective of this audit was to provide reasonable assurance that the systems and practices in place to support performance measurement were adequate, effective, and aligned with Treasury Board requirements.

1.5 Scope and Approach

The scope of the audit focused on the manner in which performance indicators and measures were developed and how the associated performance information was collected and used by the Department. The audit also focused on conformance with the TB Policy on Management, Resources and Results Structures (MRRS), the TB Policy on Transfer Payments, the TB Policy on Evaluation, TBS Directive on the Evaluation Function, the Department's Policy on Evaluation, DEC Charter, planning and reporting cycle, as well as the degree to which performance information was integrated into key decisions.

The audit sampled performance measurement information within all departmental activities including Grants and Contributions programs, policy and research activities, and internal support functions up until September 2015.

Exceptions:
This audit did not provide assurance in regard to:

1.6 Risk Assessment

The risk assessment conducted in the planning phase of the audit informed the development of the audit scope and criteria. See Annex C and D for details.

The Department's business activities were distinctly different, ranging from transfer payments programs, to policy development, to operations. This diversity emphasized the need for a strong PMF to clearly understand the results achieved in each activity and align resources accordingly. The nature of some PS activities such as policy development creates challenges in defining performance indicators and data sources.

Program and policy areas also placed significant reliance on external stakeholders' data collection processes, some of which are less established, resulting in a higher degree of inherent risk in relation to data integrity and validity. Further, in some cases the dedication of resources to performance measurement activities competed against other priorities, which may have resulted in incomplete processes.

Many of the Department's activities are horizontal in nature and therefore include the participation of other departments and agencies from within and outside the portfolio. This horizontality added another layer of complexity in coordinating acceptance, commitment, and reporting of performance measurement by all parties involved.

1.7 Audit Opinion

Improvements are requiredFootnote12 to the performance measurement processes and practices to increase the level of compliance with TB requirements, and their adequacy and effectiveness. There are opportunities to improve the performance measurement processes, and strengthen the collection and use of performance measurement information in support of management and oversight activities.

1.8 Statement of Conformance and Assurance

Sufficient and appropriate audit procedures were conducted and evidence gathered to support the accuracy of the opinion provided and contained in this report. The opinion is based on a comparison of the conditions, as they existed at the time, against pre-established audit criteria that were agreed upon with management. The opinion is applicable only to the entity examined and within the scope described herein. The evidence was gathered in compliance with the Treasury Board Policy and Directive on Internal Audit. The audit conforms to the Internal Auditing Standards for the Government of Canada, as supported by the results of the Quality Assurance and Improvement Program. The procedures used meet the professional standards of the Institute of Internal Auditors. The evidence gathered is sufficient to provide Senior Management with proof of the opinion derived from the internal audit.

2. Findings, Recommendations and management responses

2.1 Public Safety Performance Measurement Processes

Through the review of various policies and guidelines pertaining to the management of performance measurement, we identified the following as mandatory items for comprehensive performance measurement processes:

Public Safety performance measurement processes are incomplete.

Policy, Guidance and Training

Many policies and guidelines exist to provide guidance on performance measurement. Treasury Board approved performance measurement policies and directives include: Policy on MRRS, Policy on Evaluation, Directive on Evaluation Function, and Policy on Transfer Payments. In addition to the external policies, Public Safety has also developed and approved a Policy on Evaluation

To support departments in the development of performance measurement tools, Treasury Board also issued the Guideline on Performance Measurement Strategy under the Policy on Transfer Payments and the Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies. To complement the available guidance and to support program and policy areas, PS has also developed a guide that focuses on the development of logic models, Performance Measurement Strategy Guide – Information on Logic Models.

Although policies and guidelines are available, they mainly focus on performance measurement management of program areas and provide limited guidance for policy activities. The interviewees indicated that there is still a need for further guidance pertaining to the unique role of PS as coordinator and policy department.  

To support the implementation and compliance to the listed performance measurement policies, directives and guidelines, specialized Evaluation Programs and general management courses, such as G110, G126, G226, are offered through Canada School of Public Service. 

Furthermore, the Strategic Planning Division in PACB has also developed and delivered a number of targeted presentations to various branches and directorates including a series of workshops in 2012. While the attendance to these workshops was limited, 50% of interviewees indicated a need for training and guidance particularly around the development of the indicators for policy function, and tools to capture the required data. Subsequent to the audit, the Evaluation unit developed and delivered a training session on performance measurement indicators for policy activities to address the identified need.

Roles, Responsibilities, Accountabilities and Oversight

Document reviews and interviews indicated that generally there were clear roles, responsibilities and accountabilities and that they were in line with the policies, directives and guidance mentioned above.

The Directive on the Evaluation Function states that the Head of Evaluation is responsible for “reviewing and providing advice on the performance measurement framework embedded in the organization's Management, Resources and Results Structure”.Footnote13 We followed up with TB; they did not provide a clear interpretation as to whether “advice” was to indicate any form of accountability or how the enforcement of this “advice” should be performed. This allowed each individual Department to interpret and to develop its own methodology to ensure its needs were addressed and that it complied with the policy. While the PS and TBS policies had roles and responsibilities concerning “advisory” and “review” of both the PMSs and the PMF, the PS Evaluation Function did not have a formal process pertaining to the review of all PMSs or the annual PMF. To date, the Evaluation Function has not been actively engaged in the PMF review as required by the TBS Directive on Evaluation Function.

In addition to the Evaluation Function advisory and review role, the SPD has also supported the Department in the performance measurement management. As formally documented within their Business Plan, the SPD is committed to “Promoting the stability of the PAA while amending it where necessary”.Footnote14 More specifically, SPD is responsible for:

The reviewed 2014-15 Performance Management Agreements for each executive head also indicated that there is management commitment toward performance measurement by including the following criteria:

Significant progress has been made in achieving the expected results based on the executive's areas of responsibility in the PMF and mitigation strategies are in place to address key risks.”

While the roles, responsibilities, and accountabilities are clearly outlined, interviewees indicated that there has been misinterpretation between the roles of SPD and the Evaluation unit, specifically in the development and monitoring of performance measurement activities.

The TBS Directive on the Evaluation Function required the Head of Evaluation to present an Annual State of Performance Measurement report to the DEC. Between 2009 and 2013 this report was not completed. However, at the June 15, 2010, DEC, performance measurement was discussed, and the Director General (DG), Evaluation tabled “Resourcing the Evaluation Function”. The presentation identified priority activities, and recommended those that could be deferred. The Annual State of Performance Measurement report was identified as one of those activities that could be deferred given it would have limited value because of the known state of performance measurement throughout the Department. DEC accepted the recommendation of prioritization of activities.

The first PS Annual State of Performance Measurement report was presented to the DEC in February 2014 (a summary of findings is available in Annex E). Its purpose was to provide PM information required to move the Department forward in terms of results-based management. Support for results-based management is intended to:

This report was a catalyst for drawing attention to and improving oversight of performance measurement by the DEC. The report recommended that all PMSs had to be approved by the responsible ADM. The development of the report and adoption of the recommendations was a major factor in reinforcing the role of the Evaluation Unit in the coordination of these activities. Subsequent to the audit period, the Evaluation Unit began working with both SPD and the Grants & Contributions (G&C) Centre of Excellence to clarify roles and develop tracking tools and inventory lists.

The oversight governance for performance measurement is composed of various committees at different levels of the organization, namely the DEC, Departmental Audit Committee (DAC), Departmental Management Committee (DMC) and Branch Management Committees.

As per the TB Policy on Evaluation and the PS DEC Charter, the DEC has a number of specific and clearly defined responsibilities concerning PM. More specifically, the DEC is responsible for reviewing “the adequacy of resources allocated to performance measurement activities as they relate to evaluation, and to recommend to the deputy head an adequate level of resources for these activities”.Footnote15 Further to the findings and recommendations outlined in the PS Annual State of Performance Measurement report and subsequent to the audit period, the PS DEC charter was revised to include the additional responsibilities pertaining to “the periodic reviews [of] Assistant Deputy Minister (ADM) approved PMSs, and reviews [of] the departmental PMF annually”.Footnote16

In line with the TB Policy on Internal Audit, through the review of committee records of decisions, we found that DAC received information concerning the MRRS implementation and had reviewed key performance reports such as RPP, DPR and the 2014-15 PMF. However, we also noted that the PS Annual State of Performance Measurement report was not shared with the DAC members.

As per the DMC Terms of Reference, performance information is to be presented at mid-year and year-end for review. Specifically pertaining to the PMF, the DMC is required to review the initial PMF, and then review its results at mid-year and year-end. During the audit period, we found that the PMF was reviewed bilaterally with ADMs instead of DMC as it would have been expected. An ADM meeting was held to discuss the final PMF. Furthermore, due to timing constraints, the 2013-14 PMF mid-year results were not presented at DMC, as they were considered no longer relevant. In 2014-15, data for only a select number of indicators was presented at the mid-year review and year-end discussions. The committee records of decisions indicated further actions required for ADM PACB to review the existing indicators and ensure that the ones chosen to be presented identified issues that required appropriate and timely course corrections.

Finally, we conclude that there is inconsistency in performance information presented and used at the various committees throughout the Department. This may limit the effectiveness of committees.

Procedures and Practices for the Development and Approval of the PMF

A departmental PMF sets out the expected results and the performance measures at the PAA level. The indicators in the PMF are limited in number and focus on supporting departmental monitoring and reporting.

The development of the Departmental PMF has improved with improved indicators, targets and integration into departmental business planning. We found that the PMF was aligned with the TB Guidelines for MRRS; it identified indicators by each sub-sub PAA activity, and it was individually signed off by all ADMs.

The interviewees indicated that the SPD unit conducted a consultative one-on-one approach with individual program areas. These consultations included topics such as TBS requested changes, general PMF development processes, and the implementation of indicators. Although interviewees acknowledged generally good participation during the PMF development consultations, the PMF expectations and use remained unclear. It should be noted that some interviewees indicated that indicators of the PMF were of little value to them because it did not provide them with the information necessary to guide their program.

Although SPD organized meetings with branch coordinators and program managers to facilitate the PMF amendment process, we found no evidence of an established methodology that would support the request for amending the PMF. Interviewees also stated that the Evaluation Function had not been consulted to support its advisory and review role.

TBS provided feedback on the PS PMF, and SPD retained a consultant in 2012-13 to review the PMF. Both TBS and the consultant made observations, most notably that there was:

Although SPD agreed with some of the observations, we found no evidence that these observations were communicated to senior management.

While SPD led the PMF development process and ensured compliance to MMRS as well as alignment with the PAA, we found that there is still a need to review the PMF to ensure its clarity and completeness.

Procedures and Practices for the Development and Approval of PMSs 

PMSs are used to identify and plan how performance information was to be collected to support ongoing monitoring of a program and its evaluation. They are intended to support effectively both day-by-day program monitoring and delivery, and the eventual evaluation of that program. There are no imposed limits on the number of indicators that can be included, or expected results and outputs; however, their “successful implementation […] is more likely if indicators are kept to a reasonable number”.Footnote17

We found that there was a good understanding of the development of a comprehensive PMS, including Logic Models, stakeholder discussions and indicator identification. More specifically, 10 out of 11 interviewees indicated that they have collaborative discussions with program stakeholders. Program areas have made efforts to ensure awareness and relevancy that are particularly important given our reliance on input from partners.

A judgemental sample of approximately 30% of the total PMSs was assessed to ensure their progress and integrity of their indicators. The results revealed areas for improvement, mainly pertaining to the integrity and the identification of indicators. More specifically, we found:

Though there are still improvements to be made, the development of PMSs has progressed since the 2013 PS Annual State of Performance Measurement report.

Furthermore, this report also increased senior management's interest by requiring that the responsible ADMs approve PMSs. More specifically, the Deputy Minister requested that those PMSs assessed as “Needs Improvement” be updated and approved by each applicable ADM by September 30, 2014. The following chart presents the status of PMSs as of September 2015.

Figure 1: Status of PMS Development Departmentally

Text
Image Description

This pie chart entitled “Status of PMS Development Departmentally” is divided into 3 sections. The total of the complete pie chart equals 28 PMSs. The largest section represents 18 out of the 28 total PMSs and is labelled “PMS's that are developed and ADM approved”. The second largest section represents 8 out of the 28 total PMSs and is labelled “PMSs that are developed but are not yet ADM approved”. The smallest section represents 2 out of the 28 total PMSs and is labelled “outstanding”.


The two outstanding PMSs are in the policy areas.

PMF, PMS and Other Business Document Alignment

The alignment between PMF, PMSs and other business documents, such as business plans, enables the use of the performance measurement information for sound management.

Through the review of 2014-15 branch business plans, we found that there was alignment with the PMF at the indicators and target levels. This alignment is also brought to the work plans that reflected general commitment to improve performance measurement activities.

Furthermore, interviewees indicated a disconnect between PMF and PMS. Through the documentation review, we found only a few examples of deliberate alignment of the individual PMS indicators to the PMF. Considering the PMF and PMS development processes, further improvements are needed to ensure alignment between the PMF and PMSs.

Although many efforts were made to improve the performance measurement processes within the Department, we found that there are opportunities to improve the development processes and accentuate the alignment between PMSs and PMF. The lack of alignment between PMSs and PMF presents risks of inconsistency that may result in a weak PMF and PMSs, and a misrepresentation of the Department's performance.

2.2 Public Safety Performance Data Collection, Reporting and Decision-Making

As indicated by the policies and guidelines, we identified the following as the required elements for performance data collection and use:

PS is collecting limited data against the indicators in the PMF and PMSs; consequently, there is a small amount of data available to support decision-making and reporting purposes.

PMF Data Collection, Reporting and Decision-Making

The Department has demonstrated many improvements within the performance measurement processes. While efforts were made to refine and adjust the PAA structure and align its PMF indicators, we found that the Department has not demonstrated stability in the past year to permit data collection and analysis of results.

Documentation review indicated that data for approximately 25% of the 2012-13 PMF indicators were not collected and reported in 2013-14. Furthermore, we found through the analysis of the 2013-14 PMF that a number of indicators did not have identified data sources. These indicators were then determined not measurable. The review of the Report on Plans and Priorities (RPP) and of the Departmental Performance Report (DPR) has revealed divergence between the indicators set out in the plan versus those against which results were reported. We found no evidence of rationale to support the changes made to some of the RPP indicators in the DPR of that year. The review also presented lack of consistency on the indicators at the PMF development and reporting stages. Due to these inconsistencies, we identified difficulties in ensuring integrity of the reporting process and the real status of performance measurement within the Department.

The review of the 2014-15 PMF indicated only a number of selected indicators being presented to the DMC for review of mid-year results. We found no evidence of a formal selection process or a rationale for their selection. Based on the performance documentation analysis and the review of committee agendas and records of decisions, we found that there was no status update on the complete PMF. Furthermore, we also noted that although performance information was occasionally presented at the committees, PMF concerns and issues were not fully revealed and discussed to ensure that corrective actions were taken.

At the Branch Management level, interviewees stated having established regular meetings; however, there was limited formal documentation. Many of the interviewees indicated that there was little demand from senior management for performance measurement data and/or results. Most of the performance measurement discussions that took place focused on the establishment of the indicators and identification of data sources.

PMS Data Collection, Reporting and Decision-Making

Although there was a good understanding of the PMS development processes and that 64% of the PMSs are developed and approved by the ADM, the PMSs are still at the early stage of their implementation.

Efforts were made to collect performance measurement information where possible; however, we noted that data was not collected on a regular basis. The performance information was gathered on an ad-hoc or as needed basis. The review of a sample of PMSs indicated four of 10 programs, namely Crime Prevention Program, First Nations Policing Program, Government Operations Centre and Critical Infrastructure, were collecting and using performance measurement information to support aspects of program management.

The review of PMSs and interviewees indicated that in some cases the limitation for data collection and use was also due to a need to rely on third-party information. This reliance presented a required ongoing and active engagement of stakeholders, which has presented challenges in timely data gathering. Furthermore, we noted that the reliance on third parties was not clarified and program areas were at early stages of determining the processes around third-party data collection and storage. Finally, the PMS review sample indicated limited methodology for analysis and data interpretation.

Considering the expected use of performance measurement information, we found that the Department is still at the early stage of data collection, which limits the information available for reporting and sound decision-making. The audit observations identified opportunities for promoting the use of performance measurement information and improving the identification of data source, regular data collection, analysis, and reporting. Because the Department has collected limited performance data, there is a risk of insufficient information available for sound decision-making.

2.3 Recommendations

2.4 Management Response and Action Plan

Management accepts the recommendations of Internal Audit. The recommendations recognize that the accountability for performance measurement is shared by all Departmental Managers.

Actions Planned

Planned Completion Date

Recommendation #1 :
Assistant Deputy Minister of PACB, should ensure that:

  • Alignment between indicators in the PMF, the RPP, and the DPR
  • The Chief Audit and Evaluation Executive is supported to review and advise on the PMF in accordance with the TBS Directive on the Evaluation Function.

PACB Contribution:

  • PACB will continue to ensure alignment with the MRRS Policy by ensuring that indicators in the PMF, the RPP and the DPR are aligned.
  • PACB will provide the CAE with the opportunity to review and advise on the PMF prior to its final approval.

November 30, 2017

Recommendation #2:
Each Assistant Deputy Ministers should:

  • Identify resources assigned to development and implementation of PMSs.
  • Ensure linkages between the PMF and PMSs.

PACB Contribution:

  • PACB will continue to support Branches in the development and implementation of PMSs. PACB commits to using key planning documents (such as RPP, Business Plans) to identify programs in the Branch that require the development and implementation of PMS's.
  • PACB will continue to ensure that linkages are made between the PMF and PMS's through the review and use of PMS indicators to inform the MRRS Amendment Process. PACB Program Managers will review their PMS's following amendments made to the PMF during the annual MRRS Amendment Process, and make changes where applicable.

April 30, 2017

EMPB Contribution:
The Branch has already a number of resources assigned to implementation of performance measurement in the Branch. Further, EMPB expects managers to develop, collect and use performance measure as part of sound program administration.
To ensure managers are well supported, training on using performance measurement data for program planning and administration will be provided, and the Branch will increase its focus on using performance data for program management.

November 30, 2016

CMB Contribution:
a) The ADM, CMB will identify resources assigned to the development and implementation of PMSs.
b) The ADM, CMB will ensure linkages between the PMF and PMS are in place.

March 31, 2018

CSCCB Contribution:

  • CSCCB managers designate appropriate resources to develop and implement PMSs, as required.
  • CSCCB managers ensure PMSs under their purview include linkages to relevant PMF elements, either as new PMSs are developed or existing ones are reviewed/updated as required (e.g., following changes in program or function, or policy direction, including changes to expected results, outcomes or objectives) and in collaboration with PACB which is responsible for MRRS/PMF review and coordination across the Department.

March 31, 2017.

NCSB Contribution:

  • PMS's will be assigned and reassigned annually as part of the Performance Management Process. Important planning documents such as the Business Plan and RPP will be used to pinpoint areas in the branch that could benefit from the development and implementation of PMS's.
  • NCSB will continue to ensure linkages are made between the PMF and PMS's though the review of PMS indicators to inform the Management, Resources and Results Structure Policy (MRRS) Amendment Process.

March 31, 2017

Recommendation #3:
Each Assistant Deputy Ministers should:

  • Ensure collection of accurate data against the PMF and PMSs indicators to support timely decision-making and reporting.
  • Develop agreements for third-party data collection, where applicable

PACB Contribution:

  • PACB will ensure the collection of accurate data against their PMF and PMS indicators (as applicable) and will continue to support Branches in their collection of data through the development of comprehensive MRRS and PMS templates, as well as through exercises such as the Mid- and End-of-Year Reviews.
  • Upon the review or development of a PMS, PACB Program Managers will assess the need for the development of an agreement for third-party data collection, and if applicable, develop an agreement.

August 31, 2017

EMPB Contribution:
Attention on the use of performance data will be the focus of EMPB management. The mid-year review cycle will be leveraged to encourage the tracking of performance data. Collection and use of performance measures will be included as part of the employee Performance Management process
Executive Leadership meetings as well as middle manager meetings with EMPB executive cadre will be leveraged to focus attention on collection of performance data.

March 31, 2017

CMB Contribution:

  • The ADM, CMB will ensure data against the PMF and PMSs indicators are collected and available to support timely decision-making and reporting.
  • Upon the review or development of a PMS, Director Generals of directorates within CMB will assess the need for the development of an agreement for third-party data collection, and if applicable, develop an agreement.

March 31, 2018.

CSCCB Contribution:

  • Data collection is incorporated as a responsibility linked to the branch business planning process, but according to stated frequencies in respective PMSs.
  • Upon the review or development of a PMS, program managers will assess the need for developing an agreement for third party data collection, and if applicable, develop an agreement. 

March 31, 2017.

NCSB Contribution:

  • NCSB is currently developing its 2016-17 business plan and will ensure to include a clearly defined data collection schedule for the PMF indicators and the lead manager responsible for the data collection.
  • Directorate-wide validation of the required third party agreements is ongoing. The agreements and protocols will be developed, if required.

April 30, 2017

Annex A: Internal Audit and Evaluation Directorate Opinion Scale

The following is the Internal Audit and Evaluation Directorate audit opinion scale by which the significance of the audit collective findings and conclusions are assessed.

Audit Opinion Ranking

Definition

Well Controlled

  • well managed, no material weaknesses noted; and
  • effective

Minor Improvement

  • well managed, but minor improvements are needed; and
  • effective

Improvements Required

Improvements are required (at least one of the following two criteria are met):

  • control weaknesses, but exposure is limited because likelihood of the risk occurring is not high;
  • control weaknesses, but exposure is limited because impact of the risk is not high;

Significant Improvements Required

Significant improvements are required (at least one of the following two criteria are met):

  • Financial adjustments material to line item or area or to the department;
  • Control deficiencies represent serious exposure;
  • Major deficiencies in overall control structure;

Annex B: Status of PMS Development by Branch

Branch

Developed
(ADM Approved)

Developed
(Not ADM approved)

Outstanding

Total

National and Cyber Security

3

2

0

5

Portfolio Affairs and Communications

2

1

0

3

Community Safety and Countering Crime

9

4

2

15

Emergency Management and Programs

4

1

0

5

Departmental Total

18

8

2

28

Annex C: Audit Criteria

Audit Criteria

Criterion 1:

The Department had formally defined and communicated adequate policy and guidance to ensure the effective implementation of performance measurement.

Criterion 2: 

Individual programs and activities had developed Performance Measurement Strategies (including engagement of appropriate stakeholders), identified the data source(s), and implemented data collection and reporting processes.

Criterion 3:

The PMF process engaged the appropriate stakeholders to ensure well defined and measureable performance indicators which were aligned to departmental objectives.

Criterion 4:

The PMF was informed by and aligned to lower level PMS processes. PMS processes and activities were further aligned to branch business plans and work plan activities to ensure the appropriate focus on the achievement of results.

Criterion 5:

Performance measurement information was appropriately integrated into decision-making.

Criterion 6:

The Department was able to report to Canadians, Parliamentarians, and Central Agencies on its achievement of results in a manner that was consistent with its objectives and program outcomes.

Criterion 7:

The Department had performance measurement accountability controls.

Annex D: Preliminary Audit Risks

The following is a summary of the key risks to which PS is exposed in relation to Performance Measurement.

Key Area

Risk Statement

Culture

Risk that Performance Measurement methodologies and practices are not instilled in business activities.

Engagement

Risk that key personnel are not appropriately engaged in all aspects of Performance Measurement.

Integration

Risk that key performance measurement documents (e.g. PMF, PMS, Business plans) may not be aligned and outcomes and indicators may not be appropriate.

Performance Data

Risk that program and policy area's data monitoring activities may not be sufficient enough to challenge or validate external information required to ensure the integrity of the performance indicators.

Use

Risk that performance information is not utilized by key personnel to guide decision-making, i.e. resource allocations or re-alignment of project objectives.

Reporting

Risk that the Department is unable to provide a comprehensive story in relation to its activities to Canadians, Parliamentarians, and Central Agencies.

Annex E: Summary of 2013 State of Annual Performance Measurement Report

The 2013 State of Annual Performance Measurement report had the main purpose of assessing the existence and completeness of the PMSs based on their above identified program and policy activity. They evaluated the completeness of the PMSs as described by TB guidance, which identified key components including Logic Models, indicators, and data collection. The report sets a Departmental baseline for future comparison; and it summarizes performance measurement gaps.

The report did not attempt to provide any conclusions as to whether those programs, which had PMSs, were implementing them and integrating them into their program management cycles. This analysis was the purpose of this audit.

The main findings of the Annual Report were:

Footnotes

  1. 1

    TB Results-Based Management Lexicon

  2. 2

    Audit opinion assessment scale can be found in Annex A

  3. 3

    TB Results-Based Management Lexicon

  4. 4

    TB Policy on Management, Resources and Results Structure

  5. 5

    TB Policy on Evaluation

  6. 6

    TB Policy on Management, Resources and Results Structure

  7. 7

    TB Policy on Transfer Payments

  8. 8

    TB Policy on Evaluation

  9. 9

    PS Policy on Evaluation

  10. 10

    TBS Directive on the Evaluation Function

  11. 11

    TB Directive on the Evaluation Function

  12. 12

    Audit opinion assessment scale can be found in Annex A

  13. 13

    TB Policy on Evaluation

  14. 14

    SPD Business Plan

  15. 15

    TB Policy on Evaluation

  16. 16

    PS DEC Charter

  17. 17

    TB Supporting Effective Evaluations: A Guide to Developing Performance Measurement Strategies

Date modified: