Evaluation of the Encouraging Better Practice in Aged Care (ebpac) Initiative Final Report



Yüklə 1,15 Mb.
səhifə5/38
tarix04.01.2019
ölçüsü1,15 Mb.
#90277
1   2   3   4   5   6   7   8   9   ...   38

2.2Evaluation approach


While the overall EBPAC initiatives were seeking to have an impact at each of the three levels identified in the framework, this was not necessarily the case for individual projects. In some instances, not all cells within the CHSD Evaluation Framework were relevant. For example, individual projects may have been aiming to have an impact at one, two or all three of these levels. However the discipline of reviewing each cell of the matrix ensures we explored all potential project and program impacts and outcomes.

For several projects, the evaluation of impacts and outcomes primarily focused on the provider and system levels. Based on our experience, it can be quite difficult to measure impacts of time-limited initiatives on individuals who have a progressive illness or disease trajectory and attribute those impacts to the particular intervention, given the multi-factorial aspects associated with ageing clients.

In developing our evaluation framework, we have identified a set of data collection requirements that have been implemented to support the national evaluation. We have aimed to strike a balance between minimising the data collection burden placed on individual projects whilst ensuring that sufficient information is available to produce a robust national evaluation. In some cases, we have asked projects (or participating aged care services) to complete specific tools at key points during the evaluation. In other cases, we have collected information directly through site visits, on-line questionnaires, surveys or stakeholder interviews.

Our Evaluation Framework document developed in February 2013 comprised a suite of seven evaluation tools. Some of these were sourced (or modified) from the published literature, whilst others were developed specifically for the purpose of this evaluation. More details about the seven tools are provided below:


Tool 1 Project six monthly report


The progress report has been designed to collect data in a systematic way to meet the requirements for reporting to the Department and to inform the program evaluation. It was completed by the projects at six monthly intervals. Each project completed four progress reports over the project period.

Tool 2 Project expenditure breakdown report


The concept of developing an expenditure breakdown tool was to identify the key cost implications of the EBPAC initiative. However, the Evaluation Team decided in October 2014 not to pursue the development of this tool. Similar tools we have used previously and others that we have reviewed have proved to be either too complex and/or detailed, or too simplistic to allow us to draw any program-wide conclusions given the variety and nature of the projects funded. In place of this, we have drawn on the financial information provided in each projects four progress reports together with the information provided in the final audited statement of receipts and expenditure.

Tool 3 Project dissemination log


The rationale behind the dissemination log was to record details of any public dissemination of project outputs. This information has assisted us in answering a range of evaluation questions across several domains of inquiry.

Tool 4 Training materials evaluation questionnaire


Many EBPAC projects developed or refined training materials or resources to be implemented across the aged care sector using a variety of delivery models. For some projects, materials were targeted at staff of aged care organisations, whilst other materials were delivered directly to aged care consumers. The purpose of this questionnaire was to support our evaluation of the materials developed across EBPAC projects.

Tool 5 Project workshop log


The purpose of this tool was to assist in answering evaluation questions particularly related to assessing the reach of workshops. The tool is only used to record workshop activity conducted by the two national roll-out projects.

Tool 6 Project workshop notification/recruitment tool


Again, this tool was intended only for the use of the two national roll-out projects. Its purpose was to capture information regarding strategies employed in planning, convening and reviewing workshops and assisted in assessing the different strategies employed by these two projects.

Tool 7 Stakeholder interviews


A series of stakeholder interviews were conducted by the evaluation team to collect information on the views and experiences of EBPAC stakeholders. A semi-structured interview format was adopted to allow issues to be explored in more depth than is possible through questionnaires.

Tools 3, 4, 5 and 6 were supported by Word or Excel templates that were emailed to the projects.


Other evaluation activities


In addition to information collected through these formal evaluation tools, we also undertook a range of related activities to support the EBPAC projects and our evaluation method. A brief outline of these is provided below.

National workshops


The evaluation team facilitated two national workshops during the evaluation. The first workshop was held in Canberra on 11 October 2012. The second national EBPAC workshop was held at the Park Royal Hotel, Melbourne Airport on 22 July 2014. Both workshops were facilitated by CHSD in its role as the EBPAC national evaluator.

The first (orientation) workshop was held prior to the commencement of data collection activities in order to introduce the evaluation team to the relevant project contacts and generate their support for and involvement in the evaluation. The workshop also sought feedback about the proposed EBPAC evaluation approach, e.g. data collection tools, local evaluation activities, communication strategies, ethics and EBPAC project progress reports.

The second workshop provided an opportunity for the evaluation team to discuss emerging evaluation issues with projects and to offer any support that may be required regarding evaluation, report writing or dissemination of results.

Feedback from participants indicated that both workshops were extremely well received. They provided an excellent opportunity for projects to network and share ideas and strategies. Detailed workshop evaluation reports are available in Evaluation Progress Reports 1 and 2.


Site visits


Our evaluation plan allowed for two site visits to be undertaken to each project over the course of the evaluation. At the commencement of the evaluation, a primary and secondary evaluation team member was allocated for each EBPAC project and site visits were conducted by both team members.

The initial round of evaluation site visits occurred between November 2012 and February 2013. The visits represented a critical source of information that fed into the development of the evaluation framework. They also provided an important opportunity for the evaluation team to familiarise itself with the details of each project and meet the key people involved.

The second site visit provided an opportunity to collect data from members of the project team relating to project implementation and project governance. Key evaluation questions were explored relating to project delivery, stakeholder engagement, costs and funding, evaluation, sustainability, generalisability, capacity building and enablers and barriers to project implementation.

In addition, a small number of additional site visits were conducted to support projects that were experiencing difficulties.


Ethics approval


An application for ethical approval for the program evaluation was approved by the University of Wollongong / Illawarra Area Health Service Human Research Ethics Committee on 22/3/2013 (HE13/107).

Each EBPAC project was responsible for determining their own ethics approval requirements and for submitting required applications to the relevant ethics committee. In most cases, projects determined at an early stage that ethics approval would be required and submitted ethics applications with no involvement from the evaluation team. For a small number of projects, the national evaluation team provided support and advice regarding the need for and process of obtaining relevant ethics approval. Three projects did not obtain ethics approval.



Yüklə 1,15 Mb.

Dostları ilə paylaş:
1   2   3   4   5   6   7   8   9   ...   38




Verilənlər bazası müəlliflik hüququ ilə müdafiə olunur ©muhaz.org 2024
rəhbərliyinə müraciət

gir | qeydiyyatdan keç
    Ana səhifə


yükləyin