In 2013, members of the evaluation team undertook a targeted literature review to identify the factors that are considered important to assist in the ability of those working in the community care sector to access and implement evidence based practice.
Literature published since 2000 was considered and numerous databases were reviewed including PubMed, Cochrane Database, Medline and Cinahl. In addition, a snowball approach was used, reviewing references in key articles, as well as searching websites that are known to contain information on the subject, e.g., the Canadian Health Services Research Foundation, the Department of Health and Ageing and the US Centre for Disease Control and Prevention.
The review revealed a greater need for community services to work in partnership and in collaboration, the need for the alignment of philosophical ideas and policies, organisational design factors that address administrative and clinical factors, and coordination and boundary spaning linkage mechanisms. The review is available in Appendix 1.
2EVALUATION METHODOLOGY
Our evaluation strategy has been designed in order to allow the evaluation team to form a judgment as to how successfully the EBPAC initiatives have been implemented, whether the desired results have been achieved and what lessons have been learnt that will lay the ground-work for the sustained use of evidence-based practice in residential and community aged care.
The evaluation assesses the outcomes of the projects funded under the EBPAC initiatives and identifies critical success factors to inform future national rollout or wider promulgation of evidence-based materials and resources from the successful projects. Key information for this was obtained from a set of evaluation tools described in section 2.2.
Our evaluation of the EBPAC projects drew extensively on the aggregated findings from the project evaluations provided in their final reports. This was supplemented by data provided through the site visits and stakeholder interviews as summarised in Table . The project final reports and the records of the site visits and stakeholder interviews were loaded into NVivo software, which was used to facilitate data analysis.
The program evaluation drew extensively on the aggregate findings of the project evaluations, constituting a ‘meta-evaluation’ of project achievements, constraints and successes. Given the diversity of projects there were no common clinical outcomes, hence improvements in clinical care were only identified by project-level evaluations.
This was supplemented by our knowledge gleaned from our experience in conducting the national evaluation of Rounds 1 and 2 of EBPAC. This was supported by a targeted literature review focussing on the evidence relating to implementing evidence based practice within a community care context (see Appendix 1).
2.1The CHSD Evaluation Framework
The foundation of our evaluation is a framework (referred to as ‘the CHSD evaluation framework’). It is represented by a matrix with three levels of analysis on the vertical axis ensuring we explore the impact and outcomes for consumers (including carers, their families and friends), providers and the broader aged care sector (refer to Figure ).
Across the horizontal axis of the matrix are six key issues that a comprehensive evaluation should address – program delivery, program impact, sustainability, capacity building, generalisability and dissemination. Through systematically exploring each of the six key issues or questions posed, where possible at each level of the framework, we will address the formative and summative requirements of this evaluation.
Figure CHSD Evaluation Framework
Level I Impacton, and outcomes for, consumers (clients, families, carers, friends, communities)
|
EVALUATION HIERARCHY
|
What did you do?
PROGRAM / PROJECT DELIVERY
|
How did it go?
PROGRAM / PROJECT IMPACT
|
Can you keep going?
PROGRAM / PROJECT SUSTAINABILITY
|
What has been learnt?
PROGRAM / PROJECT CAPACITY BUILDING
|
Are your lessons useful for someone else?
PROGRAM / PROJECT GENERALISABILITY
|
Who did you tell?
DISSEMINATION
|
Outcomes, indicators and measures to be developed for each cell as relevant
|
Describe what was implemented and, if necessary, contrast to what was planned
|
Impact on consumers and carers
|
Sustainability assessment
|
Capacity building assessment
|
Generalisability assessment
|
Dissemination log
|
Level 2 Impact on, and outcomes for, providers (staff, formal carers, professionals, volunteers, organisations)
|
EVALUATION HIERARCHY
|
What did you do?
PROGRAM / PROJECT DELIVERY
|
How did it go?
PROGRAM / PROJECT IMPACT
|
Can you keep going?
PROGRAM / PROJECT SUSTAINABILITY
|
What has been learnt?
PROGRAM / PROJECT CAPACITY BUILDING
|
Are your lessons useful for someone else?
PROGRAM / PROJECT GENERALISABILITY
|
Who did you tell?
DISSEMINATION
|
Outcomes, indicators and measures to be developed for each cell as relevant
|
Describe what was implemented and, if necessary, contrast to what was planned
|
Impact on professionals, volunteers, organisations
|
Sustainability assessment
|
Capacity building assessment
|
Generalisability assessment
|
Dissemination log
|
Level 3 Impact on, and outcomes for, the system (structures, processes, networks, relationships)
|
EVALUATION HIERARCHY
|
What did you do?
PROGRAM / PROJECT DELIVERY
|
How did it go?
PROGRAM / PROJECT IMPACT
|
Can you keep going?
PROGRAM / PROJECT SUSTAINABILITY
|
What has been learnt?
PROGRAM / PROJECT CAPACITY BUILDING
|
Are your lessons useful for someone else?
PROGRAM / PROJECT GENERALISABILITY
|
Who did you tell?
DISSEMINATION
|
Outcomes, indicators and measures to be developed for each cell as relevant
|
Describe what was implemented and, if necessary, contrast to what was planned
|
System level impacts, including external relationships
|
Sustainability assessment
|
Capacity building assessment
|
Generalisability assessment
|
Dissemination log
|
Dostları ilə paylaş: |