DHIS2 Quick Start LogicalOutcomes
DHIS2 Quick Start LogicalOutcomes
About DHIS2
About LogicalOutcomes
| 2
The problems with
monitoring and
evaluation
Not all evaluation is
Monitoring and evaluation systems often fail: They tend to go wildly over budget, or
over schedule, or don't deliver what they promised, or all three. Even when they are
implemented correctly, there is little evidence that they improve program
effective effectiveness.* Yet funders expect nonprofits - even small ones - to evaluate their
programs as though it's a simple task. Why are M&E systems so difficult to
implement? And how can we make them less expensive and more useful?
In our experience, drawn from 25 years of working with funders and agencies:
Agencies create logic models that are uninformed by research because they don’t
have the resources to review the research literature.
Funders require agencies to design evaluation plans but don’t have the expertise to
assess them for feasibility or usefulness.
Evaluators tend to select indicators that are technically weak, and in any case,
agencies don’t have the capacity to collect the data.
Even when agencies collect service data, they do not have the capacity to test its
quality, aggregate it and report it to users in a way that supports decision-making.
| 4
*Powers, L.C. (2009). A framework for evaluating the effectiveness of performance measurement systems. RealWorld Systems Research Series 2009:1.
Available at SSRN: http://ssrn.com/abstract=1371158
**E.g., Ethical breakdowns (2011) Bazerman and Tenbrunsel, Harvard Business Review, https://hbr.org/2011/04/ethical-breakdowns
Difficulties in
Most monitoring and evaluation Nonprofits move too quickly to
(M&E) implementations go over software implementation and get
budget, over schedule, or don’t paralyzed
implementing M&E deliver what they promised
Nonprofits tend to select a software
systems
M&E systems are difficult to
program with the mistaken idea that
implement because of the lack of
it will solve their evaluation needs. In
standards around outcomes and
fact, most of the work involves the
indicators. Every funder and
definition of indicators, reports, user
nonprofit uses different definitions,
permissions and other elements that
LogicalOutcomes interviewed over and each must define its indicators
don’t depend on any particular
40 staff and consultants at from scratch.
software. Software developers don’t
nonprofits in Canada, the U.S., have the expertise to define
Organization-wide data
Europe, Asia and Africa and indicators, and the project gets stuck.
aggregation requires sophisticated
several software vendors*. meta-data management & data
Neither nonprofits nor vendors are
models
We also reviewed the research satisfied with the implementation
literature on M&E software It is relatively simple to collect data of M&E systems
implementations for nonprofits for a single project and a single
Because nonprofits have an
using Google Scholar, and funder.
unrealistic concept of the complexity
reviewed web forums and news
As soon as a nonprofit needs to of M&E systems, implementers feel
groups devoted to monitoring and frustrated and unappreciated.
report to multiple funders or
evaluation.
combine data across different
Project managers report that their
There was broad consensus that programs, it is a completely different
own managers don’t appreciate their
challenge. And most nonprofits do
large scale M&E implementations efforts, vendors report that
not have the expertise to manage the
are very difficult to manage, nonprofits expect unreasonable
added complexity.
regardless of the software. deliverables for the budget, and
nonprofits report that they sink vast
amounts of money with
| 5
* Much of this research was done for SNV - Kerr, G. 2015. PME software and functions after 2015. Unpublished report, SNV. unsatisfactory returns.
Summary of
Difficulty of M&E implementation Selecting software
Organization-wide M&E systems are M&E requirements are so
comparable to Enterprise Resource complex that no single software
comments from Planning (ERP) implementations. In
some ways they are more difficult
program can meet all of them.
Every software program will
for monitoring and them, including platforms like SalesForce, SharePoint and Microsoft
CRM as well as specialized programs like DevResults, ActivityInfo
Data collection
Reporting
Cost
See the summary of the requirements in the next three pages. Only
one software program satisfied all of them: District Health
Information Software (DHIS).
| 8
Capture theories of change and indicators for each program
Software Can power-users create logic models and evaluation frameworks during
Monitoring software is complex, so we Can basic users easily enter and process data on a mobile device
(smartphone) or web form?
assume three levels of expertise at the
Can basic users collect information about individual service users and/or
agency:
events, or qualitative information, or rating scales?
Can power-users design data entry forms with indicators disaggregated by
Power-users are agency staff who are different categories (e.g., age, location, program, etc.) based on funder
familiar with the software. They requirements?
don’t need to be software Report information
programmers. Can power-users build automated monthly reports that meet agency
Project managers are agency staff needs?
who are given 3 to 4 hours of Can project managers quickly design customized reports for individual
funders to meet their changing reporting requirements?
training, mostly to create reports.
Can project managers generate and tailor attractive reports, defining
Basic users just enter data or view various combinations of indicators and time frames, aggregating on many
dashboards. variables, and exporting in PDF or spreadsheet formats?
Can project managers easily get information out of the system in flexible
formats once it is put into the system, aggregating by program, client type
and/or sector?
| 9
Implement and roll-out
Software Does the software system provide good updated documentation and
requirements.2 training materials (e.g. video tutorials)?
Can the software run on popular web browsers on all major operating
systems?
Can basic users view, enter or download data even when internet
connection is not available?
Manage and protect data
Does the software protect data integrity from corruption, e.g., when
internet connectivity is disrupted?
Does the software employ security protocols when transferring data and
when data is at rest? Does it follow good practices for protecting
confidential information?
| 10
Software Build community capacity and knowledge
Can the software measure key elements (e.g., core values, success factors)
requirements.3 that are important to the agency and its communities?
Can agencies use and adapt the software freely without limitation? Does
the software use open standards for importing, exporting and
communicating data to support the work of partners?
What is the annual cost per basic user and per project manager, including
the expected level of technical support and hosting?
How long will it take to train for each role (basic user, project manager,
power-user)?
How long does it take to create new templates, indicators and elaborate
data entry forms?
| 11
12 key software
In summary, nonprofits seem to want software that is infinitely flexible,
inexpensive to configure and implement, and extremely easy to use.
requirements This is not an unusual set of requests for enterprise software, but it is difficult
to achieve. It requires a complex, flexible software platform that supports a
variety of user roles and the capacity to develop and share templates. That, in
turn, led to a strong preference for open source software that would not be
locked down by a vendor.
Ability to create complex indicators Open source and ability to share templates
Ability to store, import, export data Large community of developers (to prevent
vendor lock-in)
Ability to create on-demand attractive Frequent revisions of the software (to
and flexible reports prevent obsolescence)
Specifically designed for M&E; does not
require extensive customization Posted development roadmap
| 12
We identified about 35 software programs through searches on the web, discussion forums and
Software recommendations from nonprofits and narrowed them down to 24 after an initial review.
Where possible we requested information from their respective vendors; not all of our
comparison questions were answered so there are many gaps in the table.
Only 2 programs satisfied all criteria, and only one of them (DHIS2) was well-tested and mature.
| 13
This table is a simplified summary of our analysis. We used a set of clearly defined tasks to test each software platform. In some cases the platform passed some elements of
the test but not others (and was scored with an x and y). We rejected software once we established that they failed in at least one of the 12 key requirements.
It’s not entirely about
M&E software can be divided into four categories:
All of them have been used successfully in some organizations, and have failed
in others (as defined by being over budget, over schedule, or not providing
the expected functionality).
Typically there is a trade-off between flexibility and ease of use. Software that
is quick and easy to configure has less capability in terms of monitoring and
evaluation functions.
Even the most expensive software requires a large staff investment from
organizations to define outcomes, indicators and data models. As one
informant stated, “90% of our work would have been exactly the same if we
| 14
had chosen another software program”.
About DHIS2
What DHIS2 does
DHIS2 is an open source program that has been in development for over 20 years.
It emerged in post-apartheid South Africa in 1994 as a collaboration between local
public health activists and Scandinavian action researchers. Its mission: To build the
capacity of local communities while contributing to an effective national health
system. It is now used in many applications beyond health.
DHIS2 releases new versions every three months. It is supported by the University of
Oslo, plus an international network of experts and consultants. It is funded by
NORAD, PEPFAR, the University of Oslo, the Global Fund, CDC, Gates Foundation
etc. and is accompanied by detailed documentation, video tutorials and training
materials. A free online Academy is being launched in early 2016.
Resilient
DHIS2 is designed to handle intermittent internet connections and low cost data
collection. Agencies can collect data offline with free phone apps or light-weight
feature-phone browsers and upload it when the internet is up. They can download
their own data and work with it, syncing when they wish.
Flexible
DHIS2 is designed to aggregate data that is gathered in multiple formats and
locations. It can import and export data through csv files or a web API. It also
provides built-in data collection apps for individual client tracking.
Decentralized
DHIS2 is designed to be independent of any one organization. Expert nodes have
been set up in India, Vietnam, Malawi, Namibia, South Africa and several other
countries to ensure that local expertise can develop. The University of Oslo has
supported dozens of graduate students from developing nations to carry out| 17
research on health systems using DHIS2.
Implemented in 47
National standard for Health Management Information Systems in 20
countries. The uses go well beyond health, though:
| 19
Partners Pilot/early phase Scaling up Nation-wide rollout
Implemented by
Service providers Global initiatives
World Health Organization - Program
Population Services International
dozens of NGOs (PSI)
dashboards, Malaria, DQA tool
Africare
| 20
| 21
From http://www.slideshare.net/dhis2/global-citizen2
Committed to In January 2016, University of Oslo HISP is launching
an online DHIS2 Academy
training and support It will use Open edX, a MOOC platform developed by
MIT.
| 24
From https://launchpad.net/dhis2/+milestone/2.21
A simple Client The University of Oslo is investing heavily in the
ability to track the progress of individual clients. The
Management System ‘tracker’ can be used to create a basic client
management system.
| 26
monitoring and
of effective measurement systems:
Defining effective programs that are informed by evidence and meet local
needs and priorities.
Selecting valid and useful indicators that will actually change the behaviour of
managers and funders.
Defining indicators and measures clearly enough that they can be shared and
aggregated across jurisdictions, using standard formats.
Collecting data securely using tools that do not incur an unreasonable cost
burden on front line workers and agencies.
Combining, cleaning and aggregating data from many sources to meet the
needs of multiple users.
Define
Define
Indicators and Set up DHIS2
Requirements
Reports
1. Launch project 3. Define program indicators 5. Set up DHIS2 system
Hold kick-off with team, client Review and clarify existing program Configure DHIS2 instance from
working group. indicators. worksheets.
Finalize project charter and Define data elements, Test data collection and reporting
workplan. disaggregations, option sets with internal users.
Deliverable: Project Charter Validate evaluation framework with Pilot test DHIS2 system with selected
2. Define DHIS2 requirements users. users.
Identify decision-makers and user Deliverable: List of Program Collect user feedback & incorporate
groups. Indicators and Data Elements changes.
Interview users and staff regarding 4. Design reports Deliverable: Beta version of DHIS2
M&E needs. Design reports using sample data system
Assess business processes and Define format of import/export 6. Transition to maintenance phase
existing IT system. tables and APIs. Deliver training to client staff.
Define organizational units, user Validate report designs with users Transfer project to ongoing hosting
roles, datasets, reports. Deliverable: DHIS2 Report and maintenance plan.
Deliverable: DHIS2 Requirements Worksheet Deliverable: Working DHIS2 system
Worksheet
| 29
Schedule This schedule would be customized for each project.
WEEKS
ACTIVITY 1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28
1. Launch project
Hold kick-off with team, client working group
Finalize project charter and workplan
Deliverable: Project Charter
2. Define DHIS2 requirements
Identify decision-makers and user groups
Interview users and staff regarding M&E needs
Assess business processes and existing IT system
Define organizational units, user roles, datasets, reports
Deliverable: DHIS2 Requirements Worksheet
3. Define program indicators
Review and clarify existing program indicators
Identify relevant indicators from external sources if applicable
Define data elements, dissagregations, option sets
Validate evaluation framework with users
Deliverable: List of Program Indicators and Data Elements
4. Design reports
Design reports using sample data
Define format of import/export tables and APIs
Validate report designs with users
Deliverable: DHIS2 Report Worksheet
5. Set up DHIS2 system
Configure DHIS2 instance from worksheets
Test data collection and reporting with internal users
Pilot test DHIS2 system with selected users
Collect user feedback & incorporate changes
Deliverable: Beta version of DHIS2 system
6. Transition to maintenance phase
Deliver training to client staff
Transfer project to ongoing hosting and maintenance plan
Deliverable: Working DHIS2 system
7. Close project
Finalize project tasks & debrief
Archive or delete files
Document lessons learned
| 30
M&E requires a system, not just a software program
Indicators Data warehouse
Indicators are defined using international Service data are stored in encrypted databases
metadata standards. on secure servers with data integrity safeguards.
Report templates and dashboards Data can be imported from other systems,
combined, and exported in various formats.
Standard data visualizations, reports and
dashboards can be selected from templates. Standard disaggregations
Existing data that must be imported and Client trackers, to track the registration and progress of
combined with new data individual clients (DHIS2 can provide a simple Client
Management System)
Number of users and user roles
Facility checklists to track program fidelity and quality
Graphic and UX design of data collection
forms Registry of all service locations, including services and
catchment areas. Facility registries can support flexible
Validation rules and skip logic for data
reporting by service type, location and so on
import and collection
Geographic mapping of indicators and services
A web portal that dynamically reports on
selected indicators for a public audience Program fidelity checklists and rules engines to track the
quality of program delivery based on defined milestones
Enhanced reports that combine multiple
and attributes
data sources and indicators to
communicate trends in service delivery Training and development instances of DHIS2
Survey bank of questions to assess staff, Online training material customized to your agency
partner and participant engagement
Ongoing coaching through the implementation phase
| 34
Use spreadsheets to
define system
requirements
A ‘Quick Start’ approach uses a configuration spreadsheet with
individual worksheets including:
Indicators
Data elements
Categories
Option sets
Indicator groups
Data element groups
Indicator group sets
Organizational units
Datasets
Report types
Organizational roles
Users
| 35
Define feasible
Invite (don’t force) teams to participate in pilots of monitoring and evaluation
tools, and select projects that can tolerate ambiguity and the frustrations that
are part of early adoption. Pilots should be championed by critical and
objectives knowledgeable project managers.
Focus on user needs. For example, who is actually using the information?
When do they need it, and how do they want to report it? Include corporate
users (like business development) as well as the project managers.
What are the minimum reports necessary to achieve user objectives? You
don’t need to solve everything at once. Aim for quick wins and build
excitement across the organization by delivering products that work.
*https://scholar.google.ca/scholar?hl=en&as_sdt=0,5&q=DHIS2
and
http://www.mn.uio.no/ifi/english/research/networks/hisp/Researc
h%20Library/Recent%20Publications and | 36
http://www.mn.uio.no/ifi/english/research/networks/hisp/Researc
h%20Library/phd-thesis-list.html
Phase 2. Define
indicators and
design reports
Defining outcomes
International experience has shown that shared measurement systems
should get to the level of indicators as quickly as possible. They should be
| 38
Millennium Development Goals http://www.undp.org/
Canadian Index of Wellbeing https://uwaterloo.ca/canadian-index-wellbeing
Characteristics of a
good indicator -
examples
| 39
From Harmonized Reproductive Health Registries (hRHR) Working Group at www.fhi.no/dokumenter/1d23cd1b4e.pdf
Characteristics of a 1. Action Focused
“It is clear what needs to be done to improve outcomes associated with this
good indicator – indicator (e.g., immunise to reduce neonatal tetanus)”
summary* 2. Important
“The indicator and the data generated will make a relevant and significant
contribution to determining how to effectively respond to the problem”
3. Operational
"The indicator is quantifiable; definitions are precise and reference standards
are developed and tested or it is feasible to do so”
4. Feasible
“It is feasible to collect data required for indicator in the relevant setting”
6. Open access
“The indicator is available at no cost and can be shared freely.”
Adopted from Harmonized Reproductive Health Registries (hRHR) Working Group at www.fhi.no/dokumenter/1d23cd1b4e.pdf | 40
Example indicator –
GAVI Alliance
| 42
From http://www.pepfar.gov/documents/organization/240108.pdf
Example
indicator –
social services
| 43
Draft Indicator Reference Sheet from working document, Prosper Canada and Canadian Bankers Association, 2015.
Indicators will be The challenge
The indicator itself will have its own license and authorship (e.g., Statistics
IATI (http://iatistandard.org/)
Canada uses the Open Government Licence – Canada).
HXL (Humanitarian Exchange Language,
http://hxlstandard.org/) Zenodo, as the publisher, will maintain accessibility of the indicator(s) even if
the original dataset is taken down.
ADX (Aggregate Data Exchange,
www.dhis2.org/doc/snapshot/en/develo
per/html/ch01s12.html)
DOI - http://blog.apastyle.org/apastyle/2014/07/how-to-use-the-new-doi-format-in-apa-style.html
ORCID - https://orcid.org/organizations/funders and http://orcid.org/content/initiative
FUNDREF - http://www.crossref.org/fundref/index.html | 44
ZENODO - https://zenodo.org/features
Open Government License – Canada - http://open.canada.ca/en/open-government-licence-canada?_ga=1.156660539.1898951134.1438269552
Generic logic model We are using a standard 8-step logic model to provide consistency for
coding indicators into the evaluation system. At the top level of the data
dictionary are Indicator Group Sets divided into four outcome groups and
four output groups.
OUTCOMES
1. Impact – covering all timeframes from immediate to long term, and that
refer to the impact on the intended beneficiary groups. Examples:
employment, income, housing status, etc.
OUTPUTS
6. Reach – the extent to which the program reaches the targeted number
and type of participants or audience
| 46
This is a real chart with real data. From http://junkcharts.typepad.com/junk_charts/2010/08/
Designing reports In this example, vaccinations have fallen in the Bird
District. The organization needs to find out why
that lead to action vaccinations have been declining, especially in late
2014, and how to increase the rate again.
| 47
From Trainingland – DHIS2 training site - to be launched in early 2016 on www.dhis2.org
GAVI Vaccine Alliance
From Gavi Vaccine Alliance at http://www.gavi.org/results/goal-level-indicators/. Gavi has begun to use DHIS2 to track and report on indicators. | 48
Use PowerPoint to Use PowerPoint or Excel to create prototypes of desired reports
using dummy data. Then consult with key stakeholders and
decision-makers. Is this what they want?
create mockups of We will have a few basic reports that are part of the template,
desired reports including participant satisfaction and number of people served.
| 49
Phase 3. Set up a
functioning system
with DHIS2
DHIS2 reports
information in Dashboards can be created for individual users and funders. They can be posted on
the integrated web portal or shared privately.
various formats
| 51
Use DHIS2 to test
mockups of
dashboards
From www.dhis2.org | 52
Defining logic model Each group set (see below) is linked to multiple indicator groups.
| 53
Assigning indicators Group Sets (e.g., REACH) are linked to Indicator Groups.
| 54
DHIS2 use
cases
Managing malaria in
To improve malaria reporting in Kenya, the Ministry of Health in 2010
approved the use of DHIS2 to report on malaria commodities at the sub-
national level.*
Kenya With support from USAID, Kenya’s Malaria Control Unit transitioned its
reporting system to DHIS2 in October 2012. Use of DHIS2 improved
reporting rates from about 45 percent to 70 percent in the months after
its implementation (see figure to the left).
The Health Information Systems unit of the Ministry of Health and staff from
the HIV, TB, malaria, reproductive health and family programs participated in a
conference on the impact of DHIS2, facilitated by USAID and Ministry staff.
From http://www.unicef.org/search/search.php?q_en=dhis&go.x=0&go.y=0 | 58
Afghanistan
Algeria
Bangladesh
Benin
National
Bhutan
DHIS2 is being used or in the process of Burkina Faso
Burundi
adoption by over 50 countries so far.
implementations of
Cameroon
Colombia
Here is a somewhat out-of-date list. Congo Brazzaville
DHIS
Cote d'Ivoire
DRC
Key: Ghana
Guinea Bissau
India (Bihar, Orissa, Maharashtra,
Kerala, Punjab, Haryana, H Pradesh)
Kenya
Laos
Liberia
Malawi
Mexico
Mozambique
The University of Oslo’s DHIS2 program trains Myanmar
It is a heavy user of data standards, and promotes the use of DHIS2 to track
health status. In fact, it funds DHIS2 implementations as part of its ‘Health
Systems Strengthening’ initiative, and most of its national partners use DHIS2
to collect and report on health data. In November 2014, Global Fund reported
that:
“Strengthened country data systems are crucial to making robust plans and
measuring and evaluating impact. Data needed for results reporting and
impact assessments require country-based data systems and structures … Of
the high impact countries, 17 out of 23 are using DHIS2 2 as a reporting
platform, with funding from grants going to support rollout and training.”*
The entire web site provides a model for good funding practices and
resources. They use indicators that have been defined within DHIS2, including
PEPFAR’s, and show examples of how to build in workplan deliverables and
milestones.
From http://www.theglobalfund.org/documents/fundingmodel/progressupdate/FundingModel_2014-12-Progress_Update_en/ | 60
Data linked to
national data
systems.
Dynamic and
verified results
instead of static
results.
| 61
About
LogicalOutcomes
About us LogicalOutcomes is a nonprofit organization based in Toronto, Canada.
Gillian Kerr, Ph.D., C.Psych. Martha McGuire, M.S.W., CE Neil Price, M.A.
| 63
The DHIS2 Team
LogicalOutcomes has an international The DHIS2 Network
network of analysts and contractors.
We work with Canadian and international The University of Oslo, the NonProfit
analysts, software developers, writers Organizations Knowledge Initiative
and evaluators. (NPOKI), Metrics for Management,
Population Services International
For DHIS2 implementations, we work (PSI), The Global Fund and many
with HISP India and HISP Uganda, two of other organizations are building a
the international hubs for DHIS2 community of practice to create
development, as well as several shared measurement systems for
independent experts. nonprofits across the world.