Lecture by Professor Omar Hasan Kasule Sr. for Year
3 medical students on March 4, 2013 Faculty of Medicine King Fahad Medical City
Summary
·
Analysis of adverse events demonstrates that
multiple factors are usually involved in their causation.
·
Therefore, a systems approach to considering the
situation—as distinct from a person approach—will have a greater chance of
setting in place strategies to decrease the likelihood of recurrence.
Learning objective:
overall
·
Understand how systems thinking can improve
health care and minimize patient adverse events.
Learning outcomes:
knowledge
·
Explain what is meant by the terms “system” and
“complex system” as they relate to health care;
·
Explain why a systems approach to patient safety
is superior to the traditional approach.
Learning outcomes:
performance requirements
·
Describe the term HRO and the elements of a safe
health-care delivery system.
Keywords:
·
System
·
Complex system
·
High reliability organization (HRO).
Why systems thinking
underpins
patient safety
·
Health care provision is rarely carried out by
single individuals.
·
Safe and effective patient care Is dependent
on how the workers work together
·
In the particular work environment/organization
·
Patients’ safety depends a system (many people
doing the right thing at the right time)
The concept of system
·
The word system is a broad term that is used to
describe any collection of two or more interacting parts, or “an interdependent
group of items forming a unified whole” like biological and organic systems.
·
Systems are
in a continuous state of information exchange both internally and externally: continuous process of inputs, internal transformation,
output and feedback
Complex system
·
A complex system is one in which there
are so many interacting parts that it is difficult, if not impossible, to
predict the behaviour of the system based on a knowledge of its component parts
·
The delivery of health care fits this definition
of a complex system, especially in a hospital setting.
·
Hospitals are made up of many interacting parts,
including humans (patients and staff), infrastructure, technology and
therapeutic agents—the various ways they interact with one another and how they
collectively act is highl complex and variable
·
An understanding of the health system requires thinking
beyond the individualized service.
Understanding
complexity
·
Knowledge about the complexity of health care will
enable health-care professionals to understand how the organizational structure
and the work processes can contribute to the overall quality of patient care.
·
Systems thinking helps us make sense of complex
organizations by enabling us to look at health care as a whole system with all
its complexity and interdependence. It removes the focus from the individual to
the organization. It forces us to move away from a blame culture towards a
systems approach.
·
Using a system a systems approach enables us to:
examine organizational factors that underpin dysfunctional health care and accidents/errors
(poor processes, poor designs, poor teamwork, financial restraints and
institutional factors) rather than focus on the people who are associated with
or blamed for the blunders or negligence;
·
Move away from blaming to understanding; improve
the transparency of the processes of care rather than focus solely on the
single act of care.
The traditional
approach: blame and shame
·
In a complex environment it is no surprise that
many things go wrong on a regular
·
basis.
·
When something does go wrong, the traditional
approach is to blame the health-care worker most directly involved in the
patient care at the time—often the nurse or junior doctor
·
While the tendency to blame an individual (the
“person approach”) is a strong one—and a very natural one—it is unhelpful, and
actually counterproductive
·
Whatever role that the “blamed” health-care worker
may have had in the evolution of the incident, it is very unlikely that their
course of action was deliberate in terms of patient harm (if the action was
deliberate this is termed a violation
·
Most health-care workers involved in an adverse event
want to avoid punishment and will limit their reporting of the event
·
If a blame “culture” is allowed to persist a
health-care organization will have great difficulty in decreasing the rate of
adverse incidents of a similar nature occurring in the future
A systems approach
·
Adopting a system approach to errors and adverse
events does not mean that students and health professionals are not
professionally responsible for their actions.
·
Most circumstances surrounding adverse events
are complicated so it is best to use a system approach to understand what
happened and why and then make decisions about personal accountability.
·
Accountability is a professional obligation, and
no one thinks that individuals should not he held accountable.
·
System accountability requires that the system look
at itself; for too long the system has passed on mistakes and errors in the
system of health care to the individual health-care workers.
The new approach
·
Therefore, the main response to an error should
be to try to change the system through a “systems approach”
·
A systems approach to errors in health care, therefore,
requires an understanding of the multiple factors that are involved in each of
the areas that make up the health-care system:
·
Patient and provider factors, Task factors, Technology
and tool factors, Team factors, Environmental factors, and Organizational
factors
The Swiss cheese model
·
Active failures” are errors made by the workers
that have an immediate adverse effect.
·
Latent conditions are usually the result of poor
decisionmaking, poor design and poor protocols by other humans who do not work
at the front line. These conditions are often set in place long before the event
in question.
·
the Swiss cheese model explains how faults in
the different layers of the system lead to incidents
·
The Swiss model shows that a fault in one layer
of the system of care is usually not enough to cause an accident.
·
Adverse events usually occur when a number of
faults occur in a number of layers (for
·
To prevent these adverse events occurring,
multiple “defences” in the form of
successive layers of “protection”
(understanding, awareness, alarms and warnings, restoration of systems, safety barriers,
containment, elimination, evacuation, escape and rescue) designed to guard
against the failure of the underlying layer.
·
The advantage of the systems approach to
investigating situations is that this approach considers all the layers to see
if there are ways that any of them can be improved.
High reliability organization
(HRO)
·
The term HRO refers to organizations that operate
in hazardous conditions but manage to function at a level of hazard that is
almost completely “failure free”—that is they have very few adverse events.
·
Examples of HROs are: air traffic control
systems, nuclear power plants and naval aircraft carriers.
·
Health care is can be an HRO to achieve consistently
safe and effective performance despite high levels of complexity and unpredictability
in the work environment.
·
HROs demonstrate to health-care organizations
that they too can improve safety by focusing on the system.
·
Health care differs from HROs: do not routinely think that health care will fail,
not mindful of miscommunication, not consider fatigue and poor writing, not
consider system wide errors
Characteristics of
high reliability organizations
·
preoccupation with failure: acknowledge
and plan for the possibility of “failure” because of the high-risk, error-prone
nature of their activities;
·
commitment to resilience: proactively
seek out unexpected threats and contain them before they cause harm;
·
sensitivity to operations: pay close
attention to the issues facing the workers at the frontline;
·
a culture of safety in which individuals
feel comfortable drawing attention to potential
·
hazards or actual failures without fear of criticism
·
Health-care organizations can learn from HROs even
though they are different from health care.
Case study #1
The anaesthetist and the
surgeon discussed the preoperative antibiotics required for the laparoscopic
cholecystectomy that was about to begin. The anaesthetist informed the surgeon
of the patient’s allergy to penicillin and the surgeon suggested clindamycin as
an alternative preoperative antibiotic. The anaesthetist went into the sterile
corridor to retrieve the antibiotics but returned and explained to the
circulating nurse that he could not find any suitable antibiotics in the sterile
corridor. The circulating nurse got on the phone to request the preoperative
antibiotics. The anaesthetist explained that he could not order them because
there were no order forms (he looked through a file folder of forms). The circulating
nurse confirmed that the requested antibiotics “are coming”. The surgical
incision was performed. Six minutes later the antibiotics were delivered to the
OR and immediately injected into the patient. This injection happened after the
time of incision, which was counter to protocol that requires antibiotics to be
administered prior to the surgical incision in order to avoid surgical site
infections.
Source: WHO Patient
Safety Curriculum Guide for Medical Schools working group.
Case #2:
Jacqui had an exploratory
procedure called an endoscopic retrograde cholangiopancreatography at a large
teaching hospital for a suspected disorder of her gallbladder. Under general
anaesthetic, an endoscope was inserted into her mouth and was guided through
the oesophagus to the duodenum. Cannulas were inserted through the endoscope
into the common bile duct and a contrast medium injected so an X-ray could be
taken. Two months later, Jacqui was told she was one of 28 patients who had
been injected with contrast medium containing a corrosive substance, phenol.
Normally, the pharmacy department ordered 20 ml vials of “Conray 280”. However,
for a period of approximately five months they incorrectly ordered and supplied
to theatre 5 ml vials of 60% “Conray 280” with 10% phenol in which the label
clearly stated “use under strict supervision—caustic substance” and “single
dose vial”. A nurse finally picked up the mistake, which had been missed by the
pharmacy department and many teams of theatre and surgical staff. Report on an
investigation of incidents in the operating theatre at Canterbury Hospital 8
february-7June 1999.
Source: http://www.hccc.nsw.gov.au/downloads/canterbu
.pdf, accessed April 2008.
Case #3
Neurosurgeon A was
performing a craniotomy on a child called Jim. The flap was made on the right
side in preparation for the removal of a suspected meningioma. The surgeon
paused to recalled the history of the patient. He is puzzled, as he recalls
that the meningioma was on the left side, not the right. The neurosurgeon
re-checked the computed axial tomography (CT) scans. The scans showed that the
lesion was in the right frontal lobe. The neurosurgeon checked his own notes on
Jim, and saw that he has written a diagnosis of a left-sided cerebral lesion.
Seeing, however, that the CT scan shows the lesion to be on the right side, he
went ahead with the surgery. To his surprise, there is no evidence of any tumour.
The neurosurgeon closed up the flap and sent the boy to recovery. The next day,
Jim was sent for a second CT scan. The second set of scans showed that the
lesion was indeed on the left, as he had remembered. The following errors had
occurred: • CT scan had been mislabelled; the marker for “R” (right) had been
placed incorrectly; • mistake made in the booking of the operating theatre,
which should have stated the site of the procedure; • neurosurgeon did not
double-check CAT scan and notes prior to surgery.
SourceL WHO Patient Safety Curriculum Guide for
Medical Schools working group.
Supplied by Ranjit De
Alwis, International Medical University, Kuala Lumpur,Malaysia.
Case #4:
A patient arrived in the
operating room for an inguinal hernia repair. Although the procedure had been
booked as a general anaesthesia case, the anaesthetist discussed a local
anaesthetic with the patient. During his pre-operative anaesthesia consultation,
it had been established that the patient would receive a local anaesthetic. When
the surgeon entered the room several minutes later, the patient told him that
he wanted to have a local anaesthetic. The surgeon examined the hernia and
reported that the hernia was too big for a local anaesthetic and would require
either a spinal or general anaesthesia. The surgeon was irritated and said
that, “if (the anaesthetist who did the pre-op consult) wants to do the
procedure under a local that’s fine, but I do not”. The patient and the
anaesthetist discussed the side-effects of a spinal and the patient asked the
surgeon which one he would recommend. The surgeon suggested general anaesthesia
and the patient agreed to this.
After the patient had
been induced and intubated the surgeon asked the anaesthetist to tell the other
anaesthetists that they should not speak to patients in pre-admit about local
versus general anaesthesia because they had not examined the patient. It has
happened three or four times that the pre-admit anaesthetists have told
patients something different in their pre-op consult than what the surgeon has
recommended. The anaesthetist agreed to speak to his colleagues and the chief
of anaesthesia.
Source: WHO Patient
Safety Curriculum Guide for Medical Schools working group. Supplied by Lorelei
Lingard, University of Toronto, Toronto, Canada.
Tools and resources
IHI clinical
microsystem assessment too (http://www.ihi.org/IHI/Topics/Improvement/Impro
vementMethods/Tools/ClinicalMicrosystemAssessmentTool.htm).
Learning to improve
complex systems of care
Headrick LA. Learning to
improve complex
systems of care. In: Collaborative
education to
ensure patient safety.
Washington, DC,
HRSA/Bureau of Health
Professions, 2000, 75–88
(http://www.ihi.org/NR/rdonlyres/15FB8A41-
D6B0-4804-A588-6EC429D326E9/0/final11700v
ersion.pdf).
Organization strategy
Runciman B, Merry A,
Walton M. Safety and
ethics in health care: a
guide to getting it right, 1st
ed. Aldershot, UK,
Ashgate Publishing Ltd, 2007.
Kohn LT, Corrigan JM,
Donaldson MS, eds. To err
is human: building a
safer health system.
Washington, DC, Committee
on Quality of Health
Care in America,
Institute of Medicine, National
Academy Press, 1999
(http://psnet.ahrq.gov/resource.aspx?resourceID=
1579).