Discussion 2

  

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

This week builds on that foundational awareness with a focus on the application of evidence-based practice models as a strategy to improve patient safety and other quality dimensions. In this Discussion, consider how these strategies can sustain practice changes.

To prepare:

Read: Newhouse, R.P. (2007) Diffusing confusion among evidence-based practice, quality improvement and research. JONA 37,432-535 (see attached pdf)

Read: Mazurek Melynk B., Gallagher-Ford, L., English Long, L., & Fineout-Overholt, E. (2014) The establishment of Evidence-Based practice competencies for practicing registered nurses and advanced practice nurses in real world clinical settings: proficiencies to improve healthcare quality, reliability, patient outcomes, and costs. Worldviews on Evidence-Based Nursing 11(1),5-15. (see attached pdf)

Save Time On Research and Writing
Hire a Pro to Write You a 100% Plagiarism-Free Paper.
Get My Paper

We all agree that research is the highest form of reliable evidence, when implemented in an organization that provides the basis for an evidence based practice. FIND a research study published in a peer reviewed journal related to a clinical practice problem that is of interest to you, and which would ultimately contribute to an evidence based practice (See “TeamSTEPPS Improves Operating Room Efficiency and Patient Safety” attached article).

The discussion assignment:

ANSWER the following discussion questions. Include the research study in your post (attach it to the post and post it in doc sharing). See page 11 #15 in Mazurek Melynk, Gallagher Ford, English Long and Fineout-Overholt. In this assignment, you are being asked to critically appraise a single research study for its relevance to a QI practice problem.

Discussion questions:

1) Briefly summarize study findings, conclusions and recommendations. Do you agree with these?

2) As a DNP prepared nurse, would you recommend a change in nursing practice based on the study? Defend and/or justify your decision based on research evaluation principles. In other words, does the evidence generated by this research article signify a need to change nursing practice? If not, why not; if so, why?

By tomorrow Friday February 02, 2018 by 12 pm America New/York time 

American Journal of Medical Quality
2016, Vol. 31(5) 408 –41

4

© The Author(s) 2015
Reprints and permissions:
sagepub.com/journalsPermissions.nav
DOI: 10.1177/1062860615583671
ajmq.sagepub.com

Article

A 1999 Institute of Medicine (IOM) report, To Err Is
Human, disclosed that up to 98 000 deaths occurred annu-
ally in US hospitals because of medical errors.1 Since that
initial report, many organizations, including the World
Health Organization, the Joint Commission, the Institute
for Healthcare Improvement, and the Accreditation
Council for Graduate Medical Education (ACGME),
have concurred with the critical need for improved com-
munication and teamwork to prevent medical errors and
promote patient safety.2-5 The IOM advocated training in
team behavior, leadership, and communication, which
has been adapted from programs in the aviation industry
called crew resource management.6 The concept pro-
motes a team approach in the operating room by opening
channels of communication during formal preoperative
and postoperative briefings among health care profes-
sionals to improve patient safety.

In 2006, the collaborative efforts of the Agency for
Healthcare Research and Quality and the Department of
Defense produced an evidence-based resource for team
training in health care.7,8 Team Strategies and Tools to
Enhance Performance and Patient Safety (TeamSTEPPS)
adopts crew resource management strategies by imple-
menting preoperative and postoperative briefings and
encourages situational awareness and communication by
all members of the health care team.9,1

0

As a quality improvement and patient safety project,
urology residents at the study medical center participated
in leading the preoperative and postoperative briefings
during the past year when TeamSTEPPS was imple-
mented in the operating rooms throughout the Department
of Surgery, including all surgical subspecialties. The pur-
pose of this project was to evaluate the operating room
efficiency and patient safety level within the urology ser-
vice related to improved channels of communication
among operating room personnel during the first year of
implementation of TeamSTEPPS.

Methods

The study organization’s institutional review board
approved this study as a performance improvement proj-
ect. All health care personnel in the operating room
accomplished TeamSTEPPS training, which included
didactic-based modules on the 4 core competencies:
leadership, situation monitoring, mutual support, and

583671AJMXXX10.1177/1062860615583671American Journal of Medical QualityWeld et al
research-article2015

1San Antonio Military Medical Center, Fort Sam Houston, TX

Corresponding Author:
Thomas E. Novak, MD, MCHE-SDU, San Antonio Military Medical
Center, 3551 Roger Brooke Drive, Fort Sam Houston, TX 78234.
Email: Thomas.e.novak.mil@mail.mil

TeamSTEPPS Improves Operating
Room Efficiency and Patient Safety

Lancaster R. Weld1, Matthew T. Stringer, DO1, James S. Ebertowski, MD1,
Timothy S. Baumgartner, MD1, Matthew C. Kasprenski, MD1,
Jeremy C. Kelley, DO1, Doug S. Cho, MD1, Erwin A. Tieva, MD1,

and Thomas E. Novak, MD1

Abstract
The objective was to evaluate the effect of TeamSTEPPS on operating room efficiency and patient safety. TeamSTEPPS
consisted of briefings attended by all health care personnel assigned to the specific operating room to discuss issues
unique to each case scheduled for that day. The operative times, on-time start rates, and turnover times of all cases
performed by the urology service during the initial year with TeamSTEPPS were compared to the prior year. Patient
safety issues identified during postoperative briefings were analyzed. The mean case time was 12.7 minutes less with
TeamSTEPPS (P < .001). The on-time first-start rate improved by 21% with TeamSTEPPS (P < .001). The mean room turnover time did not change. Patient safety issues declined from an initial rate of 16% to 6% at midyear and remained stable (P < 0.001). TeamSTEPPS was associated with improved operating room efficiency and diminished patient safety issues in the operating room.

Keywords
TeamSTEPPS, medical errors, communication, operative briefings

http://crossmark.crossref.org/dialog/?doi=10.1177%2F1062860615583671&domain=pdf&date_stamp=2015-04-17

Weld et al 409

communication. TeamSTEPPS was implemented
throughout the Department of Surgery in the operating
rooms in November 2013. For each operating room, a
preoperative briefing was conducted in the operating
room 30 minutes prior to the planned start time for that
room. The briefings typically lasted 5 to 10 minutes and
covered pertinent aspects of all cases planned for the
day (Table 1). The attending and resident urologists,
anesthesiologist or nurse anesthetist, circulating nurse,
surgical technician, and various other trainees attended
the briefings. The team also attended a postoperative
briefing (Table 1) following each case before leaving the
operating room, usually during patient arousal; these
briefings lasted less than 5 minutes. During the “con-
cerns” section of the postoperative briefing, health care
team members identified any potential or realized patient
safety issues. Issues were annotated and reported to the
department chief to identify root causes and trends, and
to administer corrective actions. The briefings were led
by the resident urologist assigned to the operating room
under direct supervision of the attending urological
surgeon.

Patient movement from the pre-anesthesia unit through
the operating room and back to the postanesthesia unit
was timed and documented in the anesthesia record. The
“anesthesia start to in-room time” was the time that the
anesthesia representative took from first seeing the
patient in the pre-anesthesia unit until the patient entered
the operating room. The “in-room to turn-over-to-surgeon
time” was the time that the anesthesia representative
needed to induce anesthesia and allow the surgical team
to begin positioning and preparing the patient. The “turn-
over-to-surgeon to surgical start time” encompassed the
time from when the anesthesia team turned the case over
to the surgeon until the surgeon actually started the case.
The “surgical time” only included the time from proce-
dure start to finish and did not include anesthesia induc-
tion, patient positioning and preparation, or arousal from
anesthesia. The “case time” is the sum of all previously
described times starting when the anesthesia team first
saw the patient in the pre-anesthesia unit until the case
completed. An on-time start was defined as the patient
entering the operating room with an anesthesiologist or
nurse anesthetist prior to or at the planned start time, typi-
cally 7:30 am. An operating room turnover time was
defined as the time between the patient leaving the oper-
ating room and the next patient entering the room.

A database was retrospectively constructed of all per-
tinent times for all urology cases from November 2012
to October 2013, prior to TeamSTEPPS implementation,
to serve as a baseline cohort. Operative times also were
recorded for all urology cases after implementation of
TeamSTEPPS from November 2013 to October 2014.
Each postoperative debriefing that raised patient safety

issues was categorized as being personnel, instrument/
equipment, or support service (pharmacy/radiology/lab-
oratory) related. Personnel-related incidents included
issues such as unfamiliarity with the procedure or equip-
ment that affected patient safety. Instrument and equip-
ment issues typically involved incorrect instruments or
instrument sets and equipment malfunctions. For exam-
ple, some of these issues resulted in potential or realized
patient safety issues related to visibility during endo-
scopic procedures or control of hemorrhage. Support
service issues generally involved slow responses to
requests during surgery resulting in prolonged anesthesia

Table 1. Briefing Checklists.

Preoperative Briefing
Team introductions
Surgeon:
• Procedures and plan for the day
• Instruments/supplies not normally used
• Expected specimens/implant verification
• Critical moments of the case (eg, no counting, lunch

breaks)
• Potential complications/blood loss
• Special requests (eg, X-ray, equipment representatives)
• Postoperative plan (eg, PACU, ICU)
• Concerns
Anesthesia:
• Antibiotics
• Allergies
• Anesthesia plan
• Blood availability
• Concerns
Nurse/technician:
• Equipment/instrument/supplies/implants
• Contact precautions
• Correct bed
• Positioning
• Concerns

Postoperative Debrief
Technician/nurse:
• Counts correct
• Wound classification
• Medications
• Concerns
Anesthesia:
• Verify postoperative plan
• Concerns
Surgeon:
• Procedures performed
• Verify specimens
• Verify implants
• What went right/wrong
• Concerns

Abbreviations: ICU, intensive care unit; PACU, postanesthesia care unit.

410 American Journal of Medical Quality 31(5)

time, which exposed the patient to unnecessary risks.
Additionally, each identified patient safety issue was fur-
ther categorized as potential or realized. Potential issues
were defined as events that occurred with no resultant
adverse patient outcome but that could cause complica-
tions under different clinical circumstances. Realized
issues were defined as events directly related to adverse
patient outcomes.

To evaluate operating room efficiency data, statistical
comparisons with the appropriate t test or χ2 test were
made between cases with TeamSTEPPS versus before
TeamSTEPPS. For patient safety data, a χ2 test was per-
formed comparing patient safety issues from the first 6
months of the year during TeamSTEPPS implementation
to issues during the second 6 months. The result of a χ2
test performed to compare patient safety issues during the
first month of TeamSTEPPS implementation (November
2013) to issues at midyear in May 2014 is reported in
Figure 1.

Results

A total of 1481 cases with TeamSTEPPS and 1513 cases
before TeamSTEPPS were compared. Table 2 shows the
cases performed and categorized by procedure type and
ACGME category with and before TeamSTEPPS. Also,
cases were categorized by first-start cases, turnover cases,
and add-on urgent cases. The distribution of cases in all

categories is similar between years with and before
TeamSTEPPS.

Table 3 shows the operating room efficiency data with
and before TeamSTEPPS. The mean in-room to turn-
over-to-surgeon time, mean turn-over-to-surgeon to sur-
gical start time, mean surgical time, and mean case time
were significantly shorter with TeamSTEPPS. The mean
case time including anesthesia and surgical time decreased
by 10.1% with TeamSTEPPS. The on-time first-start rate
was significantly higher with TeamSTEPPS. The mean
late interval for first-start cases and mean turnover time
was similar before and with TeamSTEPPS.

Table 4 shows the patient safety data categorized by
potential versus realized issues and by source of issue
(personnel, instrument/equipment, or support service
related). The data are divided into issues that occurred
during the first 6 months of TeamSTEPPS implementa-
tion and issues during the second 6 months. The inci-
dence of patient safety issues (combined potential and
realized) and the incidence of realized issues decreased
significantly. The incidence of potential issues was statis-
tically similar.

Figure 1 illustrates the percentage of cases during
which a potential or realized patient safety issue was
identified by month. The numerator is the number of
operative cases with one or more potential or realized
issues reported during the respective month, and the
denominator is the total number of operative cases for

*

0

2

4

6

8

10

12

14

16

18

Nov Dec Jan Feb Mar Apr May Jun Jul Aug Sep Oct

%
P

at
ie

nt

S

af
et

y
Is

su
es

w
ith

T
ea

m
S

T
E

P
P

S

Personnel Instruments/Equipment Pharmacy/Radiology/Lab

Figure 1. Percentage of potential or realized patient safety issues by source category identified at TeamSTEPPS postoperative
briefings by month.
Abbreviation: TeamSTEPPS, Team Strategies and Tools to Enhance Performance and Patient Safety.

Weld et al 411

Table 2. Surgical Case Distribution.

Number of Cases
Before TeamSTEPPS

Number of Cases
With TeamSTEPPS

Total Number
of Cases

Procedure type
Endoscopic/ESWL 855 839 1694
Laparoscopic/robotic 114 103 217
Open 544 539 1083
ACGME category
ESWL 24 20 44
Female 50 75 125
General 4 65 109
Male urethra 43 32 75
Pediatric major 26 25 51
Pediatric minor 171 167 338
Pelvic oncology 70 62 132
Penis 35 36 71
Percutaneous 30 15 45
Prostate biopsy 27 26 53
Retroperitoneal oncology 75 59 134
Scrotal/inguinal 164 153 317
Transurethral 447 396 843
Ureteroscopy 307 350 657
Totals 1513 1481 2994
First-start cases 479 457 936
Turnover cases 861 851 1712
Add-on urgent cases 173 173 346

Abbreviations: ACGME, Accreditation Council for Graduate Medical Education; ESWL, extracorporeal shock wave lithotripsy; TeamSTEPPS,
Team Strategies and Tools to Enhance Performance and Patient Safety.

Table 3. Operating Room Efficiency Data.

Before TeamSTEPPS With TeamSTEPPS P Value

Mean anesthesia start to in-room time (minutes) 10.97 11.30 .200
Mean in-room to turnover-to-surgeon time (minutes) 14.45 13.75 .017a

Mean turnover-to-surgeon to surgical start time (minutes) 16.29 15.19 .004a

Mean surgical time (minutes) 83.45 72.23 <.001a

Mean case time (minutes) 125.16 112.47 <.001a

On-time first-start rate 48.9% 69.8% <.001a

Mean late interval for first-start cases (minutes) 12.46 14.54 .212
Mean turnover time (minutes) 41.48 40.49 .193

Abbreviation: TeamSTEPPS, Team Strategies and Tools to Enhance Performance and Patient Safety.
aStatistically significant.

that month. In the initial month of TeamSTEPPS, the
overall rate of patient safety issues was 15.8%. The rate
declined to 6.2% at midyear and remained near that rate
for the remainder of the year (P < .001). The most sub- stantial improvement was realized in the instrument/ equipment category.

Discussion

Barriers to the implementation of TeamSTEPPS in the
operating room include concerns about time requirements

for the briefings. The preoperative briefings took place in
the operating room 30 minutes prior to the planned start
time for that room and lasted 5 to 10 minutes. The post-
operative briefings occurred after each case during patient
arousal from anesthesia and lasted less than 5 minutes.
Therefore, additional provider time was only required for
the preoperative briefings.

Physicians, nurses, and technicians are trained in sepa-
rate, diverse educational programs that teach attention to
details regarding their specific roles. Given the interdisci-
plinary nature of health care, communication is critical to

412 American Journal of Medical Quality 31(5)

ensure patient safety. Teams make fewer mistakes than
individual members when each team member accepts his
or her responsibilities, knows the responsibilities of other
team members, and feels comfortable communicating to
those other members.11,12 TeamSTEPPS provides a
framework to foster communication about patient care
issues regardless of role or position in the operating room.
Nonetheless, as with pilots leading crew resource man-
agement strategies among crew members, the surgeon
remains the leader of the operating room and is respon-
sible for establishing an open environment conducive to
communication by all operating room health care mem-
bers.13 This leadership skill involving communication is
essential to patient safety and can be taught to surgical
residents simultaneously with the technical skills required
for surgery.

Operating room efficiency is increasingly used as a
marker of quality of surgical care. Operating room time is
estimated to cost $15 per minute and constitutes approxi-
mately 40% of hospital revenue.14 As a result, efforts to
improve quality in the hospital setting are often focused
on reducing preventable delays and increasing effi-
ciency.15 Nundy et al associated operating room briefings
with a 31% reduction in operating room delays.16 The cur-
rent study showed an average reduction of 10% in overall
case time with TeamSTEPPS, and the on-time first-start
rate increased from 49% to 70% with TeamSTEPPS.
Turnover time did not improve with TeamSTEPPS possi-
bly because of minimal impact of improved team com-
munication on the processes involved in room turnover,
such as housekeeping and instrument sterilization, which
require a fixed interval of time. Improved efficiency and
capacity allows more operations to be performed during
the daytime, when personnel familiar with the scheduled
cases are readily available. Fewer operations are per-
formed at night, when teams unfamiliar with one another

are more likely to work together. Thus, TeamSTEPPS has
the potential to improve both quality and safety while
increasing efficiency and creating a more predictable
work environment.

The implementation of TeamSTEPPS is the most likely
factor resulting in the improvement in operating effi-
ciency. The Department of Urology experienced minimal
turnover in surgical staff, with all subspecialties of urol-
ogy offered throughout the 2-year study period. Also, the
beneficiaries of medical care at the military medical cen-
ter study site consist of a relatively fixed patient popula-
tion. Additionally, there were no other significant changes
in facilities, administrative policies, or services offered to
the patient population to account for a change in effi-
ciency. To further analyze the impact of TeamSTEPPS on
operating room efficiency, operating room data from
another comparable service were evaluated. The most
comparable service to urology in terms of numbers of sur-
geons and operative case load at the study medical center
is the Department of Orthopedics. The orthopedic mean
surgery time the year prior to TeamSTEPPS was 115 min-
utes compared to 96 minutes the year TeamSTEPPS was
implemented (P < .01). This 19-minute improvement on average per orthopedic case was similar to the 11-minute improvement realized by the urology service. In terms of on-time first-start rates, the orthopedic rate improved from 63% to 75% of first cases starting on time with TeamSTEPPS implementation (P = .01). By comparison, the urology service experienced an improvement from 49% to 70% of first cases starting on time. These similar improvements in the Department of Orthopedics opera- tive data at the medical center further support the impact of TeamSTEPPS on operating room efficiency.

Literature supporting the patient safety benefits of
operative briefings is plentiful. Preoperative briefings
have been shown to reduce medical errors.17-19 Multiple

Table 4. Patient Safety Data.

First 6 Months With TeamSTEPPS Second 6 Months With TeamSTEPPS P Value

Total number of cases 699 782
Patient safety issues 73 (10.4%) 43 (5.4%) .022a

Potential issues 34 (4.9%) 27 (3.5%) .610
Personnel 12 11
Instrument/equipment 13 13
Support services 9 3
Realized issues 39 (5.6%) 16 (2.0%) .004a

Personnel 16 9
Instrument/equipment 19 7
Support services 4 0

Abbreviation: TeamSTEPPS, Team Strategies and Tools to Enhance Performance and Patient Safety.
aStatistically significant.

Weld et al 413

studies show that briefings improve team communica-
tion, cohesion, and interprofessional insight.20,21
Operating room staff report increased confidence to voice
a concern about potential patient safety issues when brief-
ings are held.16 Postoperative briefings offer benefit
through reflective learning, deliberate practice, and
immediate feedback.22 The current study demonstrated a
reduction in overall patient safety issues and realized
issues over the course of the first year since TeamSTEPPS
implementation. The rate of potential patient safety issues
was unchanged from the first half of the year to the sec-
ond. Patient safety issues related to surgical instruments
and equipment accounted for 51% of all reported patient
safety issues, which is comparable to a recent article
reporting 63% of issues related to instruments and equip-
ment.22 The current study realized the most substantial
improvement within the surgical instruments and equip-
ment category. Surgeons utilized the postoperative brief-
ing reports to elevate instrument or equipment problems
to hospital leadership for prompt attention. Ironically, as
surgery adopts more technically advanced tools such as
fiber optics and robotics, the present study suggests that a
basic proficiency of communication among operating
room personnel about those instruments and equipment
actually improves patient safety.

TeamSTEPPS promotes a culture of safety by encour-
aging communication. A growing confidence to commu-
nicate among health care personnel signals that patient
safety is truly valued.18,21,23,24 Characteristics of a strong
culture of safety include a commitment to discuss and
learn from patient safety issues and incorporation of a
nonpunitive system for reporting and analyzing issues.

This study was limited to cases performed at a large
teaching medical center. As such, dedicated urology
nurses and technicians are not available for every urology
case. The findings of increased operating room efficiency
with improved communication may not be applicable to
institutions that always have dedicated personnel for spe-
cific urology procedures. Despite minimal surgeon turn-
over and similar case volumes and types during the 2
years of this study, individual case complexity variations
may potentially introduce confounding factors when
comparing between years. Also, surgeon experience is
gained over the course of the study, potentially influenc-
ing operative times. The patient safety issues in this study
were reported by the individual operating room teams.
Complacency in reporting or fear of reprisal may have
affected reporting patterns.

Conclusion

TeamSTEPPS provided a framework for urology resi-
dents to open channels of communication among inter-
disciplinary health care team members in the operating

room. This process improved operating room efficiency
by reducing mean case time by 10.1% and increasing on-
time first-start rates by 20.9%. Also, patient safety
improved, with the rate of reported patient safety issues
declining by more than half within the first 6 months of
TeamSTEPPS implementation.

Authors’ Note

The authors prepared this work within the scope of their
employment with the United States Army Medical Corps and
United States Air Force Medical Corps.

Declaration of Conflicting Interests

The authors declared no potential conflicts of interest with respect
to the research, authorship, and/or publication of this article.

Funding

The authors received no financial support for the research,
authorship, and/or publication of this article.

References

1. Kohn LT, Corrigan JM, Donaldson MS. To Err Is Human:
Building a Safer Health System. Washington, DC: National
Academies Press; 1999.

2. World Alliance for Patient Safety. WHO Guidelines for Safe
Surgery. Geneva, Switzerland: World Health Organization;
2008.

3. Institute for Healthcare Improvement. Safety briefings.
http://www.wsha.org/files/82/SafetyBriefings . Accessed
September 12, 2014.

4. The Joint Commission. Health care at the crossroads: strate-
gies for improving the medical liability system and prevent-
ing patient injury. www.jointcommission.org/assets/1/18/
Medical_Liability . Accessed September 12, 2014.

5. Riebschleger M, Bohl J. New standards for teamwork: dis-
cussion and justification. In: Philibert I, Amis S, eds. The
ACGME 2011 Duty Hour Standards: Enhancing Quality of
Care, Supervision, and Resident Professional Development.
Chicago, IL: Accreditation Council for Graduate Medical
Education; 2011:53-56.

6. Zeltser MV, Nash DB. Approaching the evidence basis for
aviation-derived teamwork training in medicine. Am J Med
Qual. 2010;25:13-23.

7. Baker DP, Beaubien JM, Holtzman AK. DoD Medical
Team Training Programs: An Independent Case Study
Analysis. Washington, DC: American Institutes for
Research;2003.

8. Baker DP, Gustafson S, Beaubien JM, Salas E, Barach
P. Medical Teamwork and Patient Safety: The Evidence-
based Relation. Washington, DC: American Institutes for
Research; 2003.

9. Alonso A, Baker D, Holtzman A, et al. Reducing medical
error in the military health system: how can team training
help? Hum Resour Manage Rev. 2006;16:396-415.

10. Clancy CM. TeamSTEPPS: optimizing teamwork in the
perioperative setting. AORN J. 2007;86:18-22.

http://www.jointcommission.org/assets/1/18/Medical_Liability

http://www.jointcommission.org/assets/1/18/Medical_Liability

414 American Journal of Medical Quality 31(5)

11. Volpe CE, Cannon-Bowers JA, Salas E, Specter PE. The
impact of cross-training on team functioning: an empirical
investigation. Hum Factors. 1996;38:87-100.

12. Smith-Jentsch KA, Salas E, Baker DP. Training team
performance-related assertiveness. Pers Psychol. 1996;49:
909-936.

13. France DJ, Leming-Lee S, Jackson T, Feistritzer NR, Higgins
MS. An observational analysis of surgical team compliance
with perioperative safety practices after crew resource man-
agement training. Am J Surg. 2008;195:546-553.

14. Bacchetta MD, Girardi LN, Southard EJ, et al. Comparison
of open versus bed-side percutaneous dilatational tracheos-
tomy in the cardiothoracic surgical patient: outcomes and
financial analysis. Ann Thorac Surg. 2005;79:1879-1885.

15. Rutter T, Brown A. Contemporary operating room man-
agement. Adv Anesth. 1994;11:173-214.

16. Nundy S, Mukherjee A, Sexton JB, et al. Impact of preop-
erative briefings on operating room delays: a preliminary
report. Arch Surg. 2008;143:1068-1072.

17. Lingard L, Regehr G. Evaluation of a preoperative check-
list and team briefing among surgeons, nurses, and anesthe-
siologists to reduce failures in communication. Arch Surg.
2008;143:12-17.

18. Defontes J, Surbida S. Preoperative safety briefing project.
Perm J. 2003;8:21-27.

19. Makary MA, Mukjerjee A, Sexton BJ, et al. Operating
room briefings and wrong-site surgery. J Am Coll Surg.
2006;10:236-243.

20. Lingard L, Espin S, Rubin B, et al. Getting teams to talk:
development and pilot implementation of a checklist to
promote interprofessional communication in the OR. Qual
Saf Health Care. 2005;14:340-346.

21. Leonard M, Graham S, Bonacum D. The human factor:
the critical importance of effective teamwork and com-
munication in providing safe care. Qual Saf Health Care.
2004;13:85-90.

22. Papaspyros SC, Javangula KC, Adluri RKP, O’Regan DJ.
Briefing and debriefing in the cardiac operating room.
Analysis of impact on theatre team attitude and patient
safety. Interact Cardiovasc Thorac Surg. 2010;10:43-47.

23. Bleakley A. A common body of care: the ethics and politics
of teamwork in the operating theater are inseparable. J Med
Philos. 2006;31:305-322.

24. Allard J, Bleakley A, Hobbs A, Coombes L. Pre-surgery
briefings and safety climate in the operating theatre. BMJ
Qual Saf. 2011;20:711-717.

Copyright @ Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

JONA
Volume 37, Number 10, pp 432-435
Copyright B 2007 Wolters Kluwer Health |
Lippincott Williams & Wilkins

Diffusing Confusion
Among Evidence-Based
Practice, Quality Improvement,
and Research
Robin Purdy Newhouse, PhD, RN, CNA, CNOR

In this department, hot topics in
nursing outcomes, research, and
evidence-based practice relevant
to the nurse administrator are
highlighted. The goal is to dis-
cuss the practical implications for
nurse leaders in diverse health-
care settings. Content includes
evidence-based projects and deci-
sion making, locating measurement
tools for quality improvement
and safety projects, using outcome
measures to evaluate quality, prac-
tice implications of administrative
research, and exemplars of proj-
ects that demonstrate innova-
tive approaches to organizational
problems.

In a recent evidence-based practice
(EBP) workshop, a nurse executive
asked: ‘‘What is the difference
between EBP and quality improve-
ment (QI) and benchmarking?’’ In
a different workshop, another
asked: ‘‘Do I need an institutional

review board approval for my
EBP project?’’ It becomes confus-
ing when organizational EBP, QI,
and research activities are all re-
ferred to as EBP. The issue is that
these activities often overlap.
This column assesses the unique
and overlapping relationships
among EBP, QI, and research. Defi-
nitions are provided in Figure 1.
Using an organizational problem
of increased pressure ulcer rates,
examples of each approach are
provided in Figure 2.

Research
Research is a systematic investiga-
tion, including research develop-
ment, testing, and evaluation
designed to develop or contribute
to generalizable knowledge.1 Be-
cause nursing research is under-
developed in a number of areas,
scientific evidence (research) is not
available to inform practice when
a problem emerges or questions
are raised about nursing processes
included in organizational policies.

The research process includes
identification of the problem,
selection of a conceptual frame-
work or theoretical model that
describes the relationships be-
tween study variables, generation

of hypotheses or research ques-
tions, and a plan for the study
design and method. The design
and method are based on the
state of knowledge of the prob-
lem and the gap in the evidence.

The design frames the appro-
priate research approach (experi-
mental, quasi-experimental, or
nonexperimental). The sample
consists of the number and type
of subjects needed to identify a
statistically significant difference
if one exists. The method includes
appropriate controls, including mea-
sures or instruments with adequate
estimates of reliability and valid-
ity. Standard research procedures
are established that include a plan
for interventions, measurement,
data collection, and statistical
analysis. Institutional review
board approval is obtained before
implementation of the research
protocol.

The design and methods of
research seek to control as many
variables as possible so that a link
is established between the inter-
vention (or concept of interest)
and effect (or outcome). Using a
well-planned and implemented
research approach to solve a clini-
cal, administrative, or education

432 JONA � Vol. 37, No. 10 � October 2007

Evidence and the
Executive

Author Affiliation: Associate Profes-
sor and Assistant Dean, Doctor of Nurs-
ing Practice, University of Maryland,
School of Nursing, Baltimore, Maryland.

Correspondence: University of Mary-
land, School of Nursing, 655 W. Lombard
Street, Room 516B, Baltimore, MD 21201-
1579 (newhouse@son.umaryland.edu).

Copyright @ Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

problem informs decisions in
healthcare organizations, extend-
ing beyond lessons learned in
one organization, to generalizable
knowledge that can be applied in
similar settings.

Quality Improvement

Quality improvement is a process
by which individuals work to-
gether to improve systems and
processes with the intention to
improve outcomes.2 An alterna-
tive definition is that QI is a data-
driven systematic approach to
improving care locally.3 The dis-
tinction between research and QI
has been recently reviewed, defined,
and debated.3-5

One familiar framework to
guide the QI process is plan-do-
study-act.6 Examples of ap-
proaches to data presentation
from QI efforts include control,
radar, Pareto charts, and cause-
and-effect diagrams.7 Although
approaches to QI have undergone
an evolution to improve the sys-

tematic approach, publications of
results are usually limited to les-
sons learned, instead of general-
izable results. In addition, there
has been an increase in investiga-
tors who conduct health services
research with their research activi-
ties focused on QI interventions.
These investigators intend to gen-
eralize results and approach the
organizational improvement inter-
vention as a research study.

Evidence-Based Practice

An often-cited landmark defini-
tion of EBP is: ‘‘Evidence-based
medicine is the conscientious,
explicit, and judicious use of
current best evidence in making
decisions about the care of indi-
vidual patients. The practice of
evidence-based medicine means
integrating individual clinical
expertise with the best available
external clinical evidence from
systematic research.’’8(p71)

This definition is appropriate
for nursing research utilization,

but insufficient for EBP because
the best evidence available to
address nursing problems is often
not research. In addition, nursing
practice is nested within organi-
zations, and appropriate organi-
zational infrastructure fosters
system and individual uptake
and use of evidence. The defini-
tion of EBP can be expanded to
the following: EBP is a problem-
solving approach to clinical deci-
sion making in a healthcare
organization that integrates the
best available scientific evidence
with the best available experien-
tial (patient and practitioner)
evidence, considers internal and
external influences on practice,
and encourages critical thinking
in the judicious application of
such evidence to care of the
individual patient, patient popu-
lation, or system.9 Note that this
approach uses the best available
evidence, not one source of evi-
dence that supports current prac-
tice. A rigorous search strategy is
used, followed by retrieval and
review of evidence that includes
grading the strength and quality,
and then applying the results
through implementation and
evaluation of the recommenda-
tions. This definition includes the
organization’s experience.

Experiential evidence ex-
tends beyond the individual pro-
vider or patient, to activities
such as QI, benchmarking, or
organizational or program out-
come monitoring. Rycroft-Malone
et al10 call this organizational
evidence ‘‘local context’’ and
suggest that far more work is
needed to understand how this
type of evidence is collected and
incorporated with other types
of evidence to inform healthcare
decisions.

Figure 1. Definitions.

JONA � Vol. 37, No. 10 � October 2007 433

Evidence and the Executive

Copyright @ Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

The Overlapping Relationships

Research and QI (as a form of
experiential evidence) both in-
form EBP. Research provides a
higher level of evidence than QI
and is the major source of evi-
dence in the medical discipline.
Quality improvement provides
real-life experience and descrip-
tive data within the context of
the organization, making the
rapid cycle approach and evalua-
tion of outcomes very actionable.

However, there are 2 major
problems with using QI data as a
source of evidence.11 First, usu-
ally, the QI process does not meet
fundamental standards for the
conduct or publication of research.

Second, the interventions used in
QI processes often are not based
on theory that predicts their
success. These deficiencies in the
QI process produce results that
are not transferable to other
organizations (generalizable) and
do not measure variables or data
that are needed to explain the
results, designs that lack the ability
to draw causal inferences, and a
number of additional weaknesses
(threats to internal validity).

Research and EBP processes
both inform QI. When develop-
ing strategies to improve outcomes
in QI initiatives, research evidence
is reviewed, and an intervention or
interventions are selected to im-

prove the likelihood of success
for the change. Individual research
studies may be used to inform QI
action, as well as the recommen-
dations from an EBP evidence
review. The evidence review may
contain scientific (such as experi-
mental studies) or experiential
(such as consensus or expert opin-
ion) sources. Scientific evidence
(research) provides a higher level
of generalizability or application
to similar settings than experien-
tial evidence.

Evidence-based practice and
QI both inform opportunities
for research. As the team evalu-
ates the QI outcomes and les-
sons learned in their rapid cycle

Figure 2. Examples of research, QI, and EBP.

434 JONA � Vol. 37, No. 10 � October 2007

Evidence and the Executive

Copyright @ Lippincott Williams & Wilkins. Unauthorized reproduction of this article is prohibited.

improvements, they may iden-
tify descriptive improvements in
areas where there are gaps in
the evidence to support the need
for research to test a new inter-
vention. Likewise, during the
evidence review and synthesis
phase of the EBP process, gaps
in knowledge are identified.
These gaps provide the opportu-
nity to generate research ques-
tions or hypothesis and design a
research study to measure the
association or differences between
variables.

Conclusion
Major forces drive the need for
nurses to demonstrate basic and
advanced competency in EBP,
QI, and research. These forces
include disparities and deficits in
quality of care for patients, in-
creasing evidence to support the
effectiveness of interventions,
national efforts to standardize
performance measures, and a
focus on improving the health-
care work environments.

Efforts to improve work
environments necessitate that we
apply evidence to healthcare de-
livery, align payment policies
with QI, and prepare the work-
force.12 Applying evidence to
practice requires that we apply
scientific knowledge systemati-
cally, building infrastructure to
support decision making, setting
goals for improvement, and de-
veloping measures to assess qual-
ity.12 Preparing the workforce
involves developing competencies
in QI, EBP, informatics, patient-

centered care, and interdisciplinary
collaboration.13

To advance quality, an inter-
disciplinary common vision, lan-
guage, and processes are required.
Research, QI, and EBP are tools
to identify and describe problems,
explain relationships between fac-
tors of interest, and implement
interventions or strategies with a
clear rationale. Nurse executives
have an important role in diffus-
ing the confusion between EBP,
QI, and research; building collabo-
rative relationships; and establish-
ing organizational infrastructure to
support continued improvements
in healthcare quality.14,15 A precur-
sor to leading is understanding the
distinct differences, yet overlap-
ping associations, between these
3 important activities.

REFERENCES

1. Department of Health and Human
Services. Code of Federal Regula-
tions. Title 45. Public Welfare. Part
46: Protection of Human Subjects
(45 CFR 46.102(d)). Washington,
DC: Department of Health and

Human Services;

2002.

2. Committee on Assessing the System
for Protecting Human Research Par-

ticipants. Responsible Research: A
Systems Approach to Protecting
Research Participants. Washington,
DC: The National Academies Press;

2002.

3. Baily MA, Bottrell M, Lynn J,

Jennings B. The Ethics of Using QI
Methods to Improve Health Care
Quality and Safety: A Hastings Cen-
ter Special Report. Garrison, NY:
The Hastings Center; 2006.

4. Newhouse RP, Pettit JC, Poe S,

Rocco L. The slippery slope: differen-

tiating between quality improvement

and research. J Nurs Adm. 2006;36(4):
211-219.

5. Wise LC. Ethical issues surrounding

quality improvement activities: a review.

J Nurs Adm. 2007;37(6):272-278.
6. Langley GJ, Nolan KM, Nolan TW,

Norman CL, Provost LP. The Improve-
ment Guide: A Practical Approach to
EnhancingOrganizationalPerformance.
San Francisco, CA: Jossey-Bass; 1996.

7. Brassard M, Ritter D. The Memory
Jogger: A Pocket Guide of Tools for
Continuous Improvement and Effec-
tive Planning. Salem, NH: GOAL/
QPC; 1994.

8. Sackett KL, Rosenberg WM, Gray JA,

Haynes RB, Richardson WS. Evidence
based medicine: what it is and what it

isn’t. BMJ. 1996;312:71-72.
9. Newhouse R, Dearholt S, Poe S,

Pugh LC, White K. The Johns Hopkins
Nursing Evidence-Based Practice
Model. Baltimore, MD: The Johns
Hopkins Hospital, Johns Hopkins Uni-

versity School of Nursing; 2005.
10. Rycroft-Malone J, Seers K, Titchen A,

Harvey G, Kitson A, McCormack B.

What counts as evidence in evidence-
based practice? J Adv Nurs. 2004;
47(1):81-90.

11. Shojania KG, Grimshaw JM. Evidence-

based quality improvement: the state of
the science. Health Aff (Millwood).
2005;24(1):138-150.

12. Committee on Quality of Health

Care in America, Institute of Medi-
cine. Crossing the Quality Chasm: A
New Health System for the 21st
Century. Washington, DC: National
Academy Press; 2001.

13. Committee on the Health Professions

Education Summit Board on Health

Care Services. In: Greiner AC, Knebel E,
eds. Health Professions Education: A
Bridge to Quality. Washington, DC: The
National Academies Press; 2003.

14. American Nurses Association. Scope
and Standards for Nurse Administra-
tors, 2nd ed. Washington, DC: Nurse-
books; 2004.

15. American Nurses Association. Nurs-
ing: Scope and Standards of Practice.
Washington, DC: American Nurses

Association; 2004.

JONA � Vol. 37, No. 10 � October 2007 435

Evidence and the Executive

Original Article

The Establishment of Evidence-Based
Practice Competencies for Practicing
Registered Nurses and Advanced Practice
Nurses in Real-World Clinical Settings:
Proficiencies to Improve Healthcare Quality,
Reliability, Patient Outcomes, and Costs
Bernadette Mazurek Melnyk, RN, PhD, CPNP/PMHNP, FNAP, FAANP, FAAN •
Lynn Gallagher-Ford, RN, PhD, DPFNAP, NE-BC • Lisa English Long, RN, MSN, CNS •
Ellen Fineout-Overholt, RN, PhD, FAAN

Keywords

evidence-based
practice,

competencies,
healthcare quality

ABSTRACT
Background: Although it is widely known that evidence-based practice (EBP) improves healthcare
quality, reliability, and patient outcomes as well as reduces variations in care and costs, it is still
not the standard of care delivered by practicing clinicians across the globe. Adoption of specific
EBP competencies for nurses and advanced practice nurses (APNs) who practice in real-world
healthcare settings can assist institutions in achieving high-value, low-cost evidence-based health
care.

Aim: The aim of this study was to develop a set of clear EBP competencies for both practicing
registered nurses and APNs in clinical settings that can be used by healthcare institutions in their
quest to achieve high performing systems that consistently implement and sustain EBP.

Methods: Seven national EBP leaders developed an initial set of competencies for practicing
registered nurses and APNs through a consensus building process. Next, a Delphi survey was
conducted with 80 EBP mentors across the United States to determine consensus and clarity
around the competencies.

Findings: Two rounds of the Delphi survey resulted in total consensus by the EBP mentors,
resulting in a final set of 13 competencies for practicing registered nurses and 11 additional
competencies for APNs.

Linking Evidence to Action: Incorporation of these competencies into healthcare system ex-
pectations, orientations, job descriptions, performance appraisals, and clinical ladder promotio

n

processes could drive higher quality, reliability, and consistency of healthcare as well as reduce
costs. Research is now needed to develop valid and reliable tools for assessing these competen-
cies as well as linking them to clinician and patient outcomes.

BACKGROUND
Evidence-based practice (EBP) is a life-long problem-solving
approach to the delivery of health care that integrates the best
evidence from well-designed studies (i.e., external evidence)
and integrates it with a patient’s preferences and values
and a clinician’s expertise, which includes internal evidence
gathered from patient data. When EBP is delivered in a context
of caring and a culture as well as an ecosystem or environment
that supports it, the best clinical decisions are made that

yield positive patient outcomes (see Figure 1; Melnyk &
Fineout-Overholt, 2011).

Research supports that EBP promotes high-value health
care, including enhancing the quality and reliability of health
care, improving health outcomes, and reducing variations in
care and costs (McGinty & Anderson, 2008; Melnyk, Fineout-
Overholt, Gallagher-Ford, & Kaplan, 2012; Pravikoff, Pierce, &
Tanner, 2005). Even with its tremendous benefits, EBP is not
the standard of care that is practiced consistently by clinicians
throughout the United States and globe (Fink, Thompson, &

Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15. 5
C© 2014 Sigma Theta Tau International

EBP Competencies for Practice

Figure 1. The merging of science and art: EBP within a context of caring and an EBP culture and environment
results in the highest quality of healthcare and patient outcomes. Reprinted from Melnyk, B. M., & Fineout-
Overholt, E. (2011). Evidence-based practice in nursing and healthcare. A guide to best practice. Philadelphia:
Lippincott Williams & Wilkins. Reprinted with permission.

Bonnes, 2005; Melnyk, Grossman, et al., 2012). Tremendously
long lag times continue to exist between the generation of re-
search findings and their implementation in real-world clinical
settings to improve care and outcomes due to multiple barri-
ers, including: (a) misperceptions by clinicians that it takes
too much time, (b) inadequate EBP knowledge and skills, (c)
academic programs that continue to teach the rigorous pro-
cess of how to conduct research instead of an evidence-based
approach to care, (d) organizational cultures that do not sup-
port it, (e) lack of EBP mentors and appropriate resources, and
(f) resistance by colleagues, managers or leaders, and physi-
cians (Ely, Osheroff, Chambliss, Ebell, & Rosenbaum, 2005;
Estabrooks, O’Leary, Ricker, & Humphrey, 2003; Jennings &
Loan, 2001; Melnyk, Fineout-Overholt, Feinstein, et al., 2004;
Melnyk, Fineout-Overholt, et al., 2012; Titler, 2009).

The Seven-Step EBP Process and Facilitating
Factors
The seven steps of EBP start with cultivating a spirit of inquiry
and an EBP culture and environment as without these ele-
ments, clinicians will not routinely ask clinical questions about
their practices (see Table 1). After a clinician asks a clinical
question and searches for the best evidence, critical appraisal
of the evidence for validity, reliability, and applicability to prac-
tice is essential for integrating that evidence with a clinician’s
expertise and patient preferences to determine whether a cur-
rent practice should be changed. Once a practice change is
made based on this process, evaluating the outcomes of that

Table 1. The Seven Steps of Evidence-Based Practice

Step0:Cultivate a spirit of inquiry alongwith anEBPculture
andenvironment

Step 1: Ask thePICO(T) question

Step2: Search for thebest evidence

Step3:Critically appraise the evidence

Step4: Integrate the evidencewith clinical expertise and
patient preferences tomake thebest clinical decision

Step5: Evaluate the outcome(s) of theEBPpractice change

Step6:Disseminate theoutcome(s) (Melnyk&
Fineout-Overholt, 2011)

change is imperative to determine its impact. Finally, dissemi-
nation of the process and outcomes of the EBP change is key so
that others may learn of practices that produce the best results.

The systematic seven-step process of EBP provides a plat-
form for facilitating the best clinical decisions and ensuring the
best patient outcomes. However, consistent implementation of
the EBP process and use of evidence by practicing clinicians
is challenging. Typical barriers to EBP cited by clinicians in-
clude: time limitations, an organizational culture and philoso-
phy of “that is the way we have always done it here,” inadequate

6 Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15.
C© 2014 Sigma Theta Tau International

Original Article
EBP knowledge or education, lack of access to databases that
enable searching for best evidence, manager and leader resis-
tance, heavy workloads, resistance from nursing and physician
colleagues, uncertainty about where to look for information
and how to critically appraise evidence, and limited access to
resources that facilitate EBP (Gerrish & Clayton, 2004; Mel-
nyk, Fineout-Overholt, et al., 2012; Pravikoff, Pierce, & Tanner,
2005; Restas, 2000; Rycroft-Malone et al., 2004).

There are also factors that facilitate EBP, including: beliefs
in the value of EBP and the ability to implement it, EBP men-
tors who work with direct care clinicians to implement best
practices, supportive EBP contexts or environments and cul-
tures, administrative support, and assistance by librarians from
multifaceted education programs (Melnyk et al., 2004; Mel-
nyk & Fineout-Overholt, 2011; Melnyk, Fineout-Overholt, &
Mays, 2008; Newhouse, Dearholt, Poe, Pugh, & White, 2007;
Rycroft-Malone, 2004). The concept of healthcare context (i.e.,
the environment or setting in which people receive health-
care services), specifically organizational context, is becoming
an increasingly important factor in the implementation of ev-
idence at the point of care (Estabrooks, Squires, Cummings,
Birdsell, & Norton, 2009; Rycroft-Malone, 2004). Strategies
to enhance system-wide implementation and sustainability of
evidence-based care need to be multipronged and target: (a)
the enhancement of individual clinician and healthcare leader
EBP knowledge and skills; (b) cultivation of a context and cul-
ture that supports EBP, including the availability of resources
and EBP mentors; (c) development of healthcare leaders who
can spearhead teams that create an exciting vision, mission,
and strategic goals for system-wide implementation of EBP;
(d) sufficient time, resources, mentors, and tools for clinicians
to engage in EBP; (e) clear expectations of the role of clini-
cians and advanced practice nurses (APNs) in implementing
and sustaining evidence-based care; (f) facilitator characteris-
tics and approach; and (f) a recognition or reward system for
those who are fully engaged in the effort (Dogherty, Harri-
son, Graham, Vandyk, & Keeping-Burke, 2013; Melnyk, 2007;
Melnyk, Fineout-Overholt, et al., 2012).

Competencies for Nurses
Although there is a general expectation of healthcare systems
globally for nurses to engage in EBP, much uncertainty exists
about what exactly that level of engagement encompasses. Lack
of clarity about EBP expectations and specific EBP competen-
cies that nurses and APNs who practice in real-world healthcare
settings should meet impedes institutions from attaining high-
value, low-cost evidence-based health care. The development of
EBP competencies should be aligned with the EBP process in
continual evaluation across the span of the nurses’ practice, in-
cluding technical skills in searching and appraising literature,
clinical reasoning as patient and family preferences are con-
sidered in decision making, problem-solving skills in making
recommendations for practice changes, and the ability to adapt
to changing environments (Burns, 2009).

Competence is defined as the ability to do something well;
the quality or state of being competent (Merriam Webster Dic-
tionary, 2012). Competencies are a mechanism that supports
health professionals in providing high-quality, safe care. The
construct of nursing competency “attempts to capture the myr-
iad of personal characteristics or attributes that underlie com-
petent performance of a professional person.” Competencies
are holistic entities that are carried out within clinical contexts
and are composed of multiple attributes including knowledge,
psychomotor skills, and affective skills. Dunn and colleagues
contend that competency is not a “skill or task to be done, but
characteristics required in order to act effectively in the nurs-
ing setting.” Although a particular competency “cannot exist
without scientific knowledge, clinical skills, and humanistic
values” (Dunn et al., 2000, p. 341), the actual competency tran-
scends each of the individual components. The measurement
of nurses’ competencies related to various patient care activi-
ties is a standard ongoing activity in a multitude of healthcare
organizations across the globe, however, competencies related
to the critical issue of how practicing nurses approach decision
making (e.g., whether it is evidence-based vs. tradition-based)
is limited and needs further research.

Recently, work has been conducted to establish general
competencies for nursing by the Quality and Safety Educa-
tion for Nurses (QSEN) Project, which is a global nursing
initiative whose purpose was to develop competencies that
would “prepare future nurses who would have the knowl-
edge, skills, and attitudes (KSAs) necessary to continuously
improve the quality and safety of the healthcare systems
within which they work” (QSEN, 2013). This project has
developed competency recommendations that address the
following practice areas:

� Patient-centered care

� Teamwork and collaboration

� Evidence-based practice

� Quality improvement

� Safety

� Informatics

Further work in competency development has been spear-
headed by the Association of Critical Care Nurses, which de-
veloped the Synergy Model. The goal of the model was to assist
practicing nurses in decision making. An example of the model
in action would be the use of the model by charge nurses in
their decisions to match patients and nurses to achieve best
outcomes of evidence-based care processes promulgated by
the American Association of Critical Care Nurse (2013). Kring
(2008) wrote about how clinical nurse specialists, when com-
petent in EBP, can leverage their unique roles as expert prac-
titioners, researchers, consultants, educators, and leaders to
promote and support EBP in their organizations.

Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15. 7
C© 2014 Sigma Theta Tau International

EBP Competencies for Practice

In addition, competencies related to the academic setting
have been developed. The National League for Nurses (NLN)
developed competencies for program levels within nursing ed-
ucation. Definitions, guides to curricular development, and
criteria for use in developing certification and continuing ed-
ucation programs is a focus for faculty and administrators in
academic settings (NLN, 2013).

Stevens and colleagues defined essential competencies for
EBP to be incorporated into nursing education programs to
serve as a helpful guide to faculty in teaching and preparing
students for EBP and to “provide a basis for professional com-
petencies in clinical practice” (Stevens, 2009, p. 8). However,
to our knowledge, there has never been a systematic research-
based process used to develop contemporary EBP competen-
cies for practicing registered professional nurses and APNs
who are delivering care in real-world clinical settings defined
by leaders and mentors responsible for facilitating and sustain-
ing evidence-based care in today’s healthcare systems.

AIM
The aim of this study was to develop a clear set of competen-
cies for both practicing registered nurses and APNs in clinical
settings. These competencies can be used by healthcare insti-
tutions in their quest to achieve high performing systems that
consistently implement and sustain evidence-based care.

METHODOLOGY
The first step in formulating the competencies involved seven
national experts from both clinical and academic settings across
the United States, who were identified and invited to participate
in developing EBP competencies through a consensus build-
ing process. These experts were chosen because they were rec-
ognized national experts in EBP, having influenced the field
or being widely published in the area. Through a consensus
building process, the EBP expert panel produced two lists of
essential EBP competencies, one set for practicing registered
nurses and one for APNs. For registered nurses, the experts
identified 12 essential EBP competencies. For APNs, there were
11 additional essential EBP competencies (23 total).

The next step in developing the competencies involved uti-
lizing the Delphi survey technique, which seeks to obtain con-
sensus on the opinions of experts through a series of struc-
tured rounds. The Delphi technique is an iterative multistage
process, designed to transform opinion into group consensus.
Studies employing the Delphi technique make use of individ-
uals who have knowledge of the topic being investigated who
are identified as “experts” selected for the purpose of applying
their knowledge to a particular issue or problem. The literature
reflects that an adequate number of rounds must be employed
in a Delphi study in order to find the balance between produc-
ing meaningful results without causing sample fatigue. Rec-
ommendations for Delphi technique suggest that two or three
rounds are preferred to achieve this balance (Hasson, Keeney,
& McKenna, 2000).

Inclusion Criteria
The expert participants for this Delphi survey of EBP compe-
tencies were individuals who attended an intensive continuing
education course or program in EBP at the first author’s aca-
demic institution within the last 7 years and who identified
themselves as EBP mentors. The EBP mentors were nurses
with in-depth knowledge and skills in EBP along with skills
in organizational and individual behavior change, who work
directly with clinicians to facilitate the rapid translation of re-
search findings into healthcare systems to improve healthcare
quality and patient outcomes. EBP mentors guide others to
consistently implement evidence-based care by educating and
role modeling the use of evidence in decision making and ad-
vancement of best practice (Melnyk, 2007).

An important design element of a Delphi study is that the
investigators must determine the definition of consensus in
relation to the study’s findings prior to the data collection
phase (Williams & Webb, 1994). Although there is no uni-
versal standard about the proportion of participant agreement
that equates with consensus, recommendations range from
51% to 80% agreement for the items on the survey (Green,
Jones, Hughes, & Williams, 2002; Sumsion, 1998). Data anal-
ysis involves management of both qualitative and quantitative
information gathered from the survey. Qualitative data from
the first round group similar items together in an attempt
to create a universal description. Subsequent rounds involve
quantitative data collected to ascertain collective opinion and
are reported using descriptive and inferential statistics.

In preparation for the Delphi survey of EBP mentors across
the United States, the study was submitted to the first author’s
institutional review board and was deemed exempt status. Prior
to the survey being disseminated electronically to the EBP men-
tors for review, the study team determined the parameters of
consensus. The EBP mentors were asked to rate each com-
petency for: (a) clarity of the written quality of the competency
and (b) how essential the competency was for practicing nurses
and APNs. The criterion for agreement set was that 70% of the
EBP mentor respondents would rate the EBP competency (e.g.,
“Questions clinical practices for the purpose of improving the
quality of care”; “Searches for external evidence to answer fo-
cused clinical questions”) between 4.5 and 5 on a five-point
Likert scale that ranged from 1 not at all to 5 very much so. The
study team also decided that competencies which EBP mentors
identified as not clearly written would be reworded taking in
consideration their feedback and resent to the participants in
a second round of the Delphi survey. The essential EBP com-
petencies were sent via e-mail to the EBP mentors for review,
rating, and feedback in July 2012.

Each EBP mentor participant was contacted through an
e-mail and invited to participate in the anonymous Delphi
survey. An introduction to the study and its parameters was
included in the introductory e-mail along with the planned
timeline for the study. The survey consisted of three sections:
(a) demographic data, (b) rating of essential EBP competencies
for practicing registered nurses, and (c) rating of essential EBP

8 Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15.
C© 2014 Sigma Theta Tau International

Original Article
Table 2. Participant Characteristics (N = 80)

Mean Median Max Min

Age 52 54 70 25

Years in active clinical practice 26 29 43

1

Years as anadvancedpractice nurse 9 5 40 0

Number of years as anEBPmentor 3 3 15 0

competencies for practicing APNs. The survey was open for 2
weeks from the first contact date. A reminder e-mail was sent
1 week following the first contact and a second reminder was
issued a day before the survey closed. Consent was obtained
by virtue of the participant completing the survey.

The EBP mentors were asked to respond to two questions
about each of the EBP competencies on the survey using a five-
point Likert scale with 1 = Not at all, 2 = A little, 3 = Somewhat,
4 = Moderately so, and 5 = Very much so. The first question
was related to how essential the competency was for nurses
and APNs and was stated as “To what extent do you believe the
above EBP competency is essential for practicing registered
professional nurses.” The second question was focused on the
clarity of the competency and was stated as, “Is the competency
statement clearly written?” If participants answered “no” in
response to whether the statement was clearly written, they
were asked how they would rewrite it. Only the EBP mentors
who identified themselves as APNs were permitted to rate the
APN competencies.

FINDINGS
Of the 315 EBP mentors originally contacted to participate in
the survey, 80 responded indicating a 25% response rate. De-
mographic data collected reflected that all 80 participants were
female with a mean age of 52 years and an average of 26 years in
clinical practice. Fifty of the 80 respondents were self-reported
as APNs and the average number of years as an EBP mentor
was reported as 3 (see Table 2). The majority of the partici-
pants had a Master’s or higher educational degree and was
currently serving in an EBP mentor role. The participants re-
ported holding both clinical positions and academic positions
(see Table 3). There was a relatively even distribution of partic-
ipants who worked in Magnet (n = 36; 45%) and non-Magnet
institutions (n = 44; 55%). The sample represented a variety of
primary work settings (see Table 4).

In the competency rating section of round 1 of the survey,
all of the practicing registered nurse and APN competencies
achieved consensus as an essential competency, based on the
preset criteria. However, in the clarity portion of the rating
section, there was feedback provided by participants regarding
refining the wording of four of the competencies. Each of these

Table 3. Race, Ethnicity, Education, and Role
(N = 80)

n

Race White 75

Black orAfricanAmerican 2

NativeHawaiian or other Pacific
Islander

1

Asian 2

Ethnicity NotHispanic or Latino 79

Hispanic or Latino 1

Education Bachelor’s 9

Master’s 48

PhD 18

DNP 4

Other 1

Current position Staff nurse 5

Nursepractitioner 2

Clinical nurse specialist 12

Clinical nurse leader 0

Nurse educator 18

Nursemanager/administrator 8

Academic faculty 10

Academic administration 3

Other 22

Currently serving inan
EBPmentor role

Yes 63

No 17

four competencies was reworded and included in a second
round of the Delphi study. None of the competencies were
eliminated (see Tables 5 and 6).

Based on the feedback received from the participants in
round 1 related to the clarity of the competencies, the following
process was operationalized. In the single case where clarity
feedback was related simply to consistency in terminology, the
competency was reworded to incorporate the feedback and was
included in round 2 for the reviewers to see that their feed-
back had been integrated. However, they were not asked to
revote on the competency. In the cases where the clarity feed-
back was related to the action described (such as Formulates a
PICOT question vs. Participates in formulating a PICOT ques-
tion), the competencies were reworded and included in round

Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15. 9
C© 2014 Sigma Theta Tau International

EBP Competencies for Practice

Table 4. Organization (N = 80)

n

Typeof primary
work setting

Community hospital 21

Academicmedical center 33

Academic institution 21

Primary carepractice 1

Community health setting 0

Other 4

Work in aMagnet
designated
institution

Yes 36

No 44

Table 5. Round 1 Registered Nurse (RN) Competen-
cies (N = 80)

Consensus Reword Revote

Competency Mean ± SD (Yes–No) (Yes–No)
1 4.9 ± 0.3 No No
2 4.7 ± 0.5 No No
3 4.7 ± 0.5 Yes Yes
4 4.8 ± 0.4 No No
5 4.6 ± 0.5 Yes Yes
6 4.6 ± 0.5 Yes* Yes*

7 4.7 ± 0.5 No No
8 4.7 ± 0.5 No No
9 4.8 ± 0.4 No No
10 4.7 ± 0.4 No No
11 4.7 ± 0.5 No No
12 4.8 ± 0.4 No No

Note.*Competency6wassplit into twoseparatecompetencystatements
basedon round 1 feedback.

2 for the reviewers to see that their feedback had been inte-
grated and they were asked to revote on the whether the revised
competency still rated as an essential EBP competency. Only
registered nurse competencies received feedback that required
revoting. All of the APNs competencies reached consensus
with only minor clarifications in terminology needed.

Table 6. Round 1 APN Competencies (N = 50)

Consensus Reword Revote

Competency Mean ± SD (Yes–No) (Yes–No)
1 4.8 ± 0.4 No No
2 4.9 ± 0.3 No No
3 4.9 ± 0.3 No No
4 4.9 ± 0.3 No No
5 4.9 ± 0.2 No No
6 5.0 ± 0.2 No No
7 4.9 ± 0.3 No No
8 4.9 ± 0.3 No No
9 4.9 ± 0.3 No No
10 4.9 ± 0.2 No No
11 5.0 ± 0.2 No No

Three registered nurse competencies required rewriting
and revoting. Two competencies (#3, #5) required rewording
and one competency (#6) required splitting into two separate
competencies. Competency 3, formulates focused clinical ques-
tions in PICOT (i.e., Patient population; Intervention or area of
interest; Comparison intervention or group; Outcome; Time), was
revised to be: participates in the formulation of clinical ques-
tions using PICOT* format (*PICOT = Patient population;
Intervention or area of interest; Comparison intervention or
group; Outcome; Time). Competency 5, conducts rapid critical
appraisal of preappraised evidence and clinical practice guidelines
to determine their applicability to clinical practice, was revised to
be: participates in critical appraisal of preappraised evidence
(such as clinical practice guidelines, evidence-based policies
and procedures, and evidence syntheses).

The EBP mentor responses and feedback resulted in the
number of competency statements being increased when com-
petency 6 was split into two separate competency statements.
The additional competency statement was generated based on
feedback related to the clarity of the competency, which re-
flected that more than one idea or action was expressed in
the single competency statement. Competency 6, participates
in critical appraisal (i.e., rapid critical appraisal, evaluation, and
synthesis of published research studies) to determine the strength and
worth of evidence as well as its applicability to clinical practice, was
reworded as new competency 6, participates in the critical ap-
praisal of published research studies to determine their strength and
applicability to clinical practice, and new competency 7, partici-
pates in the evaluation and synthesis of a body of evidence gathered
to determine its strength and applicability to clinical practice.

10 Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15.
C© 2014 Sigma Theta Tau International

Original Article
Table 7. EBP Competencies

Evidence-basedpractice competencies for practicing registeredprofessional nurses

1. Questions clinical practices for thepurposeof improving thequality of care.

2. Describes clinical problemsusing internal evidence.* (internal evidence* = evidencegenerated internallywithin a clinical
setting, suchaspatient assessment data, outcomesmanagement, andquality improvement data)

3. Participates in the formulation of clinical questions usingPICOT* format. (*PICOT = Patient population; Intervention or areaof
interest; Comparison intervention or group;Outcome; Time).

4. Searches for external evidence* to answer focusedclinical questions. (external evidence* = evidencegenerated from research)
5. Participates in critical appraisal of preappraised evidence (suchas clinical practice guidelines, evidence-basedpolicies and
procedures, andevidence syntheses).

6.Participates in thecritical appraisal ofpublished researchstudies todetermine their strengthandapplicability toclinicalpractice.

7. Participates in the evaluation and synthesis of a bodyof evidencegathered todetermine its strength andapplicability to clinical
practice.

8. Collects practice data (e.g., individual patient data, quality improvement data) systematically as internal evidence for clinical
decisionmaking in the care of individuals, groups, andpopulations.

9. Integrates evidencegathered fromexternal and internal sources in order to plan evidence-basedpractice changes.

10. Implements practice changesbasedonevidenceandclinical expertise andpatient preferences to improve careprocesses and
patient outcomes.

11. Evaluates outcomesof evidence-baseddecisions andpractice changes for individuals, groups, andpopulations todetermine
best practices.

12. Disseminatesbest practices supportedby evidence to improvequality of care andpatient outcomes.

13. Participates in strategies to sustain an evidence-basedpractice culture.

Evidence-basedpractice competencies for practicing advancedpractice nurses
All competencies of practicing registeredprofessional nursesplus:

14. Systematically conducts anexhaustive search for external evidence* toanswer clinical questions. (external evidence*: evidence
generated from research)

15. Critically appraises relevant preappraised evidence (i.e., clinical guidelines, summaries, synopses, synthesesof relevant
external evidence) andprimary studies, including evaluation and synthesis.

16. Integrates abodyof external evidence fromnursing and relatedfieldswith internal evidence* inmakingdecisions about patient
care. (internal evidence* = evidencegenerated internallywithin a clinical setting, suchaspatient assessment data, outcomes
management, andquality improvement data)

17. Leads transdisciplinary teams inapplying synthesizedevidence to initiate clinical decisionsandpracticechanges to improve the
health of individuals, groups, andpopulations.

18.Generates internal evidence throughoutcomesmanagement andEBP implementationprojects for thepurposeof integrating
best practices.

19.Measuresprocesses andoutcomesof evidence-basedclinical decisions.

20. Formulates evidence-basedpolicies andprocedures.

21. Participates in the generation of external evidencewith other healthcareprofessionals.

22.Mentors others in evidence-baseddecisionmaking and theEBPprocess.

23. Implements strategies to sustain anEBPculture.

24. Communicates best evidence to individuals, groups, colleagues, andpolicymakers.

Copyright:Melnyk,Gallagher-Ford, andFineout-Overholt (2013).

Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15. 11
C© 2014 Sigma Theta Tau International

EBP Competencies for Practice

Table 8. Round 2 Registered Nurse (RN) Competen-
cies (N = 59)

CompetencyConsensusMean ± SDConsensusMet (Yes–No)
3 4.6 ± 0.5 Yes
5 4.6 ± 0.5 Yes
6 4.6 ± 0.5 Yes
7 4.5 ± 0.5 Yes

This process rendered a revised set of EBP competencies
that included 13 competencies for registered nurses and an
additional 11 EBP competencies (for a total of 24) for APNs
(see Table 7).

In October 2012, the second round of the Delphi study was
conducted. The revised set of EBP competencies was e-mailed
to the EBP mentors who responded in the first round of the
study in October 2012. The round 2 survey provided feedback to
the EBP mentors about the process that had been conducted by
the study team to render the revised competencies and asked
them to rate the three revised and the two new (split) EBP
competency statements using the same five-point Likert rank-
ing scale used in round 1. Fifty-nine of the 80 original EBP
mentors responded to the second round of the study (74%)
by the response deadline. In round 2 of the study, each of the
13 registered nurse competencies achieved consensus (based
on the preset criteria) as an essential EBP competency (see
Table 8). Throughout the process, none of the EBP mentors
articulated additional competencies, indicating a high level of
consensus about the completeness of the list of EBP compe-
tencies identified in the study. The final list of consensus-built
EBP competencies is included in Table 7.

DISCUSSION
Competencies are a mechanism that supports health profes-
sionals in providing high-quality, safe care (Dunn et al., 2000).
The issue of nursing competence in implementing EBP is im-
portant for individual nurses, APNs, nurse educators, nurse
executives, and healthcare organizations. Regardless of the sys-
tem, the culture and context or environment in which nurses
practice impact the success of engagement in and sustainabil-
ity of EBP. Therefore, it is imperative for nurse executives and
leaders to invest in creating a culture and environment to sup-
port EBP (Melnyk, Fineout-Overholt, et al., 2012). One action
toward investment in a culture of EBP is to provide a mecha-
nism for clarity in expectations for evidence-based care. Devel-
opment of evidence-based competencies provides a key mech-
anism for engagement in EBP and the delivery of high-quality
health care. Through a Delphi survey process, EBP competen-
cies were developed by EBP experts working in a variety of

settings, for registered professional nurses and APNs practic-
ing in real-world healthcare settings. These EBP competencies
can be used by healthcare systems to succinctly establish ex-
pectations regarding level of performance related to EBP by
registered professional nurses and APNs.

Multiple strategies can be used to incorporate competen-
cies into healthcare systems to improve healthcare quality, re-
liability, and patient outcomes as well as reduce variations in
care and costs. These strategies range from implementation of
competencies developed by the AACN, NLN, QSEN, and the
Institute of Medicine (IOM) from an organizational perspective

LINKING EVIDENCE TO ACTION

� Practice: Incorporation of EBP competencies into
healthcare system expectations and operations
can drive higher quality, reliability, and consis-
tency of healthcare as well as reduce costs. Support
systems in healthcare institutions, including edu-
cational and skills building programs along with
availability of EBP mentors, should be provided
to assist practicing nurses and APNs in achieving
the EBP competencies.

� Research is needed to develop valid and reliable
instruments for assessing these competencies. Al-
though the Fresno tool has been developed as
a valid and reliable tool for assessing EBP com-
petence in medicine (Ramos, Schafer, & Tracz,
2003), it has not been tested with nursing or al-
lied health professionals. Future research should
also determine the relationship between imple-
mentation of these EBP competencies with both
clinician and patient outcomes.

� Policy: Organizations that set standards for prac-
tice should embrace and endorse the EBP compe-
tencies as a tool to build and sustain acquisition of
EBP knowledge, development of EBP skills, and
incorporation of a positive attitude toward EBP to
promote best practices.

� Management: Nursing leaders should integrate
EBP competencies into multiple processes that
impact nurses across their clinical lifespan includ-
ing; interview questions, onboarding/orientation,
job descriptions, performance appraisals, and
clinical ladder promotion programs.

� Education: EBP competencies should be inte-
grated into both academic and clinical education
programs to establish and continuously reinforce
EBP as the foundation of practice.

12 Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15.
C© 2014 Sigma Theta Tau International

Original Article
Table 9. Strategies for Integration of the EBP Competencies

Category Organizational Strategies Individual Strategies

Promote a culture and
context or environment
that supports EBP

• Assess theorganization’s andemployee’s
readiness for implementation of EBP
competencies prior to implementation to
promotedevelopmentof aneffective strategic
plan for their integration.

• Beanevidence-basedclinicianby integrating
EBPcompetencies into daily practice to
deliver thebest carepossible to patients and
families.

• IncludeEBPcompetency language in the
mission and vision statements for nursing as
well as sharedgovernance council charters.

• Bea rolemodel for othersbymakingdecisions
basedonevidence every day.

• Provide systemsand resources that support
the integration anduseof EBPcompetencies,
suchas a criticalmassof EBPmentors,
access to library services including a
dedicated librarian, andavailability of aPhD
preparednurse scientist.

• IncludeEBPcompetencies in role expectations
of nurse leaders to support the
implementation of EBP in all aspects of care.

• Provide educational and skills building
programs to support clinicians’ attainment of
theEBPcompetencies.

• Support thedevelopmentof EBPmentors,who
meet/exceed theEBPcompetencies to
support practicing nurses andAPNs in EBP
projects.

Establish EBPperformance
expectations for all nurse
leaders andclinicians:

• IncludeEBP-competency-relatedquestions in
interviewprocesses

• Expect evidence-baseddecisionmaking from
others to promote awork environmentwhere
thebest care is possible.

• Designonboarding/orientationprograms that
specifically alignwith EBPcompetencies

• Rewrite jobdescriptions to include theEBP
competencies

Sustain EBPactivities and
culture

• IncludeEBPcompetencies in performance
appraisals and clinical ladder programs

• BecomeanEBPmentor andhelp others to
developand integrate theEBPcompetencies
into their daily practice.

• ImbedEBPcompetencies in practice policy
andguideline development processes

to actions and decisions made by point of care nurses (AACN,
2013; NLN, 2013; IOM, 2003). In addition, strategies can be
developed to integrate the scientifically derived, specific EBP
competencies developed in this study. EBP competencies can
be used as tools to guide the development of individuals
and organizations. Strategies for integration of the competen-
cies require both organizational and individual actions (see
Table 9).

LIMITATIONS
The main limitation of this study is that it used a convenience
sample of nurses who attended an EBP immersion workshop
at the first author’s institution, which may have biased the re-
search findings. In addition, some of the respondents were
not currently in an EBP mentorship role in practice settings.
Despite these limitations, the use of an expert EBP leader-
ship panel to first draft the competencies along with a Delphi

Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15. 13
C© 2014 Sigma Theta Tau International

EBP Competencies for Practice

survey technique with individuals who had EBP mentorship
experience in real-world practice settings were strengths in the
development of this set of contemporary EBP competencies for
practicing and APNs.

SUMMARY
A national consensus process and Delphi study was conducted
to establish contemporary EBP competencies for practicing
registered nurses and APNs. Incorporation of these EBP com-
petencies into healthcare systems should lead to higher quality
of care, greater reliability, improved patient outcomes, and re-
duced costs.

ACKNOWLEDGMENTS
The authors would like to thank the following national expert
panel who participated in the first phase of achieving consensus
in the development of these EBP competencies: Dr. Karen Bal-
akas, Dr. Ellen Fineout-Overholt, Dr. Anna Gawlinski, Dr. Mar-
ilyn Hockenberry, Dr. Rona F. Levin, Dr. Bernadette Mazurek
Melnyk, and Dr. Teri Wurmser. WVN

Author information

Bernadette Mazurek Melnyk, Associate Vice President for
Health Promotion, University Chief Wellness Officer, Dean
and Professor, College of Nursing, Professor of Pediatrics and
Psychiatry, College of Medicine, The Ohio State University,
Columbus, OH; Lynn Gallagher-Ford, Clinical Associate Pro-
fessor and Director, Center for Transdisciplinary Evidence-
based Practice, College of Nursing, The Ohio State Univer-
sity, Columbus, OH; Lisa English Long, Expert Evidence-based
Practice Mentor, Clinical Instructor, College of Nursing, The
Ohio State University, Columbus, OH; Ellen Fineout-Overholt,
Dean and Professor, Groner School of Professional Studies,
Chair, Department of Nursing, East Texas Baptist University,
Marshall, TX.
Address correspondence to Dr. Bernadette Mazurek Melnyk,
College of Nursing, The Ohio State University, 1585 Neil Av-
enue, Columbus, OH 43210, USA; Melnyk.15@osu.edu

Accepted 28 October 2013
Copyright C© 2014, Sigma Theta Tau International

References
American Association of Critical-Care Nurses (AACN). (2013).

Nurse Competencies of Concern to Patients, Clinical
Units and Systems. Retrieved from http://www.aacn.org/wd/
certifications/content/synpract2.pcms?menu

Burns, B. (2009). Continuing competency: What’s ahead? Journal
of Perinatal Neonatal Nurse, 23(3), 218–227.

Dogherty, E. J., Harrison, M. B., Graham, I. D., Vandyk, A. D.,
& Keeping-Burke, L. (2013). Turning knowledge into action at
the point-of-care: The collective experience of nurses facilitating
the implementation of evidence-based practice. Worldviews on
Evidence-Based Nursing, 10(3), 129–139.

Dunn, S. V., Lawson, D., Robertson, S., Underwood, M., Clark,
R., Valentine, T., & Herewane, D. (2000). The development of

competency standards for specialist critical care nurses. Journal
of Advanced Nursing, 31(2), 339–346.

Ely, J. W., Osheroff, J. A., Chambliss, M. L., Ebell, M. H., & Rosen-
baum, M. E. (2005). Answering physicians’ clinical questions:
Obstacles and potential solutions. Journal of American Medical
Informatics Association, 12(2), 217–224.

Estabrooks, C. A., O’Leary, K. A., Ricker, K. L., & Humphrey, C.
K. (2003). The Internet and access to evidence: How are nurses
positioned? Journal of Advance Nursing, 42(1), 73–81.

Estabrooks, C. A., Squires, J. E., Cummings, G. G., Birdsell, J.
M., & Norton, P. G. (2009). Development and assessment of
the Alberta Context Tool. BMC Health Services Research, 9(234),
1–12.

Fink, R. Thompson, C., & Bonnes, D. (2005). Overcoming barriers
and promoting the use of research in practice. Journal of Nursing
Administration, 35(3), 121–129.

Gerrish, K., & Clayton, J. (2004). Promoting evidence-based prac-
tice: An organizational approach. Journal of Nursing Management,
12(2), 114–123.

Gonzi, A., Hager, P., & Athanasou, J. (1993). The development of
competency-based assessment strategies for the professions. National
Office of Overseas Skills Recognition Research Paper No. 8.
Canberra, Australia: Australian Government Publishing
Service.

Green, B., Jones, M., Hughes, D., & Williams, A. (2002). Apply-
ing the Delphi technique in a study of GPs’ information re-
quirements. Health & Social Care in the Community, 7(3), 198–
205.

Hasson, F., Keeney, S., & McKenna, H. (2000). Research guide-
lines for the Delphi survey technique. Journal of Advanced Nurs-
ing, 32(4), 1008–1015.

Institute of Medicine (IOM) (US), Committee on Assuring the
Health of the Public in the 21st Century. (2003). The future of
the public’s health in the 21st century. Washington, DC: National
Academies Press.

Jennings, B. M., & Loan, L. A. (2001). Misconceptions among
nurses about evidence-based practice. Journal of Nursing Schol-
arship, 33(2), 121–127.

Kring, D. L. (2008). Clinical nurse specialist practice domains and
evidence-based practice competencies: A matrix of influence.
Clinical Nurse Specialist, 22(4), 179–183.

McGinty, J., & Anderson, G. (2008). Predictors of physician com-
pliance with American Heart Association guidelines for acute
myocardial infarction. Critical Care Nursing Quarterly, 31(2), 161–
172.

Melnyk, B. M. (2007). The evidence-based practice mentor: A
promising strategy for implementing and sustaining EBP in
healthcare systems [editorial]. Worldviews on Evidence-Based
Nursing, 4(3), 123–125.

Melnyk, B. M., & Fineout-Overholt, E. (2011). Evidence-based prac-
tice in nursing and healthcare. A guide to best practice. Philadelphia,
PA: Lippincott, Williams, & Wilkins.

Melnyk, B. M., Fineout-Overholt, E., Feinstein, N., Li, H. S., Small,
L., Wilcox, L., & Kraus, R. (2004). Nurses’ perceived knowledge,
beliefs, skills, and needs regarding evidence-based practice: Im-
plications for accelerating the paradigm shift. Worldviews on
Evidence-Based Nursing, 1(3), 185–193.

Melnyk, B. M., Fineout-Overholt, E., Gallagher-Ford, L., & Kaplan,
L. (2012). The state of evidence-based practice in US nurses:

14 Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15.
C© 2014 Sigma Theta Tau International

Original Article
Critical implications for nurse leaders and educators. Journal of
Nursing Administration, 42(9), 410–417.

Melnyk, B. M., Fineout-Overholt, E., & Mays, M. Z. (2008). The
evidence-based practice beliefs and implementation scales: Psy-
chometric properties of two new instruments. Worldviews on
Evidence-Based Nursing, 5(4), 208–216.

Melnyk, B. M., & Gallagher-Ford, L. (2013). Evidence-based practice
competencies for registered practicing nurses and advanced practice
nurses. Columbus, OH: The Ohio State University College of
Nursing Center for Transdisciplinary Evidence-Based Practice.

Melnyk, B. M., Grossman, D., Chou, R., Mabry-Hernandez, I.,
Nicholson, W., Dewitt, T., . . . US Preventive Services Task
Force. (2012). USPSTF perspective on evidence-based preven-
tive recommendations for children. Pediatrics, 130(2), e399–
e407. DOI: 10.1542/peds.2011-2087

National League for Nursing (NLN). (2013). Competencies
for Nursing Education. Retrieved from: http://www.nln.org/
facultyprograms/competencies/graduates_competencies.htm

Newhouse, R. P., Dearholt, S., Poe, S., Pugh, L. C., & White,
K. M. (2007). Organizational change strategies for evidence-
based practice. Journal of Nursing Administration, 37(12), 552–
557.

Pravikoff, D. S., Pierce, S. T., & Tanner, A. (2005). Evidence-based
practice readiness study supported by academy nursing infor-
matics expert panel. Nursing Outlook, 53(1), 49–50.

Quality and Safety Education for Nurses (QSEN). (2013).
The Evolution of the Quality and Safety Education for
Nurses (QSEN) Initiative. Retrieved from: http://qsen.org/
about-qsen/project-overview/

Ramos, K. D., Schafer, S., & Tracz, S. M. (2003). Validation of the
Fresno test of competence in evidence-based medicine. British
Medical Journal, 326, 319–321.

Restas, A. (2000). Barriers to using research evidence in nursing
practice. Journal of Advanced Nursing, 31(3), 599–606.

Rycroft-Malone, J. (2004). The PARIHS framework—A frame-
work for guiding the implementation of evidence-based practice.
Journal of Nursing Care Quality, 19(4), 297–304.

Rycroft-Malone, J., Harvey, G., Seers, K., Kitson, A., McCormack,
B., & Titchen, A. (2004). An exploration of the factors that in-
fluence the implementation of evidence into practice. Journal of
Clinical Nursing, 13(8), 913–924.

Stevens, K. R. (2009). Essential competencies for evidence-based prac-
tice in nursing (2nd ed.). San Antonio, TX: Academic Center
for Evidence-Based Practice, University of Texas Health Science
Center at San Antonio.

Sumsion, T. (1998). The Delphi technique: An adaptive research
tool. British Journal of Occupational Therapy, 61(4), 153–156.

Titler, M. G. (2009). Developing an evidence-based practice. In G.
LoBiondo-Wood & J. Haber (Eds.), Nursing research: Methods and
critical appraisal for evidence-based practice (7th ed., pp. 385–437).
St Louis, MO: Mosby.

Williams, P. L., & Webb, C. (1994). The Delphi technique: An
adaptive research tool. British Journal of Occupational Therapy,
61(4), 153–156.

doi 10.1111/wvn.12021
WVN 2014;11:5–15

Worldviews on Evidence-Based Nursing, 2014; 11:1, 5–15. 15
C© 2014 Sigma Theta Tau International

Copyright of Worldviews on Evidence-Based Nursing is the property of Wiley-Blackwell and
its content may not be copied or emailed to multiple sites or posted to a listserv without the
copyright holder’s express written permission. However, users may print, download, or email
articles for individual use.

Still stressed with your coursework?
Get quality coursework help from an expert!