Evidence-Based Practice for Nurses

(Ben Green) #1
Phase I:
Preparation

Phase II:
Validation

Phase III: Comparative
evaluation/decision
making

Phase IV:
Translation/application

Phase V:
Evaluation

sources of research Purpose, context, &
evidence:

Credibility of ndings& potential for/
detailed qualiapplication:ers of
recommendations per criteria ofSynthesis & decisions/
applicability:

Operational denition of use/actionsfor change: Alternative typesof evaluation:


  • Po Catalyststential Issues/
    = a problem, including
    unexplained variations or
    less-than-best practice;
    or routine update of
    knowledge; or validation/
    routine revision of procedure, policy,
    etc; or innovative program goal

  • Af profirm perceived blems with
    internal evidence

  • Focus on high priority Issues

  • Decide if need to form a team or
    inv “structures”/keyolve formal
    stake-holders

  • Consider other inuential internal
    and external factors, such as
    beliefs, resources, or
    timelines

  • Dene desired, measurable
    outcome/s Seek out
    s reviewsystematic

  • Determine need for an explicit type
    of research evidence, if
    relevant

  • Select research sources with
    conceptual t

    • Critique & synopsize essential
      components, operational details,
      and other qualifying factors, per source




(^) f• See instructions or use of
utilization-focused review tables to
f acilitate this task; ll in the tables for
group decision making or
potential future synthesis



  • Critique s reviews ystematic

  • Reassess t of individual sources

  • Rate the le quality of each vel &
    e per a "table of vidence source
    evidence"

  • Dif statistical and ferentiate
    clinical signicance

  • Eliminate noncredible sources

  • End the process if there is no
    e is clearly insufficient vidence or if there
    credible research evidence that
    meets your need


See Stetler et al. (1998) for noted
tables, resynthesis process.views, &


  • Synthesize the cumulative ndings:


(^) the similarities and • Logically organize & display
dif ndings, per common ferences across multiple
aspects or sub-elements of the topic under review
(^) substantiation of each • Evaluate degree of
aspect/sub-element; reference any qualifying
conditions



  • Evaluate degree & nature of other criteria:
    feasibility (r,r,r = risk, resources, readiness);
    pragmatic t; & current practice

  • Make a decision whether/what to use:


(^) practitioner-level decision or • Can be a personal
a recommendation to others
(^) decision; indicate if primarily • Judge the strength of this
"research-based" or, per use of supplemental information,
"evidence-based"; qualify the related level of strength
of decision/ recommendations
per related table
(^) determine degree of • For formal recommendations,
stakeholder consensus



  • If decision = "Not use" research ndings:


(^) or delay use until additional • May conduct own research
research is done by others
(^) on evidence of consensus or • If still decide to act now, e.g.,
another basis for practice, STOP use of model but
consider need for planned change and evaluation



  • If decision = "Use/Consider Use," can mean a
    recommendation for or against a specic practice

    • Types = cognitive, symbolic &/or instrumental

    • Methods = informal or formal; direct or indirect

    • Lev organizationels = individual, group, or department/

    • Direct instrumental use: change individual behavior (vis-à-vis assessment; plan/
      intervention options; implementation details; &/or evaluation); or change policy,
      procedure, protocol, algorithm, program components, etc.

    • Cognitive use: validate current practice; change personal way of thinking; increase
      a appreciate condition/s or experience/swareness; better understand or

    • Symbolic use: develop position paper or proposal for change; persuade others
      regarding a way of thinking
      C product or use goes beyond actual AUTION: Assess whether translation/
      ndings/evidence:




(^) provide various details for a complete • Research evidence may or may not
polic to users, and note dify, procedure, etc.; indicate this fact ferential levels of
evidence therein



  • F should be planned per relevant research ormal dissemination & change strategies
    (Include Dx analysis):


(^) e• Simple, passive education is rarely ffective as an isolated strategy.
Consider multiple strategies, e.g., interactive education, opinion leaders,
educational outreach, audit, etc.
(^) (e.g., Kitson or PRECEDE)• Consider implementation models



  • C variationonsider need for appropriate, reasoned

  • WITH B, where made a decision to use in the setting:


(^) e• Wvaluation to effectively implement & ith formal use, may need a dynamic
continuously imp available evidencerove/rene use of best



  • WITH B', where made a decision to consider use & thus obtain additional,
    pragmatic information before a nal decision


(^) project• With formal consideration, need a pilot
(^) IRB revie• With a pilot project, must assess if need w, per relevant institutional
criteria



  • Evaluation can be formal or
    informal, individual or
    institutional

  • Consider cost- benet of various
    evaluation efforts

  • Use RU as a process to
    enhance credibility
    of evaluation data

  • For both dynamic & pilot
    e include tvaluations,wo
    types of evaluative
    information:


(^) r• formative,egarding
actual implementation
& goal progress
(^) r• summative,egarding
Phase I outcomes and
goal results
NOTE: Model applies to all forms
of practice, i.e.,educational,
clinical,managerial, or
other
FIGURE 16-1 Continued
16.1 Evidence-Based Practice Models to Overcome Barriers 429

Free download pdf