Using The Supported Employment Fidelity Scale: An Introduction for Practitioners |
duration: 1 min. 53 sec. |
slide 16
|
|
Factors Increasing the Reliability and Validity of Ratings The research shows that independent evaluators using multiple sources of information make the most valid ratings. A central feature in the fidelity assessment is talking to individuals involved in the practice. Among those typically interviewed are the program leader, practitioners or staff providing the services, consumers receiving the service, and family members. How you interview is very important. The fidelity scale protocol gives sample questions and guidelines for asking questions, but assessors should be alert for incidental data that may reveal how the program actually works. For example, what practitioners say and how they say it, or what they DON’T say or do when there is an opportunity. Skillful fidelity assessors know to follow up with additional questions when they perceive a discrepancy. An adequate fidelity assessment includes observations in multiple practice settings. Try to capture the natural flow of work. This would include one or more regularly scheduled meetings. Make sure you schedule the visit on a day when these meetings are scheduled. Also, observations of team meetings, review of charts, and observation of interventions are important sources of information. A day-long site visit is the optimal method for acquiring information. Interviewers should be familiar with the evidence-based principles being rated. Although outside raters are recommended, fidelity scales can also be used by program managers to conduct self-ratings. The validity of self-ratings, or any ratings for that matter, depends on the knowledge and objectivity of the person making the ratings as well as access to accurate information pertaining to the ratings. Self-ratings are encouraged with appropriate caveats regarding potential biases that can be introduced by raters who are invested in seeing a program “look good” or who do not fully understand the evidence-based principles. |