RCA Background

Pretest

The pretest, which took place from March 8 to March 18, 2014, tested the system that will deliver the RCA online. Thirty-four registrants, who volunteered to participate in the test, completed a shortened version of the RCA, and were then asked for their feedback on navigation, instructions and other issues related to the test format.

Pilot

The pilot test, which took place on November 4, 5, and 8, 2014, was a full length version of the RCA, and primarily tested the specific content  –  i.e., the cases and questions. From an initial sample of 348 full registrants, 173 booked sessions for the pilot, and 169 actually wrote the pilot test.

Similar to the planned process for the actual RCA, registrants had to identify a proctor (someone to supervise them while they completed the RCA) and book a time slot to complete the pilot test. Pilot test participants were also asked to provide feedback on the navigation and instructions of the test. This larger sample size allowed for the analysis of statistics for each item (case and question).

In addition, the pilot test marked the first full meeting for the Board of Examiners, who were appointed in November 2013 by the CPTBC Board and trained in March 2014. The Board of Examiners met on January 17, 2015 to review the validity and reliability of the pilot test and to set the cut scores. The Board of Examiners recommended that individual results not be reported from the pilot because the reliability of each test form was below the threshold recommended for this type of assessment.

The Board of Examiners further recommended that each RCA consist of a single practice context in order to maximize reliability, keep the RCA within a reasonable time limit so as not to over-burden registrants, and ensure large enough sample sizes for proper statistical analyses.

The rationale for this recommendation is as follows: for the pilot test, registrants were invited to choose to complete the RCA in either one or two of the four practice contexts. Those who chose two contexts did half of the RCA in one context and half in the other. Some important concerns due to this approach surfaced in the results. First, projections based on the results revealed that reliability for RCA forms containing two practice contexts may be too low to assign scores or make pass/fail decisions about either of the two practice contexts with the degree of statistical confidence recommended for this type of examination. According to established standards for test development, exams containing two different practice contexts should be thought of as two separate exams – one for each practice context. Therefore scores and reliability should be calculated and reported for each of the two contexts separately. The pilot test results also provided a strong indication that in order to keep the total testing time within a desirable time limit it is not possible to include enough cases and questions in each of two contexts to reach the degree of reliability necessary for each of them.

In addition, the number of registrants who chose to complete the pilot test in two practice contexts was low. Projections based on these numbers indicated that there would not likely be enough registrants completing the RCA in two contexts to allow for a large enough sample size at each sitting of the RCA to complete essential statistical analyses that are dependent on sample size.

The Spring 2015 newsletter expands more upon this and can be found here.

Background and Development Documents

Quality Assurance Program Blueprint – This document provides a detailed outline of the purpose, definition, and standards of the RCA. The blueprint looks at three dimensions: contexts of physical therapy practice, key regulatory topic areas to be addressed, and essential competencies to be addressed.

Quality Assurance Program Backgrounder Document – This document lists the assumptions, guiding principles, and design features that were used as a guideline to make decisions about the program.

Indicates document has a clause or clauses discussing how a registrant can have a choice of two practice contexts (i.e. half of the questions in one context and half in the second context) for the RCA. This option will not be offered due to concerns with reliability, and therefore registrants writing in 2015 and forward are required to demonstrate continuing competence in one of four practice contexts.

Reasons for Choosing an Online Written Exam Format

Mandated by a change in the provincial legislation (Health Professions Act) in 2007, the College is now expected to provide the public with evidence of the competence of its registrants.

Periodic assessment of 3,000 physical therapists in a valid, reliable manner that allows for some objective ‘evidence’ of continuing competence is a challenging enterprise.

Over a two-year period, a comprehensive review of the educational and assessment literature was completed by the College committees and staff, with the assistance of experts in performance assessments. This review process included exploring the current best practices in assessing continuing competence across Canada and internationally. Many assessment approaches were considered including a variety of written testing formats, clinical tests, portfolios and onsite assessments. The QAP Backgrounder lists the assumptions, guiding principles and design features that were used as a guideline to make decisions about the program. It also explains other aspects considered in decision-making.

Upon careful consideration, a written test was selected as the best approach for the College to meet the regulatory requirements of a consistent, valid and reliable assessment of the continuing competence of physical therapists’ knowledge, knowledge application, clinical reasoning and decision-making skills in a cost effective way. The style of questions and corresponding answers for the written test is a flexible one – called ‘Key Features’ – to assess approaches to common situations encountered in everyday physical therapy practice.

The Quality Assurance Committee recommended a written test using a Key Features case-based format due to its strong validity and reliability, and also because this approach is economically sustainable and administratively feasible. The questions can be designed to test the application of knowledge to practice – i.e., what clinicians actually ‘do’ in their area of practice.

Although there is no perfect tool for measuring continued competence, the College is confident that a written exam satisfies legislative requirements and is also fair to British Columbian physical therapists. The Board approved the decision to deliver the RCA online to best satisfy logistical, administrative, and cost implications for both registrants and the College.

Assessment Tools Considered Before Choosing a Written Exam

Following an exhaustive research process of environmental scanning and review of key literature to understand issues, options and alternatives for best practices in continuing competence, a framework for the Quality Assurance Program from the College was established, overseen by a committee of physical therapists, other health professionals and public members.

The approved framework acknowledged that there is no perfect assessment tool to assess continuing competence. The Quality Assurance Program Advisory Team explored many assessment tools in detail before recommending that a written test was a very cost-effective, efficient baseline assessment tool for the periodic assessment of all physical therapists in British Columbia.

Assessment tools that were carefully considered, along with a ‘top line’ summary of the limitations that defined them as unsuitable, are outlined below:

  • Portfolios – A portfolio is a collection of documents, reflections and experiences in hard or electronic form that reflect the practitioner’s career and demonstrates evidence of ongoing learning and the maintenance of competence. The assessment literature concerning portfolios suggests that, although useful for development and feedback, limitations include subjective and often cumbersome scoring if there are multiple raters/scorers. Operationally the administration and tracking of portfolios is very costly (i.e., much more costly per person than a written test).
  • Continuing Education Credits – While collecting Continuing Education credits have some appeal, their use as an assessment tool for competence was not supported by the assessment literature. They are a ‘support’ to competence but not a tool to assess competence.
  • Clinical Examinations – Simulated case examinations, such as Objective Structure Clinical Examinations, have very positive features for accurate assessment. An important factor, however, is that the cost per person for the clinical examination approach is 2-3 times the cost of a written test. An additional consideration was that many physical therapists in British Columbia have not had experience with such a format. Administering this form of assessment across a large province such as British Columbia would not be cost effective, requiring either travel to many small locations by assessors or travel to one central location by registrants. Due to administrative reasons and foreseen acceptability challenges from registrants, this format was deemed not acceptable to the College.
  • On-Site Visits – The feasibility of on-site visits was explored, as well as the experiences of the College of Physiotherapists of Ontario. The assessment literature regarding on-site visits is not well developed but many similar features to the portfolio approach have been noted (e.g., subjectivity of scoring, administrative complexity, costs for large numbers of assessments). Administering assessments across a large province such as British Columbia would be very costly (i.e., more costly per person than a written test).
  • Written Assessment – A written assessment option was explored and the assessment and administrative features proved to be very appealing. While it was noted that the written assessment would only effectively evaluate ‘some’ of the facets of a physical therapist’s practice, it was viewed as a reasonable approach to regular screening of important aspects of practice – physical therapy knowledge, application of knowledge, clinical reasoning and decision-making.

Other B.C. Health Regulators’ Quality Assurance Programs

There appears to be a variety of approaches being used across the health professions in their quality assurance programs. Most are using multiple approaches to assessment including a regular reporting or self assessment of some aspects of competence (i.e., as is done with the College’s Annual Self Report) and a more thorough objective assessment within a defined period (e.g., 5, 6 or 10 years). Two other colleges use written tests (College of Dental Hygienists of British Columbia, College of Occupational Therapists of British Columbia).

Published Protocols Guiding the Development of the RCA

There are established standards for developing and administering assessments. Writing questions (i.e., also known as ‘items’) is both an art and a science. The College has established consistent examination standards and best practices. This includes (but is not limited to):

• Following a defined blueprint;

• Identifying appropriate clinical scenarios that are realistic and relevant to physical therapy in British Columbia;

• Identifying specific key features that are to be addressed in the scenario (both regulatory and clinical);

• Following a 12 step process to write the items (cases, questions, answers): Case Development Steps, Roles and Responsibilities;

• Key validation takes place after each administration to ensure items are functioning appropriately; and

• Reviewing and revising of items and cut score using data from each administration.

Registrants’ Feedback About Sample Cases

The Quality Assurance Program and RCA were presented at the Annual General Meeting in 2010 and, subsequently, at a number of educational sessions around the province. The educational session included a presentation as well as workshop activities where physical therapists had an opportunity to ‘test drive’ three sample cases. The following section briefly summarizes central messages received from attending physical therapists who submitted formal feedback forms.

In total, 263 feedback forms were submitted. Approximately 608 individuals attended the various presentations between April – June 2010, resulting in a response rate of approximately 43%. This is a high response rate for this type of feedback report.

The demographic information indicates that the respondents were similar to the registrants of the College in most categories: gender, age, years of practice, practice settings and clinical practice areas. Respondents tended to work either with adults, seniors or children, while almost half of the College members report working primarily with all ages. Given the high response rate, and the strong representation across the demographics of the College membership, the results of the feedback can be accepted as representative of the membership.

Three sample cases were presented – Musculoskeletal – MSK (geriatrics), Neurology and Cardiorespiratory. Using a five-point Likert scale from Completely Disagree to Completely Agree, respondents rated the cases on realism, whether they reflected current practice and whether they were suitable cases to assess physical therapy practice.

All three cases were rated highly with the majority of respondents Agreeing or Completely Agreeing that the cases were realistic (84-88%), reflected current practice (72-75%) and were appropriate for the assessment of physical therapy practice (67-72%).

For each specific question, they were asked to evaluate the difficulty of the question on a three-point scale: Too Difficult (beyond essential practice), Fair (part of essential practice) or Too Easy (below essential practice). Questions were almost universally rated as Fair (81-98%).

Other feedback on the RCA from the consultations with registrants:

  • Overall, the data is highly positive regarding the Key Feature cases.
  • The data indicated that the physical therapists who answered the sample questions found that they were quite fair.
  • Most of the registrants’ questions and comments indicated that they wanted more information about how the program will roll out. This desire for more information has been a College focus and has resulted in additional presentations at the Annual General Meeting as well as the updating/expansion of information on the College website and more RCA-dedicated information in the UPDATE newsletter.