Quality Assurance and Monitoring

When a state decides to pursue a quality rating and improvement system (QRIS), it is important to engage providers, partners, and other stakeholders in a strategic process to determine appropriate policies and procedures for accountability and monitoring. This section addresses documenting compliance with standards and criteria, determining rating levels, deciding how frequently rating levels will be determined, choosing which assessment tools to use, monitoring ratings, and facing possible loss or reduction of rating levels.

Print the Data Assurance and Monitoring section of the QRIS Resource Guide.

Documenting Compliance

The compliance criteria for each standard define what a program must do to achieve a particular level, move to the next level, or earn points in a specific category. Documentation for meeting QRIS standards can be in the form of a checklist, self-report or self-assessment, presentation of materials, and an observation or assessment. It is very important that each criterion and forms of acceptable documentation are clearly defined. Interviews and conversations with providers and interested stakeholders during the design phase will help identify requirements and processes that are not clear or sufficiently defined.

New Hampshire and New York are among the states with checklists on their websites that list required sources of evidence.

  • New Hampshire’s Licensed Plus Quality Rating System Option 1 Standards includes a column that specifies the type of documentation that is required to verify compliance with the standard.
  • New York’s QRIS standards include a documentation checklist for each standard. New York has developed an online Resource Guide that provides details about documentation requirements and includes samples of acceptable documents. Upon clicking on the standard, additional information on that standard becomes available, including clarifications, key documents, and links to additional reading.

Many states have glossaries, or definition pages, to define and explain the criteria more thoroughly. They also have companion pieces, such as an application manual (Maine) or a program guide (Delaware) that help providers and other interested individuals better understand the QRIS. See the Standards and Criteria section of the National Center on Early Childhood Quality Assurance’s QRIS Resource Guide for additional information.

As QRIS evolves in a state, documentation requirements may change or need clarification. Any changes need to be communicated to all stakeholders. As participation in the QRIS increases, the capacity of the documentation and assessment system must increase accordingly. The goal remains to make accurate verification and timely rating decisions.

 

Some states permit multiple methods to demonstrate compliance with QRIS standards. One area where states frequently accept equivalencies is educational qualifications and attainment.

National accreditation is another standard that is often used as an equivalent measure in a QRIS. States that incorporate national accreditation systems into their QRIS generally do so as equivalent to, or requirement for, higher levels of quality. The National Center on Early Childhood Quality Assurance provides an online National Program Standards Crosswalk Tool that includes the following standards: several sets of national accreditation standards; Head Start Program Performance Standards; Caring for Our Children Basics; Caring for Our Children, 3rd Edition; Stepping Stones to Caring for Our Children; Department of Defense Instruction and Effective Rating and Improvement System; and Child Care and Development Fund Final Rule. The tool allows users to compare their state standards to national standards.

States may decide to include standards in addition to national accreditation if they feel that standards are not sufficiently incorporated in the accreditation system or monitored with enough frequency. An example of this is the requirement for program assessments, such as the environment rating scales (ERS).

When states are considering multiple ways to demonstrate compliance, they can consider the following questions:

  • If there is another way to document compliance, is it equivalent to the competencies required in the QRIS?
  • Do providers have access to programs and supports that will help them demonstrate compliance? If not, does the state have the capacity to make them available?
  • If providers can seek validation from an outside group, association, or system to document compliance, does the outside entity have the capacity to meet the provider requests in a timely manner?
  • Are there financial implications for the state or the provider involved with alternate pathways?

Several factors determine how often programs will be monitored:

  • Available financial resources
  • Availability of staff with appropriate skills, knowledge, and time to perform functions
  • Validity and integrity of data collection
  • Connections to other systems and their monitoring and compliance processes

The method and frequency of monitoring may vary by standard. Some standards, such as current staff qualifications, may only need to be verified one time as long as the staff and their qualifications remain unchanged. Other standards, such as professional development requirements for ongoing training, need to be checked annually. Monitoring or verification may also be triggered under certain circumstances, such as staff changes, particularly if related to the director or serious licensing violations.

More information about how often programs are monitored can be found in the Quality Compendium.

Several states have found it helpful to have programs prepare for their QRIS evaluations by completing rating readiness tools. Washington has an Interactive Rating Readiness Tool (IRRT) to help facilities plan for their onsite evaluation visits. Addition information about the process can be found in the Early Achievers Participant Operating Guidelines (see page 30).

The IRRT is a checklist that includes the following:

  • Classroom schedules and general facility information
  • Documentation, including signed parent consent forms and the location of files for review
  • Identification of which standard components the facility plans to demonstrate that they are meeting

Determining the Rating Level

Identifying the entity(ies) with capacity to effectively administer a QRIS over time is a central issue to consider in the design phase. Most statewide QRIS are administered by a state agency in partnership with private sector entities. The QRIS lead agency typically oversees several basic functions, including

  • Initially assessing program quality and assigning QRIS levels;
  • Monitoring compliance to ensure system integrity;
  • Conducting classroom assessments (using an ERS, the Classroom Assessment Scoring System, or another instrument);
  • Providing training and technical assistance; and
  • Managing system planning, engagement, and outreach (data collection and analysis, Web design and upkeep, marketing development and public information dissemination, etc.).

In most cases, each of these functions is the responsibility of different staff members, many of whom may be with contracted agencies or privately funded partners. Several states use state agency employees for assigning initial ratings and monitoring compliance, and contract with outside entities for conducting classroom assessments and providing training and technical assistance. However, staffing patterns vary and are often influenced by available funding and current staffing needs and resources.

For validity of the system, it is important to separate the functions of conducting assessments and providing technical assistance. In other words, technical assistance providers should not also be responsible for assessing programs.

Roles

This section will refer to monitoring staff and technical assistance providers when describing roles. Generally, monitoring staff include raters, assessors, data collectors, and onsite evaluators. Technical assistance providers include coaches and consultants.

Licensing staff in North Carolina, Oklahoma, and Tennessee monitor QRIS criteria, and a separate team of assessors conduct the environment rating scales. In Oklahoma, the ERS assessors are contracted through the University of Oklahoma’s Center for Early Childhood Professional Development. Arizona’s Quality First QRIS is administered by First Things First (a governmental agency funded with tobacco tax). Participation in Quality First begins with an initial assessment, during which a Quality First assessor visits the program to observe classrooms and interview teachers. All programs enrolled in Quality First receive a coach, who visits the program on a regular basis and supports programs with technical assistance. The coach reviews scores from program assessments and helps the program create a plan for improvement.

Washington’s Department of Early Learning partners with the University of Washington (UW) to administer Early Achievers QRIS. The university is the lead agency for evaluation, assessment, and rating assignment. Data collectors from UW conduct facility onsite evaluation visits. The university is also responsible for the development of the Early Achievers Coach Framework. Washington designates several roles responsible for monitoring and supporting providers in the QRIS:

  • Regional coordinator: Approves or denies programs’ requests to be rated
  • Community liaison: A member of the UW evaluation team who supports facilities and the data collectors so visits are successful
  • Coach: A member of the UW evaluation team who participates in ongoing professional development and consultation with programs
  • Technical assistance (TA) specialist: With the local lead agency, the TA specialist works with programs to develop work plans and timelines
  • Data collector: A member of the UW evaluation team who collects data through observations, interviews, and reviews of records and documentation; this person also administers an ERS and the Classroom Assessment Scoring System

In many states, child care resource and referral (CCR&R) agencies play a key role in QRIS administration and often coordinate QRIS training and technical assistance. Institutions of higher education are also important partners and frequently assume responsibility for classroom assessment as well as help with data collection. Public-private partnerships, such as early and school-age care and education advisory committees, are often charged with planning, engagement, and outreach functions. In short, QRIS implementation is often a team effort.

State experience suggests that a strategic way to build on and expand current investments and maximize all available early and school-age care and education dollars is to use state licensing or other staff to assign ratings. Other outside entities, such as CCR&R agencies, institutions of higher education, and cooperative extension services, may assist with training and technical assistance.

Monitor Competencies

Given that monitors should maintain a positive working relationship with all providers, and providers’ characteristics and needs vary greatly, agencies should consider the desired knowledge, skills, and abilities of both monitors and coaches. Competencies can inform job descriptions, interview questions, selection of candidates, training, and evaluations.

Relationship-building and communication skills are at the center of training for monitors. Effective training takes into account the varying needs of centers and homes, urban and rural providers, different age groups, and cultural and language differences.

Inter-Rater Reliability

The thoughtful process of hiring qualified monitors, training with fidelity, and implementing an ongoing inter-rater reliability protocol encourages provider and stakeholder confidence in the QRIS process.

To ensure the integrity of the monitoring process, states select reliable assessment instruments, provide ongoing training and supports to monitors, and maintain a monitor calibration process to ensure that the instrument is being consistently used at all times by all monitors in all settings.

Texas Rising Star has a section in the Child Care Provider Certification Guidelines dedicated to assessor protocol. The protocol includes best practices to ensure that the certification process is reliable and credible. The guide has an introduction to the process, details about what to do prior to an assessment, how to conduct an assessment, and documenting and reporting the results.

Continuous Quality Improvement of the Monitoring System

To maintain and improve on the quality of the monitoring process, some states extend to programs the opportunity to provide feedback about the assessment experience. This feedback is usually shared immediately following the assessment visit and prior to the level designation or a request to appeal the rating.

Nevada Silver State Stars has a QRIS Feedback Form that allows programs to give feedback and share concerns regarding assessment experiences.

States may conduct annual stakeholder surveys to get feedback on providers’ experiences.

 

Licensing and QRIS are a part of the early care and education system and both are designed to support children’s development. A large majority of states require licensure for QRIS enrollment, and many include licensing as the first QRIS level. However, states diverge from this starting point in their level of coordination. It is essential that QRIS administrators include licensing from the beginning and maintain strong and consistent communication with licensing staff. The next part will outline four areas in which licensing and QRIS can coordinate: enforcement, standards, technical assistance, and monitoring.

Enforcement

Licensing and QRIS will need to communicate regarding how a program’s licensing status affects its QRIS level and accompanying support. Most states require that programs participating in the QRIS be licensed and in good standing. States define “good standing” in different ways and may have additional expectations for programs specific to maintaining licensing status after receiving ratings. If a program fails to meet these expectations, then it could lose its rating designation and financial incentives and other supports.

Issues that states have considered when determining how licensing compliance affects a program’s QRIS rating include the following:

  • How far back does the review of licensing compliance go? Does a 3-year-old violation carry the same weight as a more recent incident?
  • Should mitigating factors be taken into consideration when serious noncompliance occurs? In license revocation decisions, factors such as the program’s documented efforts to prevent the noncompliance from occurring and its subsequent response to the situation are often considered.
  • Is there a different expectation for a program first applying to participate and a program that is already rated? In other words, a serious noncompliance might keep an applicant from attaining a higher level, but it might not cause a reduction in rating for an already rated program.
  • Should a higher level of compliance be required for higher rating levels or is there one standard that’s the threshold for all rated programs?

Information about state QRIS participation policies and supports can be found in the Quality Compendium and in the Provider Incentives and Support section of the QRIS Resource Guide.

Arkansas’s and Vermont’s approaches to determining the effect of licensing status on QRIS ratings are described next.

Arkansas

  • Arkansas Better Beginnings Rule Book (See page 10.)
  • Rating assignment method: Building blocks
  • Licensing compliance: All facilities must be in good standing with the Department of Human Services. A facility in good standing is not currently debarred, defunded, excluded, or under adverse licensing action.
  • Effect of licensing status on QRIS ratings: Better Beginnings certification is valid for 36 months unless the facility becomes ineligible. Certified status can be denied, suspended, reduced, or removed if the facility is not in good standing (as defined in the previous bullet), substantiated complaints are received by the office, or the facility fails to correct deficiencies within a reasonable time period. Facilities that have their Better Beginnings certification removed are eligible to reapply in 12 months, unless otherwise notified.

Vermont

  • Vermont Step Ahead Recognition System Standards (See page 3.)
  • Rating assignment method: Points
  • Licensing compliance and effect of licensing status on QRIS ratings: Points in the Regulatory Compliance arena shall be awarded in accordance with the following criteria:

In Compliance means that the program is in compliance with all DCF [Department for Children and Families]/CDD [Child Development Division] regulations, a DCF licensor has conducted an onsite inspection within the last two years and any substantiated violations have been corrected.

  • 1 Point: The program is in compliance as defined above and within the past year has not had any substantiated violations resulting in a parental notification, and has not had any substantiated violations of the same nature or exhibited a general pattern of regulatory noncompliance.
  • 2 Points: The program is in compliance as defined above and within the past three years has not had any substantiated violations resulting in a parental notification, and has not had any repeated, substantiated violations of the same nature or exhibited a general pattern of regulatory noncompliance.
  • 3 Points: The program is in compliance as defined above and within the past five years has not had any substantiated violations resulting in a parental notification, and has not had any repeated, substantiated violations of the same nature or exhibited a general pattern of regulatory noncompliance. (p. 3).

Monitoring

Frequently, the QRIS is monitored by the licensing agency alone or in partnership with other agency staff or a private entity. Using licensors who are already funded to make periodic visits to programs makes good fiscal sense. Strategically linking QRIS to licensing could provide an opportunity to increase the number of licensing staff, reduce caseloads, and broaden staff’s roles. For example, the Oklahoma Department of Human Services added 27 licensing staff members when it became responsible for monitoring QRIS compliance. However, an assessment must be made to determine if the licensing system can adequately support this new responsibility. If a licensing program is unable to adequately monitor child care or its sole focus is on enforcement, it will face greater challenges in monitoring a QRIS. If licensing managers are included early in the QRIS planning process, they may have valuable contributions to the discussion.

States have policies and procedures for renewing rating levels, and several States also set a time limit on how long a provider can be at one rating level, During renewal, providers generally can earn higher or lower ratings, based on the standards they meet, or they can keep their current rating levels.

When discussing QRIS ratings, it is important to differentiate between two separate, but interrelated, functions: assigning a rating and conducting a classroom or home assessment. Most States use classroom or home assessments, such as the ERS, as one—but not the only—tool to assess compliance with QRIS criteria in an area of learning environments. These two functions can occur on the same cycle, such as annually, or they can occur at different points in time. On average, States assign ratings and conduct classroom assessments annually.

  • In Delaware, a program accepted for enrollment has one year to complete the steps required for Star Level 2 or the program will be terminated and must reapply for enrollment in Delaware Stars. After a program achieves Star Level 2, they must re-verify for their current Star Level every two years if maintaining or actively working to achieve a higher Star Level.
  • Illinois requires renewal every 3 years, including reassessment on ERS, BAS, PAS, or CLASS. Rated programs must provide an annual report including licensing compliance, staffing requirements, etc. to maintain the rating.
  • North Carolina assigns ratings every 3 years and monitors annually for maintenance of ratings. A reassessment of the rating may also be conducted before the 3-year time period if the annual monitoring identified certain indicators, e.g., high staff turnover, a new director, or serious licensing violations. A program may also request a rating reassessment once a year if it anticipates its rating will improve.
  • In Ohio, providers must renew their ratings on an annual basis. At that time they can apply for higher ratings if they meet the standards, maintain their current ratings, or reduce their ratings if standards for higher ratings are no longer met. A rating will expire if a provider does not apply to have its rating renewed. Prior to the rating expiration date, providers receive written notification and instructions for renewing their ratings.
  • In Oklahoma, the license and star status are nonexpiring, based on documented compliance at monitoring visits that occur at least three times per year.  Providers can be at the One Star Plus level as long as they are working toward an education component needed to achieve higher standards and move to the Star Two level.
  • Maine assigns ratings every 3 years but only requires classroom assessment in sites that are selected to participate in the QRIS evaluation.
  • Providers in Rhode Island agree to participate in BrightStars for a 3-year period. At the end of that period, providers choose to continue and be reassessed or terminate participation without penalty. If a provider chooses to terminate participation during the 3-year period, the provider will not be allowed to participate again for 24 months from the time of termination. To maintain their rating, providers submit annual reports to BrightStars and may request an adjustment to their rating once a year.
  • Ratings in Vermont are valid for 3 years if the annual report documents maintaining the standards for this rating. If a provider is unable to maintain the standards of its current star level, no action is taken until the provider submits the annual report. Providers request points based on the standards they can demonstrate in the report. Participating providers renew their certificate for the number of points they can verify (may be higher, same, or lower than previously earned points). Reminders to submit the annual report are automatically sent to participating programs and must be returned in order to maintain participation and star level.

Decisions regarding how often QRIS ratings are assigned, as well as how frequently classroom assessments are conducted, will be influenced by available resources. Conducting reliable, valid classroom assessments can have a significant financial impact. In addition to the time it takes to actually conduct an assessment, write up the results, and travel among multiple sites, time and funding must be made available to ensure that raters receive appropriate training and that inter-rater reliability is assessed on a regular basis. Additional information is provided under the question, “What assessment tools will be used? How will they be used?”

As noted earlier, QRIS compliance is typically based on a number of factors, only some of which are determined by classroom assessments. Additional information is available in QRIS Compendium Fact Sheet: Use of Observational Tools in QRIS (2017) by the National Center on Early Childhood Quality Assurance.

Most of the states that require a classroom assessment to evaluate program quality currently use environment rating scales developed by the Frank Porter Graham Child Development Institute at the University of North Carolina at Chapel Hill. These scales include the following:

  • Early Childhood Environment Rating Scale–Revised (ECERS-R)
  • Infant/Toddler Environment Rating Scale–Revised (ITERS-R)
  • School-Age Care Environment Rating Scale (SACERS)
  • Family Child Care Rating Scale (FCCRS)[1]

Quality in Early Childhood Care and Education Settings: A Compendium of Measures (Halle, Vick Whittaker, & Anderson, 2010) provides detailed information about program assessment measures, including measure purpose, intended ages and settings, administration, and reliability and validity.

Some states are also using more focused assessment tools that measure interactions, classroom practices, and administrative practices in addition to or in lieu of measures of global quality.

  • Massachusetts requires assessments with the Program Administration Scale (PAS) for child care centers and the Business Administration Scale (BAS) for family child care providers.
  • Oklahoma recognizes the Child Caregiver Interaction Scale, the Arnett Caregiver Interaction Scale, the Early Language and Literacy Classroom Observation (ELLCO), the PAS, and the Classroom Assessment Scoring System (CLASS).
  • In Ohio, self-assessments are required, but programs can use an ERS, the ELLCO, or other assessment tool, and scores are not tied to ratings.
  • In Rhode Island, CLASS scores are collected from a random sample of 33 percent of preschool classrooms. Scores were not used in the rating process during the first year of implementation.

In some cases, classroom assessment tools are required and the scores are used to help determine ratings. Other states have made assessment tools optional—as one way to accumulate QRIS points—or require tools for programs seeking higher star levels only. Some states require programs to be assessed with environment rating scales but do not tie particular scores to the ratings. Information about the program assessment tools used by state QRIS is available in QRIS Compendium Fact Sheet: Use of Observational Tools in QRIS (National Center on Early Childhood Quality Assurance, 2017).

When determining what percentage of classrooms to assess using a classroom quality measurement tool, states have to balance financial resources with assessment validity. The authors of classroom measurement tools can advise on the minimum number of classrooms to assess so that the resulting average is an accurate measure of overall program quality. The Office of Planning, Research and Evaluation publication Best Practices for Conducting Program Observations as Part of Quality Rating and Improvement Systems recommends observing at least one classroom in each age range and observing 50 percent of the classrooms in each program (Hamre & Maxwell, 2011). The authors add that “weighing the costs, it is not recommended that QRIS observe every classroom in programs if the purpose is solely to determine the program’s rating. However, it is clear that observing every classroom may be useful for other purposes such as providing technical assistance.” (p. 9)

 

[1]The environment rating scale for family child care homes was revised in 2007. Some states still refer to the older version, i.e., the Family Day Care Rating Scale (FCDRS).

A key step in QRIS design is to examine the current early and school-age care and education landscape and infrastructure to determine how to integrate various functions or subsystems. It is important to identify where there are services already in place that might be expanded or included in the QRIS structure. In most states, there are a host of resources that can be accessed.

  • North Carolina, Oklahoma, and Tennessee, among others, use state licensing staff to gather and validate the information needed to assign ratings.
  • Ohio’s Step Up to Quality program includes dedicated staff in each licensing field office whose sole responsibility is QRIS administration.
  • In Colorado, CCR&R staff, who are private sector employees who receive both public and private funding, conduct ratings.
  • In Illinois, assessments are conducted by staff at the McCormick Center for Early Childhood Leadership. Scores are sent to the Illinois Network of Child Care Resource and Referral Agencies (INCCRRA), the QRIS application contractor, to be combined with other criteria for rating generation. It is also important to ensure that the assessor conducting the assessments has the appropriate background, credentials, and training related to the age group for each assessment scale. For example, the ITERS-R assessor should have knowledge of infants and toddlers. Likewise, the SACERS assessor should be knowledgeable about the care and education of school-age children.

Monitoring the Rating

The policies and procedures for monitoring ratings should be clearly articulated to all involved. As providers submit documentation and QRIS staff conduct interviews, observations, and assessments, it is important that all acceptable sources of evidence are consistently defined and interpreted. Whether a state implements a building block approach, a point approach, or a combination of the two, it must have a sound monitoring process in place.

Just as it is important for early and school-age care and education programs to be aware of any benefits for achieving a level, they also need to understand what they must do to maintain a designated level and the consequences for noncompliance. The policy should specify when a reduction of status becomes effective, what the process is to restore a level, and if there are any appeal rights. States have developed administrative policies for situations when a program no longer meets one or more of the standards in its current designation level. The process to be followed for noncompliance should be clearly written and communicated to programs.

Many states include a program improvement plan as part of the QRIS process. Typically based on a provider’s self-assessment, observation, or rating, this plan identifies strengths and weaknesses and suggests ways to make improvements. Many QRIS use the results of an assessment tool, like an ERS, as a starting point for developing this plan.

Maryland requires a program improvement plan for programs that are seeking a Check Level 3 rating and have any ERS subscale scores below 4.0 on the program’s self-assessment. For Check Level 4 and 5, a program improvement plan is required for a program that receives an outside ERS assessment with a subscale score of 4.5 or 5.0, respectively. Programs may use a variety of additional tools or assessments to create the improvement plan, such as accreditation self-study or validation results, school-readiness goals and objectives for their jurisdictions, and program-specific goals and objectives for continuous quality improvement.

Washington has extensive procedures on how licensing status affects QRIS participation, both at registration and during participation. For example, if a facility is operating under a probationary license, it has 6 months to regain full licensure. During this time, it may continue to work with a coach or technical assistance specialist, but it cannot be evaluated for a rating. If the full license is not reinstated within 6 months, its participation in Early Achievers will be terminated. In 2015, the Early Childhood Education and Assistance Program (ECEAP), Washington’s state prekindergarten program, was required to participate in Early Achievers per House Bill 1723. Washington enhanced the data system and built the Early Achievers Participation Monitoring Report to track the progress of providers required to participate in Early Achievers (non-school-age providers receiving state subsidies and those serving ECEAP).

Providers may wish to challenge both an assessment score as well as the overall rating assigned to their programs. Some states have developed guidelines to follow if a program disagrees with its quality rating, although not all have a formal appeals process. Clear communication and training to help providers better understand the rating process may help to reduce the number of appeals.

In Stair Steps to Quality: A Guide for States and Communities Developing Quality Rating Systems for Early Care and Education, Anne Mitchell (2005) makes the following statement about implications of accountability policies:

A key accountability issue in a quality rating system (QRS) [sic] is the accuracy of quality ratings. A well-designed and implemented accountability system, bolstered by clear communication about the structure and operation of the QRS, should minimize disagreements. A concern that has been raised about rating systems, especially those connected with licensing is whether rating the quality of programs will result in challenges to ratings and an increase in requests for hearings. Anticipating that some programs may not agree with the rating they receive, an appeals process should be designed in advance. Administrators of statewide QRS report that although quality ratings do change, there are relatively few challenges and little or no increase in hearing requests. (p. 36)

The guidelines developed by each state vary. In Colorado, a program may initiate a Technical Review of its Qualistar rating within 30 calendar days of receiving its Qualistar Rating Consultation. It may also initiate a Dispute Resolution Process within the same time period. In North Carolina, programs can appeal the evaluation of staff qualifications to the Education Unit, and can appeal the environment rating scale results first to the assessors at the University of North Carolina-Greensboro and then to the Office of Administrative Hearings. In Oklahoma, if a program’s star level is reduced, it can appeal or propose an alternative settlement but cannot reapply for 6 months if the reduction is due to noncompliance. Wisconsin’s YoungStar Policy Guide stipulates that the “local YoungStar office discuss[es] the rating with the provider before it is published on the YoungStar Public Search website.” In an effort to minimize the number of reconsiderations, YoungStar has established “clear documentation and justification of the rationale for a program’s rating.” Most of the guides, workbooks, and toolkits referenced in the Provider Incentives and Support section of the QRIS Resource Guide include information on the appeals process.

In Arkansas, upon receipt of the request for appeal, the Better Beginnings coordinator will conduct an internal review to ensure that the appropriate processes were followed and determine the validity of the original decision. According to the Better Beginnings Rule Book, the Better Beginnings coordinator will review the findings with the division director and will transmit the findings of the internal review to the facility within 30 days of the receipt of the request to appeal. If the outcome of the internal review is unsatisfactory to the facility, it has 10 days to ask for further review by the Better Beginnings Appeal Review Committee.

 

As states are integrating services across systems and aligning program standards in QRIS, the reduction or loss of rating levels can have a significant financial impact on programs. Examples include the following:

  • Lack of or reduced access to free or low-cost training opportunities (Teacher Education and Compensation Helps [T.E.A.C.H.] Early Childhood® Project scholarships, training vouchers, Child Development Associate courses, credentialing programs, etc.).
  • Reduction or loss of financial rewards or bonuses for attaining and maintaining higher levels within the QRIS. These awards can be directed to the program or to individual staff within the program.
  • Reduced tiered reimbursement payments for subsidized child care.
  • Limited access to supportive services, such as technical assistance, consultation, and ERS assessments.
  • Inability to market a program at a higher level. This may reduce a program’s ability to remain competitive with other programs and may affect parents’ decisions regarding placement of their children in care.

Any partnering agency or service within the state system that advertises rating levels to the public needs to be notified of rating changes so that parents have access to the most current information. This includes both increases and decreases in levels. Local CCR&R agencies commonly maintain and distribute rating information to parents, and their listings must be accurate. If the licensing or subsidy agency is not the same agency that administers the QRIS, each of these agencies will need separate notification. When tiered reimbursement payments are involved, the subsidy agency must be notified. If prekindergarten programs are rated or if eligibility for funding depends on a specific quality rating, then the education department must be notified.

Early and school-age care and education providers should be advised not to market themselves incorrectly. Some states supply participating programs with materials, such as banners, window clings, and posters, to use to market their QRIS to parents. If these materials advertise a level that is no longer applicable, they should be changed accordingly.

Programs in Fresno County, CA, Uploaded Evidence into QRIS Database

In California, the Fresno County QRIS, Fresno County Early Stars, used Mosaic as their QRIS database. Early Stars programs uploaded to the STARS database required information such as lead teacher and director transcripts, copies of degrees or permits, and professional development certificates. Early Stars Administrators reviewed and verified all the program data in the STARS database. A rating was generated based on this information as well as information that was verified during an onsite visit.

Oklahoma Responded to Unintended Consequences in Its QRIS

Oklahoma’s Reaching for the Stars QRIS policy and procedures were specific and detailed so that staff and providers understood the process. This understanding was essential because of the significant financial consequences of star status on tiered reimbursement rates. Because it was difficult to evaluate a program when it first opened, the QRIS policy initially stated that a program could not apply for a higher star level until it had a full license, generally achieved after 6 months of operation. This requirement imposed a hardship on new programs as well as on existing child care centers, particularly if there was a change of ownership. Under new ownership, the tiered rates dropped dramatically, jeopardizing the continued quality of the center. As a result, the policy was changed to allow new programs with an initial permit to participate.

Tennessee's Appeals Process

Tennessee tried to anticipate situations that might lead to an appeal by making post-assessment calls to all providers participating in the Child Care Report Card and Star-Quality Program. These calls, which were handled by child care resource and referral (CCR&R) agency specialists, helped keep the number of disagreements low. Following each call, a provider received a copy of the assessor’s notes and a profile sheet that summarized all of its scores. If there was an issue with the assessment piece, the provider had 20 business days to file an appeal. The level 1 appeal was handled by the local unit, which worked with the CCR&R staff. The level 2 appeal was conducted by contract staff. If a provider completed both levels of the appeals process and still had an issue, it could then request an administrative hearing.

Vermont's Grievance Process

Applicants or program participants had the right to appeal rejection of their application materials or other adverse decisions related to the Step Ahead Recognition System (STARS) program such as the suspension or revocation of a STARS certificate in connection with enforcement of licensing regulations, subsidy regulations, or these standards.

Appeals had to be in writing and received by the Department of Children and Families (DCF) Commissioner within 30 days of the date of rejection or other adverse decision. If the appeal was from a school-operated prekindergarten program, the Commissioner of the Department of Education joined the Commissioner of the DCF in deciding the appeal.

The applicant or grievant had the opportunity to present the appeal to a STARS grievance committee, which was appointed by one or both commissioners and consisted of at least three members, including one from the regulated provider community. The committee provided a recommendation to the commissioner or commissioners, who made a final decision on the grievance and provided the grievant with a written decision. The grievant could appeal the final decision to the Human Services Board within 30 days of the final decision date.

Financial incentives were not paid while an appeal was pending. If a successful final appeal resulted in the determination that a STARS program participant was due a financial incentive or maintenance payment, DCF awarded payment in full within 60 days

Additional information is available in STARS Standards (September 2019).

Shared Management Approach in Virginia's QRIS Pilot

Virginia Quality was a public-private partnership between the Virginia Department of Social Services (VDSS) and Virginia Early Childhood Foundation (VECF). The administrative partnership of these two entities was referred to as “the Hub,” while each partner maintained clear responsibilities. The Virginia Department of Social Services’ responsibilities included the following:

  • Provide resources to fund the implementation of Virginia Quality
  • Facilitate selection and training of raters
  • Set criteria and training standards for mentors
  • Review applications for each participating program
  • Conduct documentation reviews
  • Maintain the QRIS online database
  • Provide technical assistance to regional and local coordinators, raters, mentors, and participating providers
  • Conduct strategic decision-making and planning for the system
  • Manage proposals and monitor contract implementation and compliance

The Virginia Early Childhood Foundation’s responsibilities included the following:

  • Maintain the quality and integrity of the standard and rating process:
    • Oversee the inter-rater reliability process
    • Review Program Rating Summary Reports
    • Award ratings
    • Manage the appeal process
  • Provide resources to help regional and local coalitions implement QRIS
  • Provide technical assistance to regional and local coordinators, raters, mentors and participating providers

Share responsibility with VDSS for strategic decision-making for the initiative

Halle, T., Vick Whittaker, J. E., & Anderson, R. (2010). Quality in early childhood care and education settings: A compendium of measures, 2nd edition. Retrieved from http://www.acf.hhs.gov/sites/default/files/opre/complete_compendium_full.pdf

Hamre, B. K., & Maxwell, K. L. (2011). Best practices for conducting program observations as part of quality rating and improvement systems (Research-to-Policy, Research-to-Practice Brief OPRE 2011-11b). Retrieved from http://www.acf.hhs.gov/programs/opre/resource/best-practices-for-conducting-program-observations-as-part-of-quality

Maxwell, K. L., Sosinsky, L., & Tout, K. (2016). Mapping the early care and education monitoring landscape. OPRE Research Brief #2016-20. Retrieved https://www.acf.hhs.gov/sites/default/files/documents/opre/mapping_the_early_care_and_education_monitoring_landscape_508final.pdf

Maxwell, K. L., Sosinsky, L., Tout, K., & Hegseth, D. (2016). Coordinated monitoring systems for early care and education. OPRE Research Brief #2016-19. Retrieved from https://www.acf.hhs.gov/sites/default/files/opre/coordinated_monitoring_systems_in_early_care_and_education.pdf

Mitchell, A. W. (2005). Stair steps to quality: A guide for states and communities developing quality rating systems for early care and education. Retrieved from https://www.researchconnections.org/childcare/resources/7180

National Center on Early Childhood Quality Assurance. (2017). QRIS compendium fact sheet: Rating structures and processes. Retrieved from https://childcareta.acf.hhs.gov/resource/qris-compendium-fact-sheet-rating-structures-and-processes

National Center on Early Childhood Quality Assurance. (2017). QRIS compendium fact sheet: Use of observational tools in QRIS. Retrieved from https://childcareta.acf.hhs.gov/resource/qris-compendium-fact-sheet-use-observational-tools-qris

National Center on Early Childhood Quality Assurance. (2015). Trends in child care center licensing regulations and policies for 2014. Retrieved from https://childcareta.acf.hhs.gov/resource/research-brief-1-trends-child-care-center-licensing-regulations-and-policies-2014

State Capacity Building Center and National Center on Early Childhood Quality Assurance. (2017). Mapping the early care and education monitoring landscape: Tool. Retrieved from https://childcareta.acf.hhs.gov/resource/mapping-early-care-and-education-monitoring-landscape

Tout, K., Zaslow, M., Halle, T. & Forry, N. (2009). Issues for the next decade of quality rating and improvement systems (Publication No. 2009-14, OPRE Issue Brief No. 3). Retrieved from http://www.acf.hhs.gov/programs/opre/resource/issues-for-the-next-decade-of-quality-rating-and-improvement-systems

Arizona Quality First. http://qualityfirstaz.com/providers/

Caronongan, P., Kirby, G., Malone, L., Boller, K. (2011). Defining and measuring quality: An in-depth study of five child care quality rating and improvement systems. OPRE Report #2011-29.  Washington, DC:  U.S. Department of Health and Human Services, Administration for Children and Families, Office of Planning, Research, and Evaluation.
http://www.acf.hhs.gov/programs/opre/cc/childcare_quality/five_childcare/five_childcare.pdf

Child Care Aware of Kansas. (n.d.). Am I Ready for KQRIS? Salina, KS: Author. http://www.ks.childcareaware.org/PDFs/Am%20I%20Ready%20for%20KQRIS.rev%202012.pdf

Early Achievers, Washington’s Quality Rating and Improvement System. http://www.del.wa.gov/care/qris/

Great Start Early Learning Advisory Council. (2010). Recommendations for Michigan’s quality assurance system for child care and early education. http://greatstartforkids.org/sites/default/files/file/QRIS/ELAC%20QRIS%20QDC%20Design%20Project%20Recommendations_FINAL_2010_11_10.pdf

IdahoSTARS QRIS Star Rating Verification Sample Checklist. http://idahostars.org/sites/default/files/documents/qris/qris_star_rating_verification_home_group.pdf

Kentucky STARS for KIDS NOW Operations Manual. http://chfs.ky.gov/NR/rdonlyres/A1AE772D-AE26-455A-A5AA-7A4605A256DE/0/STARSOperationsManual.pdf

McDonald, D. (2007). Elevating the field: Using NAEYC early childhood program accreditation to support and reach higher quality in early childhood programs. Washington, DC: National Association for the Education of Young Children. http://www.naeyc.org/files/naeyc/file/policy/state/NAEYCpubpolReport.pdf

National Center on Child Care Quality Improvement’s Program Standards Crosswalk Tool. https://occqrisguide.icfwebservices.com/?do=crosswalk

National Center on Child Care Quality Improvement (NCCCQI). (2013a) QRIS Financial Incentives. Washington, DC: Office of Child Care, Administration for Children and Families, U.S. Department of Health and Human Services. https://occqrisguide.icfwebservices.com/files/QRIS_Financial_Incentives.pdf

National Center on Child Care Quality Improvement (NCCCQI). (2013b) Accreditation accepted for QRIS. Washington, DC: Office of Child Care, Administration for Children and Families, U.S. Department of Health and Human Services. https://occqrisguide.icfwebservices.com/files/QRIS_Accred_Accepted.pdf

National Center on Child Care Quality Improvement (NCCCQI). (2013c) Use of ERS and other program assessment tools in QRIS. Washington, DC: Office of Child Care, Administration for Children and Families, U.S. Department of Health and Human Services. https://occqrisguide.icfwebservices.com/files/QRIS_Program_Assess.pdf

New Hampshire Licensed Plus Quality Rating System Option 1 Standards. http://www.dhhs.state.nh.us/dcyf/licensedplus/documents/option1standards.pdf

Pennsylvania Keystone Stars Resource Documents. https://www.pakeys.org/keystone-stars/resources/

QUALITYstarsNY Resource Guide. http://qualitystarsny.org/standardsguide.php

Washington State Department of Early Learning. (2013). Early Achievers Participant Operating Guidelines. Olympia, WA: Author. http://www.del.wa.gov/publications/elac-qris/docs/EA_operating_guidelines.pdf

Zellman, G. L. & Perlman, M. (2008). Child-care quality rating and improvement systems in five pioneer states: Implementation issues and lessons learned. Arlington, VA: RAND Corporation. http://www.rand.org/pubs/monographs/2008/RAND_MG795.pdf