Skip to main content
U.S. flag

An official website of the United States government

Ch.17 Quality Assurance

Author: Kelly Dedel, Ph.D.

The Purpose of Quality Assurance

The best way to ensure that knowledge about best practice is dependably translated into the daily reality of a facility is to establish a robust quality-assurance process. Quality assurance involves systematic measurement of the various aspects of a facility’s operation, comparisons to an objective standard, and modifications of facility policy, procedure, and practices when the standard is not met. Facilities have a responsibility to ensure that youth rights are protected and that programs and practices are producing the desire outcomes among youth in custody. Opinions about what is reasonable to ask of correctional facilities, where responsibilities lie, and which outcomes should be pursued vary across states, systems, and staff.[2] For this reason, the emergence of professional and state standards for the operation of juvenile facilities is an important development in the field. These standards provide a framework and compel stakeholders to review all aspects of facility operations and engage in discussions about “How good is good enough?” Although professional organizations have not yet developed standards to guide programming for youth housed in adult facilities, the standards established for juvenile detention and correctional facilities have obvious relevance.

Although professional standards are important for many reasons, jurisdictions should also set and monitor their own internal standards to uphold their specific priorities. Robust quality-assurance processes can be an effective deterrent to liability claims and lawsuits regarding the conditions of confinement. However, simply submitting to an outside group’s audit or conducting one’s own inspection offers no real protection; the facility must also demonstrate a commitment to remedy any deficits noted. Without a commitment to quality improvement, the audit itself is rather worthless in terms of protecting youth rights and meeting their needs, and in shielding a jurisdiction from liability. Experts in prison oversight have noted that inspections and monitoring are most effective when they are routine, thorough, and transparent and when the group conducting the inspections has the power to require change.[3]

Inspections and monitoring are only one way to identify and address problems with safety, programming, and other services in correctional facilities. Other means are the regulation and licensing of facilities, the monitoring of contracts to ensure a provider is delivering the promised services, and the investigation of allegations of staff misconduct to determine their veracity.[4] Though legal remedies are often used to correct serious deficiencies, they rarely proactively identify or solve problems. Whatever the form of the inspection, audit, or monitoring, facility administrators are often hesitant to accept an outsider looking over their shoulder, making judgments, and telling them what to do. This reluctance is understandable. However, the best quality-assurance auditors and mechanisms engage in collaborative problem solving, focusing less on what is wrong than on the underlying causes of the identified problems. In so doing, they can offer a fresh perspective. Quality assurance can unite elected officials, advocates, administrators, and staff around the common goal of running safe and effective facilities.

Types of Quality Assurance

After presenting the developmental history of professional standards that guided practices for juvenile facilities, the previous Desktop Guide to Good Juvenile Detention Practice (1996) discussed only one type of quality-assurance process—audits conducted by external, standards-setting organizations. Since then, the field has embraced the concept of quality assurance. Most quality-assurance programs now use specific, objective performance indicators and routinely collect data to assess program performance and measure outcomes. Furthermore, facilities use the results to make decisions about operations. The options for accreditation and licensing, inspection, and monitoring have multiplied in the 20 years since the first Desktop Guide was written, along with procedures for implementing quality-assurance programs at the facility level.

The number of professional organizations that publish standards relevant to juvenile facilities has grown. In addition, many states have drafted and adopted their own standards for the operation and licensing of detention and commitment facilities. These are discussed in the section below on External Audits. Standards set and audited by organizations external to the agencies that operate facilities share several features:

  • Standards draw upon the practices and experiences across the entire field and are not limited to a single jurisdiction’s practices in their conceptualization.
  • Input from a large number of practitioners and auditors who use the standards allows them to be refined, better articulated, and clarified on an ongoing basis.
  • Auditors are independent and do not have a stake in the outcome of the audits. They are less vulnerable to temptations to minimize poor performance to avoid repercussions from facility directors, agency administrators, or elected officials.
  • When the results of the audits are transparent and publicly available, facilities can compare their results to other facilities across the nation and can often access a network of colleagues who may have experienced similar problems and figured out how to solve them.

Although having the results of an audit made public may make administrators feel they are being called on the carpet, transparency has certain benefits. For one, if the audit results are made public, stakeholders can better understand the nature of the challenges faced by correctional facilities and may more clearly recognize their ability to contribute to solutions. Facilities and agencies often find themselves in a better position to advocate for needed resources from the legislature, other agencies, and community members.

Standards set and audited by external organizations are not without their flaws. Concerns include:

  • The cost to enroll in, subscribe to, or become accredited by an external organization may make participation unfeasible.
  • An overreliance on the mere presence of a written policy can leave questions unasked about whether the policy has been appropriately translated into practice and whether those practices are producing the desired outcomes.
  • Participation, certification, or accreditation does not create immunity to litigation, particularly when the standards being audited do not attend to specific outcomes around safety and treatment.

The concerns about national standards-setting organizations may have catalyzed the development of system- and facility-specific standards (internal quality assurance) that better pinpoint specific areas of operational concern and that better incorporate desired outcomes. Given the lack of standards that guide practices for youth housed in adult facilities, internal quality-assurance mechanisms are even more essential in these cases. Internal quality-assurance mechanisms are often far more detailed than standards set by external agencies; they delve more deeply into the extent to which facility policies have been effectively translated into practice to produce desired outcomes. For example, a general standard that requires a facility to “provide appropriate programming to address the rehabilitative needs of youth” may be further specified by an internal quality-assurance standard. It could require that “all youth shall complete at least 80% of the 50-session anger-management curriculum and reduce their involvement in violent institutional misconduct prior to release.”

The External Quality Assurance section below describes the various standards-setting organizations. The collection is meant only to identify some of the options available, not to endorse any particular organization or set of standards. The section on Internal Quality Assurance offers specific guidance for building a system that closes some of the gaps left open by more general standards. Overall, a quality-assurance system that involves both strategies will lead to a robust assessment of the strength of the facility’s programming and its ability to keep youth and staff safe from harm.

External Quality Assurance

As noted above, having independent organizations conduct audits can bring additional rigor, a broad base for comparison, and additional credibility to the findings.

National Organizations

A variety of national organizations have published professional standards to guide the operation of juvenile facilities. Some offer accreditation, some provide a team of auditors to inspect the facility, and some provide a do-it-yourself framework or technology for examining an area of operation.

  • The American Correctional Association (ACA) publishes standards for a broad range of correctional programs, including juvenile residential facilities, correctional facilities, detention facilities, day treatment programs, boot camps, therapeutic communities, and small detention facilities. In 2001, ACA began accrediting correctional healthcare programs as well. The Standards Committee continually revises standards based on agency experiences, evolving practices, and new case law. Standards cover the full range of facility operations—physical plant, staff training, sanitation and life safety, safety and security, programs, due process and discipline, access to courts, mail and visitation, searches, and other conditions of confinement. A narrative to clarify the intent and other information to assist with implementation follows each standard. A large number of corrections professionals have been trained as auditors. The ACA audit teams keep their findings confidential, but participating agencies are encouraged to share the results of their compliance audits with the media.[5]
  • The Correctional Education Association (CEA) updated its Performance Standards for Correctional Education Programs in Juvenile Institutions in October, 2004. The 67 individual standards are divided into four categories: administration, personnel, students, and programming. Agencies must contract with CEA to pursue accreditation, which is awarded if the agency is 100% compliant with the 24 required standards and at least 90% compliant with the 43 non-required standards. Trained CEA auditors observe programs, interview staff and students, review policy and procedures, and examine documentation related to the implementation of standards.[6]
  • The Juvenile Detention Alternatives Initiative (JDAI), supported by the Annie E. Casey Foundation, includes improving conditions of confinement as one of its essential core strategies for juvenile detention reform. Toward this end, the Youth Law Center and the Center for Children’s Law and Policy developed JDAI’s Standards for Detention Facility Conditions. They did so in consultation with national experts in all aspects of juvenile facility conditions and with input from juvenile justice system professionals in sites around the country. The standards reflect JDAI’s core values, address constitutional and statutory requirements, and embody professional best practice (including those from many of the organizations listed here). The 338 standards cover every aspect of a detention facility’s operation: classification (intake, screening, living unit assignments); health (medical, mental health, dental); access (mail, visits, telephone, legal access); programming (education, exercise, recreation, religion, work); training and administrative oversight; environment (sanitation, physical plant, food, crowding, privacy); restraints, isolation, discipline, and grievances; and safety. Last updated in 2006, the standards are slated for revision in 2014. A team of stakeholders from each participating JDAI site is trained to conduct an assessment of facility practice. The team’s findings are then used to inform the site’s JDAI goals and workplan to improve performance in any area in which a deficit was noted.[7]
  • The National Commission on Correctional Health Care (NCCHC) publishes Standards for Health Services in Juvenile Detention and Confinement Facilities. The most recent version, updated in 2011, includes governance and administration; safety; personnel and training; healthcare services and support; juvenile care and treatment; health promotion; special needs and services; health records; and medical–legal issues. Many facilities use the standards to guide facility practices without seeking accreditation; however, accreditation is also offered via a peer-review process. Survey teams composed of physicians, nurses, health administrators, and other professionals measure compliance and provide technical assistance.[8]
  • The National Fire Protection Association (NFPA) updates its Life Safety Code (LSC) every three years, most recently in 2012. The Code addresses life-safety issues in correctional environments and includes standards for egress, features of fire protection (sprinkler systems, alarms, emergency lighting, smoke barriers), and special hazard protection. The LSC does not differentiate between adult and juvenile facilities, but has standards for both new construction and existing facilities. The NFPA standards are voluntary and user accepted. The NFPA does not conduct audits or offer accreditation. Once a jurisdiction has adopted the standards, the NFPA offers technical support to assist with compliance.[9]
  • The Council of Juvenile Correctional Administrators (CJCA) developed the Performance-based Standards for Secure Juvenile Facilities (PbS) to provide a system for juvenile agencies to identify, monitor, and improve critical areas of facility operation using best practice standards and performance outcome measures. PbS member sites measure outcomes in seven key areas: safety, security, order, health and mental health services, justice and legal rights, programming, and reintegration planning. Rather than using a dichotomous accreditation process, PbS features a continuous improvement cycle that includes data collection, performance reports, an outcome measure analysis, and a rigorous facility improvement process to remedy any deficits.[10]
  • The Prison Rape Elimination Act (PREA) created the National Prison Rape Elimination Commission and gave it the task of developing a set of national standards to reduce the incidence of sexual violence in correctional facilities. A set of 43 Juvenile Facility Standards cover the broad range of issues involved in preventing, detecting, and responding to sexual assault: prevention planning (policy, staffing, supervision, monitoring); responsive planning (investigation, access to forensic services, rape crisis advocates); training and education (for staff, youth, investigators, medical and mental health staff); screening for risk; reporting; official response; investigation; discipline; emergency medical and mental health treatment; and data collection and review. Beginning in August, 2013, all states were required to audit their facilities on a staggered schedule every three years (each year, one-third of the facilities must be audited). Auditors must be trained and certified by the DOJ and may not work for the agency that operates the facilities. In any area that does not meet the standard, the facility is required to draft and implement a corrective action plan within a 180-day period. States that do not comply with PREA Standards are subject to the loss of 5% of DOJ grant funds that it would otherwise use for prison purposes, unless the state’s governor ensures that those funds will be used to promote compliance in future years.[11] (See Ch. 1: Historical Perspective: Prison Rape Elimination Act)

State Agencies

As noted above, many states have developed additional mechanisms to oversee the operation of correctional facilities via another state agency. The models typically have two parts:

  • The development of state standards to guide the operation of juvenile facilities. A national inventory has not been compiled; however, anecdotal information suggests that a significant number of states have undertaken the task of developing standards (California, Texas, New York, New Jersey, Pennsylvania, Indiana, Maryland, Oregon, and Wyoming, among others).
  • A specific agency or individual responsible for overseeing the facilities and monitoring the extent to which facilities comply with state standards. This may take the form of an Ombudsman (Texas), an Independent Monitor (Maryland), an Inspections Committee (Ohio), a State Association (Wyoming, Louisiana), a County Association (Los Angeles), or reciprocal auditing by qualified staff among an agency’s facilities (Michigan).

Although the format, specificity, and quality of standards vary, they all make an effort to establish minimum guidelines for the care and treatment of incarcerated youth. Most include an audit process at specific intervals (annually, bi-annually, every three years), some sort of license or accreditation, and a process for reporting results that brings transparency to the process. Finally, other state and local agencies (such as the Department of Education, Health Department, or Fire Marshal) may also have standards for the delivery of services in detention and correctional facilities.

Internal Quality Assurance

External audits can lend credibility and objectivity to the process; however, standards issued by national organizations are sometimes not specific enough to assess to what degree a facility has implemented its program according to design or whether specific outcomes are being achieved. Particularly in the case of adult facilities that house youth, a set of internal standards relevant and specific to the youth population in the adult correctional environment is essential. Developing a set of standards—customized to the requirements of local policy and procedure—and auditing them according to a rigorous methodology will reveal the unique successes and challenges a facility faces. Furthermore, an ongoing system of quality assurance—one that pays particular attention to correcting any identified deficits—provides additional assurance of safe conditions of confinement and adequate medical, mental health, and education services. Investing time, energy, and resources in internal quality assurance usually means that external reviews bring no surprises.

Establishing the Process

A number of decisions need to be made to establish the internal quality-assurance process.

Audit Schedule/Frequency

Semi-annual audits provide an ongoing assessment of the facility’s performance while also providing sufficient time to implement quality-improvement plans to remedy any identified deficits.

Performance and Compliance Levels

Assigning a performance rating to each standard highlights the areas in which the facility is doing well and draws attention to areas in which modifications are needed to ensure that procedures and outcomes meet expectations. While binary systems (pass–fail) are simple to use, they do not offer a sense of whether significant modifications are needed and if so, to what extent. For this reason, rating schemes that use three or four levels (Exceptional Performance, Satisfactory Performance, Minor Modifications Needed, or Significant Modifications Needed) are recommended.

Auditor Qualifications and Training

If individuals that lack subject matter expertise conduct audits, the staff—who are asked to accept and make modifications based on the auditor’s recommendation—are not likely to perceive auditors or their reports as credible. Therefore, auditors need substantive knowledge of the area they are being asked to audit. In addition, staff should not audit their own work. Larger systems can use an independent quality-assurance office with auditors trained in each of the key subject areas. Smaller jurisdictions can use a system of exchange across facilities to ensure auditor objectivity. Very small systems (those with only a single facility) can use supervisory staff or community stakeholders with the required expertise. All auditors need to be trained to understand both the intent and requirements of the standards, including the minimum performance level needed for compliance.

Reporting Results

The audit results need to be formally presented in a written narrative that is suitable for multiple audiences (facility staff, agency heads, advocates, elected officials). For this reason, the reports should be specific and detailed and should include precise conclusions about whether performance in each area met expectations.

Setting Standards

A comprehensive set of standards needs to be established to cover the full range of operations within each of the major functional and programming areas. At a minimum, standards should cover safety, security, and protection from harm; suicide prevention; mental health services; medical and dental services; education; fire safety; and sanitation. The various sets of national standards described above are useful for the task of sketching out the component parts of each major area. For example, “protection from harm” would include indicators of youth violence, behavior management and discipline, orientation, grievance, access issues, staffing, classification, and investigations of employee misconduct.

Within each of the component parts, specific standards should be crafted to assess the extent to which the procedures required by local policy have been implemented. These should be far more specific to the facility or agency than the national standards discussed above. A national standard may require “appropriate programming to address youth’s rehabilitative needs.” The local standard should detail the specific requirement for the programs of choice. For example, “Youth shall complete the 30-session anger-management program prior to their release” or “Youth shall complete the 10-week substance-abuse treatment program.” Using such specific standards and assessing the underlying causes for a failure to meet them offers a wealth of information about reasons that specific youth outcomes (refraining from violence; refraining from substance use) may or may not be achieved.

Specifying a Methodology

The way in which auditors will go about determining the level of performance with regard to a specific standard must be clearly articulated to ensure that it is sufficiently rigorous, that it benefits from all available information, and that its auditors apply it consistently. To the extent practical, each standard should be measured from multiple angles, such as direct observation, document review, and interviews with staff and youth. Once an agency identifies the sources of information, a sampling strategy is needed. How many documents of a certain type will be reviewed? How many staff and youth will be interviewed? From there, the specific questions to be asked and the specific information to be extracted from the documents need to be articulated. The technical aspects of the audit of each standard should be specified on an audit tool, such as a written interview guide or a data collection form. Not only do such devices ensure consistency across audits and auditors, but they also can be used to substantiate the auditors’ findings, should those findings be questioned.

Analyzing Data

Once the data have been collected, the auditor must analyze them to identify trends and patterns. The purpose of the analysis is to make sense out of all the data that are collected. A compilation of unanalyzed data is relatively worthless to the task of identifying and solving problems with the facility’s operation. Instead, raw data from individual documents, youth records, observations or interviews need to be combined to identify trends. For example, if a standard regarding staffing were assessed by counting the number of youth and staff assigned to five housing units over a six-month period, auditors should calculate a staff–youth ratio for each day, and the number of days in which the staff–youth ratio required by policy (1 staff for every 8 youth) met or fell below the standard. Alternatively, if a standard regarding the provision of a specific treatment program were analyzed by examining youth attendance records, the proportion of sessions attended by each of the youth included in the sample could be calculated and analyzed to determine whether youth received the expected level of exposure to the treatment program.

Interpreting Data

Once the data have been collected and analyzed to identify trends, they must be interpreted to determine the extent to which the facility’s performance is meeting expectations. As discussed above, multiple performance levels should be identified. Criteria for each level should also be established. For example, for a standard to be rated as “Satisfactory Performance,” at least 80% of the youth records surveyed must indicate that they attended 80% of the required treatment sessions, or for a standard to be rated as “Satisfactory Performance,” minimum staffing ratios must be met on at least 80% of the days each month. In general, the threshold for Satisfactory Performance (or whatever term is used) should not require perfection. It should be an attainable level that indicates that, most of the time, things go as planned. Specific numerical values may not make sense for all of the performance levels. For example, a qualitative assessment about creativity or innovation could be required for “Exceptional Performance,” and the distinction between that and poor performance could be calibrated by the magnitude of changes that are necessary to bring the operation up to expectations.

The written narrative should include a performance rating for each standard and a summary of the data analysis that provides the rationale for the rating. If performance deficits were noted, identifying the underlying causes of the problem is helpful. Using the example about youth exposure to a specific treatment program discussed above, the interpretation section could read as follows:

The auditor reviewed treatment records for 20 youth who were housed at the facility during the past three months. Of these, only 25% attended all 10-sessions of the treatment program and earned a certificate. Another 40% completed at least 8 of the 10 sessions, and made up the missed content during individual sessions. Among the 35% who attended 7 or fewer sessions, nearly all of them were from the same housing unit. The Unit Manager reported the Counselor’s chronic attendance problems, resulting in the group often being cancelled. Furthermore, at times, the Recreation Staff would “claim” the treatment hour as their own, given the priority the facility has placed on ensuring that youth receive daily recreation. For these reasons, performance for this standard is rated “Below Expectations.”

Such a narrative clearly identifies the threshold to be used—attending at least 80% of the treatment sessions), the methodology (record review for a sample of 20 youth), the analysis (only 65% met the threshold), the underlying causes of the problem (chronic absenteeism by the Counselor and competing recreation activities)—and clearly states that the performance level is not acceptable (“Below Expectations”). Quality-assurance narratives of this type are a perfect setup for Quality Improvement Plans that address the underlying causes of the problem, restore program functioning, and improve youth outcomes.

Quality Improvement Planning

If we decide we need to lose weight, jumping on the scale over and over again and recording our weight will not produce any measurable results. Unless the underlying causes (poor diet and lack of exercise) are identified and addressed (by eating differently and going to the gym), we should not expect to see any change in how we look or how our clothes fit. The same is true for improving the conditions and services in facilities that hold youth. Simply collecting and analyzing data repeatedly will not improve program performance or outcomes for youth. Instead, facilities and agencies need to undertake a problem-solving analysis, identify the underlying causes of problems, and design and implement appropriate strategies to address them.

A problem-solving analysis has three key steps:

  1. Identifying the underlying causes of the problem.
  2. Crafting strategies to impact these underlying causes.
  3. Conducting targeted reviews to determine whether the strategies were effective and the desired effects are being achieved.

Identifying the Underlying Causes

Across the nation, facilities that house youth experience similar problems with the services they provide, the youth who receive them, and the staff who are responsible for their delivery. However, the underlying causes of the problems may be totally different. For example, many facilities have difficulty maintaining required staff–youth ratios. In some places, the problem lies in an inability to fill vacancies, which may be driven by low pay, poor recruiting practices, or rigorous background checks that disqualify most of the interested applicants. In other places, the problem lies in an inability to ensure that staff report to work, which may be driven by low pay that requires most staff to have another (often better-paying) job, or high rates of youth-on-staff assault that destroy morale and staff attendance. It is essential to understand the cause of the problem.

When problems emerge around program performance (staff are not taking youth to the clinic following a use of force), the answer is rarely as simple as “Draft policy and train staff.” Usually, policy and procedure already exist, and staff have already been trained, yet these measures were not sufficient to improve performance. Instead, the key is to look for the reason that staff do not do what is expected of them. Perhaps the living unit is understaffed, and transporting a youth to the clinic would leave the unit with only one staff to supervise 14 youth. Perhaps the nurses give conflicting advice about who should be brought to the clinic, and transporting youth without visible injuries is discouraged. Perhaps most of the use-of-force incidents occur after the nursing staff have left for the day, and the morning shift is not told that the youth needs medical attention. Finding the underlying cause of a problem is like striking oil—once the well is tapped, the solutions begin to flow.

Creating Strategies to Impact the Underlying Causes

A variety of dynamics could explain the problems with staffing or transporting youth to the clinic (discussed above); finding the correct explanation is essential. Otherwise, the facility risks developing a strategy that is ultimately ineffective. In the staffing example above, if the facility develops a strategy to improve recruiting, this will have no bearing on a situation where staff are afraid to come to work because of the level of violence. In the medical transport example, expanding the hours the nurses are available will not solve a problem that is related to the lack of direct care “floaters” who can transport the youth or provide additional coverage on the unit. The strategy developed to solve the problem must be tailored to address the underlying causes of the problem.

Assessing Effectiveness

The only way to determine whether the underlying causes were identified correctly is to measure whether the size and scope of the problem is changing. For example, once “floaters” are assigned to supplement unit coverage, do the rates of youth being transported to the clinic increase? If so, the strategy was appropriately targeted. If not, leaders should re-examine the process for determining the underlying cause of the problem and apply different strategies. They should conduct targeted reviews of this nature for several months following the identification of a program deficit and the implementation of strategies to address the underlying causes. A single intervention is unlikely to solve the problem; instead, a constellation of strategies that address the problem from multiple angles is most likely to reap rewards.

These are the major components of a quality improvement plan. Quality improvement planning should be undertaken by a committee of stakeholders, not by any one individual. The planning should include a broad search for contributing factors and an openness to creative solutions. The plan should be in written form, should identify responsible parties, and should establish a specific timeline for implementing strategies and for assessing whether the desired results are being achieved. Whether operational problems or poor outcomes are identified via an accrediting body, a comparison of facility practices against national standards, or a rigorous internal review, a Quality Improvement Plan is necessary to ensure that the next accreditation visit or audit yields more positive findings.

Together, Quality Assurance and Quality Improvement Planning create a dependable pathway to ensure safety and humane treatment, deliver services that meet the wide variety of needs, and support positive outcomes for youth in custody.

 

Endnotes


[1] Consultation and subject matter expertise was sought and received from Dr. Daphne Glindmeyer, Bill Wamsley, Leonard Rice, Dana Shoenberg, Dr. Peter Leone, Kim Godfrey and Michele Deitch. While their contributions were of great value, the opinions contained in this report—and any errors—are my own.

[2] N. Katzenbach, “Reflections on 60 Years of Outside Scrutiny of Prisons and Prison Policy in the United States,” Pace Law Review 30 (2010): 1446-1452.

[3] M. Mushlin and M. Deitch, “Opening Up A Closed World: What Constitutes Effective Prison Oversight?” Pace Law Review 30 (2010): 1383-1429.

[4] M. Deitch, “Distinguishing the Various Functions of Effective Prison Oversight.” Pace Law Review 30 (2010): 1438-1445.

[5] More information on the ACA standards is available at: https://www.aca.org/ACA_Member/Standards___Accreditation/About_Us/ACA/ACA_Member/Standards_and_Accreditation/SAC_AboutUs.aspx.

[6] More information on CEA standards is available at: https://ceanational.org/standards-commission/.

[7] More information on the JDAI Standards is available at: https://stopsolitaryforkids.org/jdai-standards/#:~:text=The%20JDAI%20standards%20require%20that,toilet%20facilities%2C%20and%20hygiene%20supplies..

[8] More information on the NCCHC standards is available at: https://www.ncchc.org/juvenile-facilities.

[9] More information on the NFPA Codes is available at: https://www.nfpa.org/codes-and-standards/all-codes-and-standards/list-of-codes-and-standards?mode=code&code=101.

[10] More information on the PbS Standards is available at: http://pbstandards.org.

[11] More information on the PREA Standards is available at: https://www.prearesourcecenter.org/implementation/prea-standards/juvenile-facility-standards.

Recommended Citation for This Guide

When citing this guide, please click here for the suggested format.