Skip to secondary navigation Skip to main content

Governmentwide Findings: Accessibility Conformance Testing and Technology Lifecycle

The Assessment continued to ask questions about outcome-based results to determine whether policies, practices, and procedures were effectively leading to Section 508 conformant ICT. The average c-index value for all reporting entities was 1.74 out of 5, indicating a low level of compliance and highlights that policies, practices, and procedures are not yet resulting in more conformant ICT.

ICT Testing Outlook

As anticipated, most reporting entities continue to utilize a mix of automated and manual tools for digital accessibility testing. While manually testing all ICT may not be feasible due to resourcing, strategically combining automated tools with manual testing enables reporting entities to achieve both broad coverage and depth in their testing.

The FY24 Assessment asked a new question to better understand how many entities have conformance test processes used to evaluate different types of ICT. The results showed an overwhelming majority – 211 respondents – use a Section 508 conformance test process for web content. Additional data shows:

  • 180 respondents use an electronic documents test process.
  • 162 respondents use a software test process.
  • 101 respondents use a mobile application test process.
  • 89 respondents use a hardware test process.
  • 30 respondents use a kiosk test process.
  • 23 respondents use no test processes listed.
  • M-24-08 covers the following pillars of building and sustaining accessible federal technology:

    • Establish Digital Accessibility Programs and Policies
    • Buy Accessible Products and Services
    • Design and Develop Accessible Digital Experiences
    • Create, Communicate, and Deliver Accessible Content
    • Evaluate, Monitor, Collect Feedback, and Remediate for Accessibility
    • Cultivate a Positive Culture of Digital Accessibility

We continued to ask respondents what manual or hybrid testing methodology they use for digital accessibility testing. Data shows an increase governmentwide in the methodologies used except for the reporting entity-specific methodology, which showed no change. Additionally, the highest percentage reported for a standalone methodology was Trusted Tester 5.x for web, meaning some entities only use Trusted Tester as their test process. Two hundred two (202) respondents reported using one or more of the manual or hybrid ICT accessibility test methodologies for web content shown in Table 11 below:

Table 11: Testing Methodologies used by respondents YOY
Methodology FY23 Number of Reporting Entities Using Specified Methodology FY24 Number of Reporting Entities Using Specified Methodology % Change YOY
Manual Testing with Developer Tools 61% 76% 22% increase in utilization
Assistive Technology 48% 58% 18% increase in utilization
Manual Code Inspection 41% 57% 37% increase in utilization
Trusted Tester 5 .x 39% 49% 24% increase in utilization
Reporting Entity- Specific Test Methodology 29% 29% No Change

Additionally, YOY data revealed no change, with 61% of reporting entities, or 151 respondents in FY24, using at least one automated accessibility testing tool for comprehensive, large-scale monitoring of web content. Of those reporting entities, 84% responded that personnel who use the tool and interpret the results received training on the tool, increasing from 67% reported in FY23. However, 83 respondents or 34% reported not using any automated accessibility tool. Figure 16 and Figure 17 show the percentage of entities that employ at least one automated tool by maturity and conformance brackets.

Line graph showing percentage of entities with an automated testing tool by Maturity Bracket: Very Low: 28%, Low: 39%, Moderate: 71%, High: 80%, and Very High: 93%. Generally, the higher the maturity, the higher the percentage of reporting entities with automated testing tools.
Figure 16. Percentage of entities with an automated testing tool by Maturity Bracket.
Line graph showing percentage of entities with an automated testing tool by conformance Bracket: Very Low: 42%, Low: 67%, Moderate: 64%, High: 82%, and Very High: 92%.Generally, the higher the conformance, the higher the percentage of reporting entities with automated testing tools.
Figure 17. Percentage of entities with an automated testing tool by Conformance Bracket.

In Figures 16 and 17,the percentages of each overall category with the same conformance bracket (Very Low-Very Low, Low-Very Low, or Moderate-Very Low, etc.) were averaged and then included in the maturity and conformance charts. Generally, the higher the conformance or maturity, the higher the percentage of reporting entities with automated testing tools, which is similar to the discussion of average resources per entity by bracket discussed in previous sections.

One of the top primary challenges noted by 104 entities was a lack of or inadequate considerations at the early stage of the ICT lifecycle management process. As explained below, the Assessment continued to ask how often Section 508 conformance is integrated throughout technology development lifecycle activities:

  • Similar to last year, just over half of the reporting entities – 53% in FY24 compared to 51% in FY23 – reported Section 508 conformance is regularly, frequently, or almost always integrated throughout technology development lifecycle activities, with 23% of those almost always integrating Section 508.

  • Conversely, about 41% of all FY24 respondents reported they sometimes or never integrate Section 508 conformance into technology development lifecycle activities or don’t know how often this occurs.

The Assessment asked each reporting entity how often they implement and produce reliable test results using standard processes for validating web content conformance to Section 508 standards. The data for FY24 shows 60% of entities reporting they regularly, frequently, or almost always implement a standard process, which is 25% or more of the time. The majority of those entities said they almost always implement a standard process. One third or 33% of entities never or only sometimes implement a standard test process.

Entities reported increasing the frequency in which they conduct testing on web content as part of standard operations, with a decrease in who do not perform testing.28 When asked how often reporting entities conduct comprehensive conformance validation testing for web content, both internet and intranet, prior to deployment, GSA found that:

  • 31% of entities stated they frequently or almost always perform manual testing 60% or more of the time.

  • Slightly more entities – 36% – reported they frequently or almost always perform comprehensive automated testing on web content prior to deployment.

  • 42% of respondents or 104 entities reported they sometimes or never conduct comprehensive manual tests on web content for Section 508 conformance, an improvement from 48% in FY23. The YOY difference indicates a small, meaningful increase (WRST: extremely statistically significant) suggesting improvement as more reporting entities are moving towards regular testing.

  • 41% of respondents or 101 entities reported they sometimes or never conduct comprehensive automated tests on web content for Section 508 conformance, an improvement from 45% in FY23, noting a small, meaningful increase (WRST: extremely statistically significant).

M-24-08 guidance stated that “prior to deployment, agencies should test and validate design and development solutions with individuals with disabilities and assistive technology users.” Additionally, the FY23 Assessment made recommendations for agencies to explore ways to include users with disabilities throughout the technology lifecycle. While the majority of respondents still noted they sometimes or never conduct user testing with people with disabilities prior to deployment to address all applicable Section 508 standards, there was a 21.6% decrease YOY in the number of entities who selected this option, with 71% of entities in FY24 compared to 89% of entities in FY23. Similar percentages YOY were found in those who regularly or frequently conduct the aforementioned user testing, with 11% in FY24 compared to 10% in FY23.

Respondents also noted an increase in engaging users with disabilities YOY:29

  • 47% of entities selected engaging users with disabilities in defining user needs, compared to 15% in FY23.

  • 38% of entities selected engaging users with disabilities in development of Section 508 conformance validation test processes, compared with 12% in FY23.

  • 43% of entities selected engaging users with disabilities in user acceptance testing, compared to 12% in FY23.

Furthermore, entities reported increasing the frequency of integrating Section 508 reviews into electronic content prior to publication, with the majority of respondents or 66% regularly, frequently, or almost always integrating Section 508 reviews, compared to only 44% in FY23. Only 31% of entities reported they sometimes or never integrate Section 508 reviews, a substantial improvement from 52% in FY23 resulting in a 40% improvement YOY. This difference indicates a moderate, meaningful increase in the integration of Section 508 reviews prior to publication (WRST: extremely statistically significant). Investments in maturing the technology lifecycle is also reflected in the frequency in which reporting entities utilize a process or plan for creating accessible agency official communication, with 46% of respondents selecting they regularly, frequency, or almost always utilize said plan, up from 44% in FY23, indicating a small, meaningful increase (WRST: extremely statistically significant)

Entities also reported on how often public, online documents are tested for Section 508 conformance prior to distribution. In FY23, half of the reporting entities indicated they regularly test electronic documents prior to posting. This improved in FY24 by a small, meaningful increase (WRST: extremely statistically significant), with half of the reporting entities now testing frequently, which happens approximately 60%-89% of the time. Furthermore, 51% of entities reported frequently or almost always testing documents prior to distribution, with entities still reporting additional bandwidth to perform more comprehensive electronic document testing than web page testing.

Nonconformance Tracking and Remediation

Agencies across the federal government are still procuring, using, maintaining, and developing inaccessible ICT as demonstrated by low conformance in Compliance Key Findings. As detailed in the previous section, testing methods for reporting entities have improved, however leadership decisions regarding the tracking and remediation of defects are also crucial for improving ICT conformance.

The Assessment continued to ask how the reporting entity escalates and takes action on nonconformance issues with vendors or contractors who produce or deliver inaccessible ICT despite contractual requirements. Thirty-six percent (36%) of reporting entities escalate or take actions on inaccessible products and deliverables less than 24% of the time. However, there is a 33% decrease in the number of entities who never take action resulting in an improvement from 9% in FY23 to 6% in FY24. Reporting entities also increased how frequently they take actions to enforce digital accessibility, with 40% of entities regularly, frequently, or almost always taking actions to escalate and enforce compliance in FY24 compared to 33% in FY23, resulting in a 21% increase.

Furthermore, the Assessment asked how entities remediate nonconformant ICT that is deployed or distributed, with 35% of entities sometimes or never taking action for remediation, down from 48% in FY23. This difference indicates a moderate, meaningful increase (WRST: extremely statistically significant) to improve the remediation efforts of nonconformance issues. As Figure 18 shows, 20% almost always immediately remediate high-severity issues, an improvement from 10% in FY23, leading to an 88% increase. Reporting entities are making investments in their remediation efforts that may be more evident in conformance outcomes in future years.

Bar chart showing YOY response count comparison of how known nonconformance issues are remediated after deployment or distribution of ICT (Q56) showing 13 entities never remediate up from 6 in FY23, 81 entities sometimes remediate down from 107 in FY23, 33 entities regularly remediate down from 52 in FY23, 33 entities frequently remediate compared to 31 in FY23, 49 entities almost always remediate, up from 26 in FY23 and 23 entities responded Unknown in FY24 compared to 20 in FY23.
Figure 18. YOY response count comparison of how known nonconformance issues are remediated after deployment or distribution of ICT (Q56).

Additional technology lifecycle findings show:

  • In FY24, the majority of respondents or 52% reported regularly, frequently, or almost always track and remediate digital content, whereas last year, the majority or 51% did not. Furthermore, the YOY difference indicates a small, meaningful increase (WRST: extremely statistically significant). In FY24, a little under half of all respondents or 42% reported they do not track nonconformant digital content, or they do track but only sometimes take action to remediate, leading to a 17.6% decrease.

  • Respondents noted a 58% increase in the number of respondents who engage in technology lifecycle activities and do not or only sometimes assess risk of Section 508 nonconformant ICT throughout the technology development lifecycle.30 Forty percent (40%) of respondents selected this response in FY23 with 63% of entities selecting this option in FY24. Assessment data does not detail why these specific response options increased, but it may be due to more accurate reporting in FY24.

Many respondents reported focusing on enhancing digital accessibility throughout the technology lifecycle and advancing their testing practices. However, the differing accounts of successes and challenges highlight the varying levels of maturity among reporting entities. Some specific actions include:
  • Embedding Section 508 more thoroughly into the procurement process, which includes more reviews, accountability for vendors, and considering the needs of PWD
  • Improving ICT testing through training, thorough hybrid testing, and increasing the amount of test reports by Trusted Testers.
  • Enhancing the thoroughness of accessibility testing and ensuring effective remediation of identified issues.
  • Establishing an ICT conformance reporting portal to collect and review Section 508 testing and conformance documentation.
  • Integrating best practices throughout the technology lifecycle and providing guidance earlier in the process in order to avoid costly remediation.
  • Developing a targeted engagement approach with product line and portfolio managers to prioritize Section 508 in the technology lifecycle.
  • Gaining traction with development teams to integrate Section 508 testing early in the technology lifecycle.
  • Focusing on testing applications for conformance as part of the integration with the security authority to operate (ATO) process.

Respondents still noted numerous challenges: varied commitment to accessibility across programs, accessibility still being treated as an afterthought, digital accessibility conformance not being part of the technology lifecycle, limited or no testing tools, and lack of testing personnel.

Overall, Section 508 testing and its integration into the technology lifecycle has improved over the past year. The majority of reporting entities now use a combination of automated and manual tools to test comprehensively. However, inadequate or absent consideration of Section 508 at the early stages of the ICT lifecycle remains a significant challenge. Approximately 41% of respondents report they sometimes or never integrate Section 508 conformance into technology lifecycle activities, or are unsure how often it occurs.

Despite this, entities reported increasingly conducting testing on web content and integrating Section 508 reviews into electronic content prior to publication. The majority of respondents now regularly, frequently, or almost always incorporate Section 508 reviews. Additionally, entities are taking more frequent actions to enforce digital accessibility, with 40% of respondents reporting they regularly, frequently, or almost always escalate and enforce Section 508 conformance in FY24. Investments in enhancing Section 508 testing, prioritizing remediation, and embedding Section 508 considerations into the technology lifecycle are positive steps that should lead to improved ICT conformance in the coming years.


  1. While Conformance data does not support an increase in testing, this does not necessarily invalidate the data reported by entities for the Testing and Technology Lifecycle activities dimensions. Entities may be testing as standard practice, as reported here, but were unable to pull required data for Conformance dimension questions.
  2. Q24 underwent revision in FY24; some response options are not directly comparable.
  3. Sixteen (16) reporting entities or 6% noted they do not engage in technology lifecycle activities and were removed from the calculation of overall percentage.

Reviewed/Updated: December 2024

Section508.gov

An official website of the General Services Administration

Looking for U.S. government information and services?
Visit USA.gov