GSA Government-wide Section 508 Accessibility Program

Build Organizational Support for Accessible Electronic Content

The Revised 508 Standards clarify which electronic content must be accessible (scope) and how to make it accessible (technical requirements). The steps outlined below can help you ensure staff at your agency understand their responsibilities, and have the tools they need to produce accessible electronic content

Understand Scope and Technical Requirements
Update Agency Policies
Identify Covered Electronic Content
Train Staff
Create Accessible Content
Validate for 508 Conformance
Publish
Track and Report Conformance

Understand Scope and Technical Requirements

In the Revised 508 Standards, Section E205 - Electronic Content specifies which electronic content, including web, software, multimedia and electronic documents, must conform to the technical requirements.

Public Facing Content

All public facing electronic content must be accessible.

The Revised 508 Standards define “public facing” as “content made available by an agency to members of the general public.”  Usually, such content is published on an agency website, blog, form, or via social media. However, public facing content might also be made available in non-web formats, such as information displayed on screens or interactive kiosks in waiting areas.

Agency Official Communication

Electronic content that is not public facing but is official business and is communicated through one or more of the nine categories below is an “agency official communication” and must be accessible.

The content might be broadly disseminated or sent to individual agency employees or members of the public. The method of delivery does not matter; such content may be disseminated via an internal agency website or intranet, or by other delivery modes, including, but not limited to: emails, text messages, phone alerts, storage media, and downloadable documents.

Categories of agency official communication (with examples) are listed below. The examples are not all-inclusive, but are meant to help you understand these categories. Contact the US Access Board if you need help interpreting or applying these categories.

  1. An emergency notification.  Examples: Evacuation notices, active shooter alerts, text messages conveying emergency instructions (e.g., “remain in place”), hazardous weather alerts, and operational notices regarding unscheduled closures.
  2. An initial or final decision adjudicating an administrative claim or proceeding.  Examples: An electronic notice or alert of an approved, denied, or pending claim sent to a business or other organization, or to an individual.
  3. An internal or external program or policy announcement.  Examples: An electronic notification of a new agency policy, or a change to an existing program requirement.
  4. A notice of benefits, program eligibility, employment opportunity, or personnel action.  Examples: An electronic notice sent to a member of the public or employee describing government benefits to which they are entitled; information on whether an individual is eligible for benefits from, or to participate in, a government program; information on the status of an application for enrollment in a program; a notification of an official personnel action indicating a promotion, adverse action, or other personnel decision affecting a government employee; or a job announcement.
  5. A formal acknowledgement of receipt. Examples: An email acknowledging receipt of payment; a notice posted to a program participant’s web page containing his or her personal account information and acknowledging that he or she successfully submitted certain records.
  6. A survey or questionnaire. Examples: A set of written questions (open-ended or multiple choice) developed for the purpose of a survey or data analysis, such as a questionnaire assessing employee training needs; an employee satisfaction survey; or a questionnaire used to gather information to gauge satisfaction with a government program. This category does not include questions submitted during litigation or legal proceedings.
  7. A template or form.  Examples: An electronic document template used to create official agency documents or presentations; a web page template created to establish a common look and feel for a website; or an official agency form that must be completed by employees or members of the public.
  8. Educational or training materials. Examples: Interactive online training courses; self-paced training courses; educational webinars; other educational presentation formats; and support materials for such activities, including electronic worksheets, training manuals, and tests.
  9. Intranet content designed as a web page. Examples: An intranet page listing files for downloading; shared calendars; an internal employee locator; or other HTML web pages distributed internally via an agency intranet. This category does not include files distributed via the agency intranet that are not in one or more of the eight categories above.

NOTE: An exception provides that NARA is not responsible for remediating records sent to them by other agencies.

Refer to Section E205 for the specific technical provisions that apply to electronic content that falls under one of the above categories.

Update Agency Policies

Update your agency Accessibility policy (including policies for all agency websites) to address how your agency manages accessibility of electronic content .

If you do not yet have an agency Accessibility policy, create one. Here are some resources:

Identify Covered Electronic Content

Use the electronic content categories in Section E205 to identify the types of electronic content produced by your agency that are covered under the Revised 508 Standards.

Prioritize electronic content to review for accessibility based on the size of the target audience, frequency of user access, and criticality to the agency and content users. Focus on the most-accessed content first. For internal content, pay attention to any content that is mandatory for users to view or use.

Train Staff

Refer to the scoping provision in Section E205 to see how it applies to electronic content created and updated by web content managers, developers, authors of electronic (e.g., Microsoft Office or Adobe PDF) documents, and other content authors. Generally, content authors should be responsible for ensuring their content is accessible.

  • Identify who is responsible for generating high priority electronic content. Ensure content creators understand how to create WCAG conforming websites, web applications, software, multimedia, and eLearning content.
  • Reach out to your Communications Office to solicit their help to identify major components and processes within your organization that support electronic content development and publication. Some of these groups may not be aware of Section 508 requirements.
  • Communicate to key internal stakeholders the legal mandate, requirements, and benefits of accessible electronic content for all users, as well as resources and training available to content authors.
  • Identify a single person within each organizational group who is responsible for obtaining advanced training on how to make electronic content accessible.
  • Establish review guidelines with each department responsible for publishing covered electronic content. Insert accessibility reviews into existing development and publication lifecycles.

Create Accessible Content

Word Documents

  • Ensure the file format is .docx in order to preserve accessibility features. Other formats that can be produced by Microsoft Word (RTF, DOC, TXT, and ODF) may not be accessible.
  • Establish reusable accessible design templates to reduce the level of effort associated with generating accessible electronic content.  
    • When developing templates, follow responsive design principles that support accessibility on different types of devices.
  • Advocate for using authoring tools and document converters that enable users to author accessible documents. Where possible, upgrade existing tools (e.g., Microsoft Office 365 and Adobe Acrobat Professional DC) enterprise-wide to the latest version. Benefits include greater support for:
    • Authoring and evaluation features that make it easier to create accessible documents;
    • Conformance to the new software-authoring tools requirements in the Revised 508 standards.

PDF Documents

  • When generating PDF documents through conversion from another format, or through an automated tool, ensure the resulting PDF file is properly tagged to support accessibility.
  • If using an authoring tool, make sure the tool is capable of generating accessible PDF documents that conform to PDF/UA (some do not).

Multimedia and eLearning Files

  • Ensure video content includes synchronized captions.
    • Carefully review and edit the accuracy and synchronized timing of the captions, even if captions are provided through automated captioning software.
  • Ensure audio descriptions for people with visual disabilities are included in the default sound track, or as a separate selectable sound track.
  • If you are providing a multimedia player to view electronic content, make sure the player addresses the software provisions in the standards.
  • If you are providing a learning management system tool to help users locate and track usage of the content, make sure that:
    • The system addresses the software requirements; and
    • The electronic content provided through the system meets the electronic document provisions.

Validate for 508 Conformance

There are several ways to validate conformance to the Revised 508 Standards:

  • Automated - High volume 508 conformance testing tools automatically scan and test electronic content;
  • Manual - Manual testing uses a documented, consistent, repeatable process;
  • Hybrid - A combination of automated and manual testing.

Automated Testing

Take advantage of high volume (automated) 508 compliance scanning tools, but be aware of their limitations.

  • Automated scanning tools cannot apply human subjectivity, and therefore either produce excessive false positives or—when configured to eliminate false positives—test for only a small portion of the requirements.
    • Determine the best strategic mix of false-positive generation vs. coverage of your agency requirements by ensuring the tool vendor defines and quantifies the method and accuracy of its rule sets in regard to its alignment with your agency’s standards and expectations.
  • Consider whether or how server-based automated scanning tools will be able to access content secured behind firewalls and password- or otherwise protected content.
  • Select tools that test using the document’s native format. Tools that scan documents often convert files into HTML before testing. This conversion process reduces the fidelity and accuracy of conformance testing.
  • Your agency may need to deploy multiple scanning tools to cover multiple content types (e.g., HTML, Word, Excel, and PDF). It can be a challenge to extract and aggregate results to identify trends and focus remediation efforts.
  • Plan and deliver reporting tailored to your stakeholders. You may want to provide output from scanning tools directly to developers. Additional work may be required to integrate results into dashboard reporting to tell your organizational story.

Key Success Factor:  To provide value for the agency and support the highest level of accessibility improvement, the tool or tools you select must foster adoption and buy-in across multiple applicable roles (UX designers, developers, etc.) within the agency.

Technical Requirements

When reviewing automated tools for potential purchase, consider their ability to:

  • Scan the types and volume of electronic content your agency produces. Many tools focus on web pages, but some also scan PDF and Microsoft Office documents.
  • Customize scanning and test ruleset parameters.
  • Use a centralized custom ruleset among all tool feature sets.
  • Assign and control the ruleset version available to users from a central administrative location.
  • Scan code on a local PC to support full compliance assessments in a designer/developer unit-test environment.
  • Control and synchronize error and remediation messages presented to users for customized rules.
  • Flag false positives and ensure the errors are not repeated in subsequent test results.
  • Categorize issues by type, frequency, and severity.
  • Configure, schedule, and suspend scans; change the rate of scans; and restart in-process scans.
  • Fully customize all evaluation rule sets to address inaccurate interpretation of requirements or reduce false positives.
  • Support exclusion of specific domains, URL trees, pages, or sets of lines.
  • Emulate multiple browsers during scans.
  • Provide contextually relevant remediation guidance
  • Customize summary and detailed reports to monitor current 508 conformance; analyze trends by website and by organizational component; and export summary and detailed results to external reporting tools.
  • Direct users to specific code location(s) that are generating errors, and provide contextually relevant remediation guidance.
  • Integrate test tools and conformance monitoring into test automation environments (Dev/Ops).
  • Produce accessible system and report outputs.

Support Services Requirements

  • Installation, configuration, validation, and customization of 508 test rulesets, scans, and reporting capabilities.
  • Integration of 508 test tools, reporting, and monitoring capabilities into test automation environments.
  • Online self-paced training for web content managers, developers, programmers, quality assurance testers, project and program managers, and tool administrators.
  • Operations & maintenance support, including ongoing configuration and customization.

Validate Rulesets

  • Determine whether separate rulesets exist for different types of web content (web pages, Microsoft Office documents, Adobe PDF documents, etc.).
  • Look for a setting that indicates “WCAG 2.0 Level AA Success Criteria” which should test all the Level A and AA /Revised Section 508 requirements that are applicable to web content supported by the tool.
  • Assess each ruleset for reliability, accuracy, and degree of alignment with agency requirements in your environment. Suggested steps:
    1. Create a test bed of sample content. Ensure the test bed includes as many ways to fail a specific checkpoint as known, then uniquely identify each failure point to quantify alignment with agency guidelines as testing progresses.
    2. Configure the scan to evaluate the test bed.
    3. Run the rule set.
    4. Compare the results against manual test results to validate the script’s accuracy.  Ensure this comparison is performed by senior subject matter experts who are trained to perform manual testing.
    5. After constructing a viable initial ruleset framework by “passing” the internal test bed tests, test the resulting rule “in the wild” by scanning against multiple sites or applications constructed by technical resources not associated with the internal rule testing effort, to help identify false-positives and requirements to correct rule detection.
    6. Delete inaccurate scripts, or obtain developer assistance to customize the scripts to increase reliability in your environment.
    7. Continue testing until you end up with rule sets that provide an acceptable level of accuracy in your environment.

Configure Scans

  • Firewall restrictions.
  • Scan depth.
  • How the results should be aggregated.
  • Server capacity and length of time to run scans.
  • How to abort and restart scans.
  • The ability to eliminate rulesets that only generate warnings.
  • The ability to identify content subject to the safe harbor provision. Content that conformed to the Original 508 Standards and has not been altered on or after January 18, 2018 does not need to conform to the Revised 508 Standards (i.e., legacy content).  See Section 9.2 below for tips on identifying legacy content.

Configure Reports

  • The target audiences (web managers, program managers, executive managers).
  • Reporting scope: (issue description, category, impact, priority, solution recommendation, and location in the code).
  • Reporting format (single scan view vs comparison against previous scans, trend highlighting, and identification of major positive and negative changes).

Manual Testing

Follow the instructions outlined in Accessibility Testing for Electronic Content, endorsed by the Federal CIO Council’s Accessibility Community of Practice.

Hybrid Testing

A hybrid testing approach is usually the best solution to handle a large volume of electronic content. Consider the following:

  • Ensure developers build accessibility into code during development.
  • Whenever possible, perform manual testing prior to publishing new content.
  • Use stand-alone automated testing tools to identify obvious errors and augment manual testing.
  • Integrate automated rules sets into developer operations to add increased scale to 508 validation efforts for applications prior to release.
    • Use automated scanning tools to scan as much electronic content as possible and periodically conduct manual testing on high priority published content.  Focus on content that is returning poor test results in scans and is frequently accessed.

Publish

  • When publishing electronic content that does not fully support the 508 Standards, provide information on how to access the content via an alternative means.
  • When the electronic content is posted to social media, consider:
    • Learn how to use social media platforms to publish accessible content.  For example, when posting YouTube video, follow YouTube protocols for uploading captions.
    • Where it is not possible to publish accessible social media content, publish a duplicate but accessible version of the content on an agency website or through some other means.  Where possible, note in the social media post where users can find the accessible version.
  • When the agency provides software to view content, ensure the software is also compliant with the Section 508 Standards.

Track and Report Conformance

Measuring and reporting 508 conformance issues drives incremental improvement.

Test Reports

At their very best, automated scanning tools currently only cover 30%-35% of accessibility requirements. Until test coverage dramatically improves, results from automated scanning tools should never be represented as “508 compliance results”.  Rather, they should be used and consistently communicated to agency stakeholders as risk monitoring mechanisms, where reported test results indicate potential accessibility issues.

It is generally better to create conformance reports using manual test results, especially if they are well documented and based on normative testing using the guidance provided by the CIO Council’s Accessibility Community of Practice. Augmentation with automated testing results may be acceptable if the automated test rule sets have been validated for accuracy before running the tests.

Legacy Content

Section E202.2 of the Revised 508 Standards provide an exception for legacy content:

  • “Any component or portion of existing ICT that complies with an earlier standard issued pursuant to Section 508 of the Rehabilitation Act of 1973, as amended, and that has not been altered on or after January 18, 2018, shall not be required to be modified to conform to the Revised 508 Standards.”

This new exception presents several challenges from a monitoring perspective:

  • How can your agency know which electronic content, or what component or portion of content, conformed to the Original 508 Standards?
  • How can your agency know if any conforming “components or portions” were changed or revised on or after January 18, 2018?
  • How can your agency track and monitor which electronic content does not need to conform to the Revised 508 Standards?

Suggestions for addressing these challenges include:

  • If you already have tracking mechanisms in place that record compliance levels (and preferably test results) for individual electronic content assets before January 18, 2018, identify those that fully conform to the Original 508 Standards.
  • Where applicable, use metadata to identify legacy content as conformant under the Original 508 Standards. Configure your automated scanning tool to bypass any legacy content that contains this tag and has a “last updated” date prior to January 18, 2018. If your scanning tool does not have this capability and you are unable to find another way to exclude the legacy content from the scan, let the tool scan all the content with the understanding that the scanning tool is only designed to provide an indication of accessibility issues – it is not a proxy for determining legal compliance.
  • If the legacy content is not posted to the web (or cannot be tested by your automated scanning tool), integrate information about conforming legacy content into compliance reports. If the date last updated for the legacy content is on or after January 18, 2018, test the content against the Revised 508 Standards. Determine if there are components or portions of the legacy content that do not need to conform to the Revised 508 Standards. Make a case-by-case determination of remediation needed to mitigate compliance risk for “legacy” assets.
  • If you have previously scanned legacy content and did not identify any issues, do not automatically assume this exception applies. Scanning tools can only provide limited test coverage against the 508 Standards (Original or Revised).  Only comprehensive manual testing can validate full conformance with either standard.
  • If you have not manually tested your legacy content (or have no testing records), there is no way to tell whether it can fall under this exception. Consider testing this legacy content using the Revised 508 Standards to establish a baseline for potential accessibility risk.
  • When compiling your overall reports for key stakeholders, include results from the “previously conforming legacy content” with the results from the automated tool, and any other manual test results based on the Revised 508 Standards.
  • To distinguish conforming legacy content from non-conforming legacy content, keep automated test results separate from manual test results.

Public Facing Content

  • Identifying what public facing electronic content you need to track is generally easier when the content is posted to the web. Since all public facing content is covered under Section E205 - Electronic Content, there is no need to distinguish between content that is, and is not, covered by the standards when configuring scanning tools.
  • An agency that has a robust policy and dedicated resources can require authors or content managers to manually record testing results in a central repository, from which results can be incorporated into overall agency tracking reports and dashboards.
  • For content posted on the web, develop conformance reports that identify the current level of conformance of each type of electronic content, sorted by responsible component, with drill-down options to individual websites or content assets when feasible. This will require combining automated and manual test results, and it is advised to gradually try both types of testing to figure out what works best.
  • For public facing content that is not posted to the web (e.g., information screens on kiosks in waiting areas), identify who is responsible for creating and publishing this content, as well as the content lifecycle for publication and ongoing monitoring. Establish responsibilities for 508 testing and identify a method to track, test, and monitor results.

Non-Public Facing Content

  • While many of the above challenges and approaches apply to tracking and reporting internal content (particularly when it is deployed on the web), there is the additional challenge of tracking whether non-public facing electronic content falls under at least one of the scoping categories in Section E205.
  • Agencies with much higher levels of risk (those that serve a large segment of the public, or with a large number of employees with disabilities) may need a tracking system (e.g., a way to record due diligence and compliance against specific documents, notifications, PDF fillable forms, multimedia content publications, etc.). For instance, some agencies provide fillable PDF forms to employees and members of the public, and the authors of these forms should record information such as manual testing results, and when and by whom the forms were tested. An agency with a robust policy and dedicated resources can require authors or gatekeepers to manually record testing results in a central repository. A 508 program can establish a custom web application or a simple tracking tool using a content management system to keep track of testing results.
  • While the process sounds labor intensive, and at first it is, customized tracking tools can streamline the process by including an automated script to scan the network to determine if content is still there, or if the content has been updated since the last testing record.

This guidance was developed by the U.S. Federal Government Revised 508 Standards Transition Workgroup. Members include the U.S. Federal CIO Council Accessibility Community of Practice, the U.S. Access Board, and the General Services Administration.

Page Reviewed/Updated: January 2018