Industry Leader in Digital Accessibility

Automated vs Manual Accessibility Testing

Automated Vs. Manual Testing 

Automated testing tools can be used as a routine part of the accessibility testing process. However, we they cannot be used for every part of the process. Determining whether or not your website and/or application is compliant with the Web Content Accessibility Guidelines (WCAG) or other accessibility regulations can be a frustrating experience.

Automated tools are most organization’s first choice for scanning their site for issues, which makes sense, as most sites are simply too large and complex to comb through by hand. In addition, many are not confident that they could catch every potential accessibility issue if they did pick through their site manually. At TestPros, we use automated testing tools as a routine part of our testing process. However, we cannot rely on automated tools for every part of the process, however, as there are many accessibility issues that can’t be replicated by a machine or computer program; this means automated tools cannot test everything, and reports often include false positives and negatives.  

Here are some of the primary areas in which automated scanning often falls short:

  • Screen reader compatibility 
  • Color adjustments 
  • Page titles 
  • Proper coding 
  • Keyboard-only navigation 
  • Determining if your site is optimized for keyboard-only functionality such as: 
  • Moving between sections of a web page 
  • Accessing all menus
  • Top-of-page links that allow users to skip directly to each page’s vital content 
  • Links and form fields that can be highlighted using keyboard commands 

Ultimately, the best option for testing accessibility is to combine both automated and manual testing. The lists below show some of the testing you can perform with each option. 

Manual Testing 

  • Distinguishable links 
  • Accurate alternative text 
  • Actual color contrast 
  • Use of color 
  • Keyboard accessibility
  • Accurate form labels 
  • Form error messages 
  • Consistent navigation 
  • Text resize 
  • Timing
  • Use of sensory characters 

Automated Testing 

  • Empty links 
  • Presence of alternative text 
  • Basic color contrast
  • Presence of page title 
  • Presence of document language 
  • Presence of form labels 

Optimizing Automated Testing 

As stated above, one of the issues with automated testing is the fact that reports can generate false positives or false negatives. False negatives are more concerning than false positives, as this will often result in barriers for individuals with disabilities being overlooked.  

Here are some of the most common false negatives that occur with automated testing: 

Structure and Presentation

A major shortcoming of programs and machines is that they can’t understand the context or purpose of content; this means that they also can’t determine the compliance of a website’s structural layout or organization of content.  

A screenshot of a web page showing

The section “Your Shout” appears visually below and as part of the “Man Gets Nine Months in Violin Case” article but it only appears after the “Tough Wahoonie” section within the HTML code. Some readers will read this section out of the intended order. Automated tools cannot determine this.

Alternative Text

While automated tests can find non-text alternatives, they cannot determine if the descriptions are accurate and/or descriptive. An example can be seen below:

Image of a website demonstrating how alternative text on images presence is not necessarily accessible compliant

These three images pass automated testing as they have alt text, but the alt text states “image” for all three, which is not compliant with WCAG 2.0.

Color

While automated scans may identify many color-contrast issues, most cannot detect color contrast issues with images against background images. That is because automated tests can’t see the colors of graphics, which means if a graphic contains text, an automated scan won’t pick up the color contrast. Automated tests might return a report telling you that there is no color contrast error with the website’s background color against text. In actuality, there may be a contrast issue with a background color due to the CSS styling being different from that of the color(s) used.  

Image demonstrating colors being used on a web page that are not accessible due to color contrast issues

Images like the one above cannot be tested for color contrast issues, and since this is an important image, the contrast plays a large role in understanding the information being presented in the bar chart.

Non-distinguishable links

Automated tools will mark a link as an error if your web page includes the same link text in multiple places if the URLs aren’t exactly the same. You will want to make sure that each link with the same text actually leads to the same place by clicking each link manually. 

Image demonstrating how click here links can be disorienting to users of assistive technology such as screen readers.

Automated tools cannot tell if link text is distinguishable or not. That is why numerous “Click here” links can become disorienting to those using assistive technology and is one of the primary reasons automated testing fails.

Manual Accessibility Testing 

Human beings have varied experiences, and any given individual will experience your website differently than a computer will. Because machines and programs are not able to understand context the way we do, automated tools can’t account for individual variation. This is why manual accessibility testing is essential: the best way to ensure that your website – as well as any accessibility solution you’ve implemented – is actually usable for visitors with disabilities is to have real people evaluate your website for barriers to accessibility that an automated review could miss.  

Here are the two areas in which manual accessibility testing is absolutely necessary: 

Screen Reader Compatibility 

Screen readers are among the most commonly used assistive tools for users with low vision. There are differences between models, but essentially screen readers scan the content of a web page and use the source code to determine what the user will need to know and when. To do this, most screen readers rely on a web page’s title and move through the source code to each element in the text content on the page markup that follows 

The trouble with automated testing is that while an automated scan can identify missing titles on pages, what it can’t do is determine based on context whether a page’s title is appropriate or useful. If you’re combing through the site yourself – or if a professional is doing so on your behalf – you can ensure that the titles are compatible with assistive devices while also maintaining the clarity and readability of your site for those who don’t require screen-readers.  

In other words, manual testing can determine how readable your content is in assistive situations that automated programs can’t replicate; if you only use automated accessibility tools to test screen reader compatibility, you are going to miss a lot. 

To get a better idea of why screen reader compatibility is so important, you can view this video of a Screen Reader Demo created by UCSF.

Keyboard Only Navigation

 A key part of making a website accessible is making it possible to operate any and all of its functions without a mouse. A compliant website has to be compatible with various assistive technologies, but don’t forget that it also has to be navigable with keyboard-only commands. 

Your site is not compliant unless any user can use a keyboard in order to:  

  • Access all menus on the site 
  • Move from section to section on a page 
  • Use top-of-page links to skip to content 
  • Highlight form fields and links 
  • Access all interactive elements (drop-down menus, buttons, dialog boxes, forms, etc.). 
  • A skip navigation link is available: Many websites allow sighted keyboard users to navigate the page via a tab order that’s coded to reflect the page’s layout. If there are many tabs, this is cumbersome. A skip navigation link helps keyboard-only users do the same. 

You may also want to provide a skip navigation link to allow keyboard only users to navigate the page via a tab order. In any case, it’s imperative to make sure that your website and all of its features can be accessed and engaged with from a keyboard.  

The Bottom Line 

Automated accessibility testing is crucial, and many such tools do an excellent job at catching programmatic issues; after the long and grueling process of testing your website, it’s tempting to call it good after using a few of these tools.  

According to Section508.gov, “Automated testing and evaluation tools are not sophisticated enough to tell you, on their own, if your site is accessible, or even compliant. You must always conduct manual testing to ensure full compliance with the Revised 508 Standards.” 

You should always follow up with manual testing after you have used automated accessibility testing tools. Manual accessibility testing is an essential part of ensuring your website is accessible to all individuals because no program can replace or replicate human understanding and judgement (yet).  

What you need to remember is that as effective and useful as automated tools are, they cannot guarantee a compliant website, and that means they are not enough. If you want to ensure that your website meets compliance standards and allows access to all potential visitors, we recommend you use an independent company such as TestPros to conduct manual and automated accessibility testing of your website and/or application.