CHECK

Checklists and auditing tools to help improve individual book files.

Our Auditing Tool - OARC

OARC

The Open Accessibility Review Checker

Infographic showing the structure of OARC in sections A-F

Our custom auditing tool has been developed by Open Book Futures to help small presses audit their static eBook files for accessibility. There are 40 points on the checklist, split into 5 sections: Text Features, Non-Text Features, Semantic Tagging, Reading Order and Navigation, and Metadata. There is an optional 6th section for if your book involves Pre-recorded Media, and for that there are 2 points. To determine these checklist points, we took out just the elements of WCAG that apply to static eBook files, removing those that apply to dynamic and interactive web pages, and adding in some additional EPUB or PDF specific points. Therefore the checklist is not suitable for auditing websites.

You can also access OARC formatted as a spreadsheet: OARC [Google Sheets]

On the spreadsheet you can find some additional context including whether each feature can be audited by a machine or needs to be done manually, and our approximation of how complicated each task is. Therefore, below, we present the checklist 3 times, in section order, and then split by human or machine audit and then by complexity.

View this information as a video.

OARC

A Text Features

  1. Text is actual text; not images of text.
  2. Colours of text has contrast ratio of at least 4.5:1
  3. Headings are descriptive of the content they contain
  4. Text is reflowable without causing horizontal scrolling or other problems
  5. Text can be resized without causing horizontal scrolling or other problems
  6. Line height and spacing, letter spacing and word spacing can all be changed without problems 
  7. Orientation can be changed without problems 
  8. Fonts are coded correctly

B Non-Text Features

  1. Non-text features (figures, graphics, captions, links, mathematical expressions) have meaningful ALT text
  2. Colours of non-text features (figures, graphics) has contrast ratio of at least 3:1
  3. Non-text features (figures, graphics, captions, links, mathematical expressions) have multiple ways of conveying meaning
  4. Links are accessible and meaningful
  5. Other clickable elements are 24 x 24 pixels
  6. Other clickable elements have visible text that matches the text in the underlying code
  7. A list's numbers, letters or bullets are displayed and tagged correctly
  8. A tables's headers, rows and columns are tagged correctly

C Semantic Tagging

  1. Non-decorative/real and decorative/artefact content is all tagged correctly
  2. Non-text features (figures, graphics, captions, links, mathematical expressions) are tagged and grouped correctly
  3. Lists, tables and TOCs are tagged correctly
  4. Headers, footers, notes and references are tagged correctly
  5. Headings are tagged as headings
  6. Headings have just 1 <H1>, at the beginning
  7. Headings <H2>-<H6> don't skip levels
  8. No headings <H7> or higher
  9. Other non-PDF structure elements tagged correctly (EPUBs)
  10. PDF tags support the separate reading order (PDFs)
  11. PDF role mapping is correct (PDFs)
  12. Other structure elements in PDF tagged correctly (PDFs)

D Reading Order and Navigation

  1. Multiple ways to navigate
  2. Static page breaks are present (EPUBs)
  3. Static page breaks are navigable (EPUBs)
  4. Navigation consistent throughout
  5. Reading/focus order retains meaning when using tabs or a screenreader
  6. Repeating blocks of content can be skipped

E Metadata and Conformance reporting

  1. File has metadata
  2. File metadata has a title that is used instead of file name
  3. File metadata has a valid language
  4. Where the language changes, individual parts have a valid language
  5. Source of static page breaks/pagination is identifiable (EPUBs)
  6. File metadata includes full accessibility conformance information

F Pre-recorded Media (optional)

  1. All audio only content has an alternative suitable for individuals with a hearing impairment
  2. All video only content has an alternative suitable for individuals with a visual impairment

OARC by auditing type

Machine/Automated Auditing

A2. Colours of text has contrast ratio of at least 4.5:1
A8. Fonts are coded correctly

B2. Colours of non-text features (figures, graphics) has contrast ratio of at least 3:1
B5. Other clickable elements are 24 x 24 pixels
B6. Other clickable elements have visible text that matches the text in the underlying code

C6. Headings have just 1 <H1>, at the beginning
C7. Headings <H2>-<H6> don't skip levels
C8. No headings <H7> or higher
C10. PDF tags support the separate reading order
C11. PDF role mapping is correct
C12. Other structure elements in PDF tagged correctly

D1. Multiple ways to navigate
D2. Static page breaks are present
D3. Static page breaks are navigable

E1. File has metadata
E2. File metadata has a title that is used instead of file name
E3. File metadata has a valid language
E4. Where the language changes, individual parts have a valid language
E6. File metadata includes full accessibility conformance information

Human/Manual Auditing

A1. Text is actual text; not images of text
A3. Headings are descriptive of the content they contain
A4. Text is reflowable without causing horizontal scrolling or other problems
A5. Text can be resized without causing horizontal scrolling or other problems
A6. Line height and spacing, letter spacing and word spacing can all be changed without problems
A7. Orientation can be changed without problems

B1. Non-text features (figures, graphics, captions, links, mathematical expressions) have meaningful ALT text
B3. Non-text features (figures, graphics, captions, links, mathematical expressions) have multiple ways of conveying meaning
B4. Links are accessible and meaningful
B7. A list's numbers, letters or bullets are displayed and tagged correctly
B8. A tables's headers, rows and columns are tagged correctly

C1. Non-decorative/real and decorative/artefact content is all tagged correctly
C2. Non-text features (figures, graphics, captions, links, mathematical expressions) are tagged and grouped correctly
C3. Lists, tables and TOCs are tagged correctly
C4. Headers, footers, notes and references are tagged correctly
C5. Headings are tagged as headings
C9. Other non-PDF structure elements tagged correctly

D4. Navigation consistent throughout
D5. Reading/focus order retains meaning when using tabs or a screenreader
D6. Repeating blocks of content can be skipped

E5. Source of static page breaks/pagination is identifiable

F1. All audio only content has an alternative suitable for individuals with a hearing impairment
F2. All video only content has an alternative suitable for individuals with a visual impairment

OARC by complexity

Easy wins

A1. Text is actual text; not images of text
A2. Colours of text has contrast ratio of at least 4.5:1
A4. Text is reflowable without problems
A5. Text can be resized without problems
A6. Line height and spacing, letter spacing and word spacing can all be changed without problems
A7. Orientation can be changed without problems

B5. Other clickable elements are 24 x 24 pixels
B6. Other clickable elements have visible text that matches the text in the underlying code
B7. A list's numbers, letters or bullets are displayed and tagged correctly

C1. Non-decorative/real and decorative/artefact content is all tagged correctly
C2. Non-text features (figures, graphics, captions, links, mathematical expressions) are tagged and grouped correctly
C3. Lists, tables and TOCs are tagged correctly
C4. Headers, footers, notes and references are tagged correctly
C5. Headings are tagged as headings
C6. Headings have just 1 <H1>, at the beginning
C7. Headings <H2>-<H6> don't skip levels
C8. No headings <H7> or higher
C9. Other non-PDF structure elements tagged correctly

D1. Multiple ways to navigate
D4. Navigation consistent throughout
D6. Repeating blocks of content can be skipped

Medium

A3. Headings are descriptive of the content they contain
A8. Fonts are coded correctly

B2. Colours of non-text features (figures, graphics) has contrast ratio of at least 3:1
B4. Links are accessible and meaningful
B8. A tables's headers, rows and columns are tagged correctly

D2. Static page breaks are present
D3. Static page breaks are navigable
D5. Reading/focus order retains meaning when using tabs or a screenreader

E1. File has metadata
E2. File metadata has a title that is used instead of file name
E3. File metadata has a valid language
E5. Source of static page breaks/pagination is identifiable
E6. File metadata includes full accessibility conformance information

Complicated

B1. Non-text features (figures, graphics, captions, links, mathematical expressions) have meaningful ALT text
B3. Non-text features (figures, graphics, captions, links, mathematical expressions) have multiple ways of conveying meaning
B10. PDF tags support the separate reading order
B11. PDF role mapping is correct
B12. Other structure elements in PDF tagged correctly

Variable

E4. Where the language changes, individual parts have a valid language

F1. All audio only content has an alternative suitable for individuals with a hearing impairment
F2. All video only content has an alternative suitable for individuals with a visual impairment

Auditing Advice

This page explains how to audit the current accessibility of all aspects of the organisation, including the frontlist and backlist book files, the website functionality and the backend submission platform. You could complete this yourself using self auditing, or employ an external auditor. You could also look at assessing current organisational knowledge, attitudes towards and motivations for engaging with accessibility work. For a full audit, especially if you have ambitions to go beyond legal compliance, you could consider all of the four steps below, but completing the first two would be sufficient as a minimum.

  1. Automated Testing
  2. Manual Checking
  3. Assistive Technology Tests
  4. End user testing from print disabled people

Automated Testing

There are many proprietary and open source tools available to audit accessibility using automated testing. Below we have collated our top picks for open source tools, however many publishers may have budget to purchase a tool to do this, therefore, we have included links to other curated lists of accessibility tools from recommended sources. It's important to note that automated testing is only part of the process and can only take you so far, as many accessibility features require human assessment, for example, automated tools can check for the presence of ALT text, but can only guess at it's quality, for example length or matching the file name, and full quality checking will always need a human.

Top Picks:

EPUBs

Ace by Daisy

Smart by Daisy

PDFs

PAC (PDF Accessibility Checker)

HTML and Web Pages

Wave Browser extensions 

Accessibility Checker

More tools:

W3 Web Accessibility Evaluation Tools List

DWP Accessibility Tools and Resources

Accessibility Resources

A11y Project Resources

Manual checking

EPUBs and PDFs

We recommend our auditing tool, OARC, which includes just the parts of WCAG that are relevant to static files, and has additional checklist items for the two most common file type formats for open eBooks. 

HTML and Web Pages

For HTML books and web pages, you would need to consider all of WCAG AA, rather than just the selected checklist above, which only includes aspects of standards that apply to ebook files that need to be manually checked. Below are a list of already available widely used full WCAG based checklists:

WebAIM's WCAG 2 Checklist

Checklist of Checkpoints for Web Content Accessibility Guidelines 1.0

Deque Accessibility Developer's Guide

Other checklists and related tools:

W3C Easy Checks

UK Government: Basic accessibility check

Web Content Accessibility Guidelines - Quick Reference

Web Content Accessibility Guidelines in Plain English

BCCampus - Accessibility Toolkit - 2nd Edition

Open University Library - WCAG 2.2 Level A and AA Basic Primo VE Checklist

Assistive Technology Tests

We recommend running at least a sample of eBooks through assistive technology in order to double check that everything works OK, and best if this is a range of the most commonly used tools that fulfill a range of functions. The minimum checks you complete should be checking that:

There are different types of assistive technology that are commonly in use and you should check through at least one example of each type.

Contrast, Colour and Font Changers

Try different settings using:

Screen Readers

NVDA desktop screen reader is a commonly used open source application that you can download and test with. It's also recommended to check using mobile screen readers such as VoiceOver on iOS or TalkBack on Android. Complete the following tests using these technologies:

Screen Magnifiers

Use desktop features such as Windows Magnifier or mobile features such as Apple Zoom to check this. Complete the following checks using these features:

Speech Recognition

Dragon speech recognition is a commonly used proprietary desktop screen reader that you can test with. It's also recommended to check using mobile speech recognition on iOS or Android. Complete the following tests using these technologies:

Make sure you speak clearly, but naturally. You should also use a high quality headset rather than an in-built microphone in your local machine and make sure you’re at a consistent distance from the microphone. 

More information:

WebAim articles

using NVDA to evaluate web accessibility
using VoiceOver to evaluate web accessibility

UK Government advice

GOV.uk Testing with assistive technologies

End user testing from print disabled people

While not common for small presses, and likely this is beyond available capacity, best practice would be to approach end users with disabilities to test a sample of book files, web pages and submission systems. Below is some advice on finding user testing opportunities like this, if presses decide to go down this route.

The best feedback will always come from end users with disabilities, and from older users, as it can uncover accessibility barriers that are commonly experienced by your readership, yet are not captured within legal minimum accessibility requirements. 

In most cases, including users in evaluation involves:

Sources:

W3C Involving Users in Evaluating Web Accessibility

W3C Involving Users in Web Projects for Better, Easier Accessibility

More information:

AbilityNet - A Step-by-Step Guide to User Testing

AbilityNet - Product and Services - User Accessibility Testing and Research

The GOV.uk website includes a set of hypothetical user profiles to give you working examples of the range of users and their needs. These can be used to develop a strong idea of accessibility use cases and may help make content design decisions.

Understanding disabilities and impairments: user profiles

Other Auditing Tools and Checklists

WebAIM's WCAG 2 Checklist

Checklist of Checkpoints for Web Content Accessibility Guidelines 1.0

Deque Accessibility Developer's Guide

W3C Easy Checks

UK Government: Basic accessibility check

Web Content Accessibility Guidelines - Quick Reference

Web Content Accessibility Guidelines in Plain English

BCCampus - Accessibility Toolkit - 2nd Edition

Open University Library - WCAG 2.2 Level A and AA Basic Primo VE Checklist

The Matterhorn Protocol PDF Association 

AccessiblePublishing.ca Accessibility Features Checklist