Includia Accessibility Checker Coverage
Get clear accessibility reports within seconds of scanning your website.
Add to Chrome (Free)WCAG Success Criteria Coverage
Gain insight into the effectiveness of Includia Accessibility Checker with a detailed breakdown of its WCAG success criteria coverage. Our coverage percentages are based on Accessibility Conformance Testing (ACT) rules, providing transparency into what can be automated and what requires manual verification for full compliance.
Success Criterion | Coverage | Notes |
---|---|---|
1.1.1 Non-text Content (A) | 85.0% | IAC automates the detection of non-text content issues like missing alt attributes on images, but it cannot assess the quality or appropriateness of the alt text itself, which requires manual review. |
1.2.1 Audio-only and Video-only (Prerecorded) (A) | 70.0% | IAC can detect some issues with prerecorded audio and video, such as the presence of a video player and some attributes related to accessibility. However, it cannot verify the accuracy or completeness of audio descriptions, transcripts, or captions, which requires human review. |
1.2.2 Captions (Prerecorded) (A) | 70.0% | IAC can automatically detect the presence of captions and some common issues, but it cannot verify the accuracy or synchronization of the captions with the prerecorded media, which requires manual review. |
1.2.3 Audio Description or Media Alternative (Prerecorded) (A) | 70.0% | IAC can detect if an audio description or media alternative is present for a prerecorded video, but it cannot verify the quality or accuracy of the description. Manual testing is required to confirm that the description is an accurate and complete representation of the video content. |
1.3.1 Info and Relationships (A) | 80.0% | IAC can automatically detect many common issues related to ARIA usage, heading structure, and list markup, but it cannot verify all relationships and information communicated visually, which requires manual inspection. |
1.3.2 Meaningful Sequence (A) | 70.0% | IAC can detect some issues with meaningful sequence, such as tab order and certain structural issues. However, it cannot fully determine the logical reading order of all content, as this often requires human judgment. |
1.3.3 Sensory Characteristics (A) | 60.0% | IAC can detect some instances where sensory characteristics are the only way to understand information, but it cannot guarantee full coverage. It can't, for example, determine if instructions like "Press the red button" are also accompanied by a textual label or other non-sensory cues. |
1.4.1 Use of Color (A) | 60.0% | IAC can automatically detect some issues related to color contrast, but it cannot evaluate all aspects of WCAG 1.4.1. For example, it cannot determine if color is the sole means of conveying information, which requires manual inspection. |
1.4.2 Audio Control (A) | 90.0% | IAC can automatically detect some issues related to color use, such as contrast ratios and the presence of color as the sole means of conveying information. However, it cannot check all aspects of this success criterion. For example, it cannot determine if a color-based distinction is meaningful to a user, or if a user can interpret information based on color alone. Therefore, it is a partial check, and manual testing is always necessary. |
2.1.1 Keyboard (A) | 40.0% | IAC can detect many, but not all, keyboard accessibility issues. It can identify elements that are not keyboard focusable, but it cannot test all possible keyboard interactions or complex custom components that require manual testing. |
2.1.2 No Keyboard Trap (A) | 35.0% | IAC can detect some instances of keyboard traps, but it cannot reliably detect all of them. Many keyboard traps are related to complex dynamic content or custom components that require manual testing to uncover. |
2.1.4 Character Key Shortcuts (A) | 0.0% | IAC does not currently have a rule to detect this WCAG success criterion. This is a manual test that requires a human to verify. |
2.2.1 Timing Adjustable (A) | 45.0% | IAC can detect some issues related to timing. However, it cannot test for all instances where a time limit may exist, as many of these are handled by JavaScript and are not easily detectable by an automated tool. Manual testing is required to fully verify this success criterion. |
2.2.2 Pause, Stop, Hide (A) | 35.0% | IAC can detect some instances of automatically playing content, such as <audio> and <video> elements with the autoplay attribute. However, it cannot reliably detect all instances of moving, blinking, or scrolling content that may be triggered by JavaScript. Manual testing is often required to fully verify this success criterion. |
2.3.1 Three Flashes or Below Threshold (A) | 0.0% | IAC cannot reliably detect flashing content. This is a very difficult criterion for an automated tool to test, as it requires analyzing the visual presentation and timing of content, which is a task best suited for human review. |
2.4.1 Bypass Blocks (A) | 90.0% | IAC can detect some bypass blocks, such as missing "skip to content" links, and issues with heading structure. However, it cannot verify if the navigation is effective or if all relevant content is included in the bypass link. Manual testing is needed to ensure the effectiveness of the bypass block. |
2.4.2 Page Titled (A) | 100.0% | IAC can automatically and reliably detect whether a page has a <title> element with a non-empty value. This is a straightforward check that doesn't require any manual intervention. |
2.4.3 Focus Order (A) | 75.0% | IAC can detect some focus order issues, such as elements with a tabindex greater than zero, which can disrupt the logical flow. However, it cannot verify the entire sequential navigation of a complex page to ensure it's logical and preserves meaning. This requires a human to manually test the tab order. |
2.4.4 Link Purpose (In Context) (A) | 70.0% | IAC can detect some instances where a link's purpose may not be clear, such as generic text like "click here" or "read more" when there are multiple links with the same text on a page. However, it cannot reliably determine the semantic meaning of a link's purpose in all contexts. This is a complex task that requires a human to evaluate the surrounding content and understand the link's destination. |
2.5.1 Pointer Gestures (A) | 0.0% | IAC cannot detect this criterion. It requires human testing to verify that all functionality that can be operated with complex gestures can also be operated with a single pointer without a path-based gesture.\n |
2.5.2 Pointer Cancellation (A) | 0.0% | IAC cannot test for this criterion. It requires manual interaction and observation to ensure that a function can be completed on the "up" event and that the user can abort the action. |
2.5.3 Label in Name (A) | 85.0% | IAC can detect many instances where the visible label does not match or is not included in the accessible name of a control. It can identify common issues such as a label and button text not matching. However, there are more complex cases involving custom components or multiple labels that require manual verification.\n |
2.5.4 Motion Actuation (A) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that functionality triggered by motion can also be operated by a UI component, and that the motion-based trigger can be disabled to prevent accidental activation. This is a task that requires human interaction with the device. |
3.1.1 Language of Page (A) | 100.0% | IAC can automatically and reliably detect whether a page has a lang attribute on the <html> element. This is a simple, direct check that does not require human intervention. |
3.2.1 On Focus (A) | 50.0% | IAC can detect some instances where a focus event might trigger a change of context, such as a script that automatically redirects or submits a form on focus. However, it cannot reliably detect all instances where a change of context might occur, as these are often triggered by complex JavaScript behaviors that require manual testing to uncover. |
3.2.2 On Input (A) | 30.0% | IAC can detect some instances where a change of context occurs on input, such as a form that auto-submits when a field is changed. However, it cannot reliably detect all such instances, especially when the behavior is triggered by complex JavaScript. This is a task that often requires manual testing to verify.\n |
3.2.6 Consistent Help (A) | 0.0% | IAC cannot detect this criterion. It requires manual verification to ensure that help functionality, if present, is located consistently across a set of web pages. This involves a human reviewing multiple pages to identify the location of help features. |
3.3.1 Error Identification (A) | 85.0% | can detect many common error identification issues, such as form fields with invalid input that don't have an error message or have a message that isn't programmatically associated with the input. It can't, however, determine if the error message is clear or easy to understand.\n |
3.3.2 Labels or Instructions (A) | 85.0% | IAC is effective at checking for the presence of a label associated with form controls. It also checks for common failures like using a placeholder as a label. However, it cannot determine if the label's text is sufficiently descriptive or if additional instructions are needed for complex forms. |
3.3.7 Redundant Entry (A) | 30.0% | IAC cannot detect this criterion. It requires manual verification to determine if a form requires the user to enter the same information multiple times, and if there is an option for the user to select or autofill previously entered data. This is a complex task that requires human observation and judgment. |
4.1.2 Name, Role, Value (A) | 90.0% | IAC is very effective at detecting common issues related to name, role, and value, such as missing accessible names for buttons and links, incorrect use of ARIA attributes, and incorrect roles. It is a core part of its functionality. However, it cannot verify the semantic correctness of every name and role in all possible contexts, which may require manual inspection. |
1.2.4 Captions (Live) (AA) | 0.0% | IAC cannot test for live captions. This is a very complex and dynamic process that requires manual verification to ensure that captions are provided for all live audio content. |
1.2.5 Audio Description (Prerecorded) (AA) | 55.0% | IAC can detect if an audio description track is present, but it cannot verify if the description is complete or accurate. This is a task that requires a human to watch the video and evaluate the description. |
1.3.4 Orientation (AA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the content is not restricted to a single display orientation (e.g., portrait or landscape). This is a physical test that requires a human to rotate the device and observe the content. |
1.3.5 Identify Input Purpose (AA) | 95.0% | IAC is highly effective at detecting input fields that can be automatically filled by a browser, but are missing the correct autocomplete attribute. It can reliably identify fields that require this attribute and report a violation. |
1.4.10 Reflow (AA) | 50.0% | IAC can detect some issues that may prevent reflow, such as the use of max-width on containers that prevents content from wrapping. However, it cannot fully simulate or test all possible layouts and content reflow scenarios, which requires a human to manually test the page at different screen sizes and zoom levels. |
1.4.11 Non-text Contrast (AA) | 45.0% | IAC can detect some instances of low contrast for non-text elements, but it is not a comprehensive solution. It primarily checks for contrast on user interface components. It cannot reliably check the contrast of complex graphics or icons that are part of a larger image, which requires manual review.\n |
1.4.12 Text Spacing (AA) | 60.0% | IAC can detect some issues with text spacing, such as the use of !important declarations on properties like line-height that prevent users from overriding styles. However, it cannot verify if the content remains readable when a user increases the text spacing. This requires a human to manually test the page with a browser extension or user stylesheet. |
1.4.13 Content on Hover or Focus (AA) | 40.0% | IAC can detect some instances where content on hover or focus is not dismissible, such as a tooltip that cannot be closed. However, it cannot reliably test for all aspects of this success criterion, as many of these behaviors are complex and require manual interaction and observation. |
1.4.3 Contrast (Minimum) (AA) | 95.0% | IAC is highly effective at checking for minimum contrast ratios between text and background colors. It can reliably detect most text contrast issues and is a core part of its automated testing suite. There may be some complex cases with gradients or background images that require manual verification. |
1.4.4 Resize text (AA) | 55.0% | IAC can detect some issues that may prevent text resizing, such as the use of fixed-size units for font sizes. However, it cannot fully verify that all content and functionality remains available and readable at different text sizes. This is a task that requires a human to manually test the page with a browser zoom or text resizing feature. |
1.4.5 Images of Text (AA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to determine if a page uses an image of text rather than actual text. This is a visual check that requires a human to distinguish between a graphic and text. |
2.4.11 Focus Not Obscured (Minimum) (AA) | 40.0% | IAC cannot detect this criterion. It requires manual verification to ensure that a focused component is not hidden by other content, such as a sticky header. This requires a human to interact with the page and observe the focus state. |
2.4.5 Multiple Ways (AA) | 35.0% | IAC can detect some issues with navigation, but it cannot reliably verify the existence of multiple navigation methods on a website. This is a complex task that requires human judgment to determine if there is a site map, search functionality, or multiple navigation menus. |
2.4.6 Headings and Labels (AA) | 55.0% | IAC can detect some issues with headings and labels, such as missing labels on form controls or incorrect heading levels. However, it cannot reliably determine if all headings and labels are descriptive and accurately reflect the topic or purpose of the content they introduce. This is a task that requires human judgment. |
2.4.7 Focus Visible (AA) | 80.0% | IAC can detect some instances where focus is not visible, such as an element with a outline: none CSS rule. However, it cannot reliably detect all issues, especially with complex custom focus indicators or where the focus is visually but not programmatically hidden. This is a task that requires a human to manually test the page with a keyboard to ensure the focus is always visible. |
2.5.7 Dragging Movements (AA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that functionality that relies on dragging movements can also be operated by a single pointer without a path-based gesture. This is a task that requires human interaction with the UI. |
2.5.8 Target Size (Minimum) (AA) | 85.0% | IAC can detect if a target is at least 24 by 24 CSS pixels in size, and it can also check if a smaller target has sufficient spacing from other targets. However, there are many exceptions and complexities, such as inline links or essential targets, that require manual verification. |
3.1.2 Language of Parts (AA) | 85.0% | IAC can detect instances where a different language is used for a part of the content but is not marked up with the correct lang attribute. It is highly effective at checking for this issue. However, it cannot verify the accuracy of the language change, which requires human judgment. |
3.2.3 Consistent Navigation (AA) | 55.0% | IAC cannot reliably test for consistent navigation. This is a complex criterion that requires human judgment to verify that navigation is consistent across a set of web pages. It is difficult for an automated tool to determine if the navigation is consistent in its presentation and location. |
3.2.4 Consistent Identification (AA) | 50.0% | IAC cannot reliably test for consistent identification. This is a complex criterion that requires human judgment to verify that components with the same functionality are identified consistently across a website. For example, it would be difficult for an automated tool to determine if a "search" button is always labeled with the same text or icon. |
3.3.3 Error Suggestion (AA) | 30.0% | IAC can detect some instances where an error is present but a suggestion for correction is missing. However, it cannot reliably generate a meaningful suggestion or verify the effectiveness of the suggestion, as this requires human understanding of the error. |
3.3.4 Error Prevention (Legal, Financial, Data) (AA) | 35.0% | IAC can detect some issues with form submissions, such as forms that lack a review or confirmation step. However, it cannot reliably determine if a form is related to legal, financial, or user data submission, nor can it verify all the necessary error prevention mechanisms. |
3.3.8 Accessible Authentication (Minimum) (AA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that authentication processes do not rely on a cognitive function test unless there are alternatives. This involves human judgment and is not something an automated tool can reliably detect. |
4.1.3 Status Messages (AA) | 85.0% | IAC can detect some status messages that are not announced to screen readers. It can identify aria-live regions and check for their correct implementation, but it cannot always determine if a status message is important enough to be announced. |
1.2.6 Sign Language (Prerecorded) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that sign language interpretation is provided for all prerecorded audio content. This is a visual check that requires a human to verify the presence and accuracy of the sign language interpreter. |
1.2.7 Extended Audio Description (Prerecorded) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that extended audio descriptions are provided for all prerecorded video content. This is a very complex process that requires a human to watch the video and evaluate the description. |
1.2.8 Media Alternative (Prerecorded) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that a text alternative is provided for all prerecorded video and audio content. This is a very complex process that requires a human to review the media and the alternative. |
1.2.9 Audio-only (Live) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that a text alternative is provided for all live audio content. This is a very complex process that requires a human to review the live audio stream and the alternative. |
1.3.6 Identify Purpose (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the purpose of UI components, icons, and regions is programmatically determinable. This is a very complex task that goes beyond what automated tools can reliably check, as it requires a deep understanding of the intended purpose and context of each element. |
1.4.6 Contrast (Enhanced) (AAA) | 95.0% | IAC is highly effective at checking for enhanced contrast ratios between text and background colors. It can reliably detect most text contrast issues and is a core part of its automated testing suite. There may be some complex cases with gradients or background images that require manual verification. |
1.4.7 Low or No Background Audio (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that audio content does not have background sounds or that the background sounds can be turned off. This is a very complex process that requires a human to listen to the audio and evaluate the background sounds. |
1.4.8 Visual Presentation (AAA) | 55.0% | IAC can detect some issues with visual presentation, such as the use of justified text. However, it cannot reliably test for all aspects of this success criterion, as many of these are complex and require manual observation to ensure the visual presentation does not interfere with accessibility. |
1.4.9 Images of Text (No Exception) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that a page does not use images of text. This is a visual check that requires a human to distinguish between a graphic and text. |
2.1.3 Keyboard (No Exception) (AAA) | 80.0% | IAC can detect many keyboard accessibility issues, such as elements that are not keyboard focusable, but it cannot test all possible keyboard interactions or complex custom components that require manual testing. This criterion is also an enhanced version of 2.1.1, so it inherits the same limitations. |
2.2.3 No Timing (AAA) | 70.0% | IAC can detect some issues with timing, such as the use of <meta http-equiv="refresh">, but it cannot reliably detect all instances of time limits that may be handled by JavaScript or server-side code. This is a very complex criterion that requires manual testing. |
2.2.4 Interruptions (AAA) | 75.0% | IAC can detect some interruptions, such as auto-playing video or audio. However, it cannot reliably detect all types of interruptions, especially those that are not media-based, and it cannot verify that all interruptions can be postponed or suppressed. This is a task that requires a human to manually test the page with and without the interruptions. |
2.2.5 Re-authenticating (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the user does not lose data when they are re-authenticated. This is a very complex process that requires a human to test the authentication flow and observe the data. |
2.2.6 Timeouts (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the user is warned of any inactivity that could cause data loss. This is a very complex process that requires a human to test the application and observe the warnings. |
2.3.2 Three Flashes (AAA) | 25.0% | IAC cannot reliably detect flashing content. This is a very difficult criterion for an automated tool to test, as it requires analyzing the visual presentation and timing of content, which is a task best suited for human review. |
2.3.3 Animation from Interactions (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that animations triggered by user interaction can be disabled. This is a very complex process that requires a human to test the animations and the disabling mechanism. |
2.4.10 Section Headings (AAA) | 65.0% | IAC can detect some issues with headings and labels, such as incorrect heading levels. However, it cannot reliably determine if all sections of a page are organized with headings. This is a task that requires human judgment to determine if the headings are logically structured and accurately reflect the content. |
2.4.12 Focus Not Obscured (Enhanced) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that a focused component is not hidden by other content, such as a sticky header. This is an enhanced version of 2.4.11 and also requires a human to interact with the page and observe the focus state. |
2.4.13 Focus Appearance (AAA) | 25.0% | IAC cannot test for this criterion. It requires manual verification to ensure that a focused component is not hidden by other content, such as a sticky header. This is an enhanced version of 2.4.11 and also requires a human to interact with the page and observe the focus state.\n |
2.4.8 Location (AAA) | 30.0% | IAC can detect some issues with breadcrumbs and site maps, but it cannot reliably verify that the user's location within a set of web pages is clearly indicated. This is a complex task that requires human judgment to determine if the location is clear and easily understood. |
2.4.9 Link Purpose (Link Only) (AAA) | 30.0% | IAC can detect some instances where a link's purpose may not be clear, such as generic text like "click here" or "read more." However, it cannot reliably determine if the link text alone is sufficient to describe the purpose of the link. This is a complex task that requires human judgment to evaluate the link's text and context. |
2.5.5 Target Size (AAA) | 30.0% | IAC can detect if a target is at least 44 by 44 CSS pixels in size, and it can also check if a smaller target has sufficient spacing from other targets. However, there are many exceptions and complexities, such as inline links or essential targets, that require manual verification. |
2.5.6 Concurrent Input Mechanisms (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the user can use multiple input methods, such as a keyboard and a mouse, concurrently. This is a very complex process that requires a human to test the interaction with the UI. |
3.1.3 Unusual Words (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the definition of unusual words, idioms, and jargon is provided. This is a very complex process that requires a human to review the text and determine if a definition is needed. |
3.1.4 Abbreviations (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the expansion of abbreviations is provided. This is a very complex process that requires a human to review the text and determine if an expansion is needed.\n |
3.1.5 Reading Level (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the content is written at a lower secondary education level. This is a very complex process that requires a human to evaluate the text and its complexity. |
3.1.6 Pronunciation (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that the pronunciation of words is provided when the meaning of the words is ambiguous without it. This is a very complex process that requires a human to evaluate the text and its context. |
3.2.5 Change on Request (AAA) | 40.0% | IAC can detect some instances where a change of context occurs, such as a script that automatically redirects or submits a form. However, it cannot reliably detect all instances where a change of context might occur, as these are often triggered by complex JavaScript behaviors that require manual testing to uncover. |
3.3.5 Help (AAA) | 35.0% | IAC cannot reliably test for help. This is a complex criterion that requires human judgment to verify that context-sensitive help is available for all form controls. It is difficult for an automated tool to determine if the help is clear and easily understood. |
3.3.6 Error Prevention (All) (AAA) | 45.0% | IAC can detect some issues with form submissions, such as forms that lack a review or confirmation step. However, it cannot reliably determine all the necessary error prevention mechanisms for all forms. This is a complex task that requires human judgment and manual testing.\n |
3.3.9 Accessible Authentication (Enhanced) (AAA) | 0.0% | IAC cannot test for this criterion. It requires manual verification to ensure that authentication processes do not rely on a cognitive function test unless there are alternatives that do not rely on a cognitive function test. This is an enhanced version of 3.3.8 and also requires human judgment and is not something an automated tool can reliably detect. |