BAgel QA process

A summary of the testing process for each BAgel component. Our testing procedures are periodically reviewed to accommodate changes in browser versions, operating systems, accessibility standards, and business requirements

Testing

Assistive technologies

The purpose of assistive technology is to enable individuals with disabilities to perform tasks which would otherwise be difficult or impossible. Examples of assistive technology includes: screen readers, magnifiers and talk to speech.

To ensure that our components are accessible; we test them with various assistive technologies. We have found it to be an effective and quick method of identifying potential issues that may affect all users, not just those that use assistive technologies.

Testing so currently performed on real devices. However, it is our intention to introduce automation platforms to increase our coverage at a later date.

Our QA approach with assistive technologies

BAgel adheres to the W3C's Web Content Accessibility Guidelines 2.2 (WCAG) level AA and above.

Before each release, the design system team ensures that each component is compatible with the following combinations of assistive technologies and browsers:

Full list of assistive technology supported by BAgel

Browser Version Operating System Assistive technology
Chrome latest Android Talkback
Chrome latest Windows JAWS *
Safari latest iOS Voice Control
Safari latest iOS Voiceover
Safari latest MacOS Voiceover
Firefox latest Windows NVDA
Chrome latest Windows NVDA **
Edge latest Windows High Contrast Mode

* We do not currently test on JAWS because testing on the other listed assistive technologies provides sufficient test coverage

** We are currently evalutaing the value of adding Chrome/NVDA support.

Browser support

We test the following browsers and operating combinations

Browser Version Operating System
Chrome latest Windows
Chrome latest MacOS
Chrome latest iOS
Chrome latest Android
Safari 15 iOS
Safari latest iOS
Safari 15 MacOS
Safari latest MacOS
Firefox latest MacOS
Firefox latest Windows
Edge latest Windows
Samsung latest Android

Responsive Design

Testing for responsive design is crucial to ensure that BAgel functions well across various devices and screen sizes.

The device sizes we test:

320 pixels wide (portrait)
Adds support for older/dated mobile devices

Equivalent to 1280 pixels wide screens at 400% zoom. This is required for adhering to the WCAG 1.4.10 Reflow (AA) standard

1280 pixels wide and wider (portrait)
Adds support for wider viewport widths commonly found in desktop and laptop screens
BAgel has a maximum width of 992px, so there is no need to test multiple widths beyond 1280 pixels
Device widths between 320 pixels and 1280 pixels (portrait)
Adds support for mobile and tablet devices
There is no specific viewport width to test
Testing is performed by starting at 320 pixels and increasing the width gradually to 1280 pixels
Devices 256px high (landscape)
Adds support for when users switch between portrait and landscape orientations on their devices

Equivalent to 1024 pixels high screens at 400% zoom. This is required for adhering to the WCAG 1.4.10 Reflow (AA) standard

Tips for testing responsive design

Browser Developer Tools:
Most modern browsers (e.g., Chrome, Firefox, Safari) come with built-in developer tools that allow you to simulate different device resolutions. Access to these tools can be done by right-clicking on the webpage and select "Inspect" or "Inspect Element," and then navigating to the "Responsive" or "Device" mode.
Mobile Emulators:
Mobile emulators are very useful in the absence of actual devices. Android Emulator (for Android devices) or Xcode Simulator (for iOS devices) allows you to test the application as if it were running on an actual mobile device. These tools are especially useful for testing specific functionalities like touch interactions.

Test HTML

When creating test HTML, it's important to adhere to accessibility standards, use the appropriate attributes, and ensure proper nesting of components. Below is an example of a simple HTML structure for testing BAgel components, following best practices:

Example of test html

Example of html page with native HTML elements and bagel components
<html lang="en">
  <head>
    <!-- Links to BAgel scripts and CSS -->
  </head>
  <body>
    <ba-page-segment>
      <ba-content>
        <h1>This is the heading</h1>
        <ul>
          <li><a href="/url-1">First item in the list</a></li>
          <li><a href="/url-2">Second item in the list</a></li>
          <li><a href="/url-3">Third item in the list</a></li>
        </ul>
      </ba-content>
    </ba-page-segment>
  </body>
</html>

This is a basic example, and in a real-world scenario, you would adapt it based on the specific components and features of your website or application. Always test with actual assistive technologies and consult the component documentation and the latest accessibility guidelines for more detailed information.

Test cases

The Given-When-Then formula is a widely used approach for writing test cases, especially in Behavior-Driven Development (BDD). It provides a clear structure to define the context, actions, and expected outcomes of a test and this is why BAgel uses this approach.

What is Given – When – Then?

Given-When-Then is a formulaic approach used in software development and testing to structure and describe the behavior of a system, feature, or component. It is commonly associated with Behavior-Driven Development (BDD) and acceptance testing. The Given-When-Then structure helps to create clear, understandable, and executable specifications for software behavior.

Here's a breakdown of each part of the Given-When-Then structure:

GIVEN:
Describes the initial context or setup for the test.
Specifies the preconditions or existing state before a certain action occurs.
This is where you establish the initial conditions necessary for the test scenario.
WHEN:
Describes the specific action or event that triggers a change in the system.
Represents the event or action that you are testing.
This is the step where you execute the specific behavior or action you want to observe.
THEN:
Describes the expected outcome or result after the action specified in the `When` step.
Specifies the post-conditions or the expected state of the system.
This is where you assert the expected behavior or verify the results of the action.
I SEE:
Describes the actual outcome or result shown .
Shows the actual state of the system.
This is where you assert the actual outcome against expected results.
I HEAR:
Describes what a user hears when an assitive technology such as screen readers is in use .
User hears the actual state of the system.
Outcome is compared to expected outcome.

Example of test case format

GIVEN THAT I am on a page with a header landmark

1. Keyboard for mobile & desktop
WHEN I use the tab key to enter the web browser window
THEN I SEE focus is strongly visually indicated on interactive components
2. Desktop screenreader
WHEN I use a desktop screenreader (NVDA, JAWS, VoiceOver)
AND I use the tab key to enter the web browser window
THEN I HEAR It is discoverable with screenreader shortcuts as header/banner landmark
THEN I HEAR It typically contains the name and primary navigation of the website
3. Mobile screenreader
WHEN I use a mobile screenreader (Talkback, VoiceOver)
AND I swipe to focusable elements in the header
THEN I HEAR It is discoverable with screenreader shortcuts as header/banner landmark
THEN I HEAR It typically contains the name and primary navigation of the website

Tips for Writing Test Cases:

  1. Be Clear and Specific: Clearly define the actions and expected outcomes in each section (Given, When, Then).
  2. Use Concrete Examples: Provide specific examples for inputs, actions, and expected results.
  3. Consider Edge Cases: Include test cases for boundary values, invalid inputs, and unexpected scenarios.
  4. Ensure Isolation: Each test case should be independent of others to ensure accurate results.
  5. Verify Expected Outcomes: Clearly state the expected outcomes so that testers can verify results easily.
  6. Include Preconditions: Specify any necessary preconditions for the test to be executed successfully.

Visual regression

Visual regression testing compares screenshots of the UI in a reference state (baseline) with screenshots taken after changes have been made. It helps identify visual differences, such as changes in layout, styling, or content, that may occur unintentionally during development.

Why we use visual regression testing:

Consistency
Ensures that the UI remains consistent across different versions.
Improved User Experience
Helps prevent visual defects that might affect the user experience.
Cost Reduction
Identifies issues early, reducing the cost of fixing problems in later stages of development.
Quality Improvement
Contributes to overall product quality by maintaining a polished appearance.

Components

BAgel components are designed to contain native HTML elements and other BAgel components. It's important to ensure that all possible combinations work well together.

Usually, you just need to test components from the perspective of the parent component i.e. all the elements that can be placed in a component

Some BAgel components act differently when nested into another component. For instance, animations might turn off automatically when slotted inside another animated component. These scenarios need documenting in the test cases.

Attributes

BAgel component attributes, similar to native HTML elements, are additional details that can be directly attached within the HTML code to BAgel components. These attributes provide metadata or configuration details that can influence the behavior or appearance of the web component.

Attributes in web components serve several purposes:

Configuration and Customization
Attributes allow users of a web component to customize its behavior or appearance by providing values or settings. For example, a custom button component might have attributes like color or size that users can set to customize the button's visual style.
Data Binding
Attributes can be used for data binding, allowing dynamic updates to the component based on changes to the attribute values. When an attribute changes, it can trigger updates within the component, such as re-rendering or adjusting internal state.
Communication
Attributes can serve as a means of communication between a parent component and its children or between different parts of a web application. Changes to an attribute can signal events or actions within the component.
Styling
Attributes can be used to define styling properties or classes dynamically based on the attribute values. This allows for flexible and dynamic styling of web components based on user input or application state.

Testing attributes

Testing attributes involves validating that the attributes of the web components behave as expected under different conditions:

  • Verify individual attributes' behavior in isolation
  • Test all permitted values for attributes
  • Consider unhappy paths such as passing the wrong values to attributes
  • Check all attribute combinations
  • Create integration tests to ensure that the each component attributes interact correctly with other components or elements.

Permitted aria roles

An ARIA role attribute can be added to an element to instruct assistive technologies to expose the element as something other than its native HTML element type.

For example, an a tag <text> element with role=button is to be exposed as a button, not a link. Some ARIA roles are only appropriate to specify on certain elements.

How do you test ARIA?

  • Launch Screen reader on the chosen browser in accordance with Assistive Tech.
  • Navigate to the test page.
  • Identify all elements with an aria attribute
  • Use a screen reader to navigate to the these elements
  • Perform any commands that the screen reader announces when you land on these elements
  • Compare behavior with ARIA documentation on MDN docs

High contrast mode

High contrast mode is a feature that changes the colors and contrast on your device's screen to make it easier to read and navigate. This is especially helpful for individuals with visual impairments, color blindness, or those who have difficulty reading small text. By increasing the contrast between text and background, high contrast mode can make it easier to distinguish between different elements on the screen.

Turn on High Contrast Mode:

These instructions are general guidelines, and the exact steps might vary based on device models and software versions. Some devices may offer additional customization options within high contrast mode, allowing you to fine-tune the appearance based on your needs.

  1. Open Settings:
    • Go to your device's home screen.
    • Tap on the Settings app.
  2. Accessibility Settings:
    • Scroll down and tap on Accessibility.
  3. Display & Text Size:
    • Under the Vision section, tap on Display & Text Size.
  4. High Contrast:
    • Look for an option like Increase Contrast or Classic Invert.
    • Toggle the switch to enable high contrast mode.
  1. Open Settings:
    • Navigate to the Settings app on your Android device.
  2. Accessibility Settings:
    • Scroll down and tap on Accessibility.
  3. Vision Settings:
    • Look for an option related to Vision or Display.
    • Depending on the device and Android version, you may find options like High contrast text or Color inversion.
  4. Toggle High Contrast:
    • Enable the toggle switch for the high contrast mode or color inversion.
  1. Open Settings:
    • Press the Windows key on your keyboard or click on the Start button.
    • Select Settings.
  2. Ease of Access:
    • Click on Ease of Access.
  3. High Contrast Settings:
    • In the left sidebar, select High contrast.
  4. Choose a Theme:
    • Under Choose a theme, select a high contrast theme from the list.
  1. Open System Preferences:
    • Click on the Apple logo in the top-left corner.
    • Select System Preferences.
  2. Accessibility:
    • Click on Accessibility.
  3. Display:
    • In the left sidebar, select Display.
  4. Increase Contrast:
    • Look for options like Increase contrast or Use grayscale.
    • Adjust the settings according to your preferences.

Testing steps

This is a structured approach to ensuring that high contrast mode works effectively and meets accessibility criteria. Here's a breakdown of the steps you've outlined:

Apply High Contrast Color Theme:
Within high contrast settings, pick or create a customised color theme according to your preferences.
Open default browser for the operating system:
Edge on Windows
Safari on iOS and MacOS
Chrome on Android
Navigate to Test URL:
Open the specified test URL.
View Components:
Check that all colors have changed to reflect your customized color theme.
Ensure that only colors from your theme are visible, excluding images.
Verify that the colors of icons and graphics align with your custom theme.
Evaluate if the visual design remains equivalent to the standard view without high contrast, considering design specifications, background colors, and overall layout clarity.
Compare with Accessibility Acceptance Criteria:
Compare the results with established accessibility acceptance criteria to ensure compliance.

Reduced motion

Reduced motion is an accessibility feature designed to make digital content more comfortable for individuals who may experience discomfort, dizziness, nausea, or motion sickness when exposed to excessive or distracting motion effects on screens.

In the context of accessibility testing, reduced motion refers to the evaluation of a website, application, or digital interface to ensure that it accommodates users who have selected the reduced motion option. When reduced motion is enabled, it typically suppresses or minimizes certain types of animations, transitions, and visual effects.

In many operating systems and browsers, users can find a "Reduce Motion" or "Motion Preferences" option in the accessibility settings. Activating this option will trigger the system or application to minimize motion effects.

Here are key aspects to consider when testing for reduced motion accessibility:

Animations and Transitions:
Check for any animations or transitions within the interface, such as fading, sliding, or zooming effects.
Ensure that these effects are either eliminated or significantly reduced when reduced motion is turned on.
Parallax Effects:
Verify that parallax scrolling effects are either removed or significantly toned down to prevent discomfort.
User Interface Elements:
Examine interactive elements like buttons, menus, and links
Ensure that any motion associated with these elements is adjusted or eliminated to support reduced motion preferences.
Evaluate Reduced Motion alternatives:
Conduct testing with reduced motion settings turned on to experience the interface from the user's perspective.
Evaluate if the reduced motion feature successfully mitigates potentially discomforting visual effects.
User Feedback:
Collect feedback from users who may benefit from reduced motion settings to ensure that their needs are adequately addressed.

Example of reduce motion test case

  • GIVEN THAT I am on a page with a <ba-component>
  • WHEN reduced motion is turned on
  • THEN I SEE <ba-component>working as expected
  • THEN I SEE the loading bar is not present
  • WHEN reduced motion is turned off
  • THEN I SEE the loading bar is present and working as expected

WCAG success criteria

WCAG provides a set of criteria and guidelines to ensure that web content is accessible to people with disabilities. Each component in a web application should adhere to these criteria to enhance accessibility. Below is an example of how a specific component, can be evaluated against WCAG success criteria.

Example: WCAG success criteria evaluation for ba-message component

Additional Considerations:
Check for proper use of ARIA (Accessible Rich Internet Applications) attributes if applicable.
Test the component with screen readers and other assistive technologies to ensure a seamless user experience.
WCAG Documentation Reference:
[https://www.w3.org/WAI/WCAG22/Understanding/](Understanding WCAG 2.2)

It's important to conduct a thorough accessibility audit, including real-world usage scenarios with users who have disabilities, to ensure that your components meet the WCAG guidelines and provide an inclusive experience for all users.

Automated accessibility testing tools

The use of semi-automated tools for accessibility testing is a great practice to complement manual testing efforts. These tools can help identify common accessibility issues and provide insights into Web Content Accessibility Guidelines (WCAG) compliance.

Here's a brief overview of the tools we use in BAgel.

Features:
Provides in-page feedback with an icon on the browser toolbar.
Identifies errors, alerts, and features requiring manual checks.
Offers details about elements contributing to each issue.
Allows navigation through code with highlighting on the webpage.
Usage:
Install the browser extension.
Open the webpage you want to test.
Click on the Wave icon to view accessibility information.
Notes:
Known issues with shadow dom, which BAgel components use. Cannot be relied on solely for testing.
Features:
Integrated with browser developer tools.
Analyzes individual web pages for accessibility issues.
Provides detailed reports with explanations for each issue.
Suggests possible fixes and includes code snippets.
Usage:
Install the browser extension.
Open developer tools and switch to the `Axe` panel.
Run the axe analysis on the current page.
Features:
Assists in testing color contrast and visual design for accessibility.
Simulates various types of color vision deficiencies.
Provides feedback on potential color contrast issues.
Useful for ensuring text and interactive elements are distinguishable.
Usage:
Install the browser extension.
Open the webpage you want to test.
Utilize Stark to check color contrast and design elements.
Features:
Facilitates navigation by providing shortcuts to landmarks and headings.
Allows users to jump directly to specific sections of a webpage.
Enhances efficiency for keyboard and screen reader users.
Supports testing the logical structure of a page.
Usage:
Install the browser extension.
Use SkipTo to quickly navigate to landmarks and headings.

Found a tool not listed here? Please let us know! we are always open to exploring new options that we can integrate into our testing process

Usage tips

Integration
Integrate these tools into your development workflow for continuous testing.
Review Reports
Carefully review the reports generated by these tools to understand and address identified issues.
Combine with Manual Testing
While these tools are powerful, manual testing remains crucial for a comprehensive accessibility assessment.
Stay Updated
Periodically check for updates and new features in these tools to ensure you are using the latest capabilities.

By incorporating these tools into our accessibility testing process, we continuously catch and address potential issues, ultimately improving the overall accessibility of BAgel components.

Pros and limitations of automated accessibility tools

A wide range of tools can help you find accessibility issues, from browser plugins and website scanners to Javascript libraries that analyze your codebase. These tools has strengths and weaknesses when it comes to catching accessibility errors, you cannot solely rely on these tools.

Pros:

1 Easy to Setup:
These tools typically have straightforward installation processes, making them accessible to developers with varying levels of experience.
2 Easy to Use:
User-friendly interfaces and integrations with developer tools make these tools accessible to a broad audience, facilitating quick adoption.
3 Known Reputable Sources:
Tools developed by reputable organizations and accessibility experts are more likely to align with industry standards and guidelines.
4 Accuracy of Feedback:
Automated tools can accurately identify and flag common accessibility issues, providing developers with actionable feedback to address potential problems.
5 Clarity of Feedback:
Many tools offer detailed reports with explanations and suggestions for fixing identified issues, aiding developers in understanding and resolving problems effectively.
6 Cost:
Many semi-automated tools are open-source or have free versions available, making them cost-effective for developers and organizations with budget constraints.

Limitations:

1 Do Not Catch All Accessibility Issues:
Automated tools may not catch certain nuanced or context-specific accessibility issues. For example, they may not identify issues related to target size, missing skip-to-content links, or situations where focus is removed from interactive elements.
2 Contextual Limitations:
Automated tools may lack the ability to understand the full context of a webpage, potentially leading to false positives or negatives.
3 Dependence on Code Patterns:
Some tools rely on specific code patterns to identify accessibility issues, which may result in false positives or overlook issues not covered by those patterns.
4 Limited to What's Detectable Programmatically:
Automated tools are limited to identifying issues that can be programmatically detected. They may not catch issues that require a more nuanced understanding of user experience.
5 Dynamic Content Challenges:
Tools may struggle with dynamic content generated by JavaScript, leading to potential oversights in the evaluation of accessibility.
6 Limited to Browser-Specific Issues:
Some tools are browser-specific and may not identify issues that manifest only on certain browsers or assistive technologies.
7 No Substitute for Human Judgment:
Automated tools cannot replace the value of human judgment and expertise, particularly in assessing the overall user experience and considering the specific needs of diverse user groups.

In summary, while semi-automated tools offer valuable assistance in catching and addressing accessibility issues, they should be viewed as one component of a comprehensive accessibility testing strategy. Manual testing, user testing, and ongoing education are essential components to ensure a truly accessible user experience.



GitHub Storybook Figma library Version 3 release guide Release history BAgel helper QA process