Are bad graphical descriptions better than no graphical descriptions to someone with vision loss?
More than 125K global websites use overlays rather than fixing their accessibility bugs. AI-generated alt-text used by overlays is not 100 % reliable.
Customer service stoplight chart with red sad face, yellow neutral face, and green happy face
This is the first part of a two-part article. The second part, “Are bad captions better than no captions” can be read here.
In previous articles on graphical descriptions (known as alt text), I described the importance of accurate and succinct alt-text to people with disabilities who have vision loss.
I was recently sent a link to a public web page from one of the overlay companies that commented favorably on automatic, AI-based alt-text generation. I will not name this company, as I am hearing second-hand that they are now having their lawyers write cease and desist letters to people who are publicly critical of them.
I will digress here to make sure my position on overlays is crystal clear.
The title of the first of many articles I wrote on this topic was called Overlays Are Not the Solution to Your Accessibility Problems.
I am also a recent signatory (along with 200+ others) to OverlayFactSheet.com. In addition to containing a list of reasons not to use overlays, OverlayFactSheet.com also contains a list of accessibility professionals advocating against the use of overlays.
If my thoughts have changed since this article was written over a year ago, my perspective on overlays (aka accessibility tools, widgets, plugins, “one line of code accessibility solutions”) is even more pessimistic now than it was in January 2020. This evolution is in part because:
10 % of lawsuits are now against companies that use one of these tools, including a significant lawsuit against ADP.
Their claims of “A single line of code for 24/7 automated compliance” are overblown. I have yet to see a site that uses one of these tools pass even an automated WCAG test that only checks for 30 % of the guidelines, much less a manual test that covers the other 70 %. On sites using overlays, it never takes me more than 30 seconds before I run into my first non-compliant component.
Over-the-top marketing by these companies, including fake reviews and erroneous statements about consulting with assistive technology companies.
I’ve been repeatedly asked by my LinkedIn connections why I allow overlay ads to be displayed when they scroll past something I have posted. I’m not sure who is doing it (LinkedIn or the overlay company), but I know it seriously ticks me off. People assume an implicit link between advertising and the presumed blessing of the person whose media the advertising is placed on. Advertisers have pulled out of controversial talk show hosts like Laura Ingraham because they don’t want to be associated with her controversial statements. Unfortunately, I can’t do that on LinkedIn.
Many highly-regarded accessibility professionals, including Claudio Luís Vera, Karl Groves, Timothy Springer, Eric Eggert, Lainey Feingold, Adrian Roselli, Meryl K. Evans, and Steve Faulkner, have been public about their belief that no overlay can adequately substitute for fixing accessibility problems in the code. A quickly expanding list of 200+ accessibility professionals that are publically against the use of overlays is located here, about two-thirds of the way down the page.
AI-based identification of informative images is flawed
One of the issues that all of us frequently cite about why the use of overlays is not an equal experience is that AI-based identification of images can be highly flawed.
So, back to the page put out by the overlay company. This page was designed to refute many of the common points made by myself and the authors listed above. The page contains the following paragraph:
The current reality is that most websites don’t have alt text at all or have alt text which is just the name of the file, “banner”, or “image”. There are billions upon billions of images that have these kinds of descriptions today or nothing at all. How could anyone possibly provide accurate alt text for this overwhelming number of images without automation?
The company then provided a single example of where their automatically generated alt-text is correct.
This thorough analysis provided by
of the website in the EyeBobs lawsuit, which uses a well-known overlay, counters that argument.
The tool didn’t detect images using SVG format (and in fact, SVG is excluded in the overlay company’s terms of service), so no alt-text was generated.
The tool didn’t detect images sourced from third-party servers (again, excluded from the TOS), so no alt-text was generated.
There were many places where the tool provided inaccurate descriptions.
AI-based identification of skeuomorphic images is problematic
Some images, largely those that control important navigation functions, are skeuomorphic. The image is supposed to convey information above and beyond what is in the actual image. For example, the image of a crescent moon might be described as a crescent, but as a skeuomorphic image, it represents device sleep settings.
How would AI-generated alt text describe the internet connection icon, Bluetooth icon, or the levels of light and volume in those slide bar controls? Even if the AI-generated alt-text is perfect (filled circle inside a larger unfilled circle) that doesn’t describe what the image does, just what the image is.
What the skeuomorphic image does is what people with vision loss need to know. They generally aren’t interested in what a skeuomorphic image looks like.
AI-based identification of decorative images is currently non-existent
Overlays also do not reliably identify whether an image was correctly or incorrectly marked as decorative. This requirement derives from the sixth segment of the WCAG guideline on 1.1.1 Non-text Content, which specifically calls out the correct treatment of decorative images as follows:
Decoration, Formatting, Invisible: If non-text content is
- pure decoration
- is used only for visual formatting
- is not presented to users
then it is implemented in a way that it can be ignored by assistive technology.
Allowing images to be flagged as decorative permits assistive technology such as screen readers to skip over these images as if they did not exist to help the user comprehend the content more quickly while still being an equal experience. This segment of the WCAG 1.1.1 guideline is necessary because it takes assistive technology users three to five times as long to get through content, complete forms, etc. when compared to other users with the same level of website familiarity.
Nowhere in the autogenerated alt text is it ever indicated that the description was, in fact, autogenerated. People who are blind have no way of knowing that there is a chance that information is being misrepresented or left out. Adding that information would make the pages take longer to announce, but it is also the only way to notify blind users that the images were autogenerated and might be wrong.
Back to the original question
To answer the question posed by the Overlay company
How could anyone possibly provide accurate alt text for this overwhelming number of images without automation?
It’s actually not hard, but you actually have to do the work. There is no solution that can be bought that will give you perfect alt-text, all the time. Here are the steps that need to be taken to do it right:
Review your site/product and fix your existing alt-text issues.
a) For bonus points, hire people with non-visual disabilities to fix the existing site for you.
2. Take the following steps to make sure your site doesn’t backslide.
a) Notify all your content managers including third parties that alt-text must be provided for every graphic addition/modification. Reject new graphics that come in without corresponding alt submissions. Whether or not the image is being used as decorative is something that can only be determined in the context of the image’s use.
b) Don’t forget that if you offer your site in multiple languages get the alt-text right for the non-English versions as well.
c) Modify your content management system (CMS) to require alt text to be entered or the decorative box to be checked with a clear explanation of what “decorative” really means.
and the three most important steps:
d) Praise the people and vendors who do accessibility well.
e) Retrain the people and vendors who don’t do accessibility well.
f) Factor accessibility into vendor and personnel performance assessments.
When you integrate accessibility into the DNA of your organization, it is not a monumental effort. It is just the way you do things and doesn’t take substantially longer. Retrofitting things done incorrectly is what is inconvenient and expensive.
What do the users say?
In the end, the only thing that matters is the users. Not the people trying to sell you “solutions”, not even me.
If auto-generated descriptions are “good enough” for the content owner then I assume that the image is not that important.
[T]he only objective of an overlay approach to accessibility is not to provide the best user experience, it is to check accessibility off the list.
Isn’t it ironic that overlays have become a target for lawsuits the exact opposite of what the content owner was trying to avoid?
Sometimes it is not about matching with image but the message that needs to be conveyed. That needs manual evaluation to be accurate.
AI is unable to differentiate between decorative and informative images. So AI tries to give descriptions for decorative images as well. Screen reader users must ignore useless auto generated descriptions for decorative images, especially on Facebook.
Some blind users have gone so far as to post instructions about how to block overlay tools from ever coming up on the browser by blocking the overlay tools' servers.
If the users you intend to benefit find your product so ineffective and annoying they are blocking it, what does that say about your product? That clearly provides the answer to the question, “Is bad alt-text better than no alt-text.”