The Real Accessibility Toolkit (RAT) is a plug-and-play Unity package of basic accessibility features, designed to be easy to use and apply to a project with minimal setup. This project was a proof-of-concept made in about four months alongside other projects, in conjunction with a thesis on digital accessibility techniques in games. Accessibility is an interest of mine, and at the time I was thinking a lot about the role of middleware in the game production pipeline. I hoped to learn more through this project about techniques that were actually effective in aiding disabled players, and investigating the cross-section of what could be implemented as a generic middleware package for any kind of game.
The RAT consists of settings objects which can be adjusted for the needs of the project and what accessibility features it wants to use. A controller object spawns itself when called for, and keeps track of controller modules for vision, hearing, and motor function features.
The settings for RAT is a ScriptableObject resource that can hold sub-resources for each accessibility category's settings. For example, enabling vision tools will make a vision settings resource. There can be multiple of these top-level settings resources, with just one enabled at a time that dictates which settings are to be applied, allowing for developers to have multiple variants of settings for use during a game. All the settings resources have events which can be subscribed to so scripts can react to settings being changed at runtime.
RAT's vision tools comprise of three shaders and their corresponding identifier components to address the needs of certain visual impairments. All the shaders are screenspace shaders, attaching themselves to the main camera.
The colourblind shader applies a basic colour correction effect to make colours more distinguishable, using a LUT to change colours based on the needs of the different types of colourblindness.
The low contrast shader is a combination of a greyscale effect and edge detection, along with the ability to highlight certain objects in a specific colour. This is achieved with a LowContrastHighlight component which finds all the renderers attached to its GameObject and creates a list of their sub-mesh indexes. The highlight then registers itself with the vision controller module so it can be used for drawing during the low contrast effect pass. The highlight meshes are drawn in black to a render texture using a command buffer, with everything else drawn in white. This render texture is then used as a stencil in the low contrast shader to exempt the meshes from the effect, so they are drawn normally along with a colour modifier. Material property blocks are used to adjust the instance colour of materials to avoid any duplication, improving performance and avoiding memory leaks or unnecessary cleanup. Highlights are defined in the visual settings using identifiers, which have a name and colour to use, and a layer and tag to match to an object. The component can be added to objects manually, but there's also an option in the settings to auto-enable highlights, which will add highlights to any object on startup in the scene that matches an identifier.
The outline shader is similar in practice to the highlighting effect. The Outline component can be added manually to objects, or outlines can be auto-enabled on startup via identifiers, registering itself with the vision controller module. The outline effect is achieved using a greyscale stencil that maps the colour change to the desired outline colours, and uses the delineation between colours to place the outlines, checking pixels around them for any difference. The greyscale stencil is drawn using a list of differently coloured materials, from white through to varying shades of grey, as many as needed for the same length of colour identifiers. The meshes that need outlining and drawn with the corresponding coloured material using command buffers on a render texture, which is then used as the stencil/comparison image in the outline shader. This allows for many potential outline colours to be used at once.
RAT's hearing tools comprise of display functions for both subtitles and captions. The hearing module controls automatic setup of the UI for these elements in the scene, and only requires being called to show the corresponding subtitle or caption.
Subtitles are shown in a box at the bottom of the screen, with an optional animation for the text and the box. Colours can be defined for the speakers associated with subtitles, as well as a delimiter for speaker names so that if dialogue strings include a speaker name with them, they can be optionally stripped out without editing the source text.
Captions describe sound effects that are being played in the game, and can show the direction of audio cues to help players. Captions can be defined in the hearing settings for their string ID, the text to display, and their priority - captions are called using their ID. Captions stack on top of each other in the corner to show the most recent one at the bottom. Caption priority is used to sort importance of captions, so more less important captions are higher up and are replaced quicker by more important ones. After setting the player/audio listener centre's transform, directional captions show an arrow to point toward the source of a noise in relation to it.
RAT's motor tools are very simple and just have a menu scanning function. This works using Unity's UI navigation feature, and automatically scrolls through selectable UI elements after setting which selectable is the start of the menu. The scanner will wait for the defined interval before checking nearby selectables in a clockwise fashion. If a selected element is in scroll view it also ensures it scrolls down to make the selected element visible.
This project taught me a lot about shaders and screen effects in Unity, utilising different features of shaders and encountering issues with how they're ordered that had to be fixed. Using existing shader packages, modifying them, and creating my own taught me more about Unity's command buffers, render textures, stencilling, and about optimisations when it came to materials.
Besides the straight technological things I learnt, this project was an exercise in trying to make a package that was simple and straightforward to use, since the goal of the project was that this could feasibly be seen as something that developers could use out of the box. I wanted the controllers and settings for the RAT to be unobtrusive to set up and not require a developer to do manual cleanup, which is why they are created only when needed, and set up UI elements etc. automatically. The settings resources taught me some things about Unity's ScriptableObject system, with some quirks and bugs to do with its sub-resourcing feature that I had to work around, and how to use the undocumented OnValidate function to handle automatic creation and destruction of its settings. The loading of these resources also caused issues, as they aren't loaded by the engine unless they're referenced by something, which required me to make a dummy object just for loading them when needed.
If I were to do this project again, I'd look into Unity's alternative asset-loading techniques like addressables to avoid using things like the Resources folder and other hacks that were necessary. I'd also want to test the package with a wider variety of scenes and types of games, to make it more robust and flexible for developers. As it is, I don't know how smoothly it works with changing scene, and features like the menu scanning would need much more testing and could use more options, like defining its selectable search order. The shaders could use some improvements and optimisations too, like batching, as I imagine they may struggle on more complex scenes.
Overall this project was a very unique and educational experience. While I wish I could have added more features, a lot of accessibility techniques are specific to the game in question, so I'm happy with what I was able to make for this proof of concept and how it turned out.