Manage a tenancy: Weeknotes 24/04/2019

Manage a tenancy is a service we’re developing for housing officers to be able to carry out their visits and checks processes in residents homes using a mobile solution.

This week we held the Manage a tenancy show and tell at our Stamford Hill Neighbourhood Housing Office. Here’s the update from that session.

We’ve paused any further development work on the Tenancy and Household Check and Home Check processes. Most issues now being seen with these processes relate to issues with tenants’ data and need to be investigated as soon as resources are available.

The Introductory Tenancy Visit process will be able to be released as soon as issues with the ‘scheduler’ for automatically adding the process to the housing officer’s work tray is resolved. Issues with the Review and submit page were resolved.

Introductory tenancy vist start page
Introductory tenancy visit start page

ETRA (Enhanced Tenants and Residents Association) process to record issues raised at TRA meetings and send them to relevant service areas for response, continues with our apprentice developer doing great work in developing the UI in OutSystems and linking it up with APIs. Some issues around saving assets are to be resolved.

We’ve also deployed some improvements to the ‘hub’. Immediately noticed (and liked) by a housing officer at the Show and tell.

Raising awareness of how well designed technology can help overcome barriers for people with impairments

As part of Hackney Council’s User Research Week, we set up a mini ‘empathy lab’ in the Hackney Service Centre. Our aim was to raise awareness of how certain visual and physical impairments can impact people’s lives. We also wanted to demonstrate how technology built with good accessibility standards can help break down barriers for people with impairments, whether permanent or temporary.

What we did

We had two computer stations, one focussed on simulating partially sighted and reduced manual dexterity. The other focussed on severe visual impairment. Staff members passing by were encouraged to give one or both a try. We also promoted the event to colleagues across the Council by email and with Google+ community posts.

We displayed three personas, adapted from GDS digital inclusion and accessibility user profiles, to help illustrate how the types of impairments we were simulating affect people using technology in real life situations.

Partially sighted and reduced dexterity

At the partially sighted and reduced manual dexterity station, people were able to experience what it might be like if someone finds it difficult to use a mouse and can only use a keyboard. The persona ‘Christopher’ prefers to use a keyboard as he has arthritis in his hands.

To simulate this impairment, buttons were taped tightly over the main knuckles on the back of fingers and latex gloves worn. This restricted the ability to be able to bend fingers and reduced sensation in finger tips.

At this station participants were asked to fill in a form online by using only the keyboard and also its paper equivalent. Once participants had become familiar with navigating using only the keyboard, apart from experiencing reduced dexterity in their hands, they were able to select radio buttons, check boxes, drop downs and enter text. As Google forms are coded by default to meet good accessibility standards, participants found the online form easier than writing on the paper version, where writing with a pen was difficult.

In addition, to demonstrate being partially sighted, safety glasses with a light smearing of vaseline could also be worn. Because of her glaucoma, the persona ‘Claudia’ needs to be able to increase text size to be able to read what’s on a screen. We used built-in functionality in the Chrome browser to do this.

When using the vaseline smeared safety glasses, the participants were able to experience how, for example, a website that allows for text to be resized, can assist people who are partially sighted to interact with an online service. This can benefit people with impairments, such as, cataracts or more generally a deterioration in vision associated with ageing, something likely to affect everyone.

After trying out the activities, a participant commented: “Was really interesting and gave me an appreciation of how difficult it can be for some people accessing digital services. Everyone should go and see what it’s like.”

Severe visual impairment

Our second station covered severe visual impairment. The persona ‘Ashleigh’ uses a screen reader and for this activity we used the Safari browser and VoiceOver, Apple Mac’s built-in screen reader. Participants were able to experience how people can have a web page read out to them. Again, depending on how well the web page meets accessibility standards, the better the screen reader is able to make sense of the page.

Some basic accessibility considerations can make a big difference on a simple web page. For example, having a ‘skip to content’ link enables the screen reader user to avoid having to navigate through repetitive navigation links in headers. Correctly nesting heading styles on a web page also helps screen reader users to understand the structure of a page.

What next?

We’re considering how we can take our empathy lab forward and find a location in the office where we can have a more permanent space for it. This will give the team a chance  to understand what it is like for people with accessibility needs when they use the new services they are building, helping us to become the most user centred team in the country.