Housing Register Service Assessment 7/10/21
This document summarises the outcome of the Service Assessment that was undertaken on the Hackney Housing Register project in October 2021. The purpose of the service assessment was to determine whether the design and implementation of the Housing Register service met GDS Service Design standards, but also to provide specific feedback to the project team and to enable wider lessons learned to be shared with the rest of HackIT.
This summary does not include the evidence that was presented to the assessors by the team, as that can be found in the Trello board that was used to conduct the assessment.
This service assessment was carried out remotely, and we are grateful for the work done by both the project team and assessment panel in advance of the assessment meeting.
Panel | Team members |
Cate McLaurin : Panel Lead | Maya Knight: FG Delivery Manager |
Darren McCormac: Delivery Management assessor | Britt Wood: FG Principal Consultant |
Richard Smith : User Researcher / Service Designer assessor | Hannah Mills: FG |
Dan Harper-Wain: Product Management assessor (external) | Ottla Arrigoni:FG |
David Dean: Technical assessor | Chris Wensley: FG |
Thomas Morris: FG | |
Darren Aitcheson: HackIT Delivery Manager | |
Related documents / links
Service vision
The vision for the new housing register is for people wanting to join the register to understand the full range of options available to them and their likelihood. It is simple to join for people who qualify, minimises failure demand, is easy to administer, sufficiently open and gives all stakeholders confidence in the fairness of the process. The underlying applications are secure, reliable and adaptable to the changing needs of users.
Summary of learning
- The service and the team has generally met the standards required, with some particularly good practice that could be useful for other services to learn from or adopt.
- The main concern of the assessors is regarding what will happen to the service once the development team rolls off and hands it over to Hackney to run and improve; it was unclear at the time of the assessment if there was an agreed plan for this.
- The team has done a good job of user research and has not just rushed into developing the system before fully understanding the needs of service users. However, it is important that an accessibility audit is conducted as soon as possible.
- Whilst the team generally worked well in an iterative manner, and involved the LBH staff by embedding them within the team, it was quite late in the timeline before the first usable version of the service was delivered. It appears that LBH placed a fixed “go-live” date of October and a relatively fixed scope on the work at the beginning, which wasn’t particularly conducive to iterative development and Agile ways of working. This is something that may need to be addressed by LBH for future projects/products/services.
- The time allocated for the service assessment itself was not sufficient, which led to some of the later standards feeling a little rushed. A minimum of 2 hours, and preferably more, should be set aside for service assessments.
Service Standard points in detail
1. Understand users and their needs (Assessor: Richard)
Develop a deep understanding of users and the problem you’re trying to solve for them.
Look at the full context to understand what the user is trying to achieve, not just the part where they have to interact with government.
Assessor comments
- The team were able to demonstrate using a wide range of research methodologies to understand user needs and conduct usability research on the product.
- Using potential product concepts (such as potentially adding nudge techniques as part of an application to sell the benefits and support of sourcing alternative private accommodation) at a really early stage as part of the discovery research teased out additional needs and behaviours quickly in an innovative way.
- The team were also able to articulate where features had been iterated within the product due to the understanding they gathered from the usability research, including not using text messaging to inform residents of their place within the waiting list – a finding that would also back up previous research from other projects that considered using text messaging.
- Whilst not recorded explicitly in the research plan, the team were able to robustly articulate the work they did to make sure research was conducted in an ethical way; especially considering the potential participation risks of residents on the housing register and any perceived benefits it may bring towards their application.
Assessor score: Partially met
Recommendations
- Apart from a document detailing some demographic information of housing register applicants, there was anecdotal detail within the materials provided and during the assessment about telling the story of the resident – more about their motivations, life events that prompt joining the register, more about the varied barriers they experience to build up empathy with the user.
- This would be extremely valuable, especially to those who aren’t as familiar with the project – such as those who might access research artifacts from the project via the User Research Library to which a team member should add their work so others can learn from it.
2. Solve a whole problem for users (Assessors: Dan & Richard)
Work towards creating a service that solves one whole problem for users, collaborating across organisational boundaries where necessary.
Assessor comments
- There was good evidence to show the team took a holistic approach to scoping and designing the service. Whilst the original brief was to redesign the housing register form, the team unpicked this to explore the whole journey of accessing social housing (inc. alternative options), by mapping out the end to end service and analysing dropout/appeal data.
- It was clear how the components created by the team fit into the wider HackIT housing vision, including the focus on setting clear expectations about the wait time, and advising users to access alternatives.
- Perhaps due to the initial scope constraints, it wasn’t entirely clear how the overall service would develop over time, to ensure a coherent end-to-end experience for residents. The team articulated a number of areas they were keen to see improved, such as joining the application process up with the separate account users require to bid for properties, and keeping in touch with residents during the waiting period. However, there wasn’t a clear roadmap to indicate where these ideas might lead, or the level of resourcing they would require.
Assessor score: Partially met
Recommendations
- If it hasn’t already been explored, thought should also be given to how to maintain the quality of the register data during the waiting period, given that significant changes of circumstance are likely to happen to applicants during this time, that may change their priority or remove them from the register entirely.
- It also wasn’t clear how much ownership the team had over the service landing page (or how they worked with others in Hackney, if it is owned elsewhere). For example, the ‘Join the register’ link on the orientation page appears to be broken at present. If it hasn’t already been done, the team should review the consistency and usability of these pages, in the context of the end-to-end journey of signing up to the register.
3. Provide a joined up experience across all channels (Assessors: Dan & Richard)
Work towards creating a service that meets users’ needs across all channels, including online, phone, paper and face to face.
Assessor comments
- The service has been designed to operate effectively across a range of channels. The team have prioritised ease of contact for residents who struggle with the online service, with clear phone numbers and the use of a unique reference number to manage the transition between channels.
- Staff are able to view residents’ applications, and act on their behalf (for example where residents aren’t able to use the online service at all).
- There is also strong evidence for the decision the team took, about preferring email channels over SMS.
- Staff have been closely involved in the development and prioritisation of the work – both through the service owner and housing register manager being embedded in the team, as well as by bringing wider staff into the decision making process.
Assessor score: Met
Recommendations
- If it hasn’t already been explored, there is an opportunity during beta to test that the hand-off between the ‘apply’ step and the ‘provide evidence’ step works effectively in the online channel. These two steps currently appear as separate services, with an email linking residents to the second step, and could lead to drop-out.
- Conduct additional usability research with agents whilst they are supporting residents through the system to check the tool meets their needs in a live environment.
4. Make the service simple to use (Assessors: Dan & Richard)
Build a service that’s simple, intuitive and comprehensible. And test it with users to make sure it works for them.
Assessor comments
- The team conducted a significant amount of usability testing for the MVP, particularly given the tight project timescales. The team said they tested with a wide range of users and across tablets, mobiles and desktop devices.
- They were able to articulate improvements that were made throughout testing to help users succeed first time, as well as areas that users still struggled with, such as understanding the ‘task list’ pattern and providing information about household composition. Whilst the latter two issues are yet to be resolved, the team have clear plans in place to address them.
- The service also has a clear mechanism on each page for residents to seek help if they get stuck, and analytics will be in place to monitor any areas of difficulty they encounter.
- The team struck a good balance between usability and counter-fraud measure, with regard to the risk of residents submitting multiple applications to ‘game’ the service. Rather than barring repeat applications (which could prevent users who had a genuine change of circumstances from accessing the service), they opted to solve these issues through a back-end application matching capability.
- On the staff side, it was clear that the team had a good understanding of the needs of officers, particularly to be able to understand the status of a case at a glance and the available evidence showed these needs were met.
Assessor score: Met
Recommendations:
- It will be important to monitor carefully for signs of further usability issues now the service is live, given the limited time available during alpha.
- The team should ensure that the high level of contextual knowledge about the users of the service, and their needs, are handed over well to the HackIT team that takes on responsibility for the service, so that this context is applied to future iterations.
5. Make sure everyone can use the service (Assessors: Dan & Richard)
Provide a service that everyone can use, including people with disabilities or other legally protected characteristics. And people who don’t have access to the internet or lack the skills or confidence to use it.
Assessor comments
- The team showed strong evidence of researching and usability testing across a wide range of accessibility needs. This included various visual impairments and neurodiverse conditions, as well as users with low levels of English literacy. It was clear that the findings of this research had actively shaped the design of the service, such as reducing the complexity of the content.
- One of the most impressive examples was designing for non-binary users; here, the team’s work went beyond just content design, to re-working the service’s policy for calculating bedroom entitlement.
Assessor score: Not met
Recommendations
- Further actions are needed to ensure the service is accessible and meets the relevant legislation. An accessibility audit is yet to be conducted. Accessibility issues are therefore likely to still be present in the live service, including for users with motor impairments (who didn’t appear to be reached through usability testing).
- The findings from this audit should be incorporated into an accessibility statement, which should be published before the product is rolled out any further.
6. Have a multidisciplinary team (Assessor: Darren McC)
Put in place a multidisciplinary team that can create and operate the service in a sustainable way.
Assessor comments
- The standard is met – currently. It’s good that Hackney officers were embedded with the agency team, and a skills gap was addressed, though belatedly.
Assessor score: Met
Recommendations
- Assessors were concerned that, once the agency has rolled off, there would be insufficient skill and capacity to maintain and iterate the product. Given what has been delivered is MVP, it is important that HackIT identify a new team to take this application on for maintenance and iteration. There is clearly more work to be done.
7.Use agile ways of working (Assessor: Darren McC)
Create the service using agile, iterative user-centred methods.
Assessor comments
- There are some good aspects in how the team worked, considering they were constrained on budget and timescale. I’m not clear how they arrived at the MVP scope, but it is clear that it contains all the right minimum things for an application to be made and managed.
Assessors score: Partially met
Recommendations
- The retros seem to have been treated as overhead, as the time available was compressed as the build went on. Thirty minutes every two weeks does not feel long enough to dig into issues properly, to make a sustained change to process. The future team should allocate at least an hour every sprint to this (bearing in mind the Scrum Guide recommends 3 hours for 4 weeks).
8.Iterate and improve frequently (Assessor: Darren McC)
Make sure you have the capacity, resources and technical flexibility to iterate and improve the service frequently.
Assessor comments
- The standard is met. There is evidence elsewhere that changes were made during the life of the build phase, based on feedback from users (e.g. emails instead of SMS), despite the challenges of a fixed project timeline.
- The team also acted on weaker signals from users, building up a bank of assumptions to test in the future, where they had insufficient evidence to make immediate design changes.
Assessors score: Met
Recommendations
- The team has also constructed a backlog for iterations once they have departed – the onus will be on HackIT to ensure that is acted upon, and balance this backlog against new evidence that emerges after launch.
9. Create a secure service which protects users’ privacy (Assessors: David)
Evaluate what data the service will be collecting, storing and providing.
Assessor comments
- It appears as though reasonable assumptions have been made in the selection of a user identity framework. Developers only have access to resources that they should have. The data migration was done in line with acceptable practices, and sensible filters were chosen to discard irrelevant data.
- Concern at lack of shared identity. Users signing up for this service are signing up only for this service, and their credentials are useless outside this system. This could lead to a greater incidence of password loss and is a missed opportunity.
Assessor score: Partially met.
Recommendations
- Actively monitor cross government discussion and developments relating to authentication and citizen sign on for public digital services to inform Hackney’s future approach
10. Define what success looks like and publish performance data (Assessor: Dan)
Work out what success looks like for your service and identify metrics which will tell you what’s working and what can be improved, combined with user research.
Assessor comments
- The team have developed a clear set of success measures for the service, which tie right back to the vision/principles of Hackney’s housing register policy. The right tools are/will be in place to monitor these once the service is live (including manual counting, time and motion studies, user research, Google Analytics and the planned introduction of Sentry).
- It wasn’t clear from the evidence we saw whether or not the team will also actively monitor for failure scenarios. For example, one of the service goals is to deter housing register applications from residents who are likely to face extremely long wait times. It’s possible this could also deter users with genuine priority need, who would be housed within shorter timescales.
Assessor score: Met.
Recommendations
- If they haven’t already explored this, the team should consider some of the potential negative outcomes of the service, and ensure they are able to measure any early indications of them where possible.
- The main risk here looks to be ensuring a stable team is in place to pick up the measurement activity once the service is fully live, and act on what they observe (Link to point 6. Have a multidisciplinary team).
11.Choose the right tools and technology (Assessor: David)
Choose tools and technology that let you create a high quality service in a cost effective way. Minimise the cost of changing direction in future.
Assessor comments
- The tech stack the team described is as follows:
.NET Core / C# for backend
API Gateway
Dynamo DB
React/NextJS for frontend
Serverless Framework
- This is in line with Hackney’s standards. The main question the assessor had here was about the choice of database – DynamoDB and the team explained how this decision was arrived at.
Assessor score: Met.
12. Make new source code open (Assessor: David)
Make all new source code open and reusable, and publish it under appropriate licences. Or if this isn’t possible, provide a convincing explanation of why this can’t be done for specific subsets of the source code.
Assessor comments
- We didn’t get to talk about this card specifically; however, the Hackney playbook was used in the development of the product and this was backed up by conversations with the team
Assessor score: Met.
13.Use and contribute to open standards, common components and patterns (Assessor: David)
Build on open standards and common components and patterns from inside and outside government.
Assessor comments
- This section was not specifically discussed during the assessment meeting due to time constraints. However, the assessor was satisfied that the Hackney playbook was used in the development of the product and this was backed up by conversations with the team.
- In addition, the government design system was used on the frontend, bringing this service in line with most others stylistically.
Assessor score: Met.
14. Operate a reliable service (Assessor: David)
Minimise service downtime and have a plan to deal with it when it does happen.
Assessor comments
- The team ensured that observability was in place using CloudWatch and X-Ray – _de rigeur_ for AWS services and certainly in line with Hackney expectations.
- Swagger was used for API documentation and will be a great help to anyone interfacing with the service in the future.
- Tests were automated as part of the CI pipeline (unit and integration only) during the staging and production release phases.
- Infrastructure was defined in code.
Assessor score: Partially met
Recommendations
- The main recommendation is around the use of end-to-end tests. The developers had not implemented them, did not plan to, and did not see their usefulness in this project. Suggest that with a high throughput, public facing app such as this, end to end tests are essential.