Assessing a service against the Standard

We’ve just launched a private beta of a new service with a small pilot team. The service will enable our neighbourhood housing officers to work in a more mobile way. So now seemed a good time to do an internal assessment of how well our design and development of the service measured up against the local government digital service standard.

To do this we worked in groups of 3 or 4 with a mixture of roles: UX researchers and designers, developers, and project management roles. Some were involved in the development of the new service and others were not. Each group assessed different points of the standard and then we came together with the Head of Digital to share our assessment.

What we found

Working as an Agile multidisciplinary team with a product owner who’s able to make decisions and prioritise user stories, has enabled us to iteratively improve the product based on user needs from our user research and feedback from testing with users.

Rotating ‘show and tells’ at the users’ local offices has helped build the profile of the product and contributed to users understanding of it before the launch of private beta with a pilot team. This has contributed to the building of a simple and intuitive product and reduced training time to use it. User feedback suggests it is a clear improvement on their previous mobile working tools and the pilot team are engaged and eager to use it.

Our developers have a clear understanding of the tools and systems they use, they’ve reused existing components where possible and made code available for others through Github as well as meeting internal standards on data management.

The new product provides improved data capturing, more monitoring capability and improvements on data security than previously.

However, we need to show greater clarity on how the the product has been iterated based on user research findings and no specific research has been done on assisted digital user needs.

On project management, risks need to be identified and reported more clearly and user stories need to have defined acceptance criteria that can assist the product owner in signing off the product.

Some constraints with the technology platform have been found which are hindering providing the experience that the users might expect. While security meets acceptable standards more rigorous testing needs to be done. In addition, more automated testing tools could be used to ensure the product works well in different environments.

Our next steps

We will continue to review where we are against the standard and where we need to improve our working practices. We also need to do an accessibility review, a privacy impact assessment and ensure good standards are part of future developments.

We need to plan for ongoing user research to continue to improve the product, develop performance indicators and monitor user uptake and satisfaction with the product.

We also need to share findings about the technical platform used and create design patterns that could be reused.

Our business continuity and communications plans need to be developed and use made of the Council’s existing process if the product is unavailable. We also need to find a way to involve our Council Members in the work that’s been done.

Doing an internal assessment now has provided us with additional guidance in designing and developing a user-centred service.

 

One thought on “Assessing a service against the Standard”

  1. Gill,

    This is brilliant and what I am trying to get other LAs to do for LocalGovDigital.

    How do I get in touch with you as I want to republish your blog post?

    Cheers,

    Nick

Leave a Reply

Your email address will not be published. Required fields are marked *