What we’re learning from HackIT Service Assessments

Service assessments are a central part of HackIT’s governance approach. In the last 18 months we’ve run more than a dozen and the Delivery team have also piloted GDS Service Assessor training to help skill up a pool of assessors. Last week we completed a service assessment of the beta stage of the Manage Arrears service. The assessment itself will be published in due course – what we’re sharing here is our experiences and the key lessons we’re taking away for next time…

Great prep is a great start: From the panel: Soraya put together a really good Trello board (building on an earlier version from the ‘redesigning content on the Hackney website’ project). It was really helpful extra context and a great example for future teams to crib from. Without this, we’d have struggled to get to grips with the detail we needed to make a useful assessment. Attending a couple of the show & tells in the run up would have given us a head start for the assessment without requiring much time from us and no extra effort from the project team – we’ll try this next time.

From the team: It was useful to start a Service Standard Assessment Trello board at the beginning of our phase to link evidence to each criteria periodically as the project progressed. This helped our team ensure that we were capturing supporting evidence, as well as creating a great service in the best way we could. As the approached, the Trello board reminded us that there were some aspects of the standards which we would not have a chance to fully implement. We had plans in place to implement these and were able to share these confidently but this was the difference between us getting a partially met and a fully met for many of our criteria. The learning here is to allow more time ahead of the assessment to execute actions. Try not to just have a plan, implement the plan! 

Keep some focus on the bigger picture:
From the panel: The team have a great product that they’re very passionate about but as assessors, we’d never seen the product close up before. Even with a great Trello board and some show & tells under our belts the panel still needs a brief overview of the end to end product and how the current phase fits in. Next time, I’ll take the time to remind a team in advance, of the importance of ‘showing the thing’ on the day – a really strong narrative and end to end demo are key to making sure the team sell all the benefits of their work.

From the team: As a team, we found this assessment invaluable. We learned that it is important not to assume that everyone is aware of what went on throughout the full project lifecycle, and to remember to set the narrative and background.  This was missing in our presentation and would have been a benefit as we would have shared more insight into the Discovery work from earlier phases which would have emphasised the overall service vision and journey.

Take some time to come to a decision:
From the panel: we made a conscious decision at the start of the session to conclude the assessment on the day which meant a nervy quarter of an hour at the end of the session while the panel reached a conclusion on each of the standards. With hindsight, it might have been beneficial to have delayed this and given the panel a bit more time to explore the detail of the Trello board before we made a decision and to allow the team to challenge some of our assumptions.

From the project team: Next time, I would definitely include some extra time after the scoring for questions. This would provide the assessors with an opportunity to clarify or ask questions about anything which they were unclear on. Also, it would offer the team a chance to question the outcomes. On this occasion, the team came away feeling slightly deflated because there was no opportunity to discuss and understand fully how some of the final scores were met. Since then, our lead assessor, Liz, has done a great job to add the reasons for the outcomes which are available on our Service Standard Assessment Trello board.

Maybe a consistent panel could be useful:
From the panel: We also reflected on the potential benefits of including some of the same panel for the next stage service assessment of this product. Some fresh eyes will be useful too but it could be beneficial to help appreciate whether some of our recommendations have taken root. We were really lucky to be joined by Jess from ACAS as our external assessor, who was new to assessing but who’d been on the other side of the table a couple of times before. She was able to offer challenge from her experience in central Government (where service assessments can act as gateways) but had enough experience of being in the project team’s shoes to ask probing questions in a supportive way.

From the project team: It would be good to use the same panel for this product’s next service standard assessment as they are aware of it and the recommendations that came from this assessment. They could see the product’s progress as well as question the outcomes of the last recommended actions. We had a great multi-disciplinary panel which included both external and internal assessors all of whom objectively evaluated our service and raised constructive feedback.

You can find links to all of the service assessments we’ve carried out on our HackIT site. Written by Liz Harrison and Soraya Clarke

How can we make open data work for our users?

For a long time, there’s been an established view that publishing more open data should encourage the proliferation of new applications and services across the public and private sector based on the availability of common datasets across local authority boundaries. 

Historically in Hackney, we’ve only ever dipped our toes in the world of open data and to be honest, we don’t know whether there’s enough evidence to prove that simply publishing as much as possible will realise these benefits – we just don’t know enough about the user need to justify a big bang open data offering.

That being said, we’ve been inspired by the ODI, Nesta and others who are keen to encourage greater transparency and common standards in data publication and we have reviewed some of the good practice of other local authorities who’ve experimented with open data portals in recent years. But our HackIT manifesto tells us to think big and act small – we don’t want to dive into the deep end of an expensive open data portal for publishing every non-personal data set we can find. Instead, we want to understand the kind of data users want to see, and work out the most simple and cost effective way of providing that data in a useful way.

Currently, we don’t proactively publish the data that is most commonly requested via Freedom of Information (FOI) requests, which results in a significant burden of resource spent re-actively responding to FOI requests. In his 2018 manifesto, the Mayor Glanville committed us to more transparency and this has helped shape a set of basic principles to help us experiment with user focused open data publication. We’re keen to open ourselves up to some challenge on this and share what we learn as we experiment:

  • Focus on releasing data commonly requested via FOI – We believe that concentrating our open data offering on the most frequently requested topics will release significant amounts of officer time and provide a better service for residents. Therefore, we will focus on these opportunity areas first.
  • Use common standards wherever possible – to be valuable for developers and organisations, we need to publish our data in a format that is easy to join across local authority areas. Where a standard doesn’t exist, we will try to match other Councils who are already publishing this data to maximise reusability. We will openly publicise our data structures to encourage reuse by other local authorities.
  • Automated with a focus on currency of data – to be of maximum value in reducing the FOI burden we face, the data included in our open data products should be as current as possible. To ensure we aren’t just moving the resource problem from FOI administration to open data publication, data updates should be automated and require a minimum amount of officer time to publish.
  • Adopt a flexible approach to the technology we use to publish our data – we don’t believe in being constrained by an online data portal for publishing open data. We will aim to use the best modern technology and data analytics techniques depending on the data involved to provide a data dashboard, visualisation and full download of the dataset. We are motivated by providing a great user experience that’s easily discoverable first and foremost.
  • Aim to meet GDS design standards – all of our products will be designed in line with modern, accessible design standards and we will always test our products with our users.
  • Understand our impact – we will always begin by understanding the baseline volume of FOI requests for the data set in question and monitor over time, the impact of publishing the dataset. We expect to see more exemptions (where an FOI is refused under the grounds that the data is already published) and over time, fewer FOI requests overall. If the data set isn’t reducing FOI demand, we will look for other areas where we can add more value by publishing new data.

Our first experiment is with Penalty Charge Notice (PCN) data (that’s parking tickets to you and me…) – it’s one of the most commonly requested datasets and we think publishing this openly will help residents answer their questions in a more timely way and reduce the time we spend responding to FOIs on the topic. We’re experimenting with providing a simple  app on the parking pages on our website, which will allow users to extract the type of information from the data that is often asked for in FOI requests. Our great User Research team are helping to keep us focused on delivering something simple and intuitive for residents. We’ll also be trialing a light touch open data portal which will allow us to curate our open data sets in one place on the website. We’ll share more as we develop our MVP over the coming weeks. 

Verification Hub: Weeknotes 03/07/19

Last week, with lots of pushing from Tom (plus some 5:30am testing) and a hefty amount of support from Phil, Rasit and Lindsay, Farthest Gate deployed the Verification Hub in the live Liberator environment. Tom’s testing proved it hadn’t caused any problems with the permit application process itself but for now VH remains switched off…essentially this means Farthest Gate are lined up to flip the switch to on, just as soon as we iron out some new problems…

The main thing we’re trying to solve (beyond the project team’s availability having hit leave season!) are some problems we’ve encountered with the VH API. We had previously been testing against ClearCore v4 but have since been upgrade to v5 which is setup with SSL certificates over HTTPS and requires authentication, which wasn’t the case before. It’s much more secure but means we now need to develop our API to call the ClearCore service using credentials.

Unfortunately .Net Core doesn’t currently provide the necessary libraries to easily connect to a SOAP web service over HTTPS using credentials.  Matt and Farthest Gate confirmed this as an issue and agreed with Matt’s two options of either generating our own SOAP requests or switching to the .Net Framework. Matt felt that there was too much scope for potential issues when generating our own SOAP requests which means he now needs to build a new API project in the .Net framework and move the code over. This will involve rebuilding but hopefully not too much rewriting.

At this point we’re hoping Matt will be able to focus on the changes over the next week by which time, Steve will be back with us and able to move housing benefits data into the VH data core to expand out what we’re verifying against. Charlene, our optimistic and very supportive product owner, continues to marshal us towards pressing the on button in the next 2 weeks so we can sit back and watch how it performs IRL.

Summer in Hackney web map prototype: Weeknotes 01/07/19

We’ve had it in the back of our minds to revisit the technology and presentation options for map.Hackney for a while, in particular, to make it more accessible and mobile friendly. The Re-engineering Hackney Content project has provided the impetus we needed to test out a new thematic approach to redesigning map.Hackney, with the added fun of super short timescales (4 weeks) and a very assertive product owner (Susan M-L).

Our project aim: to deliver a web map that enables residents to view topical services that are useful to them; is web friendly and meets GDS accessibility standards.

Last week we:

  • Developed user stories and reviewed the range of mapping layers we’re currently offering through map.Hackney
  • Agreed to test an approach where we present a map based on a theme (we’ve selected summer in Hackney) rather than a service by service approach. This means that rather than a map that shows you all the parks and open spaces across all of Hackney and related info, we’ll present a range of layers (water fountains, adventure playgrounds, the Hackney carnival route) meeting a specific theme or activity (having a picnic, playing, going out etc). 
  • Developed a prototype with a new base map, that’s starting to incorporate some layers by theme

This week we’ll be:

  • Carrying out user research in Hackney Central and some libraries and open spaces to understand what users need from a summer in Hackney map – Wing has helpfully designed up a google form for us to record feedback
  • Developing the prototype further with improved design based on UI toolkit
  • Knowledge transfer with Mo and Carolina providing some advice and support to Sandrine and Marta to develop the design elements of the prototype

Verification Hub: Weeknotes 17/06/19

Farthest Gate have completed the work to implement VH in front of CI in the test environment and we’ve been carrying out some functional testing. So far, so good – it’s performing as we’d expect (our dummy data is passing and failing where we want it to…see below) and we remain painfully optimistic that we can get it into live by the end of next week…but this is dependent on Farthest Gate deploying to the bridge environment first in the next day or so.

However, Matt has been able to snatch some minutes around the edge of other work to connect our API with ClearCore and we’ve hit some blockers with security certificates. We think this will need a bit of infrastructure support to resolve (as well as time from Matt that we can’t yet guarantee). This is on our ‘must solve’ list – if we can’t resolve it won’t be possible to go live next week.

Thanks are due to Rasit and Phil who have become involved to look at the full CMR for Liberator given the changes we’re making. In effect they’re checking that the changes don’t break anything, that it’s safe to deploy to live and they have sight of all the documentation in case any issues arise.

In the meantime, InfoShare have completed installation of ClearCore 5 in our Live environment. Tom C is now building an internal LLPG gazetteer and changing the VH project configuration so that is ready to load data into the Hub. Steve F has Housing Benefit data and has almost finished automating this as daily update delta files from Council Tax and Housing Bens into the ClearCore data hub.

We’re about to hit holiday season and we’re reaching crunch point for this work – we’re 80% of the way through this work but it will require some focused support from the whole project team to get it over the line.