Week Notes Ending 20/09/2019

I sometimes ask myself, “If I was able to have one superpower what would it be?” Funny how I always find myself wanting to be able to control time and weather. What does this have to do with my weekly review I hear you ask? I posed the same question to the Web first team to find out, in an inspired superhero way, each project team member’s superhero strengths. (After all, you never know when you might need a superhero to rescue) The concept tied in nicely with the superhero-themed Retro.

Highlight of the week
The team retro was the highlight of this week. As a team, we unpacked some hidden truths which needed to be surfaced for us to acknowledge, understand, build and progress.

Objectives for next week
We have clearly defined goals courtesy of our sprint planning session some of which to mention
– Finishing off requirements for gov wifi & gov roam
– Completion of switch testing

Special Thanks!
Special thanks to my team this week for being so awesome! Thanks for trusting the process, eventually we will build on trusting the person.
Finally a big thank you to the ‘Storm Trooper’ who rocked up just as things were about to become worthy of a ‘Kleenex’ advert with group hugs to match. This whole week has helped me understand the team dynamic and superhero strengths quite early in the project lifecycle.
Like a said, you never know when you might need a superhero to aid and rescue.

An update on the Hackney pattern library

Today was an exciting day for front end at Hackney. We finished adding all of the current GOV.UK design system components (along with their Hackney branding) to our pattern library AND we were fortunate enough to have a great meeting with two GDS developers Nick Colley and Hanna Laakso who were kind enough to come to Hackney Service Centre and talk to us about many of the factors involved in maintaining a UI library.

A Work-In-Progress version of our pattern library can be seen here: It contains a lot of Hackney branded GOV.UK components (many of these have very small branding changes – tweaks to colours and spacing) and a couple of new Hackney components, such as the Contact Block and the announcements components. I have some tidying up to do and spacing to update. I also have documentation to write.

Beyond that my immediate priority is to work with one of our designers to develop a flexible and fit for purpose version of our header and footer for the library – a lot of our components so far have come out of our new design for the main Hackney website, but the site’s header and footer are quite bespoke and will need to be genericised / iterated so that our pattern library components can produce any combination of elements that we might require. These include things such as whether or not we have header navigation and how extensive that is, whether or not we have login and/or search in the header… Once we have the header and footer in place I’m excited about starting to use the pattern library in the wild and letting people’s experiences of it drive the work we do going forward.

Our meeting this afternoon with Nick and Hanna was a real reminder of how important it is to be thoughtful in the work we are doing. We discussed issues that we at Hackney have barely thought about, being in such an early stage of our journey, such as Semantic versioning and the responsibility around breaking changes / deprecation as well as an enlightening discussion around the importance and effectiveness of communicating those changes well. We also spent a while discussing accessibility and in particular accessibility testing tools (I point you in the direction of this blog post from GDS about Assistive technology tools you can test with at no cost). 

I was glad to be able to ask the experts about one of the issues I came up against while building our pattern library: including external libraries. Our new Contact Block Component requires the Leaflet.js library which adds a reasonably hefty chunk javascript and css to our compiled code. There is a big question here around whether the benefits of including Leaflet in the pattern library should be achieved at the expense of reducing page load times and increasing data usage on pages which don’t use that component. There’s not a simple answer here, but it was really great to discuss the pros and cons with Nick and Hanna and hear my own concerns validated. It provided plenty of food for thought and I have already made a change that will improve performance in the short term while I continue to investigate other possibilities.

It was also really great to discover that Nick and Hanna both started off as apprentices, since three of our digital apprentices Andrew, Liam and Miles were participating in the meeting. I asked them how the meeting went from their perspective and will leave you with their responses:

“Great meeting which helped my understanding on how to make applications more accessible to users while also giving me a clearer view on important skills such as communication is to the IT Industry”

“I found the meeting very valuable as it gave us the opportunity as apprentices to meet experienced developers from GDS who were more than happy to share their knowledge and expertise in web/frontend development. We discussed the technicalities behind developing the Hackney Frontend library and using the GDS Frontend as a dependency, the importance of web accessibility (especially in the public sector), the importance of working in the open and sharing your work with other organisations.

“We also briefly discussed our backgrounds and how we got into web development/front-end development, which was very interesting and inspiring to be able to get insights into how people got into to the work they do.”

“It was cool to know that both developers were also apprentices in the past”

Using local data to build a better picture of poverty in Hackney

In Hackney, our ambition is to be evidence-led in everything we do – this includes ensuring that our strategies, services plans and key decisions are informed by what we know about the borough and the people who live and work here.  The Strategy team are currently developing two key strategies around poverty reduction and fostering an inclusive economy and HackIT have joined forces with them to help build a picture of poverty in Hackney today.

The go-to data source on poverty for small areas is the Index of Multiple Deprivation (IMD) which combines 37 different indicators across 7 ‘domains’ of deprivation (income, employment, education, health, crime, barriers to housing and services, and living environment). Whilst this data offers valuable insight, it has limitations and doesn’t always tell the full story:

The first key limitation highlighted above is that the IMD is good for understanding how deprived an area is in relation to others, but not how deprived an area is in absolute terms. This means that if nothing changed in Hackney but other areas got worse, Hackney would appear less deprived even though the experiences of our residents had not improved. 

This relates to the second key limitation of the IMD: its relative nature does not enable us to see change over time, and this data is only published every 4-5 years. On top of that, the underlying data behind the index is generally 2-3 years old by the time it is released. We know that Hackney is changing quickly, so we need to make sure we have up-to-date information. We also need to be able to see whether poverty is increasing or decreasing to better understand whether our approach is working.

Over the past 6 weeks we’ve worked on a short data project to build a  prototype of a Hackney Poverty Index in an attempt to fill this gap. We wanted to learn from the concept and methodology of the IMD to build our own index that combine open data alongside data we hold within the Council.

We aimed to identify at least one good indicator from each of the 7 IMD ‘domains’ to include in our prototype. We compiled a list of possible indicators and evaluated these in terms of:

  • data quality – can we trust this data?
  • granularity – is it available for small areas below borough level?
  • frequency – is it up-to-date and refreshed regularly?
  • coverage – is the data missing key sections of our population?
  • access – is it easily accessible from our systems, or openly available?

We were able to go above and beyond the 7 indicators we set out to include in our prototype, and in the end included 13 different datasets. These ranged from an indicator which identified the proportion of households who were in debt to the council, emergency admissions to hospital, and air quality. We did, however, face challenges identifying good datasets for some areas (education in particular) and had to be pragmatic with our choices. 

We brought together these datasets in Qlik, our business intelligence tool to transform, analyse and visualise data. The Hackney Poverty Index dashboard is now available for officers across the council to test out and give feedback on.

Our prototype is already generating insights that we didn’t have before. The maps below show that our local data is able to provide a much more granular picture than is available in the IMD (this data can be mapped at Output Area level). Our local data also shows a different pattern of income deprivation than the IMD 2015, with more poverty in the north of the borough. It is difficult to say whether this is due to changes over time (the underlying IMD data is likely to be from 2012-13) or for methodological reasons. We’ll be exploring this more when a new release of IMD data is available in October.

This prototype is just the beginning of a Hackney Poverty Index. We know that there is a lot more work to do! We expect to start the next phase of this project in late September when we’ll be:

  • further refining the themes and indicators we use to measure poverty
  • bringing together these indicators into an index which provides a single measure of poverty in Hackney
  • researching what functionality users need to understand and analyse the data
  • making the next iteration of the Index available to the public
  • providing more analysis alongside the data to tell a story

How can we make open data work for our users?

For a long time, there’s been an established view that publishing more open data should encourage the proliferation of new applications and services across the public and private sector based on the availability of common datasets across local authority boundaries. 

Historically in Hackney, we’ve only ever dipped our toes in the world of open data and to be honest, we don’t know whether there’s enough evidence to prove that simply publishing as much as possible will realise these benefits – we just don’t know enough about the user need to justify a big bang open data offering.

That being said, we’ve been inspired by the ODI, Nesta and others who are keen to encourage greater transparency and common standards in data publication and we have reviewed some of the good practice of other local authorities who’ve experimented with open data portals in recent years. But our HackIT manifesto tells us to think big and act small – we don’t want to dive into the deep end of an expensive open data portal for publishing every non-personal data set we can find. Instead, we want to understand the kind of data users want to see, and work out the most simple and cost effective way of providing that data in a useful way.

Currently, we don’t proactively publish the data that is most commonly requested via Freedom of Information (FOI) requests, which results in a significant burden of resource spent re-actively responding to FOI requests. In his 2018 manifesto, the Mayor Glanville committed us to more transparency and this has helped shape a set of basic principles to help us experiment with user focused open data publication. We’re keen to open ourselves up to some challenge on this and share what we learn as we experiment:

  • Focus on releasing data commonly requested via FOI – We believe that concentrating our open data offering on the most frequently requested topics will release significant amounts of officer time and provide a better service for residents. Therefore, we will focus on these opportunity areas first.
  • Use common standards wherever possible – to be valuable for developers and organisations, we need to publish our data in a format that is easy to join across local authority areas. Where a standard doesn’t exist, we will try to match other Councils who are already publishing this data to maximise reusability. We will openly publicise our data structures to encourage reuse by other local authorities.
  • Automated with a focus on currency of data – to be of maximum value in reducing the FOI burden we face, the data included in our open data products should be as current as possible. To ensure we aren’t just moving the resource problem from FOI administration to open data publication, data updates should be automated and require a minimum amount of officer time to publish.
  • Adopt a flexible approach to the technology we use to publish our data – we don’t believe in being constrained by an online data portal for publishing open data. We will aim to use the best modern technology and data analytics techniques depending on the data involved to provide a data dashboard, visualisation and full download of the dataset. We are motivated by providing a great user experience that’s easily discoverable first and foremost.
  • Aim to meet GDS design standards – all of our products will be designed in line with modern, accessible design standards and we will always test our products with our users.
  • Understand our impact – we will always begin by understanding the baseline volume of FOI requests for the data set in question and monitor over time, the impact of publishing the dataset. We expect to see more exemptions (where an FOI is refused under the grounds that the data is already published) and over time, fewer FOI requests overall. If the data set isn’t reducing FOI demand, we will look for other areas where we can add more value by publishing new data.

Our first experiment is with Penalty Charge Notice (PCN) data (that’s parking tickets to you and me…) – it’s one of the most commonly requested datasets and we think publishing this openly will help residents answer their questions in a more timely way and reduce the time we spend responding to FOIs on the topic. We’re experimenting with providing a simple  app on the parking pages on our website, which will allow users to extract the type of information from the data that is often asked for in FOI requests. Our great User Research team are helping to keep us focused on delivering something simple and intuitive for residents. We’ll also be trialing a light touch open data portal which will allow us to curate our open data sets in one place on the website. We’ll share more as we develop our MVP over the coming weeks. 

Hackney’s drive for a digital ‘Submit My Planning Application’ service

Doesn’t time fly!

It seems like only yesterday (Nov ‘17) when we were holding a Planning design sprint with @euanmills and his Connected Places Catapult team. Since then we’ve been fortunate enough to win an MHCLG funding bid to take forward our ambitious idea for the ground breaking Submit My Planning Application digital service (SMPA).

For the past year Hackney ICT have been working tirelessly with Snook and their technology partners Hactar to deliver a really high quality prototype for our SMPA digital service. Thanks to some great help from the Hackney Planning team, local businesses and residents the service has been fully designed around user needs. working collaboratively with the Open Systems Lab and the Southwark Planning team has saved us reinventing the wheel by utilising their digital Local Planning Policy engine approach.

Going back to the original brief, our goal was to either build a new service or stimulate the market into providing excellent digital Planning services.  We still want to meet the real needs of the Planning Authorities, businesses and residents that are suffering from a stagnant software market -our digital service will be the first step in that direction. However, I really didn’t see Northgate’s acquisition of Snook coming, a really interesting partnership and something to keep an eye on!

So after several sprints, retrospectives, all the iterative user research, many weeknotes and of course the obligatory show & tells we now have a working prototype and plan to deliver a Live minimal viable product later this summer.

So what are we doing now?

Basically, lots of building, testing and deploying! We have six sprints scheduled to take us to a live working product. See our previous show & tells, even better see all the whole project, here’s our project site alternative you can follow us on Pipeline.

Our aim is to have a Live working digital service by the end of the summer ‘A service so good people prefer to us it’. This is a bold statement, but we know demand failure is hitting a high of over 50%, therefore plenty of room for improvement. Imagine Amazon getting 50% of deliveries wrong! Also in this case the people do actually have an alternative i.e. the Planning Portal and iApply hence we intend to deliver an excellent alternative.

Normally when developing a new system from scratch, a lot of the work done is behind the scenes. Nowadays it’s not so much behind the scenes anymore, Hackney being part of the digital declaration, projects posted on Pipeline and working in the open you can find the code is stored on GitHub. You’ll see we’ve also been utilising existing capabilities such as gov.notify and developing excellent simple to follow documentation such as our API walkthrough. Currently we have 90% of the MVP under test and using real life applications to make sure the software works as expected. 

Our LLPG and GIS systems are now integrated, not to forget the most important integration i.e. with Tascomi our soon to be back office Planning Management system. We’re also keeping a close eye on the GLA’s London Development Database. Our service will need to capture all the right data up front in a standard format that can be seamlessly used right through the Planning process and not locked away in those closed and difficult to use PDFs.

What’s coming up and worth looking out for

The next couple of weeks are all about finishing the build and testing, testing and more testing!

As well as building the MVP we’re focussing on the not so exciting logistics of making the service Live within our new DevOps ways of working and if timings work out well, we’ll be integrating with Tascomi!

Key dates to look out for:

Come and join us at our offices or remotely, we’ll be around afterwards for deeper discussions on how we can shape the service and how others can adopt it 

  • 17th Sep Connected Places Catapult (Plantech Week)

We’ll be presenting our new digital Service ‘Submit My Planning Application’, Book your tickets through Eventbright

  • 19th Sep GLA – Event (Plantech Week)

14:00 GLA Plannig Event , Plantech for Local Government

(Venue City Hall)

  • 19th Sep SMPA Product Launch! (Plantech Week)

18:30 Connected Places Catapult Third Thursday

(Venue Urban Innovation Centre)