Testers needed: designing inclusive retros

The What

In January I ran a workshop at UKGovCamp. I think it was called “How to love your introverts” – I’m struggling to remember that far back. Attended mostly by introverts (and a couple of brave extroverts), we explored participants’ views on what introverts need to thrive in collaborative environments.* 

Borrowing from the fantastic workshop contributions, I’ve pulled together** a slide deck on designing and delivering introvert-friendly retrospectives. The focus is on ways to foster inclusivity as the norm, not the exception. Why retros? I needed a place to start. My hypothesis is that the principles translate to any type of facilitation. 

The Why

I am an evangelist for collaboration and I am a card carrying introvert. Sometimes it feels like the two things don’t sit well together. Open plan working (although clearly not a thing at the moment), lots of visual stimulation, conversation-led work could be a recipe for disaster for introverts who generally prefer space for thinking and reflection and low stimulus settings. On the flip side, working in “pizza sized” focused teams is fertile ground for introverts, who tend to blossom in smaller groups. 

Despite the connotations of the word, collaboration isn’t necessarily inclusive. Reflective of our extrovert-centric culture, facilitation techniques often reward the attributes and qualities of quick-thinking, talkative types. Figures vary, but up to 40 per cent of the population are introverts. Designing our agile practices to bring out the best in introverts is a necessity, rather than a nice to have. 

The Ask

I’m looking for feedback on the ideas in the slide deck. What works? What doesn’t? What have I missed? My contact details in the deck or use the comments section below.  Thank you for helping me to make this piece of work better.

*As an aside, introversion and extroversion are not binary concepts. Personality traits exist on a continuum. People who exhibit both introvert and extrovert tendencies are dubbed ambiverts.  
**What took me so long, right?

DfE Technology Support Scheme – project note

life fail GIF
This didn’t happen. Honest.

Back in April, the government announced it would be providing free laptops and internet to school children. HLT were leading on this work and took care of the ordering and liaising with schools on the numbers needed. 

The TLDR version is we smashed it! We made some mistakes, and we learned a lot and we adapted and changed approaches as we needed to as none of us really knew what to expect or how it would go. We were thinking on our feet and timeframes didn’t allow us the luxury of a lot of things. 

A lot of what we did on this project can certainty help us in other projects, such as the Future Homeworking project, in designing how we deliver a thing and collate data. 

Also, we all laughed a lot and Christian, Anwar and Shakti (and Bruno) were a brilliant bunch to do this with and worked so well together to solve problems and get things done. 

My aversion to public transport meant I was walking in and home most days. By the end of it I’d walked a total of 71 miles.  

I also went deaf in one ear again due to an ear infection. This caused a lot of amusement. 

What we did: 

To be honest, my first thought on discussing this was ‘I have absolutely no idea how on earth to do this’. 

We had the task of designing and organising the end to end process of accepting the delivery of 1600 devices and getting them to each school as quickly as we could. 

So, no pressure. 

On the HLT side we worked with Emma Claridge who handled the ordering and liaising with the schools on the numbers they needed and also arranged the drivers and vans for delivery from Environmental Services. 

We worked at an incredibly fast pace on this. We knew delivery was due on 30/06/20 so had just less than a week to find a team and work out:

  • Where were we going to put 1500 devices?
  • Who would we need on the team?
  • How would we get them in?
  • How would we log them all?
  • How would we assign them to the destination? 
  • How would we batch them? 
  • How many deliveries could we get on the van? 

Where on earth were we going to put 1500 devices?

We met with Paul Hornsey who kindly lent us Robert House until 10th July. This meant we finally had a hard deadline to get these out by. 

Who would we need on the team?

Sandeep was onboard from the start with his ability to understand how to log assets and ensure we track them properly. These weren’t devices that were going to be managed by Hackney but we did need to ensure we logged them as received and delivered property and onto the asset register. 

How would we get them inside?

We were told the delivery would arrive on pallets and they would put all of them onto the pavement and we’d have to get them inside. The doors to the room we were using and storing weren’t wide enough to get a whole pallet through. This sent my anxiety levels sky high as we needed enough people to help get them in quickly as possible and others to help stand guard to ensure they stayed safe. Oh, and hope it didn’t chuck it down.

Joss from HLT came down to help, Sandeep drove in from Kent and Paul came on site for the day.

The delivery was, alas, 5 hours late and some had to get home as they couldn’t stay that late. Zaf and Colin arrived with the key to solving our delivery headache – there was apparently a big allen key that opened a partition wall which allowed us to access the wide doors to get full pallets in! Just had to find the big allen key. Which we did! Hurrah! And with more help from HLT staff and facilities team we got the pallets in all ok and my anxiety subsided. 

How would we log them all? 

Sandeep had worked on a google sheet that easily allowed the team to log the serial number of each device, the date received and date dispatched. We’d been working on delivery receipts for the schools to sign so were able to log in that these had been signed as received. But still – 1600 devices to type into a sheet is still quite tedious and time consuming. Enter three barcode scanners that just made ife super easy in this dept! Beep, beep, beep!

Lesson learned early on in this – best to scan small batches and check to ensure that all logged on the spreadsheet. The team had got to number 456 and realised that there weren’t the correct number of populated rows in the sheet. Without a way of knowing which had been missed, it was a case of learning the hard way and having to re-scan them all. Going forward, they scanned in 10s and checked the number of populated rows correlated. It turned out there were connectivity issue with the internet dropping out while scanning so the team was now alert to this and managed it better. 

How would we assign them to the destination 

Once we got the names of the schools they were going to and the numbers going to each school, as well as the type of device (Chrome or Windows) we got super high tech and I sat with a bunch of post it notes and wrote out stickers for them – one post it with the name of the school on for each batch of 5 or less

  • Eg: one school had 53 devices
  • I wrote out 11 post it notes with the name of the school on. 
  • The team would scan 5 devices, tape them up and stick one post it note on it 10 times. 
  • The final 3 devices were scanned and batch as a 3.
  • For that school, there would be 10 bundles of 5 and 1 bundle of 3.  
  • There’s probably a much better way of doing that, but in the time we had, we couldn’t think of it. 

How we would we batch them for delivery? 

We received the names of the schools and Sandrine and Marta were able to help in pulling these into a map. 

We knew the van had a min capacity of 100 devices so Marta did a fabulous job of grouping them into local area of no more than 100. This gave us the number of runs that would need to be made and these were what decided went into each batch for each run. We now knew we had a total of 7 batches to get out. 

How many deliveries could we do in a day? 

We weren’t clear on the number of vans or drivers per day so the worst case scenario was 7 working days, which would take us up to 10th. We had to hope all were delivered and no returns! 

As it turned out, we had two drivers and one van had a capacity of 200 devices. We immediately were ahead of schedule and all went home well smug. 

However, in the following days, a couple of schools weren’t open for delivery and one driver was off sick, so we fell behind but then raced ahead with re-delivering failed first attempts and ended up finishing ahead of schedule! But, we had some last minute additions to deliver on the last day. We had to keep everything crossed there was no hiccups and they all got delivered ok. And that they did.

We had one school (with over 100 devices) that we weren’t able to get hold of so these had to be safely stored in HLT for them to deliver at a later date so we could clear Robert House on the 10th as agreed. 

This was a short and fast paced project to deliver and despite so many unknowns and variables, the team performed amazingly well to get this over the line ahead of schedule and keep on track despite set backs.

Oh, and the Mayor popped by with Cllr Bramble. Which was nice. 

Lies, damn lies and YouTube analytics – Part 1

Or the boring but effective title: What YouTube analytics are telling us about virtual council meetings

TL;DR

  • Online quizzes with your family are no longer fun, more like a war zone
  • Statutory council meetings are taking place online and available in real time to citizens
  • YouTube analytics are revealing things about viewer engagement that suggest we need to rethink the regulations and experiment with how council meetings are led and managed
  • Meetings are viewed asynchronously, rather than in real time
  • Viewers are watching short sections, rather than an entire meeting
  • Engagement across different types of meeting vary significantly
  • Don’t be a rookie analyst – large viewing numbers don’t equate to a successful meeting, its effectiveness or value. 

Back in April – when doing a virtual quiz with our relatives still felt like a novelty not a chore – Parliament passed regulations making provision for statutory council meetings to take place remotely. Councillors – for the first time – could meet online and continue the virtual role of governance and scrutiny. To ensure access and accountability to local citizens, the regulations also stipulated that meetings should be available online and in real time. 

For many, the prospect of delivering council meetings virtually sparked ambition for increased participation. At Hackney, we’ve been live streaming our statutory meetings via YouTube. Early indications show that more people are watching the live stream than would have attended a meeting in the town hall prior to COVID-19. Whether this trend will continue is something we need to assess over the coming weeks. 

In June and July we’re collecting baseline data for all our statutory meetings at Hackney. We are using a combination of qualitative feedback and YouTube analytics. The analytics are revealing things about viewer engagement and behaviour that suggest assumptions in regulations are not reflected in online behaviour, and that the structure and delivery of meetings may need to change to increase participation levels.  

Here’s the health warning. The analytics tell us very little about the inherent value of a meeting or how effectively it was managed. These things need to be measured in other ways. To be clear, a low number of views doesn’t mean a meeting has little value or is ineffective, and popularity should not be taken as a proxy for quality. 

What do we know so far… 

Engagement varies hugely between different types of meeting

This makes it difficult to draw conclusions that are relevant for all meeting types, but over time we expect to see trends emerge for specific committees and commissions. In June scrutiny commissions attracted a higher number of views, both concurrent (in real time) and asynchronous (watching afterwards), as did our planning sub committee. 

People are viewing in their own time, rather than real time

Concurrent views measure the number of people watching in real time. This is small in comparison to the number of people who are watching asynchronously across all meetings. The regulations assume that people would want to watch the meeting as it was taking place, but the analytics we have don’t bear this out. 

People watch short sections, rather than the whole thing

People aren’t watching the entirety of a meeting. This chimes with online behaviour. We tend to skim and skip through online content. We have grown used to bite-sized pieces of information or entertainment. In 2020 it’s an anathema to give your undivided attention to something for three hours. 

Long meetings, agendas, copious reports

Long meetings traditionally have been a bit of a badge of honour – a reflection of their seriousness and gravitas. Lovingly prepared, lengthy reports by Council officers are also held in similar esteem. I’m not seeking to decry or undermine the Herculean efforts of councillors or officers (I’ve written and presented those reports myself). My point is that neither translate well into an online environment. The implications of the previous sentence are huge, more than a blog post can convey or one Council can solve trying to tackle it alone.

Conclusions, thus far

The new regulations are a game changer in some ways and in others not. No-one anticipates a wholesale rollback to face-to-face meetings – virtual is here to stay. The big question is how can it’s value be sustained and meaningful. 

Lifting and shifting a face-to-face process online isn’t the answer. The emergency regulations replicate a process that is over 40 years old – something that wasn’t designed for internet-era culture, unresponsive to short feedback loops, out of step with people’s expectation of service provision and a 24 hour news cycle.

How might we: 

  • Improve the experience for viewers watching asynchronously (watching after the live stream has taken place)?
  • Improve the experience for viewers who skim content and watch short sections?
  • Experiment with structure and delivery of meetings?

We’re continuing to measure our meeting analytics in July and I’ll blog again about the results. 

Rediscovering leadership

In my (nearly) two years at HackIT as our Lead User Researcher, I’ve been really lucky to have a range of opportunities to develop my skills individually and also within our teams.

‘Things to Make You Think’ talks that our colleagues in the Learning and Organisational Development team regularly set up have been great. These are run with respected professionals across different sectors, sharing their personal experiences and challenges that help you reflect on what you do. Also, last year our team visited NatCen to spend time perfecting our skills in conducting robust in-depth interviews, essential when we regularly conduct research with our residents and staff.

Recently, we’ve partnered with Stride – a new type of leadership product. The Stride team are on a mission to democratise leadership development. I’ve joined 9 colleagues, all of us from different backgrounds and different points in our careers, in piloting the product. 

Right, I’ll be honest. When the opportunity first came up, I did see it and think “Oh no, not one of those leadership courses!”. I think the term “leadership” comes with a lot of baggage. Historically, I’ve seen it as a bunch of similar people from similar backgrounds in a room, learning about academic theories about how to be a “better leader”. 

Now, I know that might seem harsh. I know these programmes work for lots of people but, for me, I was always turned off by the idea. Personally, they just didn’t feel human. Whilst I always want to learn things that are based in academic rigor, I’d want them to be designed in a way to help you connect with the people around you – not feel distant from them. 

However, even with that expectation in my head, I decided to give it a go. Maybe it is something a bit different.

Striding for the first time

Stride is based around an app which you can use whenever you want to help build your leadership practice. The early version of the product covers topics including giving feedback, setting goals and objectives and also understanding your values.  The Stride team is working to add more to the app all of the time.

Stride has also been running a series of webinars every other Thursday which expand on topics you cover in the app and we’ve set up a weekly ‘buddy’ check-in with a colleague who is also trying it out to compare notes.

When I first started using the app, I was quite critical. I initially found it difficult to connect and identify with the content. Maybe I wasn’t giving it a chance but it did take a bit of time to get into the habit of taking some time out for me to regularly reflect on my skills and practice.

One thing I did like was the fact I didn’t have to commit spending weeks at a time to something. By checking in with the app little and often, it meant I kept going with it. I also was really strict keeping the time in to watch the webinars and check-in with my buddy, something I’ve found I’ve had to be disciplined with more than ever, especially whilst working remotely.

Whilst I wasn’t initially seeing the change myself, the people around me were. My team told me I was having more stretching conversations with them that got them thinking. My line manager could see how I became more energised about planning and developing both myself and people within the team.

Spelling out what we stand for

One of the strong things that came out of my learning with Stride was to really understand what you, both as individuals and a team, believe in. The reason you get up in the morning to do the job that you do. The beliefs you hold that you never compromise on to help you deliver your best work. As individuals, we had our own ideas but we never got them down anywhere together or formally used those beliefs to shape what we do as a group.

We have got council wide values but I wanted to make them more relatable for us as a team. I set up an hour-long remote session over Google Meet where we individually reflected and then shared with the team what those broader values mean for us. 

I was slightly apprehensive going into the session as I wasn’t sure what we’d come up with. Doing everything remotely also felt like it was going to make it more difficult. However, by the end we came up with what our values are as user researchers – the reason why we do the job we do. 

Here is what we came up with: 

Proud

Crafting accurate research that compels teams to make people centred, evidenced decisions that impact residents’ lives.

Ambitious

Acting with courage to present our work far and wide; seeking opportunities to develop ourselves and support others in our sector to be the best we can be.

Pioneering 

Always evolving our research practice, not being afraid to try new things whilst being the experts on using the right research method at the right time to get a fair, accurate picture. 

Open

Truthfully sharing what we’ve learnt and how we’ve got there to impact decisions across the wider community whilst continually improving our own work.

Proactive

Pushing for ways research can continually make improvements to council services, successfully embedding ResearchOps within our team so we can do our best work. 

Inclusive

Representing the diversity of the people we serve, portraying the lived experiences of those who may be underrepresented whilst always treating participants and their data with the respect they deserve.

Reflecting on the journey

Using Stride has changed my perspective when I hear the word ‘leadership’. Leadership now feels more relatable, accessible and human for me. It’s not just something reserved for the few at a certain time in their career but something anyone can practice to get the best out of themselves and the people around them. 

Next week, the council is running our annual Leaders’ Conference – a week of online events to get people thinking more about what being a leader really means for them. I’ll now be going into that with a different perspective.

At the end of the day, we are all leaders – we just might not know it yet.

Why getting the right name for your APIs is so important

Series 1- Chapter 2

To continue from our previous blog post about our journey of defining platform & service APIs and what we have learned from it, we knew that the next step in this journey without a doubt should be about data taxonomy.

Organizations are often locked down with legacy,mid-age, clunky, etc (;-)) databases. We have the challenge of tackling cases where our on-premises databases don’t have any basic database principles applied. In other cases, we found that people are still reluctant to bid goodbye to their tightly coupled processes and/or are scared of any process changes that might impact them. These databases are typically a developers’ nightmare especially when it comes to integration. I have also been in situations where  we would like to open up the data from the source database and make it more available (of course in a secure manner). So when we’re building an API to unlock data from a legacy application or business process our challenge is: ‘What do we call that API so that it’s clear what data it presents without assuming how it should be used?’

So this is why data taxonomy is so important to the design of our Platform APIs 

Data Taxonomy is a process of classifying your data and building a hierarchical structure which becomes the foundation for the API. And then, across our APIs, the taxonomy helps us explain the relationship between different data models.

We started our journey with identifying our core entities and tried to model different data domains around those core entities. Believe me, the workshop was well executed and we ended up with a nicely drawn spider web of all domains which are related to our core entities – People and Properties. This step was really important to build our landscape of Platform APIs and understand how the usage works between different services as we did not want to have our fingers burnt again. Also, it helped us to identify the development of future platform APIs as well.

We have picked up “People” as the core entity. But in a local government context, that means different things depending on the context. For example, the housing tenant might be different from the council tax payer and the holder of the parking permit.So we then refined that further as Resident Information API (we’ve decided that contact details should sit as a separate layer). I will be honest here, initially I thought this would be easy but slowly reality kicked in. In our discovery session, we realized we have 26 data sources and maybe more that store contact information about residents. The team was gobsmacked.. We immediately asked the question of how would we cope with this mammoth task? The team knew that this was a task that will involve a lot of iterations and improvements, so we decided to start with baby steps and learn from them. 

We started with 6 initial datasets which we thought would be a good starting point –

  •  Adults and Children social care data, 
  • Housing data set,
  • Housing benefits dataset, 
  • LLPG data (holding the address information) , 
  • Flags we’d identified that might indicate someone was vulnerable (eg living alone)  
  • Data about people who asked for help during COVID.

The use case that followed was to provide a platform API that retrieves data from individual lines of business applications APIs in order to provide a consolidated view of all the data we hold about a given person across multiple data sources. We are also considering that as part of future iteration we should tackle storing audit logs of any changes made to the data. 

Hackney has a track record of tackling this issue. Our Citizen Index is more than 10 years old and enables us to match people’s records across business applications and verify their identity. So next, we will build on our understanding of this to determine whether or how we create linkages between different records of people. 

Having a data taxonomy designed for our APIs helps us to think in the right direction in terms of data relationships and building a knowledge model around different types of data domain. In other words, the taxonomy for a given domain should be the foundation step for any API development. Having an idea of the relationship between those domains will help you when designing your microservice architecture, simplifying development and integration. 

What’s Next?

  1. Building the SwaggerHub definitions and best practices
  2. Publishing our domain structure
  3. Securing our data
  4. Data Migration Mechanism

Appendix

Data classification is the process of organizing the dataset into relevant categories which helps efficient data management, improved data security, and streamlined compliance audit when it comes to sensitive data.