During my years as a consultant I have worked with about 45 municipalities, running projects in the fields of digitalisation, management and leadership, holding workshops and carrying out interviews. I have gained valuable insight into how municipalities manage their organisations via selected strategies, the processes and organisational structures they implement, and the resulting challenges. I’d like to share with you the fruits of my long experience and offer you solutions in this and future blog posts.
In a municipality there are many different challenges when it comes to improving the efficiency of decision-making processes. This is an area in which I have a keen interest, I want to understand in-depth each organisation’s unique needs and prerequisites and on this basis put together the right set of tools, methods and skills.
In this blog post I will focus on two fundamental factors that often disrupt efficient decision-making: namely, the lack of access to collected data, and the lack of understanding and knowhow about this data.
Lack of access to collected data In a municipality there are many different systems featuring a wide variety of data. Some of this data can be found in shared systems such as finance, HR and purchasing. But there are also operation-specific systems in education, health & social care, urban planning and so on. The problem arises when we want to conduct an analysis of key figures where for instance we need data from both the finance and the education sector’s operation-specific systems. These systems often don’t speak to each other and in many cases it is necessary to manually collate the required information. Very often by exporting data to Excel, where calculations are made and the results are visualised in the form of a diagram or graph which is used for analysis that is then presented to a parent, politician or student. This is a very time-consuming process and with a manual process there is always a risk of inaccurate data and of accidentally spreading sensitive data.
In order to create the foundation for good analysis, it is first necessary to review all your systems and examine how they can start talking to each other. In other words, make sure that it is quick and easy to extract data from different systems, transform it, and then load it so it is ready for analysis. This is done in something called a data warehouse and is the very heart of an analysis package. Only after this is the information quality-assured and ready for analysis and visualisation. The time aspect is important; cutting the time it takes to extract data so it is loaded and ready for analysis is an important step on the way to becoming an efficient and data-driven organisation.
There are many different end-user tools such as QlikView (or Sense) and Power BI, each with their own advantages and disadvantages. I will talk more about specific tools in a separate blog post.
Data extraction can also create the foundation for future improvements. One example of such an improvement is the way the ePsychiatry Unit in Region Västra Götaland secured better patient discharge decisions by predicting which patients were likely to be readmitted soon after discharge. This solution minimised readmission risks, cut health-care system costs, and gave greater insight into which factors impacted patient readmission rates. Read more about this by clicking on the link to our customer case: Better patient discharge decisions with machine learning
To get started on extracting data, we generally work with the client to conduct what we call a ‘health check’. We do this to quickly gain an honest and objective picture of the client’s existing systems. We work on the basis of the operation’s actual needs and suggest a plan of action to begin extracting the data needed for accurate analyses.
Lack of knowhow and understanding about how to interpret your data In many cases a support function has produced data, for example for a school principal who wants to analyse the contents of this data. The aim may be to gain insight into whether the school is doing well in a particular area, or to evaluate whether initiatives designed to deliver improvement need to be implemented. Usually, however, the person whose job it is to analyse the data does not have sufficient knowhow about how to use it, allowing incorrect conclusions to be drawn. In the field of analytics we call this Data Literacy (the ability to read, write and communicate data in a given context).
Data Literacy is a crucial skill once your data has been extracted. In order for the organisation to be able to draw the right conclusions, generate new analyses and implement relevant improvements, it is important to train the people who will analyse the data so they understand where it is coming from, how it is defined, and how it ties in together with other data.
A few tips for spreading and enhancing knowledge about your data:
If you would like more in-depth information about Data Literacy and what it involves please click on the link to our blog post: What is Data Literacy?
Do you have a different perspective on how to secure good analyses, would you like to discuss how best to create good decision-making support? If so, don’t hesitate to get in touch, I’m very interested in your thoughts.
And don’t forget to continue following my blog series about the challenges that municipalities and county councils face in quality-assuring information, analysing data and improving the efficiency of their decision-making processes.