Delegates from fifteen Singapore government agencies
gathered for an OpenGov Breakfast Insight session on data analytics on October
26. Mr. Mohit Sagar, Editor-in-chief at OpenGov Asia, started the discussion
talking about the need to put the power of data analytics in the hands of
business users, not just keeping it with the IT department. It is about
empowering the end-users.
Mr. Charlie Farah (above-right), Director- Healthcare and Public Sector,
APAC at Qlik, talked about six universal trends in the context of data and the
public sector. The first is rising costs, putting pressure on governments to
find efficiencies in the way they are spending public money. Then there is a
growing citizen thirst for open data. They want more transparency about how
their government is working.
There is high impetus for cross-agency collaboration.
Government agencies are breaking down silos and sharing information across
government, finding links and connecting services together. Connectivity and digitisation
are two other important trends, not just within countries, but with counterpart
agencies overseas too. The final trend is governments using data for improving
Mr. Farah said, “In an ideal world, it would be perfect to
have all your data sitting in one beautiful data warehouse or data lakes and
you can start doing your analytics and visualisations on top of that.”
But in the real world, data comes from multiple sources-
operations, finance, workforce, supply chain. Then there is citizen data from
multiple sources for governments. Either all the sources could be combined into
a single data warehouse with enormous investments of resources and time. The
alternative is to start connecting the existing data points from different
sources and start gaining insights.
Today there is a lot of buzz visualisation. But visualisation
is the last mile. It is about visual analytics, Mr. Farah said. Being able to
connect data to see the story and making evidence-based decisions. The idea is
to put the functionality and the tools in the hands of the people on the
Data at the Transport
Accident Commission, Victoria
Mr. Bernie Kruger (above), Business Intelligence and Data Science Lead,
from the Transport Accident Commission (TAC)
spoke next. The TAC is a
Victorian Government-owned organisation whose role is to promote road safety,
improve the State's trauma system and support those who have been injured on the
roads, through an insurance scheme. The TAC is a 'no-fault' scheme. This
means that medical benefits will be paid to an injured person regardless of who
caused the accident.
TAC has a two-pronged 2020 strategy. It aims to reduce the
number of fatalities on the road towards zero. And it aims to deal better with
clients’ serious injuries, helping get their lives back on track.
Traditionally the focus was on the scheme’s liabilities.
There were between 600 and 1000 rules to apply and check for with every single
payment, to prevent fraud. Now the focus has shifted to the client. He said
that it is all about the client and client-centricity.
Data insights will be a critical enabler for this strategy.
TAC plans to establish an
enterprise-wide approach to translating data. This will allow research and data
to be shared across the organisation.
Mr. Kruger went on to outline the challenges faced in the
use of data. Often there is inadequate buy-in from senior management. They do
not consider to be an asset. It is rather seen as an operational tool. The
value of the data is seldom measured. Organisations are not aware of the
monetary value they can attach to certain data of a certain quality. Moreover,
there is chronic underinvestment in IT.
In addition to poor data quality, there is often a lack of
good data governance, further complicated with the move to the cloud. Who owns the
data, is it business or is IT? Weak data lineage also leads to problems. For
instance, Mr. Kruger said you tap into operational data and create a report. If
the operation changes, how easy is it to alter your report?
TAC was using a specific software as a Swiss Army Knife, for
everything from loading the data to cleaning, storage etc. But being married to
one software like this can be a trap. The total cost of ownership (TCO) can
turn out to be very high.
Another challenge is that the BI (business insights) team/
data scientists are viewed as service providers in many organisations. This
kind of culture hampers collaboration and results in an us vs them mentality.
How is TAC addressing these challenges? Firstly, a lot of
data management activities are being automated, enabling far more focus on high value analytics
and leading to better reporting and better insights. Everyone is taken
along on the data journey, not just executives or IT. The benefits of data are
shared. People are shown what is involved in a day for analyst and the
challenges involved in extracting insights from data. Solutions have to be
designed together with business. Data scientists should not just get the
requirements and design the solution. Business has to be a part of the process.
If everyone is not on-board, projects are going to fail.
Today everyone wants to jump on to advanced analytics, AI
but the entire supply chain has to be dealt with. “We have implemented data
science to quickly get benefits out of predictive analytics, machine learning,
network analytics, geospatial analytics,” Mr. Kruger said. It is an
experimental approach. Whenever anything works, its results are shared. Open
source tools are used. For example, the data science team at TAC has adopted R
and has been using it extensively. There are analytics pockets all through the
organisation. To bring them together a Data, Reporting & Analytics competency centre has been set up.
Other considerations are the adoption of agile and design
thinking principles, borrowed from the software development lifecycle, whether
to have an enterprise data warehouse or a data lake (why not have both!), data
discovery (allow the end-users to discover the data and its value) and cloud vs
on-premise. There is still pushback against cloud computing due to security and
privacy concerns. But the massive computing power available in the cloud is a
major consideration for Mr. Kruger and his team.
TAC also extensively shares data with other agencies, which
results in richer data and richer insights. For instance, if someone is in an
accident, the ambulance picks them up. Their data is captured on an ipad. That data goes to the hospital. Before the
person is even in the hospital bed, a claim is lodged on their behalf with TAC.
Reception and Resuscitation (TR&R®)
project is another one. It is a decision support system for the trauma
clinicians regarding resuscitation of the patients and the relevant protocols.
The system receives information from the ambulance, the vital signs monitor and
displays it on Google Glass right in their field of vision. Algorithms
prompt the Trauma Team in real time to confirm the state of the patient,
perform procedures and administer drugs as well as assisting with diagnosing
injuries. Ultimately the data
is integrated back into the TAC, so that they can follow up on the claim.
Polling questions and discussion
When asked about top drivers for improving business
information usage in public sector organisations, the response from the
delegates was split between improving speed and accuracy of decisions,
improving and optimising process performance, developing better policy/
products/ services and achieving better business transparency.
Mr. Chia Ti Yu, Director (Finance, Systems & Projects), Ministry
of Finance said that the driver would vary according to the role of
organisation and the individual. For his role, the primary use of data would be
optimising process performance, for a more frontline role, using data might be
about improved citizen interaction or developing better policy and services.
The inward and outward facing organisations have different priorities.
For instance, an agency like GovTech (Government Technology Agency of
Singapore) would be more focused on citizen satisfaction.
Around 56% of delegates rated their organisation’s use of
data and data analytics tools as fair (“we use data in our decision-making process,
but analysis is primarily a manual process), while 44% rated it as good.
The two biggest barriers identified by the respondents to
integrating more data and analytics into day-to-day decision-making were the
need to manually compile data from many sources and limited or no access to
data. Significant amounts of data might not even be digitised.
Here also, the
situation varies a lot between organisations. For example, financial data is
completely digitised but some hospitals still use pen and paper for certain
There is also a culture issue. One agency tried to get
business users to do more self-service. They experienced pushback. They didn’t
consider analytics to be part of their role.
Mr. Sagar asked the delegates if their organisations’
management understand the value of data. Or is analytics considered to be an
expense. For some, it is still viewed as an expense.
The agencies represented at the session were at different
stages of their analytics journeys. Several are using a mix of manual tools and
commercially available analytics and visualisation platforms.
In an area like health, at least 60-70% clean data would be
required. A small difference in numbers can make a huge difference in health.
If the data from two hospitals is not of similar standard, they can’t be
consolidated or compared.
There could be different systems. There could be a lack of data
definitions and standardisation. There could be issues regarding a ‘source of
truth’, as in, when the same data is available from multiple sources, which should
be considered to be the definitive source.
Then with user-generated data, every agency has a slightly
different practice. Mr. Paul Loke, Chief
Information Office at the Accountant-General's Department – Ministry of Finance,
said that you do not want officials to create a 100-line Purchase Order (PO)
for buying laptops, but you do not want a single line item, showing ‘IT
investments’. The latter provides zero visibility. It is about striking a
balance between the two.
Mr. Loke added that while cleaning data, it is important to
know the objective. If the end-result is a dashboard, then data has to be
cleansed. But fraud or crime detection requires dirty data.
Mr. Farah pointed out that there will always be some data
quality issues. Organisations should not wait to embark on analytics till they achieve
100% in terms of quality.
It is not just enough to get the right, cleaned data. The data
has to be received in a timely fashion. Some organisations such as the Economic
Development Board (EDB) have made the required investments and consolidated
their data. Now users can go and pick up whatever they require. But with
others, data is still in silos. Data stewards sometimes have a protectionist
kind of attitude towards their data and behave suspiciously towards data requests.
Frequently, once the request is placed, it takes a long time to get the
information. Here, data governance is an area of concern. People are more
comfortable sharing aggregated data.
And at other times, it could simply be a matter of not
having enough time to respond to data requests, while running daily operations.
Challenges remain but progress is being made. All the
agencies have at least made a start on their data journeys. Some have already laid a strong foundation. In others, pilot projects are demonstrating benefits and
senior managements are gradually acquiring a better understanding of the
potential of data.
Concluding the discussion, Mr. Farah said that data
analytics will not provide all the answers on its own. It should complement
human reasoning, enabling the people to ask questions of the data. It is about hitting that sweet spot between
Spock’s pure logic and Captain Kirk’s human intuition.