0% found this document useful (0 votes)
112 views35 pages

Analytics Maturity Model

Analytics is increasingly becoming a common and integrated part of tax administrations across the world, in developed and developing countries alike. Administrations find use for analytics for operational purposes like reporting, risk modelling and fraud detection as well as for uncovering insight used to improve their efficiency and effectiveness. A recent FTA report on digitalisation suggested benefits of using analytics in seventeen different areas within management, taxpayer services, compli

Uploaded by

john
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
112 views35 pages

Analytics Maturity Model

Analytics is increasingly becoming a common and integrated part of tax administrations across the world, in developed and developing countries alike. Administrations find use for analytics for operational purposes like reporting, risk modelling and fraud detection as well as for uncovering insight used to improve their efficiency and effectiveness. A recent FTA report on digitalisation suggested benefits of using analytics in seventeen different areas within management, taxpayer services, compli

Uploaded by

john
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 35

FORUM ON TAX ADMINISTRATION

OECD Tax Administration Maturity Model Series

Analytics Maturity Model


|1

OECD Tax Administration Maturity Model Series

Analytics Maturity Model

PUBE

ANALYTICS MATURITY MODEL © OECD 2022


2|

This document, as well as any data and any map included herein, are without prejudice to the status of or sovereignty over any
territory, to the delimitation of international frontiers and boundaries and to the name of any territory, city or area.

This document was approved by the Committee on Fiscal Affairs on 1 June 2022 and prepared for publication by the OECD
Secretariat.

Please cite this publication as:


OECD (2022), Analytics Maturity Model, OECD, Paris.
www.oecd.org/tax/forum-on-tax-administration/publications-and-products/analytics-maturity-model.htm

Photo credits: © Suppachok N – Shutterstock.com

© OCDE 2022

The use of this work, whether digital or print, is governed by the Terms and Conditions to be found at www.oecd.org/termsandconditions

ANALYTICS MATURITY MODEL © OECD 2022


|3

Preface

Analytics is increasingly becoming a common and integrated part of tax administrations across the world,
in developed and developing countries alike. Administrations find use for analytics for operational purposes
like reporting, risk modelling and fraud detection as well as for uncovering insight used to improve their
efficiency and effectiveness. A recent FTA report on digitalisation suggested benefits of using analytics in
seventeen different areas within management, taxpayer services, compliance, and tax functions. 1
Administrations that consider investing in analytics often need to assess their current status and research
common practice for the area. We are therefore pleased to present to the international tax community the
Analytics Maturity Model from the Forum on Tax Administration, which can be used for self-assessment
as well as for comparison with other administrations.
We would like to express our sincere thanks to the analytics experts from Canada, Ireland, New Zealand,
Norway and the United Kingdom who helped draft the model in collaboration with the FTA Secretariat.
Many thanks also to the members of the Analytics Community of Interest who helped with revision and
piloting; the Asian Development Bank for their assistance in increasing pilot participation; and the many
tax administrations across the world that have contributed to this report through self-assessment results.
The results included in anonymised format in this report reveal that very few administrations have
consistently assessed themselves to be on a single maturity level across all indicative attributes, and many
cover three levels. The results also show that there are attributes on the Emerging and Progressing levels
for around 90% of the participations, indicating that many administrations deem that they have some way
to go to reach a consistently Established level for analytics. This suggests that there is room for
improvement in the area of analytics across both developed and developing countries.
We therefore believe that this model may prove useful regardless of the size, characteristics and location
of the tax administration, and warmly encourage administrations to read and use the report.

Niall Cody Mike Cunnington


Commissioner Deputy Commissioner Information and Intelligence
Ireland New Zealand

1
OECD (2021), Supporting the Digitalisation of Developing Country Tax Administrations, Forum on Tax
Administration, OECD, Paris. www.oecd.org/tax/forum-on-tax-administration/publications-and-products/supporting-
the-digitalisation-of-developing-country-taxadministrations.htm

ANALYTICS MATURITY MODEL © OECD 2022


4|

Table of contents

Preface 3
Executive Summary 5
1 Introduction 6
What are maturity models? 6
Model development and preparation for publishing 6

2 Using the Analytics Maturity Model 7


General background 7
Maturity levels 7
Layout of the maturity model 8
Recommendations for the self-assessment process 8
Recording of self-assessments 9

3 Results of pilot self-assessments 10


Self-assessment results 10
Self-assessment process 13
Summary 14

4 The Analytics Maturity Model 15


Strategic perspective 16
Operational perspective 21

5 Glossary of terms 29
Annex A. Self-assessment record sheet 32
Process-related questions 32
Self-assessment record 33
Additional considerations 33

ANALYTICS MATURITY MODEL © OECD 2022


|5

Executive Summary

Analytics is increasingly becoming a fundamental and integrated part of tax administration, being used for
operational purposes as well as for uncovering new opportunities for increased efficiency and effectiveness
in fulfilling the administration mandate. The Analytics Community of Interest in the Forum on Tax
Administration (FTA) has therefore, together with the FTA Secretariat, developed an Analytics Maturity
Model to facilitate self-assessments by tax administrations globally of their maturity in the area of analytics.
Maturity models can aid tax administrations in self-assessing their current level of capability, developing a
common, strategy-based understanding of what changes may be necessary, and contribute to identifying
peers that may be able to share relevant experience.
This report contains three parts and an Annex:
• Chapters 1 and 2 introduce the model and offer suggestions for how to use it.
• Chapter 3 summarises the results of the self-assessments conducted by tax administrations that
participated in the pilot phase.
• Chapter 4 contains the Analytics Maturity Model, which can be used for self-assessment and
comparison with the anonymised results in the previous chapter.
• Annex A contains the forms that can be used to record the self-assessment process and results.

Caveat

Tax administrations operate in varied environments, and the way in which they each administer their
taxation system differs with respect to policy and legislative environments as well as administrative
practices and cultures. A standard approach to tax administration may be neither practical nor desirable in
a particular instance. Therefore, this report and the observations it makes need to be interpreted with this
in mind. Care should be taken when considering a tax administration’s distinct practices to fully appreciate
the complex factors that have shaped a particular approach. Similarly, regard needs to be had to the
distinct challenges and priorities each administration is managing. In particular, not all parts of this Analytics
Maturity Model will be relevant for all tax administrations.

ANALYTICS MATURITY MODEL © OECD 2022


6|

1 Introduction
What are maturity models?

Maturity models are a relatively common tool, often used on a self-assessment basis, to help organisations
understand their current level of capability in a particular functional, strategic or organisational area. In
addition, maturity models, through the setting out of different levels and descriptors of maturity, are
intended to provide a common understanding of the type of changes that would be likely to enable an
organisation to reach a higher level of maturity over time should it so wish.
The OECD Forum on Tax Administration (FTA) has published other maturity models. The models and
more information about their usage can be found at: https://www.oecd.org/tax/forum-on-tax-
administration/about/maturity-model-series.htm.
The maturity model contained in this document covers the specialised area of analytics. Similarly to the
previously published maturity models, the aim of the Analytics Maturity Model is to:
• Allow tax administrations to self-assess through internal discussions how they see their current
level of maturity as regards the availability and usage of analytics.
• Provide senior leadership of the tax administration with a good oversight of the current level of
maturity based on input from other stakeholders across the organisation. This can help in deciding
strategy and identifying areas for further improvement.
• To allow tax administrations to compare themselves to their peers. An administration will know its
own level and will be able to compare itself to other tax administrations by studying this report. It is
also possible for tax administrations to reach out, through the Secretariat, to other tax
administrations at different levels of maturity for peer-to-peer discussion and learning purposes.

Model development and preparation for publishing

An advisory group of tax administrations from Canada, Ireland, Norway and the United Kingdom developed
the initial draft for this Maturity Model. The FTA Secretariat and the Chair of the Analytics Community of
Interest (COI) from Revenue Ireland revised the draft, which was subsequently piloted among the COI
members.
The FTA Secretariat received a large number of pilot assessments and many useful comments to the
piloted draft. The co-chairs of the COI from Revenue Ireland and the Inland Revenue Department of New
Zealand further revised the model with some assistance from the FTA Secretariat, to take into account the
feedback.
The model has been through another revision in the FTA Secretariat before final piloting, and adjusted
based on comments from the final round of piloting. In preparation for publishing, the results of the self-
assessments conducted by pilot tax administrations have been added.

ANALYTICS MATURITY MODEL © OECD 2022


|7

2 Using the Analytics Maturity Model


General background

Maturity models are generally descriptive in nature, with a focus on processes and the broad outcomes of
those processes, rather than being heavily based on metrics. This recognises that even where the metrics
chosen may indicate a good or less good outcome, they do not by themselves show how that outcome has
been achieved, the sustainability of the outcome or its robustness and adaptability to changes in the
external environment.
By their nature, maturity models are not prescriptive as to the details of processes nor as to how broad
outcomes should be achieved. There is no one-size-fits-all nor any detailed method that should be
preferred to another in all circumstances. There is also no judgement within the models themselves as to
what the optimal level is for a particular tax administration. This will depend on their own circumstances,
objectives and priorities.
What the maturity model will help an administration assess, though, is where they see themselves as to
their current level of maturity and the kind of processes and broad outcomes they may wish to consider in
order to improve their maturity. In addition, being able to compare themselves to other tax administrations,
or to the average level of maturity of other administrations, can be a useful input to the consideration of
whether the current level of maturity is the right one for them.

Maturity levels

The model sets out five levels of maturity. The reason for choosing five levels is to help make it easier for
administrations to assess where they are by providing clear distinctions in the descriptions of maturity. This
would become more difficult the more maturity levels there are. At the same time, having five levels helps
to ensure that the distinctions between the levels are not so great that it becomes difficult for
administrations to see the pathway to higher levels of maturity.
In designing the maturity model, it was decided to use the middle level, termed “Established”, to provide a
description of where, on average, FTA members may be expected to cluster. Using this as an anchor, the
other levels of maturity were fleshed out by trying to describe the pathway from an “Emerging” level to
“Established”, and from “Established” to what might be possible in the future given expected developments.
The five levels are:
1. Emerging: this level is intended to represent tax administrations that have already developed to a
certain extent but which, at least in the area of analytics, have significant further progress they
could make. The intention is that, in general, the descriptions of this level do not focus on what is
not in place but rather on what is in place, while noting what some of the limitations might be.
2. Progressing: this level is intended to represent tax administrations that have made or are
undertaking reforms in the area of analytics as part of progressing towards the average level of
advanced tax administrations.

ANALYTICS MATURITY MODEL © OECD 2022


8|

3. Established: this level is intended to represent where many advanced tax administrations, such
as FTA members, might be expected to cluster.
4. Leading: this level is intended to represent the cutting edge of what is generally possible at the
present time through actions by the tax administration itself.
5. Aspirational: the intention of this level is to look forward at what might be possible in the medium
term as the use of new technology tools develops and as administrations move towards more
seamless tax administration. Few tax administrations are expected to be consistently at this level
currently, in particular since in some cases it requires cooperation external to the tax administration
(such as whole of government approaches, access to a wide range of data sources etc.).

Layout of the maturity model

The Analytics Maturity Model is organised around the strategic perspective and the operational perspective
to using analytics. To assist in the understanding of what a given level of maturity means, a set of indicative
attributes is contained under each maturity level. As shown by the term itself, these are indicative and not
determinative.
Not all of the indicative attributes under a particular maturity level will necessarily be present in a particular
tax administration. A tax administration may also not fit all of the elements of a particular attribute. A further
issue that may arise is that the self-assessment group will feel that it in some cases indicators of different
maturity levels will be met within a particular theme, for example some “Progressing” indicators and some
“Established” indicators.
There is no one-size-fits-all that can work across a large and diverse range of administrations. The
attributes are therefore intended to help guide discussions rather than determine them. In using the model,
tax administrations are asked to consider the best fit for them, taking account of both the descriptors and
indicators. The self-assessment group will then need to determine which maturity level it best fits, based
on discussions of the weight it attaches to the importance of particular indicators being present for the
relevant descriptor. Hopefully, the information that it may not fit all of the indicators may also provide food
for thought about possible areas that the administration may wish to consider further.
In some cases the indicative attributes may be additive across the maturity model, and this should hopefully
be clear from the context. They will generally not be repeated across maturity levels. Where a tax
administration meets a number of indicative attributes within the same row, then its level of maturity within
that row will be the highest of the indicative attributes which are met. (For example if “Progressing”,
“Established” and “Leading” indicators in one row are all met, then the level of maturity for that row would
be “Leading”.)
It is important to repeat, though, that the indicative attributes are not determinative. Rather, they are
intended to reflect what might be expected, in general form, to be in place at a particular maturity level
which will differ from the level below (for example by virtue of being more demanding or representing a
shift in approach).

Recommendations for the self-assessment process

The Analytics Maturity Model has been designed to be used as a self-assessment tool. To be effective,
this self-assessment should be done in a way which makes the process as objective as possible and avoids
group-think. The following key considerations are based on experience with using this and other maturity
models:

ANALYTICS MATURITY MODEL © OECD 2022


|9

• Sufficient time should be allowed for the self-assessment discussion. Feedback from
administrations suggests that it may take from a few hours to a full day depending on the amount
of preparation before the group discussion.
• Ideally, there should be a range of staff with analytics services and analytics usage responsibilities
involved in the self-assessment, across grades. Care should be taken to ensure that the
conversations can be frank and open, and people should be encouraged to express their views.
• It can be helpful to ask someone outside of the management chain for analytics to facilitate the
discussions. This person should have read this report and understand the process for self-
assessment against the model. As well as facilitating discussions, the person should be able to
challenge the views of the self-assessment group, including asking for supporting evidence where
appropriate.
• Consideration should be given to how to reach a view where there is a division within the self-
assessment group on the appropriate assessment of maturity. The facilitator may, for example,
have a tie-break role.
• In addition to the facilitator, consideration should be given to involving staff from other tax
administration functions, ideally at a relatively senior level, to assist in the challenge function and
to provide insights from their different perspectives. A number of administrations have reported that
cross-organisational conversations when self-assessing can prove highly useful in joining-up
different areas of business, helping people to see the scope for synergies and for mutual support
in achieving the administration’s objectives.
• Administrations sometimes find that their maturity matches several levels for a single indicative
attribute, with descriptions from more than one level matching their understanding of the
administration’s situation. In these cases, the administration should choose maturity level that they
find is best for their administration. In some cases, this may mean choosing the lowest level of
maturity, because that will clearly signal internally in the administration that there is room for
improvement.
• When decisions are taken on the level of maturity, it can be helpful to record the main reasons
behind that decision. This will assist in preparing for changes as well as future use of the model
within the tax administration, allowing an easier discussion of what, if anything, has changed.

Recording of self-assessments

The record sheet in Annex A can be used by tax administrations to record the results of their self-
assessment as well as answers to self-assessment process questions and open questions regarding the
model.

ANALYTICS MATURITY MODEL © OECD 2022


10 |

3 Results of pilot self-assessments


The Analytics Maturity Model has been tested through pilot self-assessments by 41 administrations from
the Americas, the Asia-Pacific region and Europe. Most of the pilot administrations are FTA-members;
twelve of the respondents are developing country tax administrations. The feedback from the pilot testing
triggered a few minor textual adjustments and additional definitions. This chapter summarises the results
from self-assessments carried out by tax administrations.

Self-assessment results

The self-assessment record sheets received from the pilot tax administrations show that the majority
assesses the maturity of their analytics capacity and usage at “Established” maturity levels. This is
visualised in Figure 3.1, which illustrates the average maturity level for each of the 11 indicative attributes
across the Strategic and Operational perspectives of the model. This seems to indicate that the maturity
model is well calibrated, as the Established level was designed to be a description of the average maturity
level of FTA member administrations.

Figure 3.1. Results of the pilot self-assessments for the 11 indicative attributes of the model

Average pilot participant score


1.1 Strategy
2. Operational 5 1. Strategic
perspective 2.7 Usage areas 1.2 Governance perspective
4

2.6 Analytics capabilities 2 1.3 Culture

0
2.5 Analytics process and project
1.4 Budget setting
management

2.4 Business feedback and 2.1 IT infrastructure, system


evaluation development and tools

2.3 Talent management 2.2 Data management

Source: FTA Secretariat, based on self-assessment responses.

ANALYTICS MATURITY MODEL © OECD 2022


| 11

The detailed results illustrated in the heat map in Table 3.1 show how each of the 41 tax administrations
assessed their maturity level across the indicative attributes. The results are anonymised to ensure that
administrations are not influenced in their use of the Maturity Model by concerns about external
perceptions. However, administrations that participated in the piloting of the model will be able to identify
themselves based on their record sheet submission.

ANALYTICS MATURITY MODEL © OECD 2022


12 |

Table 3.1. Results of the pilot self-assessments for the 11 indicative attributes of the model
Administrations
Indicative
attributes

AM
AG

AO
AC

AD

AH

AN
AA

AB

AE

AK
AF

AL
AJ
AI
W
M
G

Q
C

U
A

Y
F

Z
L
J
I
1.1 Strategy 2 2 1 3 2 2 3 2 3 3 2 3 3 3 3 2 3 3 3 3 3 3 2 3 3 2 3 3 3 2 3 3 4 3 3 3 3 4 3 3 4
1.2
1 1 3 2 2 2 2 2 2 2 2 1 3 2 2 3 2 3 3 3 2 2 3 2 3 2 3 3 3 3 2 3 3 3 3 3 3 3 3 3 4
Governance
1.3 Culture 2 1 2 2 2 2 1 2 2 3 3 2 3 3 3 3 2 2 4 3 2 2 2 3 3 3 3 3 3 3 3 4 4 3 3 4 3 5 3 4 4
1.4 Budget
1 2 2 1 2 2 2 2 1 2 2 2 1 2 2 2 2 2 2 3 2 3 4 2 2 3 3 3 3 3 3 3 3 2 2 3 3 4 4 4 4
setting
2.1 IT
infrastructure,
system 2 2 2 2 2 2 3 3 2 2 2 3 2 3 3 2 3 4 3 3 3 3 2 3 3 3 3 4 3 4 3 3 3 4 3 4 3 2 4 4 4
development
and tools
2.2 Data
management 2 2 3 2 2 2 4 2 2 2 2 2 3 3 3 2 2 2 2 3 3 2 3 3 3 3 3 3 3 3 2 3 3 4 3 3 3 3 4 3 4

2.3 Talent
1 2 1 1 1 2 1 2 3 2 2 3 1 2 2 2 3 2 2 2 3 3 2 3 2 3 3 2 2 4 4 3 4 3 3 3 4 2 3 4 4
management
2.4 Business
feedback and 1 1 1 1 2 2 1 2 2 2 2 2 3 2 2 3 2 2 2 2 3 2 3 3 3 3 3 2 4 3 3 3 3 3 3 3 4 4 3 3 4
evaluation
2.5 Analytics
process and
1 1 1 1 2 2 2 3 3 2 3 2 3 1 3 3 3 3 3 3 3 3 3 3 4 3 3 4 4 3 3 3 3 4 5 4 3 2 4 4 4
project
management
2.6 Analytics
capabilities 1 1 1 2 2 2 2 2 2 3 3 3 2 3 3 3 4 3 3 3 3 4 3 3 4 4 3 3 3 3 4 4 2 4 4 4 4 4 3 4 4
2.7 Usage
1 2 2 2 2 2 2 2 2 2 3 3 2 3 2 3 3 3 3 2 3 4 4 3 2 3 3 3 3 3 4 3 3 3 4 3 4 5 4 4 4
areas
Heat map
1 Emerging 2 Progressing 3 Established 4 Leading 5 Aspirational
key:

Source: FTA Secretariat, based on self-assessment responses.

ANALYTICS MATURITY MODEL © OECD 2022


| 13

With the dark blue and light blue colours representing the self-assessed indicative attributes below the
Established level, it is clear from the pilot results that many administrations find that they would need to
make some changes before reaching an Established level of maturity across the entire field of analytics.
This is further illustrated with additional statistics: 26 administrations self-assessed their average maturity
level to be lower than Established, while 13 self-assessed their average level to be higher than Established.

Table 3.2. Average number of times a maturity level was used during self-assessment
Emerging Progressing Established Leading Aspirational
7% 32% 46% 15% 1%
Source: FTA Secretariat, based on self-assessment responses.

Finally, table 3.2 summarises the average number of times a maturity level was used during self-
assessments of the 11 indicative attributes. This illustrates that although the Emerging and Progressing
levels are selected more frequently than the Leading and Aspiration levels, the results are clustered around
the “Established” category as intended when the model was built and calibrated. Therefore, for the time
being there does not seem to be a need for adjusting the model.

Self-assessment process

Feedback from the self-assessment process shows that the process varied considerably between the
participating administrations, in terms of methodology used, the number of staff and managers involved as
well as time spent on assessments:
• Some administrations informed us that they reused their assessment from the piloting of the draft
model in 2021, and adjusted their responses according to the changes in the model and local
circumstances, consequently only spending a few hours on the self-assessment. Other
administrations reported devoting significant resources to the self-assessment process, spending
more than a week to reach their conclusions.
• Around 45% of the administrations that responded to the question reported that they assigned a
facilitator to organise the self-assessment process.
• While almost 95% of the administrations that responded reported that they managed to involve the
appropriate range of staff in the self-assessment discussions, only 60% reported involving officials
from other areas of the administration. Many, especially from smaller administrations, chose to do
an assessment within the analytics team, reasoning that the team had sufficient insight into the
capability and usage of analytics across their administration.
Figure 3.2 shows that the number of staff working with analytics or analytics services involved in the self-
assessment process varied from 2 to 401, with a median of 7 and an average of 22. A similar disparity in
time spent is also visible. Administrations reported spending between 1 and 94 hours to complete the self-
assessment, with a median of 4 and an average of 13.

ANALYTICS MATURITY MODEL © OECD 2022


14 |

Figure 3.2. Minimum, median and maximum values for analytics staff in self-assessment group and
self-assessment time

No. of analytics staff in self-


assessment group

Time taken to complete the


self-assessment (in hours)

0 50 100 150 200 250 300 350 400 450

Source: FTA Secretariat, based on self-assessment responses.

Summary

The range and diversity in data available for tax administration analytics is expanding every year, and it is
likely to increase faster with internationalisation of the economy in most jurisdictions. Complementing this
improved opportunity for useful analytics source data, the methodology, tools and processing for more
efficient and effective execution of analytics is continuously improving. The situation is therefore ripe for
intensified utilisation of analytics in a taxation context, allowing administrations to better fulfil their mandate
with the use of this type of methodology and technology.
With many pilot administrations assessing their analytics maturity level to be below the Established level,
indicated by 39% of per-attribute assessments being Emerging or Progressing and only 16% being
Leading or Aspirational 2, there seems to be significant potential for improvement. Given that these
administrations represent most regions of the world, the set of pilot results are likely to give a good
representation of the actual state of analytics capability and usage in tax administrations worldwide. It will
be interesting to follow the development of this field in the years to come as further progress is made on
the digitalisation and digital transformation of tax administrations.

2
See table 3.2.

ANALYTICS MATURITY MODEL © OECD 2022


| 15

4 The Analytics Maturity Model


The organisation of the maturity model into two parts reflects the progression from determining the strategic
approach to implementing the approach through practical action. Both parts also demonstrate the evolution
from the use of analytics being initiated by individuals and teams within the administration, via well-
regulated use of analytics for core tax administration functions, to the widespread use of analytics in
seamless tax administration.

ANALYTICS MATURITY MODEL © OECD 2022


16 |

Strategic perspective

This part of the model focuses on the framework within which the analytics activities are carried out, by examining four factors: Strategy, governance,
culture and budget setting. The governance of analytics is approached by examining the governance of analytics services, how analytics projects are
prioritised, and the governance of ethics and transparency issues. The different maturity levels reflect the existence and sophistication of the analytics
strategy and governance, and the effects of these through a maturing culture of widespread analytics usage and appropriate funding.

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Pockets of analytics The strategic importance of A high-level strategy and Analytics capabilities and Analytics capabilities and
knowledge and good analytics for decision- organisational structure is in practices are well-integrated practices are fully integrated
Descriptor practice may exist in some making and the need for place for the coordinated use into strategic planning, into the administration’s
business units depending on coordinated analytics of analytics, and the performance management strategy and the
the background and services is largely governance of analytics activities and operational organisational processes
experience of individual understood at the senior services is managed at decision-making across the supporting seamless
managers and staff. At the level, but there is no overall senior level. The importance administration. The taxation. The administration
administration level, strategy for analytics use in of coordinated analytics importance of integrating is innovation-focused at all
although there are some the administration. The services and use of analytics analytics with every aspect levels with government-wide
senior sponsors, there is not development and use of for more effective tax of tax administration is analytics coordination,
a shared view of the role of analytics services are administration is prioritised embedded into the supporting the use of
analytics in improving tax generally driven by by senior leadership. This is administration culture. An analytics in assuring the
administration. individual business units. increasingly reflected in end-to-end governance proper application of tax
Indicative
budget setting, project function for analytics rules within taxpayers’
Attributes planning and IT ensures proper prioritisation natural systems.
development. and value for money.

ANALYTICS MATURITY MODEL © OECD 2022


| 17

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Strategy While there is awareness of the The strategic importance of A high-level strategy for A detailed strategic framework The development of the overall
power of analytics in some analytics for decision-making is analytics services is in place, is in place for the coordination strategy to support seamless
parts of the tax administration, recognised at senior level and setting out the role of analytics of analytics services and the tax administration is informed
there is not a consistent view it is encouraged by senior in enhancing the effectiveness integration of analytics into all and enabled by the use of
across senior management as management, but there is no and efficiency of tax business areas, including analytics.
to how to develop the use of overall strategy for how to administration functions and planning for a move towards
analytics for improving decision improve the use of analytics processes. more seamless tax
making across the and analytics professionals in administration.
administration. the administration.

Although analytics are used to While the use of analytics is While the strategy emphasises The analytics strategy is The strategy for use of
good effect in some business increasing, the development the importance of coordinated informed by extensive internal analytics within the
units of the tax administration, and use of analytics services analytics services and feedback and external research administration and within
in other units there is little are generally driven by improved analytics capability and is actively supported by taxpayers’ natural systems is
awareness of the potential of individual business units, (including as regards the senior management. There is co-designed by the
analytics to provide new including on an on-demand availability and use of data), increasing engagement with administration and external
insights, with many decisions basis, without cross- analytics functions are not yet external stakeholders on the stakeholders, with the aim of
taken solely on the basis of the administration strategy-based fully embedded in all business development of analytical enhancing trust and confidence
knowledge and experience of coordination. This can lead to areas, which can affect capabilities to assure system in the integrity of the tax
individual tax officials (which analytics work being carried out prioritisation decisions. integrity. system.
may not be consistent across in silos and hinder the benefits
the administration). of coordinated analytics.

ANALYTICS MATURITY MODEL © OECD 2022


18 |

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Governance Analytics services governance While a high-level and An analytics services An analytics governance board The analytics governance
arrangements differ between principles-based governance governance team is supervised is in place and end-to-end board incorporates external
units because the overarching framework for coordinated at senior management level, in analytics services governance members to ensure that
governance framework is analytics services is in place close consultation with processes are defined, alignment between the
lacking or ineffectual. Oversight and supported by senior business units. Clear guidance rigorously applied and administration’s analytics
is usually provided by the management, there is no is in place and being followed monitored, to ensure alignment processes and those used by
relevant manager without centralised follow-up to ensure up for the prioritisation of with the administration’s other government units and
reference to administration- that this happens consistently, analytics services business objectives. Analytics taxpayers. This contributes to
wide governance principles and and there is a lack of guidance development, ensuring that the governance is integrated with seamless taxation systems and
with limited visibility at senior and support. needs of all business units are the governance of other IT integration of tax processes in
management level outside of considered and that reuse and services, ensuring optimal taxpayers’ natural systems.
major reform processes. multiuse opportunities are resource use and prioritisation
utilised. across the administration.

Prioritisation between analytics Principles for prioritising and Detailed governance processes There is co-ordinated oversight Analytics strategy and delivery
projects takes place at the coordinating analytics projects are in place for to help ensure of analytics projects which is subject to regular
business unit level without are in place. Typically, though, that analytics projects deliver ensures high value by independent expert review,
consideration for analytical projects are initiated maximum value to the prioritising projects in line with including by parties outside the
administration-wide analytical by organisational units to meet administration as a whole. In the overall administration tax administration.
needs. While informal networks their own priorities rather than practice, though, prioritisation strategy. This is a transparent
of analysts may exist, there are those of the administration as a decisions on some projects and well-documented process.
no formal processes in place to whole, and often start because may be taken at individual
drive coordination across the there is data available. business unit level.
administration.

Beyond compliance with Ethical and transparency Ethical and transparency A comprehensive framework Adherence to the ethical
privacy legislation, ethical and matters arising from analytics issues arising from analytics for considering the ethical and framework is routinely
transparency considerations activities receive some activities are usually transparency dimensions of monitored, including through
receive limited attention. attention from some analysts considered by the analysts and analytics activities is in place the use of AI, and subject to
and managers. their managers, but no and well adhered to. independent external review.
consistent process is in place.

ANALYTICS MATURITY MODEL © OECD 2022


| 19

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Culture Some individual teams and There is a growing appreciation The value of analysing data is Managers at all levels see The critical importance of
business units actively consider across the administration for actively promoted by senior themselves as champions of analytics to seamless tax
ways to make more effective the potential benefits of management and supported by digital transformation, and administration is embedded in
use of analytics. However, analytics, particularly at the the dissemination of examples, there is an active programme in core administration
there is no shared culture senior management level. staff training and increased place to motivate staff and professional values. The
within the tax administration However, many operational collaboration between analysts foster a culture of innovation consequences of this attention
appreciating the benefits of staff remain reluctant to and business units on and change underpinned by is visible in day-to-day
analytics for the administration engage with analytical opportunities and results. the use of analytics across the behaviours and in an
and taxpayers, and the benefits solutions or use results from There is noticeable appetite for administration. organisational culture focused
are not actively promoted by analytics. There is limited analytical solutions in most on innovation.
senior management. understanding of where value business units.
can be added outside of risk
management and audit.

With a generally low level of Data literacy is improving The general level of A strong and cooperative All levels of the organisation
data literacy and few across the administration appreciation for data as a culture is in place across the understand the analytical
programmes in place to through basic training valuable asset is high among administration for valuing process and will identify
improve the level, there is only programmes, establishment of staff at all levels, with a culture analytics as part of the range of opportunities for using analytics
intermittent understanding of informal networks of analysts, of networking, cooperation and tools for enhancing tax to ensure the optimisation of
data as a valuable asset and and increased collaboration knowledge sharing across the administration processes, the tax system. This culture is
the role of analytics in tax between analysts and other administration in general. reducing burdens and supported through continuous
administration. staff. There is manifest emphasis, improving the effectiveness of training and development
interest and understanding on the tax administration. Data which meets the needs of
the use of analytics for literacy is strong, supported by advanced analysts, ad hoc
achieving tax administration both basic and advanced analysts and users.
objectives. training in the use of analytics.

ANALYTICS MATURITY MODEL © OECD 2022


20 |

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Budget setting Budget planning for analytics Coordination of budget The analytics governance team The analytics governance team The budget planning process
investment and spending tends planning for analytics carries out analysis to inform considers the cost and benefits for analytics investment and
to occur on a project or investment and spending budget planning through of long-term strategic spending is fully integrated into
business unit level based on across the administration is engagement with business investment and spending in administration-wide budget
previous year’s outcomes, with generally limited to significant units, taking into account cross- enhanced analytics setting processes, taking
little consideration for current or analytics projects. There is cutting objectives. capabilities. The budget account of and supporting the
future administration-wide some analysis of the holistic Consideration is given to the planning process is integration of tax compliance
needs. impacts of budget changes. impacts of investments and coordinated with other IT- analytics functions embedded
spending to enhance analytics related functions, ensuring in taxpayers’ natural systems.
capabilities, largely focused on harmonisation with the
medium-term objectives. administration’s longer-term
objectives for digital
transformation.

ANALYTICS MATURITY MODEL © OECD 2022


| 21

Operational perspective

This part of the model examines how management and staff in the administration choose to act as the strategy and governance framework develops.
The effects of the strategic approach on operations are examined through a range of factors, which can largely be grouped as the technological
foundation of analytics and the use of analytics. The evolution in maturity is reflected in increased quality and scope of the technological foundation for
analytics, growing management and staff support inside and outside the analytics teams for the usability of analytics, increased professionalism in
organising the analytics work and using the results, and an increasing range of areas benefiting from analytics.

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Data sources are only partially A common analytics services The common analytics services Analytics services are The administration uses the
digitised, the administration infrastructure is in place, but it infrastructure is well frequently enhanced by latest analytics tools, all
Descriptor lacks a common infrastructure is not well maintained and has maintained, and necessary emerging technology. All core available structured and
for analytics services, and limited analytics tools. Most analytics tools are provided. All datasets are comprehensively unstructured data and – where
there are recurrent issues with internal data is digitised, and significant data sources are documented. There is applicable - agile techniques to
data quality. Most analytical some of it is available in a digitised, and there is easy increasing use of unstructured maximise tax compliance and
work is undertaken at the centralised repository although access to most data used for data and big data. Users have minimise burdens. Data is
initiative of the analyst or in data quality is varies. Analytical analytics, including third party access to good-quality clean and fully documented,
response to requests from needs and opportunities are sources, with acceptable operational datasets, and increasingly available in
individual users. Analytics tools sometimes considered in the matching levels. complemented by a wide range real-time if relevant.
and techniques are purchase and development of There is increasing proactive of third-party sources. Analytics and project
rudimentary. IT systems. Analytics is not yet cooperation between analysts The administration utilises management capabilities are
Analytics projects generally do seen as a core function within and operational staff. analytics tools and advanced maintained at the cutting edge,
not involve the operational the administration. While Advanced analysts have a techniques effectively across with a strong focus on enabling
staff. analysts generally have good good understanding of the administration. analysts and operational users
basic skills, some opportunities statistical thinking and key Analysts work proactively with to take maximum advantage of
for training and access to basic modelling techniques. operational managers to the opportunities available.
analytics tools, there is a lack identify business problems and Advanced analysts are trained
of engagement by most to design and communicate to postgraduate level in
business units, resulting in practical solutions, using a statistical modelling and
underuse of analytics. broad range of modelling and machine learning, and use a
Indicative exploration techniques. full suite of data visualisation,
Attributes Operational users are fully natural language processing
involved at all stages of and artificial intelligence tools.

ANALYTICS MATURITY MODEL © OECD 2022


22 |

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

analytics projects. Analytics


informs all tax administration
functions, including through a
growing number of automated
analytics processes.

ANALYTICS MATURITY MODEL © OECD 2022


| 23

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

IT infrastructure, The administration lacks a There is a common The common analytics services The common analytics services The analytics services
system common infrastructure for infrastructure and central infrastructure is well designed infrastructure incorporates infrastructure is an integrated
development and analytics services; most data repository for analytics and maintained. A formal leading architecture solutions part of the wider internal and
systems are separate, and services, but it frequently fails routine for synchronising like cloud services as external network constituting
tools
there is no central repository as it is not aligned with other IT changes with source systems appropriate, and is frequently taxpayers’ natural systems,
for data exploration. New systems. Opportunities to is in place and catches most enhanced based on emerging executing tax processes and
opportunities offered by improve analytics services changes. The tax technology. Automated change evolving along with the other
emerging technology are not through technological administration exploits some of information flows from source systems.
considered. IT system development are generally not the new opportunities offered systems to the analytics
development generally does exploited. Analytical needs and by emerging technology. services infrastructure,
not consider analytical needs opportunities are sometimes Changes to other IT systems ensuring that changes are
and opportunities. considered in the purchase and only proceed after implemented in time. Analytical
development of operational and consideration of analytical needs and opportunities are a
administrative IT systems but needs and opportunities, but significant factor in decisions
are generally given low priority. these are not necessarily given regarding IT system
high priority. development.

There is limited access to Tools are available for joining Users have access to a limited Users have access to a full The administration collaborates
analytics tools. Most analysis is and visualising data but with range of analytical tools. Where suite of analytical tools, with external partners in testing
conducted on spreadsheets limited flexibility and opportunities arise, analytical including tools for network emerging analytics technology.
after manual extraction. reproducibility. tools are evaluated against analysis.
next-best alternatives.

ANALYTICS MATURITY MODEL © OECD 2022


24 |

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Data management A significant number of data Most data sources are All significant data sources are Operational datasets are The comprehensive central
sources are not yet digitised. digitised, and some data is digitised. A central repository comprehensively documented. analytics repository is
Many of the digitised data made available in a centralised for most data used for Users have access to a wide increasingly shared with other
sources are maintained in repository for reporting analytics, including third party range of third-party sources agencies. Analysts have near-
separate systems with no purposes. Some of the sources sources, is in place, with and unstructured data, real-time access to data in
central repository, leading to share a common taxpayer acceptable matching levels increasingly in real-time. This is taxpayer and third-party
difficulties with matching and identifier. A small number of supported by a secure digital underpinned by the use of systems as necessary, and
exploration. third party data sources are identity shared by all internal digital identity which is shared large representative datasets
available, but with significant sources. Load frequency and across society. Some are available for development
matching and quality issues. preparation levels largely representative datasets are purposes. Internationally
match analyst needs. available for development compatible digital identity
purposes. Load frequency and supports all taxation processes.
adaptation levels match analyst
needs.

There is limited awareness of The administration is aware of The administration has Maintenance of the common Maintenance of the common
the importance of a common the need for a common implemented a common ontology for most internal ontology for all internal systems
ontology and no common ontology where concepts, ontology for core systems. systems is largely automated. is fully automated, and
systems or processes are in terms and structures for Processes are in place to Integration between the translation rules exist for all
place for creating and analytics source systems are create and maintain the ontology catalogue, the central external systems available for
maintaining metadata. described and harmonised, but ontology catalogue, and are repository and the analytics analytics. The ontology
this has not been consistently largely followed. The ontology tools is improving. catalogue is integrated with the
implemented. Creation of catalogue is partially integrated central repository and the
metadata is inconsistent. with the central analytics analytics tools.
repository and the analytics
tools.

The organisation as a whole There is awareness of the There is a general Data quality is increasingly an Data quality is a central part of
has little awareness of the importance of data quality in understanding of the integral part of the overall the overall business strategy,
importance of data quality. parts of the organisation, but importance of data quality in business strategy. Data quality and there is widespread
Data documentation is there is no systematic parts of the administration. monitoring and error correction understanding of the
generally limited and of varying monitoring of data quality, and Data quality monitoring is is largely automated. importance of this. Data quality
quality. The data used for error correction is usually largely automated. Error monitoring and error correction
analytics has many missing carried out manually in an ad correction, although well is fully automated and happens
values and errors. hoc manner. organised, is mostly done in real-time.
manually.

ANALYTICS MATURITY MODEL © OECD 2022


| 25

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Poor data security Some security measures are in The data security solution The fine-grained data security There is real time management
management is often observed, place, and it is generally ensures individual approval- solution allows for flexible and of data protection risks through
increasing the risk of data leaks possible to trace access and based access as well as secure sharing of data between AI applications which ensure
or the alteration of data. changes to data to identified adherence to privacy laws. analysts. Where access to or that data cannot be accessed
individuals. However, Disclosure standards, use of data goes beyond or used without appropriate
unauthorised transfer of data regulations and policies are permissions, this is flagged in permission and which
(for example to an external being established to ensure real-time as a potential breach automatically restrict access to
drive) is not automatically that security and data risks are and integrity risk to data and issue real-time
prevented or detected. well-managed and allow for management and data reports to management when
timely detection of data breach protection officers. potential misuse of data is
incidents or any cybersecurity identified.
threats

ANALYTICS MATURITY MODEL © OECD 2022


26 |

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Talent management Dedicated analyst positions are The core competencies needed Analysts are generally recruited The administration has The administration is
sometimes advertised, but for analyst positions have been through a dedicated process developed a reputation as a recognised as a leading
recruitment requirements identified, and recruitment is (which may be shared with popular employer of analysts, employer of analysts and
generally only include basic increasingly tailored towards other government agencies), and career paths are defined in provides strong career
analytical skills, and systematic improving the analytics and there is a proactive relevant business units. Links opportunities with staff able to
steps are taken to improve the capability of the administration. advertising strategy. Analytical have been established with progress to management
analytical capability of the However, the career path for capabilities are given some universities and similar levels. There is a close
administration or to promote analysts is unclear, making it increasing weight in bodies, and there is a good relationship with universities
career opportunities for difficult to retain highly skilled recruitment where relevant. understanding within HR of and similar bodies to provide a
analysts. analysts. skills needed for analysts. pathway to a career in public
service.

Analytics training is generally Limited training is available to Formal training opportunities There is structured training in Opportunities are routinely
done through mentoring and fill analysts’ skills gaps, for analysts are offered, and place for analysts, and staff are available for analysts to
self-learning. While analysts although upskilling is analysts are encouraged to encouraged to undertake self- undertake professional courses
may be sent on ad hoc training encouraged by management. undertake training guided learning on the latest and for continuous multifaceted
by individual business units, Informal networks for analysts opportunities. The analytics technologies and tools. There learning, including in other
the formal identification of skills are encouraged. services governance team is some support for external business areas.
gaps and programme to upskill organises networks for advanced courses and a
analysts is inadequate or analysts, increasing cross-unit management culture supportive
missing. exchange of analytics skills and of continuous learning and
experience. development.

Business feedback Analytics outputs are Analytics outputs are subject to Analytics outputs are tested Analytics outputs are tested AI is used to separate the
and evaluation occasionally subject to evaluation after completion, and reviewed by users as they and evaluated according to impact of analytics from the
evaluation. When feedback is although not consistently. are developed. Feedback is pre-defined protocols on a effects of other factors.
sought from users, it happens Users provide formal treated as a key part of delivery regular basis. Users provide Analytical models are
on ad hoc and informal basis requirements at the outset of a of a project. Learnings are timely, thorough, and monitored on a continuous
rather than through formal project. Formal feedback is generally agreed through the structured quantitative and real-time basis, and
mechanisms, meaning that provided by users, although the governance processes and qualitative feedback for each recommendations for
learning is often not captured feedback may not always be applied to improve future project, and results are adjustments are made where
and applied to future projects. captured and applied to projects. consistently used to improve appropriate.
improve future projects. future projects. Some analytical
work is subject to expert
external review.

ANALYTICS MATURITY MODEL © OECD 2022


| 27

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Analytics process Each piece of analytical work Although some work is agreed Operational users are involved Operational users are fully Operational users, dedicated
and project follows a different approach with operational users, in the project, but often see involved at all stages of the project management experts
management according to analyst analytics projects are often their role as reactive rather advanced analytics project, and analysts work as a single
experience and capabilities, undertaken without ongoing than pro-active. suggesting new ideas and team.
with few formal processes in business engagement due to ensuring that what is delivered
place. limited resources and meets real operational needs.
capabilities.

Analytics projects are usually Selected aspects of project Standardised processes are in Principles-based approaches Rigorous processes are in
carried out by analysts and management are followed in place covering business are in place for all aspects of place covering the full suite of
often do not involve the some analytics projects. engagement and collaboration, analytical work, and analysts analytics applications, including
business side, although there project management, and have the experience and know- AI, natural-language
may be some informal testing. Projects follow a mix of how to tailor the application of programming, real-time
engagement. Follow-through to waterfall and agile these principles as required. deployment, etc. The end-to-
ensure appropriate changes in methodologies. High-level Projects follow an agile or end process for analytical
work processes and programming standards are in similar iterative and flexible projects is subject to regular
procedures is often place. methodology where applicable. external peer review and
inconsistent. validation as well as
appropriate benchmarking with
leading external organisations.

Analytics Most work is based on Work is a mixture of Most work is based on data Analysts use a mix of Analysts utilise a full suite of
capabilities hypothesis-driven data hypothesis-driven analysis, exploration and modelling structured and unstructured data visualisation, natural
analysis, making the data exploration and basic using a variety of statistical data, as well as big data, and language processing, machine-
assumptions on which the modelling, allowing analysts to techniques; analysts carry out exploit a wide range of learning and other variations of
analysis is based limit the increasingly uncover systematic tests of code and statistical techniques, including artificial intelligence tools.
scope and potential results of unexpected or previously data accuracy. Cross-validation increasing use of machine Substantial parts of the process
the analysis, and potentially unknown patterns. or similar methods are used to learning and other variations of are automated.
allowing for incorrect or test the reliability of findings. artificial intelligence.
inaccurate conclusions.

ANALYTICS MATURITY MODEL © OECD 2022


28 |

MATURITY LEVELS EMERGING PROGRESSING ESTABLISHED LEADING ASPIRATIONAL

Analysts mainly rely on basic Advanced analysts have some Advanced analysts have a Advanced analysts have All advanced analysts have a
data manipulation and modelling skills and a basic good command of statistical developed strong statistical thorough understanding of the
visualisation skills, whereas understanding of the wider tax thinking and some key thinking skills, are comfortable statistical theory of and
statistical methodology and system, affording them basic modelling techniques in using a broad range of mechanics of a wide range of
advanced modelling techniques insight in how their work can addition to strong capabilities in modelling techniques, are advanced techniques, including
are largely untouched, limiting add value to the data manipulation and developing graph analytics AI, natural language
the potential for new insight. administration’s work. visualisation. They have a good skills, and are highly skilled in processing, and advanced
appreciation of how they can creating visualisations both to graph analysis. They are highly
support business decisions in explore data and present skilled in the effective
general. insights. application of statistical
techniques to frame and to
They work cooperatively and answer business problems,
proactively with operational and have a deep
managers to identify business understanding of business
problems and design and strategy and operational
communicate practical challenges.
solutions.

Usage areas Analytics usage areas are While somewhat patchy across Sophisticated data analysis Analytics are built into a wide Data analytics has become an
limited and only partially the administration, in some enables the administration to range of business processes integrated part of taxpayer
adaptable to the changing tax units professional data analysts detect anomalies, risks and within the tax administration, natural systems, simplifying
administration environment. are using combinations of data potential underlying problems increasingly supported by AI compliance and reducing cost
sources to support the tax with tax law, with an increasing applications, allowing the for the tax administration and
administration mandate, for use of automation to flag administration to identify issues taxpayers.
instance by assessing taxpayer issues for further investigation. and to take automatic actions
risk profiles for auditing and (such as taxpayer prompts) or
uncovering major anomalies. make recommendations for
actions by tax officials.

ANALYTICS MATURITY MODEL © OECD 2022


| 29

5 Glossary of terms
Advanced technology: It will vary over time and with context what is considered advanced technology;
at the time of publishing and in the context of tax administration analytics, machine learning and other
forms of artificial intelligence are likely to be considered advanced.
Agile project methodology: A methodology, often based on the Agile Manifesto 3, which amongst other
differences from traditional methodology focuses more on responding to the need for change than on
following a predefined plan.
Advanced analyst: A person using advanced analytics in a professional capacity.
Advanced analytics: Analysing data using statistical techniques and practices to gain understanding and
insight, make predictions and draw inferences about cause and effect.
Analyst: A person using analytics in a professional capacity.
Analytics: Discovery, interpretation and communication of meaningful patterns in data. This includes
reporting, risk modelling, advanced analytics and other variations of using data to gain insight. All variations
of analytics depend on the Analytics services made available by the tax administration to its staff.
Analytics outputs: Results from analytics work. These can vary as much as the field itself; examples
include a dashboard and a risk rating for a taxpayer. Users of analytics outputs can evaluate and give
feedback to the analyst regarding the usefulness of the outputs.
Analytics services infrastructure: Computing infrastructure, software and data used for analytics. The
data is often prepared to be more immediately usable.
Analytics services: The combination of an analytics services infrastructure, analytics management,
analytics prioritisation procedures and analytics support personnel in IT and business making it possible
for analysts to perform their work effectively and efficiently.
Analytics services governance: Managing analytics services in order to maximise the benefits and
balance the needs of the different teams using analytics services.
Artificial intelligence (AI): The ability of computers to acquire and apply knowledge, including by
performing tasks like sensing, pattern recognition, learning, and decision making. Machine learning is a
sub-category of AI where the algorithms used may be changed by the computer. Natural language
processing is a branch of AI seeking to enable computers to process and interpret human language in a
similar manner to what humans can do.
Big Data: The term is usually used about data sets that are too large or complex to be processed with
traditional methods and tools. Many use the V-s to describe Big Data sets:
• Volume: The amount of data is much larger than usual
• Velocity: The rate at which the data is produced or received is much faster than usual.
• Variety: The data sets contain data on a variety of formats, like audio, video streams and images.

3
http://agilemanifesto.org/

ANALYTICS MATURITY MODEL © OECD 2022


30 |

• Veracity: The possibility to verify that data is correct may vary considerably or be quite low.
Business units: The parts of a tax administration where operational tasks such as compliance
interventions, customer service, or debt management interventions are carried out. The structure and
responsibility of the business units will vary according to local arrangements.
Champion/ Challenger concept: This concept is based on identifying the current approach as the
Champion, and developing a set of Challenger approaches that differ from the Champion in measurable
and defined ways, so that they will deliver different results. Testing the approaches with real transactions
will show if any of the Challengers give better results than the Champion does. 4
Cross-validation: Validating the accuracy of a finding with different sets of data.
Data literacy: This can be understood as the ability to use and understand the usefulness of data.
Data mining: This term is often used about the process of searching for patterns in large data sets.
Fine-grained data security: Data security measures that regulate access to individual data sets or parts
of these. For instance, one user may have access to the entire data set while another user only has access
to particular columns or rows in the data set.
Machine learning: See Artificial Intelligence.
Metadata: Information about data elements. The metadata may for instance include structural information
like data type and number of records; quality information like validation rules, data quality and data
density 5; and relational information like possible integration with data in other systems.
Modelling: Administrations use this term in different ways, but generally modelling in tax analytics involves
using software to create a mathematical or other form of model that represents an aspect of reality and
can be used to answer questions, test concepts or uncover new information.
Natural language processing: See Artificial Intelligence.
Ontology: Overview of common concepts, terms and structures (i.e. metadata) used in the tax
administration. For instance, officials and IT systems in the tax administration should use a single definition
of taxpayer; this would be defined in the ontology.
Operational manager: Any manager with responsibility for operations (e.g. compliance management, debt
management, customer service).
Operational user: In this context, a user that utilises the results of analytics for operational purposes. For
instance, an auditor may perform an audit on a company because analytics results show a high risk of
fraud.
Regular: In the context of activities performed, this means that the activity is planned and happens
repeatedly at some predefined interval, as opposed to ad hoc activities.
Seamless tax administration: A tax administration that ensures that taxation happens in the background
in seamless and frictionless processes, with little or no effort on the part of the taxpayer. 6

4
Definition loosely based on https://www.fico.com/blogs/adaptive-control-championchallenger
5
Data density describes if a field in a record is hardly ever, sometimes or almost always filled out. For instance, having
a field for business category in the record describing a business is only useful for analysis if it almost always contains
a value.
6
Definition loosely based on https://www.oecd.org/tax/forum-on-tax-administration/publications-and-products/tax-
administration-3-0-the-digital-transformation-of-tax-administration.htm

ANALYTICS MATURITY MODEL © OECD 2022


| 31

Senior management: Different administrations use different terminology, but this should generally be
taken to mean Commissioner, Assistant Commissioner, Head of Division, Head of Branch and similar
positions as well as their immediate subordinate managers.
Taxpayer natural systems: These are sometimes called ecosystems; they are the interconnected
systems that taxpayers use to run their businesses, undertake transactions and communicate, including
for instance business accounting systems, financial service systems, and sharing and gig economy
platforms.
Unstructured data: Data which is not structured in a predefined manner. Examples include image files,
audio files, video files and text files.
Waterfall project methodology: The name of the methodology comes from the fact that when a project
phase has been completed, it cannot be revisited; water only falls down. With waterfall development, the
user representative normally signs off on a set of requirements for the software that is to be developed.
The developers then design, develop and test the software internally in their organisation as they interpret
it to be described in the requirements. Then the users test that the software fulfils the agreed requirements,
after which the software goes into production. Any need for changes that arise during the design,
development and testing will have to be handled through a formal change request with corresponding
budgetary adjustment (usually increase).

ANALYTICS MATURITY MODEL © OECD 2022


32 |

Annex A. Self-assessment record sheet

Please only include one “X” per row in the self-assessment record – the one that best fits your
administration’s level of maturity.
Please send the completed self-assessment record sheet to the Forum on Tax Administration Secretariat
at fta@oecd.org.

Process-related questions

Please see Recommendations for the self-assessment process for more information.

Jurisdiction name

Contact person

Appointment of facilitator (Y/N)?


Number of staff working with analytics or
analytics services in the self-assessment group
Appropriate range of staff involved in the
discussions (Y/N)?
Involvement of official(s) from other areas of
the tax administration (Y/N)? Please comment.
Time taken in hours to complete the self-
assessment

ANALYTICS MATURITY MODEL © OECD 2022


| 33

Self-assessment record

Strategic perspective

Indicative attribute \ Maturity levels Emerging Progressing Established Leading Aspirational


Strategy
Governance
Culture
Budget setting

Operational perspective

Indicative attribute \ Maturity levels Emerging Progressing Established Leading Aspirational


IT infrastructure, system development and tools
Data management
Talent management
Business feedback and evaluation
Analytics process and project management
Analytics capabilities
Usage areas

Additional considerations

1. Are there particular elements within one or more indicative attributes where you assess your
administration to be substantially more or less mature compared with your overall assessment for the
attribute?

2. Are there areas where you think there is a lack of clarity as regards the difference between adjacent
maturity levels?

3. Are there areas where you think the language is unclear or ambiguous?

4. Would you like to suggest additional terms to include in the Glossary?

ANALYTICS MATURITY MODEL © OECD 2022


FORUM ON TAX ADMINISTRATION
OECD Tax Administration Maturity Model Series

Analytics Maturity Model


The OECD Tax Administration Maturity Model Series sets out descriptions of capabilities and
performance in particular functions or sets of activities carried out by tax administrations across
five discrete maturity levels. The intention of this series is to provide tax administrations globally
with a tool to allow them to self-assess their current level of maturity and to facilitate consideration
of future strategy, depending on a tax administration’s unique circumstances and priorities.

Analytics is increasingly becoming a common and integrated part of tax administrations across the
world, in developed and developing countries alike, being used in strategic as well as operative
usage areas. The FTA Analytics Community of Interest and the FTA Secretariat have therefore
developed the Analytics Maturity Model. The model can aid tax administrations in assessing their
analytics usage and capability, providing insight into current status and identifying areas of
weaknesses as well as strengths.

The model is organised around the strategic and operational perspectives of analytics. To assist
in the understanding of what a given level of maturity means, a set of indicative attributes is
contained under each maturity level. In addition to the model itself, the report offers guidance
for how to perform a self-assessment based on the model. It also summarises the anonymised
results from the over forty administrations that have participated in the piloting process, as an
aid to understanding the current status of analytics use and capabilities in tax administrations.

www.oecd.org/tax/forum-on-tax-administration/

You might also like