0% found this document useful (0 votes)
45 views9 pages

Data and Information Practice Assignment

The document describes a term project for a data management course where students assess the data management maturity of a real organization. It outlines three phases for the project: planning the assessment activities including defining scope and approach, performing the maturity assessment by gathering information, and interpreting the results. Key deliverables include an organizational overview, details of assessment planning and performance, results visualization, and conclusions.

Uploaded by

Lojain
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
0% found this document useful (0 votes)
45 views9 pages

Data and Information Practice Assignment

The document describes a term project for a data management course where students assess the data management maturity of a real organization. It outlines three phases for the project: planning the assessment activities including defining scope and approach, performing the maturity assessment by gathering information, and interpreting the results. Key deliverables include an organizational overview, details of assessment planning and performance, results visualization, and conclusions.

Uploaded by

Lojain
Copyright
© © All Rights Reserved
Available Formats
Download as PDF, TXT or read online on Scribd
Download as pdf or txt
Download as pdf or txt
You are on page 1/ 9

CIS 416: Data and Information Management

Instructors: Dr Samiha BRAHIMI

Term Project: Data management maturity assessment


Goals of the assignment
✓ Functional: practice the data management concepts learnt in class by assessing a real
data-driven organization's maturity.
✓ Non-functional: This assignment is a community service curricular activity,
Introduction and background
As data continues to grow exponentially, business leaders have access to more raw
performance data than ever before. However, many organizations have no idea how to harness
the power of data for their business. The DMM Program provides the best practices roadmap
and services to help organizations build, improve, and measure their enterprise data
management function and staff.
The program centers around the Data Management Maturity (DMM) framework (example
shown in figure 1), a comprehensive framework of data management practices in six key
categories that helps organizations benchmark their capabilities, identify strengths and gaps,
and leverage their data assets to improve business performance.
CMMs usually define five or six levels of maturity, each with its own characteristics that
span from non-existent or ad hoc to optimized or high performance. See figure2 (Ref:
textbook) for a sample visualization.
Figure1: Data management maturity framework

Figure2: Data management maturity levels

At any level, assessment criteria will be evaluated along a scale, such as 1 – Not started, 2 –
In process, 3 – Functional, 4 – Effective, showing progress within that level and movement
toward the next level. Scores can be combined or visually displayed to enable understanding
of the variance between current and desired state. When assessing using a model that can be
mapped to a DAMA-DMBOK Data Management Knowledge Area, criteria could be
formulated based on the categories in the Context Diagram namely Activity, Tools,
Standards and People and resources. See example in figure 3 (Ref: chapter 15)
Figure 3: example DMMA visualization

This paper is a description of the students' term project for CIS 416: data and information
management.
Description:
In this assignment, students will pick a data-driven organization, evaluate its data
management maturity according to some selected aspects (data governance and data
architecture) and recommend improvement techniques based on the studied material and
their own research efforts. Students will choose the appropriate interaction approach to
collect the information required for the assessment. Related documents could be provided
to the students as per the organization's regulations.
Workflow:
Students are requested to go through this document and the recommended reading material
before starting the project. The project is divided on three main phases:

Phase I: Plan Assessment Activities (details in chapter 15)


✓ Define Organizational Scope: For a first assessment, it is usually best to define a
manageable scope, such as a single business area or program. As for this project,
students are requested to deal with only two data management functions. Hence, the
scope should be identified accordingly.
✓ Define Interaction Approach: In conducting a DMMA, an organization should
follow recommendations for the selected model. Information gathering activities may
include workshops, interviews, surveys, and artifact reviews. Employ methods that
work well within the organizational culture, minimize the time commitment from
participants, and enable the assessment to be completed quickly so that actions from
the assessment can be defined while the process is fresh in participants’ minds. You
can get inspired by the questionnaires and surveys available online .
✓ Plan Communications: Ensure participants understand the assessment model, as well
as how the findings will be used. Before the assessment begins, stakeholders should
be informed about expectations for the assessment (The purpose of the DMMA, How it
will be conducted, What their involvement may be, The schedule of assessment activities)

Phase II: Perform Maturity Assessment (details in chapter 15)


✓ Gather Information: gather appropriate inputs for the assessment, based on the
interaction model. At a minimum, the information gathered will include formal
ratings of assessment criteria.
✓ Perform the Assessment: The overall rating assignments and interpretation are
typically multi-phased. Participants will have different opinions generating different
ratings across the assessment topics. Discussion and rationalization will be needed to
reconcile the ratings. Input is provided by the participants and then refined through
artifact reviews or examinations by the assessment team. The goal is to come to a
consensus view of current state.

Phase III: Interpret Results (details in chapter 15)

Report Assessment Results The assessment report should include:


✓ Business drivers for the assessment
✓ Overall results of the assessment
✓ Ratings by topic with gaps indicated
✓ A recommended approach to close gaps
✓ Strengths of the organization as observed
✓ Risks to progress
Required Components for the Final Paper:
Please make sure to label each section with either a section title (e.g., literature review) or a
title that communicates the content of the section (e.g., previous research on culture
keeping). This structure is not mandatory; section could be removed or added with a valid
justification.

1. Cover Page: The first page of your paper should be a cover sheet that includes a title
that communicates the content of your paper, your name, date, title of the class, and
any other information you feel is necessary.
2. Abstract: A revised abstract for the paper that is no longer than 250 words. It should
be single-spaced and should be placed immediately preceding the introduction. The
abstract should summarize the whole work including the methodology and the
results.
3. Introduction: This section should contain

✓ An overview about the organization (may include a definition and activity,


the business model it is following, its enterprise data architecture …etc.).
✓ T the scope of the assessment along with a valid justification of the scoping.
✓ The end of this section explains the way the rest of the document is
structured.

4. Assessment planning :

This section reports all the above assessment planning activities in details, the
interaction approach material like: interviews, questionnaires, meeting agenda …etc.
should go as appendices.

5. Perform Maturity Assessment: this section reports in details the activities


performed as explained above.
6. Result interpretation : the assessment results are reported here as explained
above; use a data visualization tool to ensure the readability of the results.
7. Discussion and Conclusion: In this section you will summarize the argument that
you have made in the paper. Lessons learnt and challenges faced are also explained
here.
8. Appendices: If you did interviews or a survey you must include an appendix with
your questions. You should refer to the appendix in the relative section. You can
also include appendices with additional information (e.g., coding, statistics) if you
feel that it is necessary. An important appendix you should include is a certificate
from the company that delivered the requested documents. The appendices do not
count in the page count.
9. Bibliography/Citations: You may cite any guides or tools you have used. References
should be mentioned in the main text.

Submission guidelines
Reports will be submitted via blackboard according to the following table:

Milestone Grades Sections includes Deadline

1 6 1, 3, 4, 8 and 9. week 7
2 5 All * Week 11
Presentation 4 All Week 13

* Sections already included in milestone 1 should be updated according to the provided


instructor's feedback. Students can also update the document if they find it necessary.

Guidelines for your presentation:

1. Your presentation should be about 10 minutes. Please practice ahead of time so that
you can make sure that you can fit what you want to say in this time period.
2. You should briefly explain all the performed activities.
3. After your presentation the class will ask questions to you and your panel. Please
come prepared to talk in depth about your project.

Evaluation rubrics
Milestone I (6 marks )

Meeting the deadline: 0.5 mark or more (-0.25 for each day)

Team work, structure and language (0.5)

Organizational description: 0.5 mark

• Students present the organization and show how it is said to be data driven (0.5
mark)
• Student present the organization s without covering the data aspect (0.25)
• No organization presentation (0)

Phase I: Plan Assessment Activities


1. Define Scope and objectives : (1 mark)
• Students define the objectives project scope and the departmental scope clearly
(1 mark)
• Students miss (do incorrectly) either the objectives, the departmental or the
project scope (0.5-0.75)
• Students miss (do incorrectly) the objectives, departmental and the project scope
(0 mark)

2. Define Interaction Approach: (0.5 mark)


• Interaction approach well defined (0.5 mark)
• Interaction approach incorrectly defined or not at all mentioned (0 mark)

3. Plan Communications: (0.25 mark)


• Plan communications strategy well defined (0.25)
• Plan communications strategy incorrectly defined or not at all mentioned (0)
4. Appendix (3 marks) 1 mark for each function
• Interaction approach questions are covering the minimum assessment criteria (3
marks)
• Interaction approach questions partially cover the minimum assessment criteria
(0.5-2.5)
• Interaction approach questions not done (0 mark)

✓ Milestone II (6 marks)

Meeting the deadline: 0.5 mark or more (-0.25 for each day)

Team work, structure and language (0.5)

Phase II: Perform Maturity Assessment


1. Perform the Assessment: (0.5 mark)
• Data is collected properly (0.5)
• Data is collected in a way that may not answer all requirements (0- o,25)
• Data is not collected at all (0 for the whole milestone)

✓ Phase III:
2. Business drivers for the assessment (0.25 mark)
• Business drivers explained correctly (0.5 mark)
• Business drivers explained incorrectly or not mentioned (0- 0 mark)
3. Overall results of the assessment (2 marks)
• Results are generated following a clear framework and visualized in an effective
and ethical way (2)
• Results are generated in a partially correct way (0.5-1.5)
• Results generation is completely wrong or not mentioned at all (0)
4. Ratings by topic with gaps indicated (1 mark)
• Gaps are indicated consistently with the rank provided (if no gaps this should be
explained in light of the ranking) (1)
• Some inconsistency is present in explaining the gaps (0.25-.5)
• Gaps are completely inconsistent or not mentioned (0)
5. A recommended approach to close gaps (if no gaps, the mark of this section is added
to the previous one) (0.5 mark)
• Proper recommendations are provided to close the gaps (0.5)
• Recommendations are not covered or all irrelevant (0)
6. Strengths of the organization as observed (0.25 mark)
• Strong points are well explained in light of the rank (0.25)
• No Strong points are identified although the rank is low or identified points
although the rank mentioned a high level of maturity (0)
7. Risks to progress (0.5 mark)
• Risks are explained in terms of the collected data (0.5)
• Risks are not mentioned or not relied with the collected data (0- 0.25)
reading

✓ Text book: Chapter 15


✓ http://www.dataversity.net/assessing-data-management-maturity-using-the-dama-dmbok-
framework-%E2%80%93-part-1/
✓ http://www.dataversity.net/assessing-data-management-maturity-using-the-dama-data-
management-book-of-knowledge-dmbok-framework-part-2/

Additional reading

✓ https://www.ewsolutions.com/data-management-maturity-overview/

You might also like