HANDOUT 1 - Fundamentals of Data Processing
HANDOUT 1 - Fundamentals of Data Processing
COMPETENCE/S
Electrical, electronic and control engineering
at the operational level
Effectively use computer applications for
COURSE OUTCOME
documents used onboard ship.
KNOWLEDGE UNDERSTANDING Understanding of the Main Features of Data
PROFICIENCY Processing
INTRODUCTION
The invention of computer technology was one of the most important
events of all time. With the improvement of computer technology and
ease of use, it has become a popular technology in the hands of
many. Data processing has also become popular with computer systems
making it easier to be handled. In the current times of multiple
industries ruling the economy of the many countries of the world,
data processing is a field that has numerous applications in most
fields like business, education, healthcare, research and more. The
importance is increasing with the increase in advancement in areas
like data science, machine learning, artificial intelligence, data
quality and data security etc.
ENGAGEMENT
ASSESSING PRIOR KNOWLEDGE
Watch the video “How Computers Work- Binary and Data” Available
at: https://www.youtube.com/watch?v=USCBCmwMCDA
EXPLORATION
Anything we input to the computer is converted to a series of number
that is represented either by one or zero. These come from signals
transmitted through wires which can be on, or off; yes or no; true
or false; or 1 or 0. Every wire is equivalent to a bit – the smallest
piece of data the computer can store. Though we are not obliged to
study this number system because our computers do the conversion,
it is still good to know how our data travels from one computer to
another and how it changes it form.
TERMS DEFINITION
1. CPU –Central Processing Unit serves as the brain
of the computer where all processing happens.
2. Input Devices – devices used to input data into the
computer
PRESENTATION:
Data Processing Cycle
Data processing cycle as the term suggests a sequence of steps
or operations for processing data i.e., processing raw data to
the usable form. The processing of data can be done by number of
data processing methods.
Other formats/ raw files – These are the software specific file
formats which can be used and processed by specialized software.
These output files may not be a complete product and require
further processing. Thus there will need to perform steps
multiple times.
Central Processing Unit – The table below shows how the main
components of the CPU divide task to process.
a) Input/Output Operations
A computer can accept data (input) from and supply
processed data (output) to a wide range of input/output
devices. These devices such as keyboards, display
screens, and printers make human-machine communication
possible.
c) Logic/Comparison Operations
A computer also possesses the ability to perform logical
operations.
INTRODUCTION
Various data processing methods are used to convert raw data into
meaningful information through a process. Data is manipulated to
produce results that leads to a resolution of a problem or an
improvement in the existing situation. Similar to a production
process, it follows a cycle where inputs (raw data) are fed to a
process (computer systems, software, etc.) to produce output
(information and insights).
EXPLORATION
Collecting the right data is essential for various purposes.
Facebook, for example, uses our data wisely that it knows which
people we may know and add, what products, we may like to buy, and
places we would like to go to without asking us directly. Sometimes,
it is beneficial for effective and efficient use of the platform.
However, it can also be used against us by anyone who may have the
intent to steal, bully, and perform other cybercrime. Therefore, it
is not just about how you collect data but how we use the data.
Facebook made use of our data effectively by using different methods
of processing according to its specific needs.
TERMS AND DEFINITION
ICT-Software Application and Network System Used in Seagoing Ships
6
MPCF 1st Semester S/Y 2023-2024
TERMS DEFINITION
1. Big Data — large enough that it can’t be processed by
traditional systems
PRESENTATION:
Data Processing Methods and Types of Data Processing
1. Manual Data Processing
3. Online Processing
This processing method is a part of automatic processing
method. This method at times known as direct or random
access processing. Under this method the job received by
the system is processed at same time of receiving. This can
be considered and often mixed with real-time processing.
This system features random and rapid input of transaction
and user defined/ demanded direct access to
databases/content when needed.
4. Distributed Processing
Distributed systems use multiple central processors to
serve multiple real-time applications and multiple users.
Data processing jobs are distributed among the processors
accordingly. This method is commonly utilized by remote
workstations connected to one big central workstation or
server.
The processors communicate with one another through various
communication lines (such as high-speed buses or telephone
lines). These are referred as loosely coupled systems or
distributed systems. Processors in a distributed system may
vary in size and function. These processors are referred as
sites, nodes, computers, and so on. ATMs are good examples
of this data processing method. All the end machines run on
a fixed software located at a particular place and make use
of exactly same information and sets of instruction.
5. Multiprocessing
This type of processing perhaps the most widely used types
of data processing. It is used almost everywhere and forms
the basic of all computing devices relying on processors.
Multi processing makes use of CPUs (more than one CPU). The
task or sets of operations are divided between CPUs available
simultaneously thus increasing efficiency and throughput.
The break down of jobs which needs be performed are sent to
different CPUs working parallel within the mainframe. The
result and benefit of this type of processing is the
reduction in time required and increasing the output .
Moreover CPUs work independently as they are not dependent
6. Time sharing
A time sharing system allows many users to share the computer
resources simultaneously. In other words, time sharing refers
to the allocation of computer resources in time slots to
several programs simultaneously. For example a mainframe
computer that has many users logged on to it. Each user uses
the resources of the mainframe -i.e. memory, CPU etc. The
users feel that they are exclusive user of the CPU, even
though this is not possible with one CPU i.e. shared among
different users.
dataset with third party must be done carefully and as per written
agreement & service agreement. This prevents data theft, misuse and
loss of data.
Data in any form and of any type requires processing most of the
time. These data can be categorized as personal information,
financial transactions, tax credits, banking details, computational
data, images and simply almost anything you can think of. The
quantum of processing required will depend on the specialized
processing which the data requires. Subsequently it will depend on
the output that you require. With the increase in demand and the
requirement for such services, a competitive market for data
services has emerged.