ppROFESSIONS
Diploma in Computer
Science
Contents
Professional
2 Diploma in Course Name
Lesson Outcomes
Lesson Title
2 Introduction
2 Defining Computer Science
3 The History of Computers
5 Applications of Computing
Module
7
# Lesson #
References
Summary Notes
2
Lesson outcomes
By the end of this lesson, you should be able to:
Define Computer Science and its usage
Understand the road to modern-day computing
Know the applications of computing
See the opportunities available in the field of Computer Science
Introduction
In this course, we explore the foundations of computing and mathematics and the implications in 21st century innovations.
As a practical component, you will also be introduced to the basics of computer programming. This will serve as a solid
foundation for any programming journey that you choose to take up after this course. Computing is a dynamic, fast paced
world, and it is imperative that you stay up to date with the latest trends. This course explores fundamental principles and
concepts in computing used in modern business and society.
Defining Computer Science
Understanding Computer Science
Computer science is the study of computational systems. In simple terms, we study how computers “think” and
programmatically help them to solve problems for us. Computer science is at the heart of software and software systems.
● It starts with theory
● Moves into development of the system
● Ends at application - the actual deployment of the system
It doesn’t truly end here though, as systems are constantly improved and perfected. Think of how your phone or computer
has received several software updates since you started using it.
Computer Science applies to many different fields, for example:
● Artificial Intelligence
● Computer Systems and Networks
● Database Systems
● Human Computer Interaction
● Vision and Graphics
● Software Engineering,
● Bioinformatics
● Theory of Computation
Key Terms
Programming
● Creation of instructions that a computer can understand
Coding
● Same meaning as programming
3
Language
● Specific method for communication
Why do we need computers?
They make repetitive tasks easier
● What if a cashier had to calculate groceries manually?
Complex tasks are achieved faster
● Could you find a topic in a 2000-page PDF in minutes?
Accuracy is greater
● Note that accuracy is based on the data it is given. If the data is not in working order, the output will not be accurate.
However, computers are becoming smarter and are now able to reject many forms of “bad” data before it is processed.
Computing and Math
Scientists throughout history have said that nature is mathematical. Many things in nature can be described using
mathematical expressions. Arguably the most popular expression is the Fibonacci sequence, covered in module 2.
An example - playing video games:
Everything in a video game is a mathematical expression. The scenes, trees, road, cars, people, voices...
Game development is a specialisation which could be taken up after understanding computer science.
● Other examples:
● Where a ball lands
● The top speed of a car
● The shape of leaves
● The motion sequence of flapping wings
● The path an ant follows
● Weight ratio and bones
● Arm and leg proportions
● Patterns on your fingerprints
● Features of the face
Computers help us understand these models and it is the job of a computer scientist to formulate them and take advantage
of the speed and power that computers offer.
The history of computers
A detour into the museum
Computers existed as far back as 2 millennia ago in the form of abaci, used by Mesopotamians for simple calculations.
Charles Babbage
British mathematician, inventor, engineer, and philosopher is considered the father of the computer.
● He built the “Difference Engine” - a massive automatic mechanical calculator that could perform mathematical
operations
● It mechanised a series of calculations on several variables to solve a complex problem
● This Engine had storage, designed to stamp its output on to a soft material
We would consider this archaic now, but this was the beginning of the modern computer.
4
First Generation
The Electronic Numerical Integrator and Computer (ENIAC) was the world’s first general purpose computer.
● It weighed 30 tonnes and consisted of 18,000 vacuum tubes. Vacuum tubes were the switching bits that were used
before transistors were invented.
● It used 160 kilowatts of power and occupied 167 square metres
● It was exciting because it was programmable, bringing about computer science
● It was used for weather predictions, cosmic ray studies, random number studies, wind tunnel design and much
more
The ENIAC could do 5000 additions and 357 multiplications, or 38 divisions in 1 second.
John Von Neumann
Von Neumann made changes to the ENIAC, making it easier to program. He was also the author of the Von Neumann
architecture, which is the base design of all modern computers.
Second Generation
The invention of the transistor brought better and smaller computers.
● A transistor is a device used to amplify or switch electronic signals and electrical power
The Universal Automatic Computer (UNIVAC) was introduced in 1951 and it is particularly famous for correctly predicting
the outcome of the US presidential election.
● It stored programs were developed which was a catalyst for many new programming languages
● It also ran an Operating System (OS)
● A programming language called Common Business Oriented Language (COBOL) was used to develop many
business systems
● New terms such as “bits” and “bytes” became commonplace in describing data
Third, Fourth and Fifth Generation
● New features such as parallel computing, multitasking, graphics, and sound arrived, making computers more
capable and more useful to the average person
● Programming became a lot easier and the applications of computers took off in various directions
● Integrated Circuits (ICs) meant computers could be smaller and achieve much more
The Future
Artificial Intelligence
Artificial intelligence involves programming a computer with the ability to mimic human intelligence.
Alan Turing, commonly referred to as the father of computer science, devised a test to determine a computer’s ability to
exhibit intelligent behaviour that is equivalent to, or indistinguishable from that of a human.
In this test, a human evaluator would communicate with subjects A and B, where one is a machine and one is a human being,
via a text-only channel. The human evaluator would then evaluate the responses and tell apart the machine from the human,
based on the machine’s ability to give answers that closely resemble human answers.
For the first time in history, in 2014, a computer program called Eugene Goostman, which simulates a 13-year-old Ukrainian
boy, is said to have passed the Turing test at an event organised by the University of Reading.
The Turing Test is successfully passed if a computer is mistaken for a human more than 30% of the time during a series of
five-minute keyboard conversations.
5
Other AI advancements include:
● Self-driving cars & other autonomous vehicles
● Advances in communications enables computers to gather staggering amounts of info, process it and produce
output at fast speeds
Internet Of Things (IOT)
The Internet Of Things refers to computing being embedded in everyday objects
IOT has made it possible to build smart gadgets and smart wearables that interact with your body and with the environment.
Some examples of Internet Of Things devices:
● Smart speakers like Alexa and Google Home
● Smart watches like Apple Watch and Garmin
● Internet-connected monitors and biometrics
Quantum Computing
Quantum computing is an area of computing focused on developing computer technology based on the principles of
quantum theory. The quantum theory attempts to explain the behaviour of energy and metrics on the atomic and subatomic
levels.
The world of quantum computing is promising to dwarf current computing capacity many times over. Recent days have
seen research on quantum computers, which promise to speed up computing by adding a new twist to the ordinary 2-bit
system that runs every single computer in the world today.
To date it still exists mostly in theory, although working demonstrations have been produced.
Applications of Computing
Business Operations
Computers are now at the centre of business operations:
● Stock markets
● Payroll calculations
● Budgeting
● Sales analysis
● Financial analysis
● Performance analysis
All of these are reliant on computing.
● ATMs are completely automated
● Most banking operations can be done in real-time
● Global payment partners (e.g. PayPal) enable limitless transactions
● Business communications are almost entirely digital
The reliance on computing offers several career opportunities:
● Systems development
● Data Analysis
● Mobile front-end development
● Mobile back-end development
6
Consumer Electronics
The most notable industry in this regard is the Internet Of Things - the interconnection of practically everything while giving
it some form of intelligence.
You may have seen the word “smart” pop up a lot in recent years. This usually means the device in question has some sort
of data gathering and/ or processing ability.
Consumer electronics is one of the most explosive industries in terms of growth, with Apple Inc. being the first trillion-dollar
company in the world. Other companies like YouTube, Facebook and tiktok have seen rapid growth in recent years because
of the growth and increased available consumer computing devices.
Smartphones are the centre of attention:
● Evolved from simple “bricks” to the single most used type of gadget for content consumption.
● There is constant focus on improving the user experience.
● Some devices now ship with AI chips built-in to further enhance the computing experience.
With recent advances in processing and interconnectivity, electronic items such as televisions, mobile phones, soundbars
and headphones are now able to respond to natural language commands.
● Many electronic devices are now available as assistive technologies for differently-abled individuals.
● Recent electronic shows had robotic arms and other contraptions that can be controlled by thoughts alone.
● There is a lot of research going on around tech that lives inside the body.
Science and Research
Science owes its achievements to computing.
● Computing allows them to accurately model difficult and complex scenarios and analyse them in fine detail
● From the 1960s, computers have been used in research in conjunction with mathematical models to produce
simulations, models, and projections.
● Weather forecasts have now become more accurate with the use of carefully designed modelling tools and data
analysis.
● Humanity has been able to safely leave earth and return by simulating never seen before environments and building
suitable equipment.
● Computers are also used in the research and fabrication of new faster and better computers.
7
References
• Athabascau.ca. 2020. Introduction to Computer Programming (Java) [online] Available at:
<https://www.athabascau.ca/syllabi/comp/comp268.php>
• www1.cs.ucr.edu. 2020. Computer Science With Business Applications (CSBA) Major.
[online] Available at: <https://www1.cs.ucr.edu/undergraduate/programs/computer-
science-business-applications/>
• BBC News. 2020. Computer AI Passes Turing Test In 'World First'. [online] Available at:
<https://www.bbc.com/news/technology-27762088#:~:text=A-computer-program-called-
Eugene,talking-to-machines-or-humans>
• Gerola, H. and Gomory, R. (1984) "Computers in Science and Technology: Early Indications",
Science, 225(4657), pp. 11-18. doi: 10.1126/science.225.4657.11.
• Educationstandards.nsw.edu.au. 2020. Information Processes And Technology | NSW
Education Standards. [online] Available at:
<https://educationstandards.nsw.edu.au/wps/portal/nesa/11-12/stage-6-learning-
areas/technologies/information-processes-technology-syllabus>.
• Lewis, T. (2014) What's the Universe Made Of? Math, Says Scientist, livescience.com. [online]
Available at: https://www.livescience.com/42839-the-universe-is-math.html
• Oettinger, A. (1966) "The Uses of Computers in Science", Scientific American, 215(3), pp.
160-172. doi: 10.1038/scientificamerican0966-160.
• OpenCourseWare, M. (2020) Syllabus | Introduction to Computer Science and Programming |
Electrical Engineering and Computer Science | MIT OpenCourseWare, Ocw.mit.edu. [online]
Available at: https://ocw.mit.edu/courses/electrical-engineering-and-computer-science/6-
00sc-introduction-to-computer-science-and-programming-spring-2011/syllabus/
• Steitz, B., 2020. Brief History Of Computer. [online] People.bu.edu. Available at:
http://people.bu.edu/baws/brief%20computer%20history.html#:~:text=Brief%20History%2
0Of%20Computer&text=The%20computer%20as%20we%20know,of%20today%20are%20
based%20on.
• Swaine, M., 2020. Difference Engine | Calculating Machine. [online] Encyclopedia Britannica.
Available at: <https://www.britannica.com/technology/Difference-Engine>.
• Uses for Computers in Business (2020). [online] Available at:
https://smallbusiness.chron.com/uses-computers-business-56844.html
• What is Computer Science? | Undergraduate Computer Science at UMD (2020). [online]
Available at: https://undergrad.cs.umd.edu/what-computer-science