0% found this document useful (0 votes)
12 views19 pages

Week-1-Lect 1-2

The document provides an overview of computers, defining them as electronic devices that process data and execute tasks. It discusses the evolution of computing from mechanical devices to modern technologies like cloud, edge, and quantum computing, as well as the role of artificial intelligence and machine learning in enhancing computer capabilities. Key concepts include the distinction between data and information, types of computers, and how AI systems learn and improve over time.

Uploaded by

Hina Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd
0% found this document useful (0 votes)
12 views19 pages

Week-1-Lect 1-2

The document provides an overview of computers, defining them as electronic devices that process data and execute tasks. It discusses the evolution of computing from mechanical devices to modern technologies like cloud, edge, and quantum computing, as well as the role of artificial intelligence and machine learning in enhancing computer capabilities. Key concepts include the distinction between data and information, types of computers, and how AI systems learn and improve over time.

Uploaded by

Hina Khan
Copyright
© © All Rights Reserved
We take content rights seriously. If you suspect this is your content, claim it here.
Available Formats
Download as PPTX, PDF, TXT or read online on Scribd

Lecture 1-2

Week 1: Introduction
What is a Computer?
• A computer is an electronic device that processes data,
performs calculations, and executes instructions to
complete tasks.
• It can store, retrieve, and manage information, enabling
various activities like writing, browsing the internet, and
running applications efficiently.
Data and Information
• Data is raw, unprocessed facts or figures without context,
like numbers or text.
• Information is processed data that has meaning and
context, making it useful for decision-making or
understanding.
• Data becomes information when it's organized and
interpreted.
Evaluation of Computing (from
mechanical to modern computing)
• Mechanical Era:
• Abacus: Early tool for basic arithmetic operations.
• Charles Babbage: Invented the Analytical Engine, a mechanical
general-purpose computer
• Electromechanical Era:
• Punch Cards: Used in early computing for data input and storage.
• ENIAC (1940s): One of the first electronic computers, using
vacuum tubes.
Abacus & Analytical Engine
Eniac
Evaluation of Computing
• Transistor Era:
• Transistors (1950s): Replaced vacuum tubes, making
computers smaller and faster.
• Integrated Circuits (IC):
• Microchips (1960s): Multiple transistors on a single chip,
further reducing computer size and cost.
Transistor / Microprocessor
Evaluation of Computing
• Microprocessor Era:
• Personal Computers (1970s):
• Microprocessors enabled the development of PCs like the
Apple II and IBM [Link] Computing:
• Smartphones & AI (2000s-Present):
• Advanced microchips, cloud computing, and artificial
intelligence power today's powerful, portable devices.
Types of Computers
• Personal Computer (PC):
• A personal computer is designed for individual use,
handling everyday tasks like browsing, gaming, and
document editing. Examples include Apple MacBook,
Dell XPS, and HP Pavilion.
Types of Computers
• Cloud Computing:
• Cloud computing is a technology that allows users to
access and store data, applications, and services over the
internet instead of using local computers.
• It offers flexibility, scalability, and cost savings by
providing resources like servers, storage, and databases
remotely, managed by providers like Google, AWS, or
Microsoft.
Types of Computers
• Edge Computing:
• Edge computing processes data closer to where it's generated, like
on devices or local servers, instead of sending it to a distant cloud.
• This reduces latency and improves response times, making it ideal
for real-time applications like autonomous vehicles, IoT(internet of
things) devices, and smart cities, where quick decisions are crucial.
• Example NVIDIA Jetson, Cisco Edge Compute, and Google Edge
TPU.
Types of Computers
• Quantum Computing:
• Quantum computing uses quantum bits (qubits) to perform
calculations far faster than traditional computers.
• Unlike classical bits, which represent 0 or 1, qubits can represent
both at once, allowing for complex problem-solving.
• This technology is powerful for tasks like cryptography, simulations,
and optimizing complex systems, revolutionizing computing potential.
• Example: IBM Q System One, Google Sycamore, and D-Wave
Systems
Quantum bits
• Quantum bits (qubits) are the basic units of information in quantum
computing. Unlike regular bits that can only be 0 or 1, qubits can be
both at the same time. This allows quantum computers to solve
problems faster than regular computers.
• Example:
• Imagine you have a light switch. A regular bit is like the switch being
either off (0) or on (1)
• Now, think of a qubit as a special switch that can be both off and on at
the same time. This lets it do many things at once, making it super
smart!
What is AI?
• Artificial Intelligence (AI) is the simulation of human
intelligence in machines, enabling them to perform tasks
like reasoning, learning, problem-solving, and decision-
making.
• AI systems use algorithms and data to recognize patterns,
adapt, and improve over time, driving applications like
robotics, natural language processing, and autonomous
vehicles.
How AI works?
• AI works by using algorithms and large amounts of data to
mimic human-like thinking and decision-making.
• Data Collection: AI systems gather vast amounts of data
from various sources, including text, images, and videos.
• Data Processing: This data is cleaned and organized to
ensure quality. Algorithms are applied to identify patterns
and trends.
How AI works?
• Training: Machine learning algorithms train on this data
to learn from examples. For instance, if training an AI to
recognize cats, it analyzes many images of cats and learns
their features.
• Model Creation: The trained algorithm becomes a model
that can make predictions or decisions based on new data.
How AI works?
• Inference: When given new input, the AI model processes
it, applying what it learned during training to provide an
output or response.
• Feedback Loop: AI systems can improve over time by
receiving feedback on their predictions, allowing them to
refine their models and increase accuracy.
Machine Learning
• Machine learning is a part of artificial intelligence that helps computers learn from data.
Instead of being programmed with specific instructions, they use examples to recognize
patterns and make [Link] time, they improve their accuracy by learning from their
experiences.
• Here are two common examples of machine learning:
• Email Spam Detection:
• Machine learning algorithms analyze emails to identify patterns that indicate spam. They
learn from labeled examples (spam and non-spam) to classify new incoming emails accurately.
• Recommendation Systems:
• Services like Netflix or Amazon use machine learning to suggest movies or products. They
analyze user behavior and preferences to recommend items you might like based on what
similar users enjoyed.

You might also like