Unik
Tnbredulin be Mochtue Leaatig
op You Thule about eoupulers, Tay
computaliion wy watdiel alodshen:
Cora pubes cuudertaut
precern — everyTitug tntarwa of Goud 1S.
olka hou tats are gist af
Hearing and -secgrition of thts. Thabo oseally
teen trom history sn dc poritvts oF Supurvi Aer
at a be the vufurod With teary ore
of emmpett (Comptortes) vo fy heaven
Nery aeita ete do te ana of
geet of
wo at) \SMo Nt uell
oud
In
fla abet tes
(larry & seco
fearfeg ‘
wot aneTraditional Programming
Data
Data ‘Output
Output
Machine Learning
Figure 1.3: Machine learning versus traditional programming
In traditional programming, the computer follows a set of predefined rules to process the input
data and produce the outeome, In machine learning, the computer tries to mimic human thinking.
It interacts with the input data, expected outcome, and environment, and it derives patterns that
are represented by one or more mathematical models. The models are then used to interact with
future input data and to generate outcomes. Unlike in automation, the computer in a machine
learning setting doesn't receive explicit and instructive coding.ately ofkwe Ah Rime tvetver Trocetng ta befd
Mic Hey owapple , ond Hy 1h alfo oH apple , a4
ey 74 Cupeeke and 80 o- Batally yor bau call
fe ou oe ayperviton. tidy [ola of learning
* happening by Chew! obj wt [EA ategery
oY Jabel Cox; apple, xp cate ae pele, ose ti edie
4 bicaly ld oh Aapervized Herat,
vA Super sty oy yee Kearny heppes
Me earning 13 dome Some
tyaivs
Couns. Supper dr
B oyyied = Wavnsr- pt
ou axel fw
:
°
In A
body Wor ge
wats doe eee
ungein SeepUnsupervised Learning
Ze) (28s
oN 7
“8
fn rigs 4 el
Ye wall
woe Ni
Yr | bts!
1 ae Sar er Gwscince
Sivmillaw
vals tthe aber — orclody 14 Supervisirg. you ofod
aly cates [tdegorish Jobely) Lek’ dpa ahah
feud oF travis Jappess In absente of- Bupeviaer
Fo aww ter) feo ab the oboe picture: In
qrare ave Bo So wor obpds. 7 don't \cuora
whet 15 te clan Sabo for given fap Ruages
ti, vi apertitny cou't Fel the ohh abel. Bub
1 Gy Afvide Tau info Shullbr objet togetlen
thr objuts one gre lar?
L
3 he
4g pron 4 wwReinforcement Learning
environment
Ninety eos ray Called agtach
é: eagle ob tee Jeary 5 ih Piggies ate aay
For bkawnp * 1 a Wow you! are af
ee r
udev position: From Ta gation oe vei
have to teke ent. 04—
auy ation dh bated on
Aies vault f ple actions.
4 ae ee fa yo Ore fn a ord Sitede-Why machine learning
Course introduction
2 Hon: abuts iL aaipitng haf Trvoge
Loge RetegshO: Iebut Hyg td OTE
Goda Se Goel ae Cs Gpaleols 4 Dine es
dorvrat lenod all The
Automated Longuoge Tronations s we
ei But with ME Ea ec ean travshate om
be amolley Nanguege i
PS eh soos puhy S/he ee wf
Ronn Wier sil ee} ee ed
deoypwontte youd piss et
Frathic preotens age on inital Calrun4 oa
‘hy
ton wontor
tala tyro icross-validation
—_—s
ee
Asa gentle reminder, you will see cross-validat
don't panie if you ever find this section difficult to understan
very soon.
ple ti in this book. So
jon in action multiple times later in this ;
ae das you will become an expert of it
When the training size is very large, it's often sufficient to split it into training. ¥ i= 0
testing (three subsets) and conduct a performance check onthe latter two. Cross-validation is less
preferable in this ease since i's computationally costly to train a model for each single round.
But if you can afford it, there's no reason not to use cross-validation. When the size isn't so large,
cross-validation is definitely a good choice.
There are mainly two cross-validation schemes in use: exhaustive and non-exhaustive. In
the exhaustive scheme, we leave out a fixed number of observations in each round as testing (or
validation) samples and use the remaining observations as training samples. This process is
repeated until all possible different subsets of samples are used for testing onee. For instance, we
can apply Leave-One-Out-Cross-Validation (LOOCY), which lets each sample be in the
testing set once. For a dataset of the size n, LOOCV requires n rounds of cross-validation. This
‘can be slow when n gets large. This following diagram presents the workflow of LOOCV:
N samples
. Testing
. set
A non-exhaustive scheme, on
partitions, The most widely
randomly split the original d
becomes the testing set, and theWe repeat this process k times,
average the k sets of test results,
10. The following table illustrates the setup for
Round Fold 1
1 Testing
2 Training
3 Training
4 ‘Training
5 Training
Parametric Machine Learning Algorithms
Assumptions can greatly simplify the learning process, but can also limit what can be leamed.
Algorithms that simplify the function to a known form are called parametric machine learning
algorithms.
Parametric machine learning algorithms make assumptions about the mapping function and have
a fixed number of parameters. No matter how much data is used to lear the model, this will not
change how many parameters the algorithm has. With a parametric algorithm, we are selecting the
Fold 2
Training
Training
Training
with each fold being the desi
for the purpose of evaluation.
five-fold:
Fold Fold
Training Training
Training Training
Testing ‘Training
Training Testing
Training ‘Training
ignated testing
‘Common values for & are
Fold 5
‘Training
Training
Training
Training
Testing
form of the function and then learning its coefficients using the
The algorithms involve two steps:
1. Selecta form for the function,
2. Learn the coefficients for t
ir
set once. Finally, weBO+B/ *X1+B2*X2=0
i initial function, the
This assumption greatly simplifies the learning process: afer eee ae Seen
remaining problem is simply to estimate the coefficients BO, BI, and
of input variables X1 and X2.
1g algorithms include:
Some more examples of parametric machine learniny
+ Logistic Regression
Linear Discriminant Analysis
+ Pereeptron
+ Naive Bayes
Simple Neural Networks
arametric Machine Learning Algorithms
Non: ratamerh eMathine Learning Algneiims
Non-parametric algorithms do not make assumptions regarding the form of the mapping function
between input data and output, Consequently, they are five to learn any functional form from the
training data.
A simple example is the K-nearest neighbors (KNN) algorithm, KNN does not make any:
assumptions about the functional form, but instead uses the pattern that points have similar output
when they are close,
Some more examples of popular nonparametric machine learning algorithms are:
+ k-Nearest Neighbors
+ Decision Trees like CART and C45
+ Support Vector Machinesg®
Sree edits pea
sive
Reguemor ty eos commelly abt pe
uli te
oullsfes ela! au Lables + ie
jo ped techies oe
aceds Var iors Sderbry -
YE Cont
BY weresues
3) Rus
u) thedcowe
s) Exgineer fg
SS Bdaebtor.
2) Spots & Eek erletwmnk -
pel Berea
de cyrtied pot y al
welal to preded®
thu ap] cbt
dotane
wh baths sh
|
orto oF a
Loe os
Yo preked:ee een ©
Bag wey Me, Tey Ce ate
Nata get! Decwlabion which © Cu
De Bice aes weed
wrt hye Bowe Wels ot / gel gre
output Rig ante ate Sela eyes poise
fo Sy 7 calush of ey es
Wolk caene
@ Sales prektion
® Fee oe pretation. ev.
oun
eX beau, steve
diy fae Thy Leo probing
tt. weeh.
Veter qleagieg
por Ff Susper-vited Jearnteg.
rrenein “a
here bpewtely yee ane gol iC Nee Sobel.
Regre Btn Wako dur elidion tye gk US
ie he ial tet wad
CL Sraply
fae5 there 3 a warkek fg Compt aud they Drove
Colbabed Sonne ef. the dato — catoiws
age bathethosct amount spit om workehfag and
cokes o PORTA tee Copy gel -
to Nave Rome they
dee Gwe aq fox oe
dave Some better Porsgs from
we Gu Come AD daue wore
) No we Want woe Concrefe
Con We get
which 20eA OE
tt. debe) 10a
oukpuk Neco thas 1 wha We Wout
A ans we Troe UPS Pal avesmived pS
Were er shuple Newear SO
SL
pret ded
Goat 10ra ae
(0) Si NEAT aM LAO REL
BY 2a
aaa
S aria
(case oars uve SE Tas
yj CoG val
have plotfed all pointA . Koro Thy
iA
Let's agumt we
we unug to pret Y
ae Fleugs whet
aly: Sealed om ml ws will be Calling 4 cepted
variable oF targa vwyiable, a4 At, Vartoble, (martertg
predict calls 1A Called of de
budget) usivg which &e
Jud variable o- preddor. Ix Sinple LR we
f vorrabl.c
waleat
sd tadopendtrt yarlable , led bo pred dopisatt
Peat Nek ea ty hy ee vrelaffoud.ge ble
oS al . ‘ Pec
aor ory aud Salen ott dito Wor Hue 5 Str
eels db ptiws We Gu uuderd rad AG
ale Prevensly So Let'4
ag em a rant