0% found this document useful (0 votes)
50 views59 pages

Intro To Pytorch - Ipynb

Uploaded by

adsanjana7
Copyright
© © All Rights Reserved
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
Download as txt, pdf, or txt
0% found this document useful (0 votes)
50 views59 pages

Intro To Pytorch - Ipynb

Uploaded by

adsanjana7
Copyright
© © All Rights Reserved
Available Formats
Download as TXT, PDF, TXT or read online on Scribd
Download as txt, pdf, or txt
Download as txt, pdf, or txt
You are on page 1/ 59

{

"cells": [
{
"cell_type": "markdown",
"id": "567a91ff-56b2-4cf0-864d-b8b21b273225",
"metadata": {},
"source": [
"### Tensors"
]
},
{
"cell_type": "markdown",
"id": "ecc24ae8-1ad9-46d7-8803-5f39557daa49",
"metadata": {},
"source": [
"Tensors are a specialized data structure that are very similar to arrays and
matrices. In PyTorch, we use tensors to encode the inputs and outputs of a model,
as well as the model’s parameters.\n",
"\n",
"Tensors are similar to NumPy’s ndarrays, except that tensors can run on GPUs
or other hardware accelerators. In fact, tensors and NumPy arrays can often share
the same underlying memory, eliminating the need to copy data (see Bridge with
NumPy). Tensors are also optimized for automatic differentiation (we’ll see more
about that later in the Autograd section). If you’re familiar with ndarrays, you’ll
be right at home with the Tensor API. If not, follow along!"
]
},
{
"cell_type": "code",
"execution_count": 2,
"id": "7cdc8114-b2fd-4e27-810b-2975597d86de",
"metadata": {},
"outputs": [],
"source": [
"import torch\n",
"import numpy as np"
]
},
{
"cell_type": "markdown",
"id": "f72bda67-ba3e-4bf7-b646-bb2b12adf26d",
"metadata": {},
"source": [
"### Initializing a Tensor\n",
"Tensors can be initialized in various ways. Take a look at the following
examples:"
]
},
{
"cell_type": "markdown",
"id": "edcbcb07-72f1-4856-b636-44e4ecb5e4fa",
"metadata": {},
"source": [
"#### Directly from data\n",
"\n",
"Tensors can be created directly from data. The data type is automatically
inferred."
]
},
{
"cell_type": "code",
"execution_count": 3,
"id": "77c5a32e-99ff-41a0-a55a-4b6547b5155b",
"metadata": {},
"outputs": [],
"source": [
"data = [[1, 2],[3, 4]]\n",
"x_data = torch.tensor(data)"
]
},
{
"cell_type": "code",
"execution_count": 4,
"id": "a46118fe-68ba-47d5-8931-f76fb60355d4",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"tensor([[1, 2],\n",
" [3, 4]])"
]
},
"execution_count": 4,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x_data"
]
},
{
"cell_type": "code",
"execution_count": 5,
"id": "a16cb73a-736c-4817-9251-cd10ee7e2f8a",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"torch.Tensor"
]
},
"execution_count": 5,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"type(x_data)"
]
},
{
"cell_type": "markdown",
"id": "a60b318f-6ade-4f8d-b742-507a75176693",
"metadata": {},
"source": [
"#### From a NumPy array\n",
"\n",
"Tensors can be created from NumPy arrays (and vice versa - see Bridge with
NumPy)."
]
},
{
"cell_type": "code",
"execution_count": 6,
"id": "b10aea8e-f09e-47c0-b1d5-6e1fe40f33b7",
"metadata": {},
"outputs": [],
"source": [
"np_array = np.array(data)\n",
"x_np = torch.from_numpy(np_array)"
]
},
{
"cell_type": "code",
"execution_count": 9,
"id": "cfc46b0f-1ff0-412e-a0d3-4804b547bb06",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"tensor([[1, 2],\n",
" [3, 4]], dtype=torch.int32)"
]
},
"execution_count": 9,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"x_np"
]
},
{
"cell_type": "markdown",
"id": "842ef899-fc19-498e-a4d0-76fb2560ba11",
"metadata": {},
"source": [
"#### From another tensor:\n",
"\n",
"The new tensor retains the properties (shape, datatype) of the argument
tensor, unless explicitly overridden."
]
},
{
"cell_type": "code",
"execution_count": 10,
"id": "613cdeff-5e87-4f34-b3e8-fef724dcf44d",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Ones Tensor: \n",
" tensor([[1, 1],\n",
" [1, 1]]) \n",
"\n"
]
}
],
"source": [
"x_ones = torch.ones_like(x_data) # retains the properties of x_data\n",
"print(f\"Ones Tensor: \\n {x_ones} \\n\")"
]
},
{
"cell_type": "code",
"execution_count": 11,
"id": "65fb6215-c674-494d-9233-dd9a328bcf2b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Random Tensor: \n",
" tensor([[0.9384, 0.7922],\n",
" [0.6407, 0.7233]]) \n",
"\n"
]
}
],
"source": [
"x_rand = torch.rand_like(x_data, dtype=torch.float) # overrides the datatype
of x_data\n",
"print(f\"Random Tensor: \\n {x_rand} \\n\")"
]
},
{
"cell_type": "markdown",
"id": "9d29d07d-754b-4dd1-a07d-09c00f98d8f0",
"metadata": {},
"source": [
"#### With random or constant values:\n",
"\n",
"shape is a tuple of tensor dimensions. In the functions below, it determines
the dimensionality of the output tensor."
]
},
{
"cell_type": "code",
"execution_count": 12,
"id": "9c9bd153-4fee-4a07-9248-4b8b2bcac457",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Random Tensor: \n",
" tensor([[0.1277, 0.8025, 0.9299],\n",
" [0.2443, 0.6467, 0.4627]]) \n",
"\n",
"Ones Tensor: \n",
" tensor([[1., 1., 1.],\n",
" [1., 1., 1.]]) \n",
"\n",
"Zeros Tensor: \n",
" tensor([[0., 0., 0.],\n",
" [0., 0., 0.]])\n"
]
}
],
"source": [
"shape = (2,3,)\n",
"rand_tensor = torch.rand(shape)\n",
"ones_tensor = torch.ones(shape)\n",
"zeros_tensor = torch.zeros(shape)\n",
"\n",
"print(f\"Random Tensor: \\n {rand_tensor} \\n\")\n",
"print(f\"Ones Tensor: \\n {ones_tensor} \\n\")\n",
"print(f\"Zeros Tensor: \\n {zeros_tensor}\")"
]
},
{
"cell_type": "markdown",
"id": "90eee245-350c-4ab5-870b-8ba5e49d1f27",
"metadata": {},
"source": [
"### Attributes of a Tensor\n",
"Tensor attributes describe their shape, datatype, and the device on which they
are stored."
]
},
{
"cell_type": "code",
"execution_count": 13,
"id": "86007cb2-f280-4ace-9019-ff0118d88981",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Shape of tensor: torch.Size([3, 4])\n",
"Datatype of tensor: torch.float32\n",
"Device tensor is stored on: cpu\n"
]
}
],
"source": [
"tensor = torch.rand(3,4)\n",
"\n",
"print(f\"Shape of tensor: {tensor.shape}\")\n",
"print(f\"Datatype of tensor: {tensor.dtype}\")\n",
"print(f\"Device tensor is stored on: {tensor.device}\")"
]
},
{
"cell_type": "code",
"execution_count": 14,
"id": "0643067a-a667-4973-bfd8-68da1d8c902f",
"metadata": {
"collapsed": true,
"jupyter": {
"outputs_hidden": true
},
"scrolled": true
},
"outputs": [
{
"data": {
"text/plain": [
"['H',\n",
" 'T',\n",
" '__abs__',\n",
" '__add__',\n",
" '__and__',\n",
" '__array__',\n",
" '__array_priority__',\n",
" '__array_wrap__',\n",
" '__bool__',\n",
" '__class__',\n",
" '__complex__',\n",
" '__contains__',\n",
" '__deepcopy__',\n",
" '__delattr__',\n",
" '__delitem__',\n",
" '__dict__',\n",
" '__dir__',\n",
" '__div__',\n",
" '__dlpack__',\n",
" '__dlpack_device__',\n",
" '__doc__',\n",
" '__eq__',\n",
" '__float__',\n",
" '__floordiv__',\n",
" '__format__',\n",
" '__ge__',\n",
" '__getattribute__',\n",
" '__getitem__',\n",
" '__getstate__',\n",
" '__gt__',\n",
" '__hash__',\n",
" '__iadd__',\n",
" '__iand__',\n",
" '__idiv__',\n",
" '__ifloordiv__',\n",
" '__ilshift__',\n",
" '__imod__',\n",
" '__imul__',\n",
" '__index__',\n",
" '__init__',\n",
" '__init_subclass__',\n",
" '__int__',\n",
" '__invert__',\n",
" '__ior__',\n",
" '__ipow__',\n",
" '__irshift__',\n",
" '__isub__',\n",
" '__iter__',\n",
" '__itruediv__',\n",
" '__ixor__',\n",
" '__le__',\n",
" '__len__',\n",
" '__long__',\n",
" '__lshift__',\n",
" '__lt__',\n",
" '__matmul__',\n",
" '__mod__',\n",
" '__module__',\n",
" '__mul__',\n",
" '__ne__',\n",
" '__neg__',\n",
" '__new__',\n",
" '__nonzero__',\n",
" '__or__',\n",
" '__pos__',\n",
" '__pow__',\n",
" '__radd__',\n",
" '__rand__',\n",
" '__rdiv__',\n",
" '__reduce__',\n",
" '__reduce_ex__',\n",
" '__repr__',\n",
" '__reversed__',\n",
" '__rfloordiv__',\n",
" '__rlshift__',\n",
" '__rmatmul__',\n",
" '__rmod__',\n",
" '__rmul__',\n",
" '__ror__',\n",
" '__rpow__',\n",
" '__rrshift__',\n",
" '__rshift__',\n",
" '__rsub__',\n",
" '__rtruediv__',\n",
" '__rxor__',\n",
" '__setattr__',\n",
" '__setitem__',\n",
" '__setstate__',\n",
" '__sizeof__',\n",
" '__str__',\n",
" '__sub__',\n",
" '__subclasshook__',\n",
" '__torch_dispatch__',\n",
" '__torch_function__',\n",
" '__truediv__',\n",
" '__weakref__',\n",
" '__xor__',\n",
" '_addmm_activation',\n",
" '_autocast_to_full_precision',\n",
" '_autocast_to_reduced_precision',\n",
" '_backward_hooks',\n",
" '_base',\n",
" '_cdata',\n",
" '_coalesced_',\n",
" '_conj',\n",
" '_conj_physical',\n",
" '_dimI',\n",
" '_dimV',\n",
" '_fix_weakref',\n",
" '_grad',\n",
" '_grad_fn',\n",
" '_has_symbolic_sizes_strides',\n",
" '_indices',\n",
" '_is_all_true',\n",
" '_is_any_true',\n",
" '_is_view',\n",
" '_is_zerotensor',\n",
" '_lazy_clone',\n",
" '_make_subclass',\n",
" '_make_wrapper_subclass',\n",
" '_neg_view',\n",
" '_nested_tensor_size',\n",
" '_nested_tensor_storage_offsets',\n",
" '_nested_tensor_strides',\n",
" '_nnz',\n",
" '_post_accumulate_grad_hooks',\n",
" '_python_dispatch',\n",
" '_reduce_ex_internal',\n",
" '_rev_view_func_unsafe',\n",
" '_sparse_mask_projection',\n",
" '_to_dense',\n",
" '_to_sparse',\n",
" '_to_sparse_bsc',\n",
" '_to_sparse_bsr',\n",
" '_to_sparse_csc',\n",
" '_to_sparse_csr',\n",
" '_typed_storage',\n",
" '_update_names',\n",
" '_values',\n",
" '_version',\n",
" '_view_func',\n",
" '_view_func_unsafe',\n",
" 'abs',\n",
" 'abs_',\n",
" 'absolute',\n",
" 'absolute_',\n",
" 'acos',\n",
" 'acos_',\n",
" 'acosh',\n",
" 'acosh_',\n",
" 'add',\n",
" 'add_',\n",
" 'addbmm',\n",
" 'addbmm_',\n",
" 'addcdiv',\n",
" 'addcdiv_',\n",
" 'addcmul',\n",
" 'addcmul_',\n",
" 'addmm',\n",
" 'addmm_',\n",
" 'addmv',\n",
" 'addmv_',\n",
" 'addr',\n",
" 'addr_',\n",
" 'adjoint',\n",
" 'align_as',\n",
" 'align_to',\n",
" 'all',\n",
" 'allclose',\n",
" 'amax',\n",
" 'amin',\n",
" 'aminmax',\n",
" 'angle',\n",
" 'any',\n",
" 'apply_',\n",
" 'arccos',\n",
" 'arccos_',\n",
" 'arccosh',\n",
" 'arccosh_',\n",
" 'arcsin',\n",
" 'arcsin_',\n",
" 'arcsinh',\n",
" 'arcsinh_',\n",
" 'arctan',\n",
" 'arctan2',\n",
" 'arctan2_',\n",
" 'arctan_',\n",
" 'arctanh',\n",
" 'arctanh_',\n",
" 'argmax',\n",
" 'argmin',\n",
" 'argsort',\n",
" 'argwhere',\n",
" 'as_strided',\n",
" 'as_strided_',\n",
" 'as_strided_scatter',\n",
" 'as_subclass',\n",
" 'asin',\n",
" 'asin_',\n",
" 'asinh',\n",
" 'asinh_',\n",
" 'atan',\n",
" 'atan2',\n",
" 'atan2_',\n",
" 'atan_',\n",
" 'atanh',\n",
" 'atanh_',\n",
" 'backward',\n",
" 'baddbmm',\n",
" 'baddbmm_',\n",
" 'bernoulli',\n",
" 'bernoulli_',\n",
" 'bfloat16',\n",
" 'bincount',\n",
" 'bitwise_and',\n",
" 'bitwise_and_',\n",
" 'bitwise_left_shift',\n",
" 'bitwise_left_shift_',\n",
" 'bitwise_not',\n",
" 'bitwise_not_',\n",
" 'bitwise_or',\n",
" 'bitwise_or_',\n",
" 'bitwise_right_shift',\n",
" 'bitwise_right_shift_',\n",
" 'bitwise_xor',\n",
" 'bitwise_xor_',\n",
" 'bmm',\n",
" 'bool',\n",
" 'broadcast_to',\n",
" 'byte',\n",
" 'cauchy_',\n",
" 'ccol_indices',\n",
" 'cdouble',\n",
" 'ceil',\n",
" 'ceil_',\n",
" 'cfloat',\n",
" 'chalf',\n",
" 'char',\n",
" 'cholesky',\n",
" 'cholesky_inverse',\n",
" 'cholesky_solve',\n",
" 'chunk',\n",
" 'clamp',\n",
" 'clamp_',\n",
" 'clamp_max',\n",
" 'clamp_max_',\n",
" 'clamp_min',\n",
" 'clamp_min_',\n",
" 'clip',\n",
" 'clip_',\n",
" 'clone',\n",
" 'coalesce',\n",
" 'col_indices',\n",
" 'conj',\n",
" 'conj_physical',\n",
" 'conj_physical_',\n",
" 'contiguous',\n",
" 'copy_',\n",
" 'copysign',\n",
" 'copysign_',\n",
" 'corrcoef',\n",
" 'cos',\n",
" 'cos_',\n",
" 'cosh',\n",
" 'cosh_',\n",
" 'count_nonzero',\n",
" 'cov',\n",
" 'cpu',\n",
" 'cross',\n",
" 'crow_indices',\n",
" 'cuda',\n",
" 'cummax',\n",
" 'cummin',\n",
" 'cumprod',\n",
" 'cumprod_',\n",
" 'cumsum',\n",
" 'cumsum_',\n",
" 'data',\n",
" 'data_ptr',\n",
" 'deg2rad',\n",
" 'deg2rad_',\n",
" 'dense_dim',\n",
" 'dequantize',\n",
" 'det',\n",
" 'detach',\n",
" 'detach_',\n",
" 'device',\n",
" 'diag',\n",
" 'diag_embed',\n",
" 'diagflat',\n",
" 'diagonal',\n",
" 'diagonal_scatter',\n",
" 'diff',\n",
" 'digamma',\n",
" 'digamma_',\n",
" 'dim',\n",
" 'dim_order',\n",
" 'dist',\n",
" 'div',\n",
" 'div_',\n",
" 'divide',\n",
" 'divide_',\n",
" 'dot',\n",
" 'double',\n",
" 'dsplit',\n",
" 'dtype',\n",
" 'eig',\n",
" 'element_size',\n",
" 'eq',\n",
" 'eq_',\n",
" 'equal',\n",
" 'erf',\n",
" 'erf_',\n",
" 'erfc',\n",
" 'erfc_',\n",
" 'erfinv',\n",
" 'erfinv_',\n",
" 'exp',\n",
" 'exp2',\n",
" 'exp2_',\n",
" 'exp_',\n",
" 'expand',\n",
" 'expand_as',\n",
" 'expm1',\n",
" 'expm1_',\n",
" 'exponential_',\n",
" 'fill_',\n",
" 'fill_diagonal_',\n",
" 'fix',\n",
" 'fix_',\n",
" 'flatten',\n",
" 'flip',\n",
" 'fliplr',\n",
" 'flipud',\n",
" 'float',\n",
" 'float_power',\n",
" 'float_power_',\n",
" 'floor',\n",
" 'floor_',\n",
" 'floor_divide',\n",
" 'floor_divide_',\n",
" 'fmax',\n",
" 'fmin',\n",
" 'fmod',\n",
" 'fmod_',\n",
" 'frac',\n",
" 'frac_',\n",
" 'frexp',\n",
" 'gather',\n",
" 'gcd',\n",
" 'gcd_',\n",
" 'ge',\n",
" 'ge_',\n",
" 'geometric_',\n",
" 'geqrf',\n",
" 'ger',\n",
" 'get_device',\n",
" 'grad',\n",
" 'grad_fn',\n",
" 'greater',\n",
" 'greater_',\n",
" 'greater_equal',\n",
" 'greater_equal_',\n",
" 'gt',\n",
" 'gt_',\n",
" 'half',\n",
" 'hardshrink',\n",
" 'has_names',\n",
" 'heaviside',\n",
" 'heaviside_',\n",
" 'histc',\n",
" 'histogram',\n",
" 'hsplit',\n",
" 'hypot',\n",
" 'hypot_',\n",
" 'i0',\n",
" 'i0_',\n",
" 'igamma',\n",
" 'igamma_',\n",
" 'igammac',\n",
" 'igammac_',\n",
" 'imag',\n",
" 'index_add',\n",
" 'index_add_',\n",
" 'index_copy',\n",
" 'index_copy_',\n",
" 'index_fill',\n",
" 'index_fill_',\n",
" 'index_put',\n",
" 'index_put_',\n",
" 'index_reduce',\n",
" 'index_reduce_',\n",
" 'index_select',\n",
" 'indices',\n",
" 'inner',\n",
" 'int',\n",
" 'int_repr',\n",
" 'inverse',\n",
" 'ipu',\n",
" 'is_coalesced',\n",
" 'is_complex',\n",
" 'is_conj',\n",
" 'is_contiguous',\n",
" 'is_cpu',\n",
" 'is_cuda',\n",
" 'is_distributed',\n",
" 'is_floating_point',\n",
" 'is_inference',\n",
" 'is_ipu',\n",
" 'is_leaf',\n",
" 'is_meta',\n",
" 'is_mkldnn',\n",
" 'is_mps',\n",
" 'is_mtia',\n",
" 'is_neg',\n",
" 'is_nested',\n",
" 'is_nonzero',\n",
" 'is_ort',\n",
" 'is_pinned',\n",
" 'is_quantized',\n",
" 'is_same_size',\n",
" 'is_set_to',\n",
" 'is_shared',\n",
" 'is_signed',\n",
" 'is_sparse',\n",
" 'is_sparse_csr',\n",
" 'is_vulkan',\n",
" 'is_xla',\n",
" 'is_xpu',\n",
" 'isclose',\n",
" 'isfinite',\n",
" 'isinf',\n",
" 'isnan',\n",
" 'isneginf',\n",
" 'isposinf',\n",
" 'isreal',\n",
" 'istft',\n",
" 'item',\n",
" 'itemsize',\n",
" 'kron',\n",
" 'kthvalue',\n",
" 'layout',\n",
" 'lcm',\n",
" 'lcm_',\n",
" 'ldexp',\n",
" 'ldexp_',\n",
" 'le',\n",
" 'le_',\n",
" 'lerp',\n",
" 'lerp_',\n",
" 'less',\n",
" 'less_',\n",
" 'less_equal',\n",
" 'less_equal_',\n",
" 'lgamma',\n",
" 'lgamma_',\n",
" 'log',\n",
" 'log10',\n",
" 'log10_',\n",
" 'log1p',\n",
" 'log1p_',\n",
" 'log2',\n",
" 'log2_',\n",
" 'log_',\n",
" 'log_normal_',\n",
" 'log_softmax',\n",
" 'logaddexp',\n",
" 'logaddexp2',\n",
" 'logcumsumexp',\n",
" 'logdet',\n",
" 'logical_and',\n",
" 'logical_and_',\n",
" 'logical_not',\n",
" 'logical_not_',\n",
" 'logical_or',\n",
" 'logical_or_',\n",
" 'logical_xor',\n",
" 'logical_xor_',\n",
" 'logit',\n",
" 'logit_',\n",
" 'logsumexp',\n",
" 'long',\n",
" 'lstsq',\n",
" 'lt',\n",
" 'lt_',\n",
" 'lu',\n",
" 'lu_solve',\n",
" 'mH',\n",
" 'mT',\n",
" 'map2_',\n",
" 'map_',\n",
" 'masked_fill',\n",
" 'masked_fill_',\n",
" 'masked_scatter',\n",
" 'masked_scatter_',\n",
" 'masked_select',\n",
" 'matmul',\n",
" 'matrix_exp',\n",
" 'matrix_power',\n",
" 'max',\n",
" 'maximum',\n",
" 'mean',\n",
" 'median',\n",
" 'min',\n",
" 'minimum',\n",
" 'mm',\n",
" 'mode',\n",
" 'module_load',\n",
" 'moveaxis',\n",
" 'movedim',\n",
" 'msort',\n",
" 'mul',\n",
" 'mul_',\n",
" 'multinomial',\n",
" 'multiply',\n",
" 'multiply_',\n",
" 'mv',\n",
" 'mvlgamma',\n",
" 'mvlgamma_',\n",
" 'name',\n",
" 'names',\n",
" 'nan_to_num',\n",
" 'nan_to_num_',\n",
" 'nanmean',\n",
" 'nanmedian',\n",
" 'nanquantile',\n",
" 'nansum',\n",
" 'narrow',\n",
" 'narrow_copy',\n",
" 'nbytes',\n",
" 'ndim',\n",
" 'ndimension',\n",
" 'ne',\n",
" 'ne_',\n",
" 'neg',\n",
" 'neg_',\n",
" 'negative',\n",
" 'negative_',\n",
" 'nelement',\n",
" 'new',\n",
" 'new_empty',\n",
" 'new_empty_strided',\n",
" 'new_full',\n",
" 'new_ones',\n",
" 'new_tensor',\n",
" 'new_zeros',\n",
" 'nextafter',\n",
" 'nextafter_',\n",
" 'nonzero',\n",
" 'nonzero_static',\n",
" 'norm',\n",
" 'normal_',\n",
" 'not_equal',\n",
" 'not_equal_',\n",
" 'numel',\n",
" 'numpy',\n",
" 'orgqr',\n",
" 'ormqr',\n",
" 'outer',\n",
" 'output_nr',\n",
" 'permute',\n",
" 'pin_memory',\n",
" 'pinverse',\n",
" 'polygamma',\n",
" 'polygamma_',\n",
" 'positive',\n",
" 'pow',\n",
" 'pow_',\n",
" 'prelu',\n",
" 'prod',\n",
" 'put',\n",
" 'put_',\n",
" 'q_per_channel_axis',\n",
" 'q_per_channel_scales',\n",
" 'q_per_channel_zero_points',\n",
" 'q_scale',\n",
" 'q_zero_point',\n",
" 'qr',\n",
" 'qscheme',\n",
" 'quantile',\n",
" 'rad2deg',\n",
" 'rad2deg_',\n",
" 'random_',\n",
" 'ravel',\n",
" 'real',\n",
" 'reciprocal',\n",
" 'reciprocal_',\n",
" 'record_stream',\n",
" 'refine_names',\n",
" 'register_hook',\n",
" 'register_post_accumulate_grad_hook',\n",
" 'reinforce',\n",
" 'relu',\n",
" 'relu_',\n",
" 'remainder',\n",
" 'remainder_',\n",
" 'rename',\n",
" 'rename_',\n",
" 'renorm',\n",
" 'renorm_',\n",
" 'repeat',\n",
" 'repeat_interleave',\n",
" 'requires_grad',\n",
" 'requires_grad_',\n",
" 'reshape',\n",
" 'reshape_as',\n",
" 'resize',\n",
" 'resize_',\n",
" 'resize_as',\n",
" 'resize_as_',\n",
" 'resize_as_sparse_',\n",
" 'resolve_conj',\n",
" 'resolve_neg',\n",
" 'retain_grad',\n",
" 'retains_grad',\n",
" 'roll',\n",
" 'rot90',\n",
" 'round',\n",
" 'round_',\n",
" 'row_indices',\n",
" 'rsqrt',\n",
" 'rsqrt_',\n",
" 'scatter',\n",
" 'scatter_',\n",
" 'scatter_add',\n",
" 'scatter_add_',\n",
" 'scatter_reduce',\n",
" 'scatter_reduce_',\n",
" 'select',\n",
" 'select_scatter',\n",
" 'set_',\n",
" 'sgn',\n",
" 'sgn_',\n",
" 'shape',\n",
" 'share_memory_',\n",
" 'short',\n",
" 'sigmoid',\n",
" 'sigmoid_',\n",
" 'sign',\n",
" 'sign_',\n",
" 'signbit',\n",
" 'sin',\n",
" 'sin_',\n",
" 'sinc',\n",
" 'sinc_',\n",
" 'sinh',\n",
" 'sinh_',\n",
" 'size',\n",
" 'slice_inverse',\n",
" 'slice_scatter',\n",
" 'slogdet',\n",
" 'smm',\n",
" 'softmax',\n",
" 'solve',\n",
" 'sort',\n",
" 'sparse_dim',\n",
" 'sparse_mask',\n",
" 'sparse_resize_',\n",
" 'sparse_resize_and_clear_',\n",
" 'split',\n",
" 'split_with_sizes',\n",
" 'sqrt',\n",
" 'sqrt_',\n",
" 'square',\n",
" 'square_',\n",
" 'squeeze',\n",
" 'squeeze_',\n",
" 'sspaddmm',\n",
" 'std',\n",
" 'stft',\n",
" 'storage',\n",
" 'storage_offset',\n",
" 'storage_type',\n",
" 'stride',\n",
" 'sub',\n",
" 'sub_',\n",
" 'subtract',\n",
" 'subtract_',\n",
" 'sum',\n",
" 'sum_to_size',\n",
" 'svd',\n",
" 'swapaxes',\n",
" 'swapaxes_',\n",
" 'swapdims',\n",
" 'swapdims_',\n",
" 'symeig',\n",
" 't',\n",
" 't_',\n",
" 'take',\n",
" 'take_along_dim',\n",
" 'tan',\n",
" 'tan_',\n",
" 'tanh',\n",
" 'tanh_',\n",
" 'tensor_split',\n",
" 'tile',\n",
" 'to',\n",
" 'to_dense',\n",
" 'to_mkldnn',\n",
" 'to_padded_tensor',\n",
" 'to_sparse',\n",
" 'to_sparse_bsc',\n",
" 'to_sparse_bsr',\n",
" 'to_sparse_coo',\n",
" 'to_sparse_csc',\n",
" 'to_sparse_csr',\n",
" 'tolist',\n",
" 'topk',\n",
" 'trace',\n",
" 'transpose',\n",
" 'transpose_',\n",
" 'triangular_solve',\n",
" 'tril',\n",
" 'tril_',\n",
" 'triu',\n",
" 'triu_',\n",
" 'true_divide',\n",
" 'true_divide_',\n",
" 'trunc',\n",
" 'trunc_',\n",
" 'type',\n",
" 'type_as',\n",
" 'unbind',\n",
" 'unflatten',\n",
" 'unfold',\n",
" 'uniform_',\n",
" 'unique',\n",
" 'unique_consecutive',\n",
" 'unsafe_chunk',\n",
" 'unsafe_split',\n",
" 'unsafe_split_with_sizes',\n",
" 'unsqueeze',\n",
" 'unsqueeze_',\n",
" 'untyped_storage',\n",
" 'values',\n",
" 'var',\n",
" 'vdot',\n",
" 'view',\n",
" 'view_as',\n",
" 'vsplit',\n",
" 'where',\n",
" 'xlogy',\n",
" 'xlogy_',\n",
" 'xpu',\n",
" 'zero_']"
]
},
"execution_count": 14,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"dir(tensor)"
]
},
{
"cell_type": "markdown",
"id": "ca881570-863e-408c-afce-b8bdb9f87641",
"metadata": {},
"source": [
"### Operations on Tensors\n",
"Over 100 tensor operations, including arithmetic, linear algebra, matrix
manipulation (transposing, indexing, slicing), sampling and more are
comprehensively described [here](https://pytorch.org/docs/stable/torch.html)."
]
},
{
"cell_type": "markdown",
"id": "95b15974-ec75-45af-a810-6d5fd0646da2",
"metadata": {},
"source": [
"Each of these operations can be run on the GPU (at typically higher speeds
than on a CPU). If you’re using Colab, allocate a GPU by going to Runtime > Change
runtime type > GPU.\n",
"\n",
"By default, tensors are created on the CPU. We need to explicitly move tensors
to the GPU using .to method (after checking for GPU availability). Keep in mind
that copying large tensors across devices can be expensive in terms of time and
memory!"
]
},
{
"cell_type": "code",
"execution_count": 16,
"id": "317a0a8b-ebe0-435a-9e6e-08452522dc48",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"False"
]
},
"execution_count": 16,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"torch.cuda.is_available()"
]
},
{
"cell_type": "code",
"execution_count": 15,
"id": "690cc03b-aa70-41fd-95f8-3a2c7a14fc8b",
"metadata": {},
"outputs": [],
"source": [
"# We move our tensor to the GPU if available\n",
"if torch.cuda.is_available():\n",
" tensor = tensor.to(\"cuda\")"
]
},
{
"cell_type": "markdown",
"id": "1717e7f9-ed30-4125-91d5-b2d313950223",
"metadata": {},
"source": [
"Try out some of the operations from the list. If you’re familiar with the
NumPy API, you’ll find the Tensor API a breeze to use.\n",
"\n",
"**Standard numpy-like indexing and slicing:**"
]
},
{
"cell_type": "code",
"execution_count": 19,
"id": "6db0ca9a-8890-45d3-a28d-382875bcb63c",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"tensor([[0.7970, 0.9522, 0.7247, 0.7570],\n",
" [0.1112, 0.5820, 0.8257, 0.9338],\n",
" [0.3646, 0.5168, 0.4983, 0.9551],\n",
" [0.0712, 0.2866, 0.9051, 0.8363]])\n"
]
}
],
"source": [
"tensor = torch.rand(4, 4)\n",
"print(tensor)"
]
},
{
"cell_type": "code",
"execution_count": 20,
"id": "2dcb2aef-6b9d-4a5f-97ee-265c913126e3",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"First row: tensor([0.7970, 0.9522, 0.7247, 0.7570])\n",
"First column: tensor([0.7970, 0.1112, 0.3646, 0.0712])\n",
"Last column: tensor([0.7570, 0.9338, 0.9551, 0.8363])\n"
]
}
],
"source": [
"print(f\"First row: {tensor[0]}\")\n",
"print(f\"First column: {tensor[:, 0]}\")\n",
"print(f\"Last column: {tensor[..., -1]}\")"
]
},
{
"cell_type": "code",
"execution_count": 21,
"id": "aad85e71-5470-4eb2-a4c9-1bb207e82614",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"tensor([[0.7970, 0.0000, 0.7247, 0.7570],\n",
" [0.1112, 0.0000, 0.8257, 0.9338],\n",
" [0.3646, 0.0000, 0.4983, 0.9551],\n",
" [0.0712, 0.0000, 0.9051, 0.8363]])\n"
]
}
],
"source": [
"tensor[:,1] = 0\n",
"print(tensor)"
]
},
{
"cell_type": "markdown",
"id": "51940028-f58f-40de-8f54-a942d786673b",
"metadata": {},
"source": [
"**Joining tensors** You can use torch.cat to concatenate a sequence of tensors
along a given dimension. See also
[torch.stack](https://pytorch.org/docs/stable/generated/torch.stack.html), another
tensor joining operator that is subtly different from torch.cat."
]
},
{
"cell_type": "code",
"execution_count": 22,
"id": "2b370287-20e9-42e1-a139-f2d701c51e3f",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"tensor([[0.7970, 0.0000, 0.7247, 0.7570, 0.7970, 0.0000, 0.7247, 0.7570,
0.7970,\n",
" 0.0000, 0.7247, 0.7570],\n",
" [0.1112, 0.0000, 0.8257, 0.9338, 0.1112, 0.0000, 0.8257, 0.9338,
0.1112,\n",
" 0.0000, 0.8257, 0.9338],\n",
" [0.3646, 0.0000, 0.4983, 0.9551, 0.3646, 0.0000, 0.4983, 0.9551,
0.3646,\n",
" 0.0000, 0.4983, 0.9551],\n",
" [0.0712, 0.0000, 0.9051, 0.8363, 0.0712, 0.0000, 0.9051, 0.8363,
0.0712,\n",
" 0.0000, 0.9051, 0.8363]])\n"
]
}
],
"source": [
"t1 = torch.cat([tensor, tensor, tensor], dim=1)\n",
"print(t1)"
]
},
{
"cell_type": "markdown",
"id": "fc888309-dd38-4f90-adac-c2ab5f0cff60",
"metadata": {},
"source": [
"**Arithmetic operations**"
]
},
{
"cell_type": "code",
"execution_count": 23,
"id": "56a3401a-2acb-419d-9b1b-11c5f7ea570c",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"tensor([[1.7334, 1.3939, 1.3748, 1.3458],\n",
" [1.3939, 1.5661, 1.3439, 1.5362],\n",
" [1.3748, 1.3439, 1.2935, 1.2757],\n",
" [1.3458, 1.5362, 1.2757, 1.5237]])"
]
},
"execution_count": 23,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# This computes the matrix multiplication between two tensors. y1, y2, y3 will
have the same value\n",
"# ``tensor.T`` returns the transpose of a tensor\n",
"y1 = tensor @ tensor.T\n",
"y2 = tensor.matmul(tensor.T)\n",
"\n",
"y3 = torch.rand_like(y1)\n",
"torch.matmul(tensor, tensor.T, out=y3)"
]
},
{
"cell_type": "code",
"execution_count": 24,
"id": "8d4e0ef5-cf47-4e34-8512-db7274fab512",
"metadata": {},
"outputs": [
{
"data": {
"text/plain": [
"tensor([[0.6352, 0.0000, 0.5252, 0.5731],\n",
" [0.0124, 0.0000, 0.6817, 0.8720],\n",
" [0.1330, 0.0000, 0.2483, 0.9123],\n",
" [0.0051, 0.0000, 0.8192, 0.6994]])"
]
},
"execution_count": 24,
"metadata": {},
"output_type": "execute_result"
}
],
"source": [
"# This computes the element-wise product. z1, z2, z3 will have the same value\
n",
"z1 = tensor * tensor\n",
"z2 = tensor.mul(tensor)\n",
"\n",
"z3 = torch.rand_like(tensor)\n",
"torch.mul(tensor, tensor, out=z3)"
]
},
{
"cell_type": "markdown",
"id": "be4fb2b9-6f09-4e6e-9bac-dfdb57bbcff4",
"metadata": {},
"source": [
"**Single-element tensors** If you have a one-element tensor, for example by
aggregating all values of a tensor into one value, you can convert it to a Python
numerical value using item():"
]
},
{
"cell_type": "code",
"execution_count": 25,
"id": "f69a6fe4-4f88-45ec-9d13-64bb3da70b20",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"7.780040740966797 <class 'float'>\n"
]
}
],
"source": [
"agg = tensor.sum()\n",
"agg_item = agg.item()\n",
"print(agg_item, type(agg_item))"
]
},
{
"cell_type": "markdown",
"id": "8f05e470-20db-404d-94a9-a6337591215c",
"metadata": {},
"source": [
"**In-place operations:** Operations that store the result into the operand are
called in-place. They are denoted by a _ suffix. For example: x.copy_(y), x.t_(),
will change x."
]
},
{
"cell_type": "code",
"execution_count": 27,
"id": "37119d25-893d-4a82-abaf-a8f9f8073649",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"tensor([[0.7970, 0.0000, 0.7247, 0.7570],\n",
" [0.1112, 0.0000, 0.8257, 0.9338],\n",
" [0.3646, 0.0000, 0.4983, 0.9551],\n",
" [0.0712, 0.0000, 0.9051, 0.8363]]) \n",
"\n",
"tensor([[5.7970, 5.0000, 5.7247, 5.7570],\n",
" [5.1112, 5.0000, 5.8257, 5.9338],\n",
" [5.3646, 5.0000, 5.4983, 5.9551],\n",
" [5.0712, 5.0000, 5.9051, 5.8363]])\n"
]
}
],
"source": [
"print(f\"{tensor} \\n\")\n",
"tensor.add_(5)\n",
"print(tensor)"
]
},
{
"cell_type": "markdown",
"id": "f1d9f69f-f9d3-408c-a1f4-973cf2db4eb4",
"metadata": {},
"source": [
"**NOTE:** In-place operations save some memory, but can be problematic when
computing derivatives because of an immediate loss of history. Hence, their use is
discouraged."
]
},
{
"cell_type": "markdown",
"id": "df1c9d91-7d48-4517-8f6e-284e707c3679",
"metadata": {},
"source": [
"### Datasets & DataLoaders"
]
},
{
"cell_type": "markdown",
"id": "f06247e3-8643-45d4-84ca-bb9dd4e84391",
"metadata": {},
"source": [
"Code for processing data samples can get messy and hard to maintain; we
ideally want our dataset code to be decoupled from our model training code for
better readability and modularity. PyTorch provides two data primitives:
torch.utils.data.DataLoader and torch.utils.data.Dataset that allow you to use pre-
loaded datasets as well as your own data. Dataset stores the samples and their
corresponding labels, and DataLoader wraps an iterable around the Dataset to enable
easy access to the samples.\n",
"\n",
"PyTorch domain libraries provide a number of pre-loaded datasets (such as
FashionMNIST) that subclass torch.utils.data.Dataset and implement functions
specific to the particular data. They can be used to prototype and benchmark your
model. You can find them here: Image Datasets, Text Datasets, and Audio Datasets"
]
},
{
"cell_type": "markdown",
"id": "513d4bd0-7aef-46cc-9cd6-b4242864d3a5",
"metadata": {},
"source": [
"#### Loading a Dataset"
]
},
{
"cell_type": "markdown",
"id": "62b8c200-00fd-4cd7-9ea0-181adef2f23b",
"metadata": {},
"source": [
"Here is an example of how to load the Fashion-MNIST dataset from TorchVision.
Fashion-MNIST is a dataset of Zalando’s article images consisting of 60,000
training examples and 10,000 test examples. Each example comprises a 28×28
grayscale image and an associated label from one of 10 classes.\n",
"\n",
"We load the FashionMNIST Dataset with the following parameters:\n",
"root is the path where the train/test data is stored,\n",
"\n",
"train specifies training or test dataset,\n",
"\n",
"download=True downloads the data from the internet if it’s not available at
root.\n",
"\n",
"transform and target_transform specify the feature and label transformations"
]
},
{
"cell_type": "code",
"execution_count": 29,
"id": "7ecb8c3e-7bc4-49b6-b940-569b2135bc5b",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Collecting matplotlib\n",
" Downloading matplotlib-3.9.0-cp311-cp311-win_amd64.whl.metadata (11 kB)\
n",
"Collecting contourpy>=1.0.1 (from matplotlib)\n",
" Downloading contourpy-1.2.1-cp311-cp311-win_amd64.whl.metadata (5.8 kB)\
n",
"Collecting cycler>=0.10 (from matplotlib)\n",
" Using cached cycler-0.12.1-py3-none-any.whl.metadata (3.8 kB)\n",
"Collecting fonttools>=4.22.0 (from matplotlib)\n",
" Downloading fonttools-4.51.0-cp311-cp311-win_amd64.whl.metadata (162 kB)\
n",
" ---------------------------------------- 0.0/162.8 kB ? eta -:--:--\n",
" -------------- ------------------------ 61.4/162.8 kB 1.6 MB/s eta
0:00:01\n",
" ------------------- ------------------- 81.9/162.8 kB 1.5 MB/s eta
0:00:01\n",
" ------------------- ------------------- 81.9/162.8 kB 1.5 MB/s eta
0:00:01\n",
" ------------------------------------ 162.8/162.8 kB 886.7 kB/s eta
0:00:00\n",
"Collecting kiwisolver>=1.3.1 (from matplotlib)\n",
" Using cached kiwisolver-1.4.5-cp311-cp311-win_amd64.whl.metadata (6.5 kB)\
n",
"Requirement already satisfied: numpy>=1.23 in c:\\users\\satya\\downloads\\
venvs\\py311_venvs\\311_torch_venv\\lib\\site-packages (from matplotlib) (1.26.4)\
n",
"Requirement already satisfied: packaging>=20.0 in c:\\users\\satya\\
downloads\\venvs\\py311_venvs\\311_torch_venv\\lib\\site-packages (from matplotlib)
(24.0)\n",
"Requirement already satisfied: pillow>=8 in c:\\users\\satya\\downloads\\
venvs\\py311_venvs\\311_torch_venv\\lib\\site-packages (from matplotlib) (10.3.0)\
n",
"Collecting pyparsing>=2.3.1 (from matplotlib)\n",
" Using cached pyparsing-3.1.2-py3-none-any.whl.metadata (5.1 kB)\n",
"Requirement already satisfied: python-dateutil>=2.7 in c:\\users\\satya\\
downloads\\venvs\\py311_venvs\\311_torch_venv\\lib\\site-packages (from matplotlib)
(2.9.0.post0)\n",
"Requirement already satisfied: six>=1.5 in c:\\users\\satya\\downloads\\
venvs\\py311_venvs\\311_torch_venv\\lib\\site-packages (from python-dateutil>=2.7-
>matplotlib) (1.16.0)\n",
"Downloading matplotlib-3.9.0-cp311-cp311-win_amd64.whl (8.0 MB)\n",
" ---------------------------------------- 0.0/8.0 MB ? eta -:--:--\n",
" --------------------------------------- 0.2/8.0 MB 5.1 MB/s eta 0:00:02\
n",
" --------------------------------------- 0.2/8.0 MB 2.3 MB/s eta 0:00:04\
n",
" -- ------------------------------------- 0.5/8.0 MB 4.2 MB/s eta 0:00:02\
n",
" -- ------------------------------------- 0.5/8.0 MB 3.6 MB/s eta 0:00:03\
n",
" --- ------------------------------------ 0.7/8.0 MB 3.0 MB/s eta 0:00:03\
n",
" ---- ----------------------------------- 1.0/8.0 MB 3.6 MB/s eta 0:00:02\
n",
" ----- ---------------------------------- 1.2/8.0 MB 3.7 MB/s eta 0:00:02\
n",
" ------- -------------------------------- 1.5/8.0 MB 4.3 MB/s eta 0:00:02\
n",
" -------- ------------------------------- 1.7/8.0 MB 4.2 MB/s eta 0:00:02\
n",
" --------- ------------------------------ 1.9/8.0 MB 4.1 MB/s eta 0:00:02\
n",
" --------- ------------------------------ 2.0/8.0 MB 3.9 MB/s eta 0:00:02\
n",
" ----------- ---------------------------- 2.3/8.0 MB 4.2 MB/s eta 0:00:02\
n",
" ------------ --------------------------- 2.5/8.0 MB 4.3 MB/s eta 0:00:02\
n",
" ------------- -------------------------- 2.7/8.0 MB 4.2 MB/s eta 0:00:02\
n",
" --------------- ------------------------ 3.1/8.0 MB 4.5 MB/s eta 0:00:02\
n",
" --------------- ------------------------ 3.1/8.0 MB 4.3 MB/s eta 0:00:02\
n",
" ----------------- ---------------------- 3.6/8.0 MB 4.6 MB/s eta 0:00:01\
n",
" ------------------- -------------------- 3.9/8.0 MB 4.8 MB/s eta 0:00:01\
n",
" -------------------- ------------------- 4.1/8.0 MB 4.7 MB/s eta 0:00:01\
n",
" --------------------- ------------------ 4.2/8.0 MB 4.8 MB/s eta 0:00:01\
n",
" ----------------------- ---------------- 4.7/8.0 MB 4.9 MB/s eta 0:00:01\
n",
" ------------------------ --------------- 4.8/8.0 MB 4.8 MB/s eta 0:00:01\
n",
" ------------------------- -------------- 5.0/8.0 MB 4.8 MB/s eta 0:00:01\
n",
" ------------------------- -------------- 5.0/8.0 MB 4.8 MB/s eta 0:00:01\
n",
" ------------------------- -------------- 5.1/8.0 MB 4.4 MB/s eta 0:00:01\
n",
" --------------------------- ------------ 5.4/8.0 MB 4.5 MB/s eta 0:00:01\
n",
" --------------------------- ------------ 5.4/8.0 MB 4.5 MB/s eta 0:00:01\
n",
" ----------------------------- ---------- 5.8/8.0 MB 4.5 MB/s eta 0:00:01\
n",
" ----------------------------- ---------- 5.9/8.0 MB 4.5 MB/s eta 0:00:01\
n",
" ------------------------------- -------- 6.2/8.0 MB 4.6 MB/s eta 0:00:01\
n",
" ------------------------------- -------- 6.3/8.0 MB 4.4 MB/s eta 0:00:01\
n",
" -------------------------------- ------- 6.6/8.0 MB 4.5 MB/s eta 0:00:01\
n",
" --------------------------------- ------ 6.7/8.0 MB 4.6 MB/s eta 0:00:01\
n",
" --------------------------------- ------ 6.7/8.0 MB 4.6 MB/s eta 0:00:01\
n",
" --------------------------------- ------ 6.7/8.0 MB 4.6 MB/s eta 0:00:01\
n",
" --------------------------------- ------ 6.7/8.0 MB 4.6 MB/s eta 0:00:01\
n",
" ---------------------------------- ----- 6.9/8.0 MB 4.0 MB/s eta 0:00:01\
n",
" ------------------------------------ --- 7.2/8.0 MB 4.1 MB/s eta 0:00:01\
n",
" ------------------------------------ --- 7.4/8.0 MB 4.1 MB/s eta 0:00:01\
n",
" ------------------------------------- -- 7.5/8.0 MB 4.1 MB/s eta 0:00:01\
n",
" --------------------------------------- 7.8/8.0 MB 4.2 MB/s eta 0:00:01\
n",
" --------------------------------------- 8.0/8.0 MB 4.1 MB/s eta 0:00:01\
n",
" ---------------------------------------- 8.0/8.0 MB 4.1 MB/s eta 0:00:00\
n",
"Downloading contourpy-1.2.1-cp311-cp311-win_amd64.whl (188 kB)\n",
" ---------------------------------------- 0.0/188.2 kB ? eta -:--:--\n",
" --------------------------------------- 184.3/188.2 kB 5.6 MB/s eta
0:00:01\n",
" ---------------------------------------- 188.2/188.2 kB 3.8 MB/s eta
0:00:00\n",
"Using cached cycler-0.12.1-py3-none-any.whl (8.3 kB)\n",
"Downloading fonttools-4.51.0-cp311-cp311-win_amd64.whl (2.2 MB)\n",
" ---------------------------------------- 0.0/2.2 MB ? eta -:--:--\n",
" --- ------------------------------------ 0.2/2.2 MB 4.1 MB/s eta 0:00:01\
n",
" ----- ---------------------------------- 0.3/2.2 MB 3.1 MB/s eta 0:00:01\
n",
" ------- -------------------------------- 0.4/2.2 MB 3.7 MB/s eta 0:00:01\
n",
" ------- -------------------------------- 0.4/2.2 MB 3.7 MB/s eta 0:00:01\
n",
" --------- ------------------------------ 0.5/2.2 MB 2.2 MB/s eta 0:00:01\
n",
" ----------------- ---------------------- 1.0/2.2 MB 3.4 MB/s eta 0:00:01\
n",
" --------------------- ------------------ 1.2/2.2 MB 3.8 MB/s eta 0:00:01\
n",
" -------------------------- ------------- 1.5/2.2 MB 3.9 MB/s eta 0:00:01\
n",
" -------------------------------- ------- 1.8/2.2 MB 4.2 MB/s eta 0:00:01\
n",
" ------------------------------------- -- 2.0/2.2 MB 4.3 MB/s eta 0:00:01\
n",
" ---------------------------------------- 2.2/2.2 MB 4.4 MB/s eta 0:00:00\
n",
"Using cached kiwisolver-1.4.5-cp311-cp311-win_amd64.whl (56 kB)\n",
"Using cached pyparsing-3.1.2-py3-none-any.whl (103 kB)\n",
"Installing collected packages: pyparsing, kiwisolver, fonttools, cycler,
contourpy, matplotlib\n",
"Successfully installed contourpy-1.2.1 cycler-0.12.1 fonttools-4.51.0
kiwisolver-1.4.5 matplotlib-3.9.0 pyparsing-3.1.2\n"
]
}
],
"source": [
"!pip install matplotlib"
]
},
{
"cell_type": "code",
"execution_count": 30,
"id": "10fd2d42-76cc-4b71-aba0-adcc40c46309",
"metadata": {},
"outputs": [],
"source": [
"import torch\n",
"from torch.utils.data import Dataset\n",
"from torchvision import datasets\n",
"from torchvision.transforms import ToTensor\n",
"import matplotlib.pyplot as plt"
]
},
{
"cell_type": "code",
"execution_count": 31,
"id": "e58b75c2-fbae-4840-bcc8-1dd43e4314b0",
"metadata": {
"collapsed": true,
"jupyter": {
"outputs_hidden": true
}
},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Downloading
http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-
ubyte.gz\n",
"Downloading
http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-images-idx3-
ubyte.gz to data\\FashionMNIST\\raw\\train-images-idx3-ubyte.gz\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"100.0%\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Extracting data\\FashionMNIST\\raw\\train-images-idx3-ubyte.gz to data\\
FashionMNIST\\raw\n",
"\n",
"Downloading
http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-
ubyte.gz\n",
"Downloading
http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/train-labels-idx1-
ubyte.gz to data\\FashionMNIST\\raw\\train-labels-idx1-ubyte.gz\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"100.0%\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Extracting data\\FashionMNIST\\raw\\train-labels-idx1-ubyte.gz to data\\
FashionMNIST\\raw\n",
"\n",
"Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-
images-idx3-ubyte.gz\n",
"Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-
images-idx3-ubyte.gz to data\\FashionMNIST\\raw\\t10k-images-idx3-ubyte.gz\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"100.0%\n"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Extracting data\\FashionMNIST\\raw\\t10k-images-idx3-ubyte.gz to data\\
FashionMNIST\\raw\n",
"\n",
"Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-
labels-idx1-ubyte.gz\n",
"Downloading http://fashion-mnist.s3-website.eu-central-1.amazonaws.com/t10k-
labels-idx1-ubyte.gz to data\\FashionMNIST\\raw\\t10k-labels-idx1-ubyte.gz\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"100.0%"
]
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Extracting data\\FashionMNIST\\raw\\t10k-labels-idx1-ubyte.gz to data\\
FashionMNIST\\raw\n",
"\n"
]
},
{
"name": "stderr",
"output_type": "stream",
"text": [
"\n"
]
}
],
"source": [
"training_data = datasets.FashionMNIST(\n",
" root=\"data\",\n",
" train=True,\n",
" download=True,\n",
" transform=ToTensor()\n",
")\n",
"\n",
"test_data = datasets.FashionMNIST(\n",
" root=\"data\",\n",
" train=False,\n",
" download=True,\n",
" transform=ToTensor()\n",
")"
]
},
{
"cell_type": "markdown",
"id": "27e3012d-2da6-4f41-a1b3-2f07e73a0dd2",
"metadata": {},
"source": [
"#### Iterating and Visualizing the Dataset\n",
"\n",
"We can index Datasets manually like a list: training_data[index]. We use
matplotlib to visualize some samples in our training data."
]
},
{
"cell_type": "code",
"execution_count": 32,
"id": "5e135ff8-bd8c-4fdb-af0b-8dc51c9dffb3",
"metadata": {},
"outputs": [
{
"data": {
"image/png":
"iVBORw0KGgoAAAANSUhEUgAAAn4AAAKSCAYAAABMVtaZAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIH
ZlcnNpb24zLjkuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy80BEi2AAAACXBIWXMAAA9hAAAPYQGoP6dpA
ABpUklEQVR4nO3dd3hVZdb4/
RVCGqkkhBYgobeIKNVCFWQERFBQHAtYQMf2OPM4M05FHcuMMoryqOiMA46gIBqwBkQHCwIDIiCCICVRpIUQ
0htJ9vuHP/Iac68bzjGV+/
u5Lq4Z1j7r7H1OdllustYO8DzPEwAAAJzxmtT3BgAAAKBuUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFH
wAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBgEF6eroEBATI7Nmz63tTAKDGUPjVkgULFkhAQECVPy1btpQRI0ZIampq
fW8e0CBs27ZNJk+eLImJiRIaGioJCQkyevRomTt3bn1vGoAf4bp2Zmha3xtwpnvggQekY8eO4nmeHDlyRBY
sWCBjx46Vt956S8aPH1/fmwfUm7Vr18qIESOkQ4cOMmPGDGndurXs379f1q9fL08+
+aTceeed9b2JAAy4rjVuFH617JJLLpH+/ftX/
v2mm26SVq1aySuvvMIBAqc99NBDEh0dLRs3bpSYmJgqyzIyMupno+pYYWGhNGvWrL43A/AJ17XGjX/
qrWMxMTESFhYmTZv+/
zX37Nmz5fzzz5e4uDgJCwuTfv36yWuvvVYtt6ioSO666y5p0aKFREZGyoQJE+TAgQMSEBAg9913Xx1+CuCn
27t3r/Tu3bta0Sci0rJly8r/
HxAQIHfccYcsX75ckpOTJSQkRHr37i0rVqyolnfgwAG58cYbpVWrVpWv+9e//lXlNaWlpfLnP/
9Z+vXrJ9HR0RIeHi5DhgyR1atXn3KbPc+TmTNnSnBwsKSkpFTGFy5cKP369ZOwsDCJjY2VqVOnyv79+6vkD
h8+XJKTk2XTpk0ydOhQadasmfz+978/5TqBho7rWuPCHb9alpOTI5mZmeJ5nmRkZMjcuXMlPz9frr322srX
PPnkkzJhwgS55pprpLS0VBYvXixTpkyRt99+W8aNG1f5uunTp8urr74q1113nQwePFg+
+uijKsuBxiQxMVHWrVsnX375pSQnJ1tfu2bNGklJSZHbbrtNIiMj5amnnpIrrrhCvv32W4mLixMRkSNHjsj
gwYMrC8X4+HhJTU2Vm266SXJzc+Xuu+8WEZHc3Fz55z//KVdffbXMmDFD8vLy5IUXXpAxY8bIhg0bpG/
fvsZtKC8vlxtvvFGWLFkiy5Ytqzz2HnroIfnTn/
4kV155pdx8881y9OhRmTt3rgwdOlQ2b95cpbA9duyYXHLJJTJ16lS59tprpVWrVj/5ewTqGte1Rs5DrZg/
f74nItX+hISEeAsWLKjy2sLCwip/
Ly0t9ZKTk72RI0dWxjZt2uSJiHf33XdXee306dM9EfFmzZpVa58FqA3vvfeeFxgY6AUGBnrnnXee95vf/
MZbuXKlV1paWuV1IuIFBwd7e/
bsqYxt3brVExFv7ty5lbGbbrrJa9OmjZeZmVklf+rUqV50dHTlcVZWVuaVlJRUec3x48e9Vq1aeTfeeGNlL
C0tzRMR77HHHvNOnDjhXXXVVV5YWJi3cuXKytekp6d7gYGB3kMPPVTl/
bZt2+Y1bdq0SnzYsGGeiHjz5s3z9asCGgSua2cG/qm3lj399NOyatUqWbVqlSxcuFBGjBghN998c5V/
JgoLC6v8/8ePH5ecnBwZMmSIfP7555Xxk/
+sddttt1V5f34BHo3V6NGjZd26dTJhwgTZunWrPProozJmzBhJSEiQN998s8prR40aJZ07d678e58+fSQqK
kr27dsnIt//E+zrr78ul156qXieJ5mZmZV/xowZIzk5OZXHU2BgoAQHB4uISEVFhWRlZUlZWZn079+/
yjF3UmlpaeWdinfffVcuvvjiymUpKSlSUVEhV155ZZV1tm7dWrp27Vrtn49DQkLkhhtuqJkvEKgnXNcaufq
uPM9UJ//LaOPGjVXi5eXlXp8+fbw2bdpU3nV46623vEGDBnkhISFV/isqICCgMm/
mzJlekyZNvBMnTlR5v5ycHP7LCI1eSUmJt2HDBu93v/udFxoa6gUFBXnbt2/3PO/
7O3633nprtZzExERv+vTpnud53pEjR4x3In74JyUlpTJ3wYIF3llnneUFBQVVeU3Hjh0rX3Pyjl9ERIQnIl
5qamq1bfjFL35hXWefPn0qXzts2DCvU6dONfadAXWN69qZgd/xq2NNmjSRESNGyJNPPim7d+
+WrKwsmTBhggwdOlSeeeYZadOmjQQFBcn8+fPl5Zdfru/
NBepEcHCwDBgwQAYMGCDdunWTG264QZYuXSqzZs0Ske/
v0pl4nici39+5ExG59tprZdq0acbX9unTR0S+b8SYPn26TJw4UX79619Ly5YtJTAwUB555BHZu3dvtbwxY8
bIihUr5NFHH5Xhw4dLaGho5bKKigoJCAiQ1NRU4zZGRERU+fsP74IAZwqua40LhV89KCsrExGR/
Px8ef311yU0NFRWrlwpISEhla+ZP39+lZzExESpqKiQtLQ06dq1a2V8z549dbPRQB05OSbi0KFDp50THx8v
kZGRUl5eLqNGjbK+9rXXXpNOnTpJSkqKBAQEVMZPFpk/NnjwYLn11ltl/
PjxMmXKFFm2bFll92Lnzp3F8zzp2LGjdOvW7bS3FzjTcF1rPPgdvzp24sQJee+99yQ4OFh69uwpgYGBEhAQ
IOXl5ZWvSU9Pl+XLl1fJGzNmjIiIPPPMM1XiPOEAjdXq1asr79j90LvvvisiIt27dz/
t9woMDJQrrrhCXn/9dfnyyy+rLT969GiV14pIlXX/97//
lXXr1qnvP2rUKFm8eLGsWLFCrrvuuso7jJdffrkEBgbK/
fffX+2zeJ4nx44dO+3PADRWXNcaF+741bLU1FTZuXOniHw/
lPbll1+W3bt3y7333itRUVEybtw4efzxx+VnP/uZ/PznP5eMjAx5+umnpUuXLvLFF19Uvk+/
fv3kiiuukDlz5sixY8cq296//vprEZEqdy6AxuDOO+
+UwsJCmTRpkvTo0UNKS0tl7dq1smTJEklKSvK5CeKvf/2rrF69WgYNGiQzZsyQXr16SVZWlnz++efy/
vvvS1ZWloiIjB8/XlJSUmTSpEkybtw4SUtLk3nz5kmvXr0kPz9fff+JEyfK/
Pnz5frrr5eoqCh57rnnpHPnzvLggw/K7373O0lPT5eJEydKZGSkpKWlybJly2TmzJlyzz33/
KTvCWhouK41cvX5C4ZnMlPbe2hoqNe3b1/
v2Wef9SoqKipf+8ILL3hdu3b1QkJCvB49enjz58/3Zs2a5f34x1NQUODdfvvtXmxsrBcREeFNnDjR27Vrly
ci3l//+te6/
ojAT5KamurdeOONXo8ePbyIiAgvODjY69Kli3fnnXd6R44cqXydiHi33357tfzExERv2rRpVWJHjhzxbr/
9dq99+/ZeUFCQ17p1a++iiy7ynn/++crXVFRUeA8//LCXmJjohYSEeOecc4739ttve9OmTfMSExMrX/
fDcS4/9Mwzz3gi4t1zzz2Vsddff9278MILvfDwcC88PNzr0aOHd/vtt3u7du2qfM2wYcO83r17+/
t1AfWO69qZIcDzDP/WgkZjy5Ytcs4558jChQvlmmuuqe/
NAQDgJ+G6Vrv4Hb9GpKioqFpszpw50qRJExk6dGg9bBEAAP7julb3+B2/
RuTRRx+VTZs2yYgRI6Rp06aSmpoqqampMnPmTGnfvn19bx4AAD7hulb3+KfeRmTVqlVy//33y44dOyQ/
P186dOgg1113nfzhD3+o8nBsAAAaA65rdY/
CDwAAwBH8jh8AAIAjKPwAAAAcQeEHAADgiNP+zUkmaOuCg4PVZaWlpcZ48+bN1ZynnnrKGH/
jjTfUnNdee01dBl1D/
BVXjjWciTjWgLpxqmONO34AAACOoPADAABwBIUfAACAIyj8AAAAHMFYbB80aWKuk7UGDhGRsLAwY/
z9999Xc2677TZj/Le//
a2aU15ebowvW7ZMzQkKCjLGT5w4oeYAAIDGizt+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHBHin+QBFnmko0rSpefpNWVmZmjNv3jxjfMmSJWrO6tWrjfGf/exnas4f/
vAHY3zIkCFqjsb2s26Iz9v8KRri5+FYw5mIYw2oGzyrFwAAACJC4QcAAOAMCj8AAABHUPgBAAA4gsIPAADA
EeY2VYc1aaLXwlr3blBQkJoTGhpqjGuduzYrVqxQl913333G+PDhw9WcDz/
80Bi3fZ7S0lJ1GQAAaNi44wcAAOAICj8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcATjXH6kaVP9K9
FGmVx99dVqzpYtW3zehujoaGM8JydHzdmwYYMxfvHFF6s52jiXEydO6BsHAAAaLe74AQAAOILCDwAAwBEUf
gAAAI6g8AMAAHAEhR8AAIAj6Or9kbKyMp9zEhIS1GULFy70+f0qKip8znn55ZeN8V//
+tc+v5fneT7nAACAho87fgAAAI6g8AMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/
AAAARzg7zqVJE3PNaxulEhYWZoy3bt1azdm/f78xHh4erub4M1Jm/
fr1xnhwcLCao2334cOH1ZymTc27jD/bDAAA6hZ3/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gsIPAADAEc529Xqe53POlClTjHFbJ7CmvLxcXRYQEGCMa13FIiJF
RUXG+PHjx9WcCRMmGOPPP/+8muPPZwUAAA0Dd/wAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOILCDwAAwBEUfgAAAI5gnIsPLrroImN848aNPr9XYGCguqysrMwYDwoK8nk9X375pbrsggs
uMMYZ5wIAaIy0cWj+XPMbgiZNzPfnmjb1v3zjjh8AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOOKM7uq1dc6Wl5f7/
H5du3Y1xlNSUnx+L63zyLbMlqPZvHmzumzixIk+v59G6zwSoRMYZsHBweqy0tLSOtmGSy65xBjfvXu3mnPZ
ZZcZ43//+999Xr8/x3RN07bB5ePWn2tH27Zt1Zy7777bGD9x4oSaU1hYaIxnZWWpOYcPHzbG09LS1Jz9+/
cb4zk5OWqOtm80hH2mvrt3bd222mSOoqIiNUf7Tn/
KOZI7fgAAAI6g8AMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR5zR41z8oY13EBFp3ry5Mb5161af12Mbf6Lxp039448/
VpdFREQY4126dFFz9uzZY4w3hLEUqH3+PABd29dt4wg6depkjO/bt0/NmTZtmjG+d+9eNecvf/
mLMR4SEqLmxMXFGeP+jHOp79ET/m6DP/
tBQ6Sd65YuXarmHDx40Bjv2LGjmlNWVmaM+3PeLCgoUJdpoz+SkpLUHG1sTH5+vs85JSUlao7GnxEwtjE42
mgU2/ioZs2aGePaNVJEP68dPXpUzYmNjTXGDxw4oOZo43aeeuopNedUuOMHAADgCAo/
AAAAR1D4AQAAOILCDwAAwBEUfgAAAI44o7t6tYdp2wwYMEBdVlxcbIxnZGT4vB5b95vtAeG+snVZads9YsQ
INYeu3obJn+/fnw5Mf3K0rr02bdqoOZ07dzbGH3/8cTUnISHBGO/
fv7+as2XLFmPc1tWrdTumpaWpOVqn3yuvvKLmrF692hjPyspSc7SuwXPOOUfN6d27tzHeq1cvNeeCCy4wxh
955BE1pyGaNGmSMd63b181p23btsb44cOH1ZyvvvrKGG/
VqpWao02RsB2Dubm5xvj27dvVHK3b1Xb99KerW3s/
23r8uYZrn8c2RUDrUj527Jia07SpuXwKCgpSc7TrsdZVLCJy2WWXGePt2rVTc06FO34AAACOoPADAABwBIU
fAACAIyj8AAAAHEHhBwAA4AgKPwAAAEcEeKc5n8GVcR3Lly9Xl2kPbB41alQtbc3p86e9/t///
rcxro0REBG59NJLfduwBq4hPlQ+LCzMGNfGCTUEthFAnTp1MsaHDBmi5mhjLqKjo9WcTz75xBj/4x//
qOa88cYbxvhZZ52l5mgjM2xjmOLi4oxx24PjtQfR287F2jaUlZWpOVFRUcb4kSNH1BxtDM4LL7yg5ixbtkx
dVl+073Ly5MlqzqxZs4xx2xgP7Zi2jYDp0KGDMW4b/ZGTk2OMFxUVqTn+jFnR9id/
RsDYcpo0Md+b0kapiOijWfypYWw52rVDO25F9HEutu9AGxP1v//
7v2pOQUGBukyEO34AAADOoPADAABwBIUfAACAIyj8AAAAHEHhBwAA4Ai9NeYMp3Wy2R7kbOvW8ZXWrSSiPy
De1pnlT3eq1rXXsWNHn98rPDxcXXaqDiNUpXXv2jrMtAd2V1RUqDmhoaHGeGZmppqj7ZurV69Wc0aOHGmM9
+nTR83ROhdtx018fLwxbvs82kPY8/Pz1RztO7V917buzZqkdQfajs/
U1FRjfPbs2WqO9v1o+0dj89prr6nLJk2aZIwPHTpUzcnOzjbGu3Tpoubs27fPGNeuXSL6Pqh1oovox5Q/
UwRs1yFt27SOZxH/
rh3aebKmJ5No35t2XhXRO45jY2PVnIcfftin9zod3PEDAABwBIUfAACAIyj8AAAAHEHhBwAA4AgKPwAAAEd
Q+AEAADgiwDvNOSA13Qpdky6++GJj/
O9//7uas3nzZmPc1lq+c+dO3zZM9NZ729eutXbb2uv3799vjGsP7RYR6dWrlzHevn17NefQoUPG+OjRo9Wc
5ORkYzwvL0/
NqSv+jMGpbeedd54xnpGRoeakp6cb49qIExF99EdERISaoz0c3fbz18YO2EazaD+XPXv2qDlDhgwxxps3b6
7m3H333cb4t99+q+Zo+63twfE1yTY2RvvZvfrqq2rOAw88YIy3aNFCzdH2q7PPPlvN+de//qUuqy/
aPmg7L5x//
vnG+AsvvODzeoKCgtSc6OhoY9w2ckwbXWRbj3aNsB03MTExxnhwcLCao32nWVlZak5ZWZnPORrbcaMt086R
thxtm0X0Y6qkpETN0a4HNqe6rnHHDwAAwBEUfgAAAI6g8AMAAHAEhR8AAIAjKPwAAAAccUZ09T700EPG+NV
XX63m7N271xi3deRoD1/+8MMP1Zzjx48b49pD6G1snUxaB5itq3PEiBHG+HfffafmaF1b3bt3V3OmT59ujK
9Zs0bNqSuNqat3wIABas6OHTuM8VtuuUXNWbdunTG+ZMkSNUfrej98+LCaU1RUZIy3bt1azdG6EENCQtScr
7/+2hjXuv5FRFq1amWMz5w5U81JSkoyxm37kq07UKN1DdrOHQcPHjTGb7rpJjVn4MCBxvh///
tfNUfbR0eNGqXmXHHFFeqy+qJd1wIDA9Uc7Wc5ePBgNWfBggXGuK2zXfv52yZP5OfnG+O2TmDt87Rr107N0
T6rrRteYztutOkbtuv0iRMnfF6P7f002ncaFxen5mjbYDu3az8f275zqvMNd/wAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOILCDwAAwBEUfgAAAI44I8a5fPTRR8Z4ZGSkmqO1fEdFRak52sPZc3Nz1Zy0tDRjXBvzIiL
SsmVLYzwhIUHN0VrIbQ+O11rYbQ+z1sZp2MZs/OMf/
zDGH3vsMTWnrjTEcS7amBPbMajlaA9Tt7GNANKOG9vD2bXtth1rsbGxxrhtf9aOw549e6o5n3zyiTHer18/
NSc5OdkY18bWiIgcPXrUGPfn+OzQoYOao41z0c6RIvq+YztHZWRkGOMFBQVqznvvvacuqy/
avmkblaGNetGODRGRoUOHGuMrV65Uc/bs2WOM+3OsFRYWqjnamLLw8HA1p3379j7naOda27Z99dVXxrht/
Ep2drYx7s+IHtvPtHnz5sb4N998o+ZMnDhRXabRrq0lJSVqzqmua9zxAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHNJquXu1h6iIiBw4cMMY3bNig5tgedK7Jysoyxm0drVqXk+371Dr9bN22WveR
rftJW4/tu/bnYdaZmZnGuPZweBH/
Hmrvj4bY1at1lCYmJqo5WgfmoUOH1BztZ+nPvmnrTtWOAX8emm7raPSn+03LOXbsmJqjLbMdn1qXqK3TUHs
/2/emdUNrkwJERIKCgoxx2zGYk5NjjNs+z/
r169Vl9UXb123HgPb9+5NzySWXqDkpKSnG+HfffafmaMeN7bytbZvW7S0i8uSTTxrj+/btU3O0Ln5bB/
U999xjjGvXFBF9v7Xtz9p3oHU8i4j85z//
McbvvfdeNUdj+w60bbOdB+jqBQAAgIhQ+AEAADiDwg8AAMARFH4AAACOoPADAABwBIUfAACAIxrNOJdf/
vKX6rJHH33UGP/ss8/UHG38ia19OywsTF2m0UYl2Gjfte1HpbWD29rEte/
ANjJD+zy2h7MnJSUZ45MnT1Zz3nnnHXVZTWqI41y0n398fLya06lTJ2PcNppH+1n6MzbI9vPXxqnY1qONny
guLlZztPezHQPa+BHbiCbt+LCNMtG2wXZ+8OfB8aWlpT7FRfTv1HbOj4yMNMZzc3PVnO3bt6vL6ktdXde0f
cM2YuT88883xhcsWKDmaNcvbfyOiH6s2c6N0dHRxrjtGNC+a9v+rJ1XioqK1BxtX7cd09qoLG2cjIjIwoUL
1WUa7dxhOz79wTgXAAAAiAiFHwAAgDMo/
AAAABxB4QcAAOAICj8AAABH6E9Wb2BsD6jXHppu6zDSOolsHYBa543tAfXaemxdN7buI43tQfS+rsfWaaZ1
WaWlpak5hw8fNsaTk5PVnLrq6m1Mjh496tcyTbNmzYxxW6dpTEyMzzla16DtuNG6E20d57ZjV6NtW2FhoZq
jda7aupS1LkRbjj+dwNr72b5rrUPT1m2ZkZFhjO/bt0/
NOdPZOoRtP2fN2rVrjfEJEyaoOVu3bjXGbdcb7RiwbbPt+NDYrisa7RrevHlzn9djOz+MHTvWGP/
8888tW2fmTw1R17jjBwAA4AgKPwAAAEdQ+AEAADiCwg8AAMARFH4AAACOoPADAABwRKMZ5xIWFqYu08YO2B
5MrY2ysLViay3x/jwEXmuhF/HvYfPamAtbjjZ+wJYTFRVljHfo0EHN0Vr/e/
bsqeag9vkzksF2TAEus41M0c6pthzNzp071WVxcXHGuD/j0PwZG5Sfn6/
maOPDbGPXtGW2UTOxsbHG+Jdffqnm+Lp+2zbYtk275vqzH/
wU3PEDAABwBIUfAACAIyj8AAAAHEHhBwAA4AgKPwAAAEc0mq7evLw8dZnWRWN7yLjWXVNcXKzmaJ2zto4c7
YHRthx/
On+0z2rr0PXnIfDaMttD4LUOatvPBwDOFLZOz5qkddVu3769TtbfEBw8eLDG3ku7fvurrrt3NdzxAwAAcAS
FHwAAgCMo/AAAABxB4QcAAOAICj8AAABHUPgBAAA4otGMcykqKlKXRUREGONt2rRRc/
xpr7eNLKmL97KNZtHeTxsNY8uxraesrMwYj4mJUXM0hYWFPucAAAD/
cccPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABzRaLp609PT1WUnTpwwxnNzc9Wc0NBQY9zW7as9YNnWBat11dq6bbXPU9NdvdoDqG3fgfa9ZWdnqzlJ
SUnGeGpqqpoDAABqHnf8AAAAHEHhBwAA4AgKPwAAAEdQ+AEAADiCwg8AAMARFH4AAACOaDTjXL755ht1WVB
QkDFuG0tSVlZmjEdGRqo52mgUbf3+5mhjW7TxK7b1BAYG+pyjja2x5eTl5ak52iie/fv3qzk
AAKDmcccPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABzRaLp6Dx48qC7TumCDg4PVnKZNzR9d61q1KS0tVZdp75efn+/
z+4WFhak52mfVPqeIvePXVy1atFCXFRcXG+Pbt2+vsfUDAIBT444fAACAIyj8AAAAHEHhBwAA4AgKPwAAAE
dQ+AEAADiCwg8AAMARjWacS3p6urosJibGp7iISIcOHYzxEydOqDllZWXGuG0ETEVFhTFuGwFTUlJijNvG0
2jLgoKC1BxtnIttPZmZmcZ4u3bt1Jw9e/aoywAAQN3hjh8AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOKLRdPUWFhaqy2bPnm2MFxcXqzlat21ISIia06SJuU5u2lT/
GrXOWe29bMvKy8vVHG2Z7TvIzc01xrOzs9Uc7f2ioqLUnI0bN6rLAABA3eGOHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gsIPAADAEQGe53n1vREAAACofdzxAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPg1Munp6RIQECCzZ8+u700BGrUFCxZIQECAfPbZZ6d87fDhw2X48OG1v1FA
IxQQECB33HHHKV938phLT0+v/Y2CisLPYNu2bTJ58mRJTEyU0NBQSUhIkNGjR8vcuXPre9OAM15AQMBp/
fnwww+N+RUVFfLvf/9bBg0aJLGxsRIZGSndunWT66+/XtavX1/
r279jxw657777uLjhjFCf18OHH35Yli9fXuvrcU3T+t6Ahmbt2rUyYsQI6dChg8yYMUNat24t+/fvl/
Xr18uTTz4pd955Z31vInBGe+mll6r8/d///resWrWqWrxnz57G/
Lvuukuefvppueyyy+Saa66Rpk2byq5duyQ1NVU6deokgwcP9nmb3nvvvdN+7Y4dO+T++++X4cOHS1JSks/
rAhqKmr4eXnfddTJ16lQJCQk5rdc//
PDDMnnyZJk4caIfWw8Nhd+PPPTQQxIdHS0bN26UmJiYKssyMjLqZ6PqWGFhoTRr1qy+NwOOuvbaa6v8ff36
9bJq1apqcZMjR47IM888IzNmzJDnn3++yrI5c+bI0aNH/
dqm4ODgU76muLj4tF4HNBY1fT0MDAyUwMBA62s8z5Pi4mIJCwvz+f1xevin3h/
Zu3ev9O7du9pOLiLSsmXLyv9/8ncali9fLsnJyRISEiK9e/eWFStWVMs7cOCA3HjjjdKqVavK1/3rX/
+q8prS0lL585//LP369ZPo6GgJDw+XIUOGyOrVq0+5zZ7nycyZMyU4OFhSUlIq4wsXLpR+/
fpJWFiYxMbGytSpU2X//v1VcocPHy7JycmyadMmGTp0qDRr1kx+//
vfn3KdQEOUlpYmnufJBRdcUG1ZQEBAlWP4pJKSEvnVr34l8fHxEh4eLpMmTapWIP74d/w+/
PBDCQgIkMWLF8sf//hHSUhIkGbNmslTTz0lU6ZMERGRESNGnPKfpYGG7HSvhyed6npo+h2/
pKQkGT9+vKxcuVL69+8vYWFh8txzz0lAQIAUFBTIiy++WHkcTZ8+vYY/oZu44/
cjiYmJsm7dOvnyyy8lOTnZ+to1a9ZISkqK3HbbbRIZGSlPPfWUXHHFFfLtt99KXFyciHx/
B2Lw4MGVhWJ8fLykpqbKTTfdJLm5uXL33XeLiEhubq7885//
lKuvvlpmzJgheXl58sILL8iYMWNkw4YN0rdvX+M2lJeXy4033ihLliyRZcuWybhx40Tk+/9S+9Of/
iRXXnml3HzzzXL06FGZO3euDB06VDZv3lzlQD527JhccsklMnXqVLn22mulVatWP/
l7BOpDYmKiiIgsXbpUpkyZclp3ru+8805p3ry5zJo1S9LT02XOnDlyxx13yJIlS06Z+5e//
EWCg4PlnnvukZKSErn44ovlrrvukqeeekp+//vfV/
5ztPbP0kBDVtPXQ82uXbvk6quvlltuuUVmzJgh3bt3l5deekluvvlmGThwoMycOVNERDp37lxjn81pHqp47
733vMDAQC8wMNA777zzvN/
85jfeypUrvdLS0iqvExEvODjY27NnT2Vs69atnoh4c+fOrYzddNNNXps2bbzMzMwq+VOnTvWio6O9wsJCz/
M8r6yszCspKanymuPHj3utWrXybrzxxspYWlqaJyLeY4895p04ccK76qqrvLCwMG/lypWVr0lPT/
cCAwO9hx56qMr7bdu2zWvatGmV+LBhwzwR8ebNm+frVwXUidtvv93z5VR1/fXXeyLiNW/e3Js0aZI3e/
Zs76uvvqr2uvnz53si4o0aNcqrqKiojP/
yl7/0AgMDvezs7MrYsGHDvGHDhlX+ffXq1Z6IeJ06dao8hk9aunSpJyLe6tWrT/
9DAg1QTV8PTx5zaWlplbHExERPRLwVK1ZUW394eLg3bdq0Gv9cruOfen9k9OjRsm7dOpkwYYJs3bpVHn30U
RkzZowkJCTIm2++WeW1o0aNqvJfIH369JGoqCjZt2+fiHz/T7Cvv/
66XHrppeJ5nmRmZlb+GTNmjOTk5Mjnn38uIt//7sPJ3w+qqKiQrKwsKSsrk/
79+1e+5odKS0tlypQp8vbbb8u7774rF198ceWylJQUqaiokCuvvLLKOlu3bi1du3at9s/
HISEhcsMNN9TMFwjUs/nz58v//d//
SceOHWXZsmVyzz33SM+ePeWiiy6SAwcOVHv9zJkzJSAgoPLvQ4YMkfLycvnmm29Oua5p06bxu0g4Y9Xk9dC
mY8eOMmbMmBrffpjxT70GAwYMkJSUFCktLZWtW7fKsmXL5IknnpDJkyfLli1bpFevXiIi0qFDh2q5zZs3l+
PHj4uIyNGjRyU7O1uef/75ar9oftIPf0H2xRdflL///e+yc+dOOXHiRGW8Y8eO1fIeeeQRyc/
Pl9TU1GrzxXbv3i2e50nXrl2N6wwKCqry94SEBH4pHY1Kfn6+5OfnV/
49MDBQ4uPjRUSkSZMmcvvtt8vtt98ux44dk08//
VTmzZsnqampMnXqVPnkk0+qvNePj+PmzZuLiFQexzamYxM4k9TU9dCG46huUfhZBAcHy4ABA2TAgAHSrVs3
ueGGG2Tp0qUya9YsERG1O8nzPBH5/s6dyPdditOmTTO+tk+fPiLyfSPG9OnTZeLEifLrX/
9aWrZsKYGBgfLII4/I3r17q+WNGTNGVqxYIY8+
+qgMHz5cQkNDK5dVVFRIQECApKamGrcxIiKiyt+5Y4HGZvbs2XL//fdX/
j0xMdE4Ny8uLk4mTJggEyZMkOHDh8tHH30k33zzTeXvAoqc+ji24diBK37q9dCG46huUfidpv79+4uIyKFD
h047Jz4+XiIjI6W8vFxGjRplfe1rr70mnTp1kpSUlCr/
7HTyoPqxwYMHy6233irjx4+XKVOmyLJly6Rp0+9/nJ07dxbP86Rjx47SrVu3095eoLG4/vrr5cILL6z8+
+lcOPr37y8fffSRHDp0qErhV9N+ePwCZyJ/rof+4FiqHfyO34+sXr3a+F8o7777roiIdO/e/
bTfKzAwUK644gp5/fXX5csvv6y2/IcjI07+19IP1/3f//5X1q1bp77/
qFGjZPHixbJixQq57rrrKu8wXn755RIYGCj3339/tc/
ieZ4cO3bstD8D0BB16tRJRo0aVfnn5PiWw4cPy44dO6q9vrS0VD744ANp0qSJdOnSpVa3LTw8XEREsrOza3
U9QG2ryeuhP8LDwzmOagF3/
H7kzjvvlMLCQpk0aZL06NFDSktLZe3atbJkyRJJSkryuQnir3/9q6xevVoGDRokM2bMkF69eklWVpZ8/
vnn8v7770tWVpaIiIwfP15SUlJk0qRJMm7cOElLS5N58+ZJr169qvwu049NnDhR5s+fL9dff71ERUXJc889
J507d5YHH3xQfve730l6erpMnDhRIiMjJS0tTZYtWyYzZ86Ue+655yd9T0BD9N1338nAgQNl5MiRctFFF0n
r1q0lIyNDXnnlFdm6davcfffd0qJFi1rdhr59+0pgYKD87W9/
k5ycHAkJCZGRI0ca554BDVlNXw991a9fP3n//ffl8ccfl7Zt20rHjh1l0KBBtbpOF1D4/
cjs2bNl6dKl8u6778rzzz8vpaWl0qFDB7ntttvkj3/8o3GQpU2rVq1kw4YN8sADD0hKSoo888wzEhcXJ717
95a//e1vla+bPn26HD58WJ577jlZuXKl9OrVSxYuXChLly495fDXa6+9VvLy8uS2226TqKgoeeyxx+Tee+
+Vbt26yRNPPFH5u1Dt27eXiy++WCZMmODr1wI0Ct27d5c5c+bIu+++K88884wcOXJEQkNDJTk5Wf7xj3/
ITTfdVOvb0Lp1a5k3b5488sgjctNNN0l5ebmsXr2awg+NTk1fD331+OOPy8yZM+WPf/
yjFBUVybRp0yj8akCAdzq/eQkAAIBGj9/xAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gsIPAADAEac9wJln5uFM1BDHWDbGY832LOqzzjrLGB86dKiaoz3w
/YMPPlBzTj4F58f279+v5pw4ccIYP/nYNZPQ0FBjvH379mrOwIEDjfG9e/eqOevXrzfGP/
nkEzWnqKhIXVbfONZqRpMmvt+vseWUlZUZ47ZHGz755JM+ryciIsIYP3DggJqzcuVKY3z+/
PlqDk59rHHHDwAAwBEUfgAAAI6g8AMAAHAEhR8AAIAjKPwAAAAcEeCdZqtVY+x+Ak6FTkPf/Otf/
zLGhwwZouZoXXvHjx9Xczp27GiMa52BIiJRUVHGuNaFKyJSUVGhLtNoXZAlJSVqTtOm5gEKX3/9tZoTFBRk
jMfFxak5Tz31lDE+d+5cNaeucKz5RutsLy8vr5P1p6SkqMvGjRtnjNu6yrXjJiYmRs3RvoO6+rnZ1tMQ9+e
T6OoFAACAiFD4AQAAOIPCDwAAwBEUfgAAAI6g8AMAAHAEhR8AAIAjzDMGADjrl7/8pbqsX79+xvjSpUvVnB
YtWhjjtrEU2ggW2/
iVgoICY9z24Pjw8HCf16ONWWnevLmak5uba4xv3rxZzQkJCTHGtbEYIiKzZs0yxjMyMtScJUuWqMtQu2z7p
j9jW5KTk43x6667Ts0ZO3asMW4bnbRnzx5jPDY2Vs0JCwszxjMzM9Wc4uJiY/yDDz5Qc/72t78Z4+
+9956ao7GNRdFGvTTkMS8ncccPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABwR4J1mC0pDfpg14K+G2IFVV8faDTfcYIxPnz5dzUlPTzfGbdusPWhd68IV0bsDu3fvruZoHYCHDh1S
c7TuXa2rWETv3o2MjFRzNm3aZIzn5eWpOdr72bot8/
PzjfGBAweqOePGjTPGtc5Nf7l8rNWkV199VV02aNAgY9zWPVxUVGSM2zrbg4ODjfHDhw+rOT179jTGtX1WR
O9s1s4pIvpn1c5dIiJvv/
22Mf7II4+oORrbPlVXx8Cp1sMdPwAAAEdQ+AEAADiCwg8AAMARFH4AAACOoPADAABwBIUfAACAI5rW9wYAq
B/
Hjh0zxm2jTKKjo43x0tJSNUcbs9K0qX760UawlJWVqTnnnHOOMd6hQwc1RxsLoY2rEBFJS0szxjdv3qzmZG
dnG+O20SyFhYXGeKtWrdQc7Tu1jdno27evMV7T41zgm3//+9/
G+JAhQ9Qc7bjRxqKI6ONPbGNJgoKCjPG2bduqOV9//bUxbjs+tXFHtvOAtiwqKkrN+fWvf22Mx8fHqzm/
+tWvjPGGOLbox7jjBwAA4AgKPwAAAEdQ+AEAADiCwg8AAMARFH4AAACOCPBOswWlMT7MGjiVhtiBVd/
H2ksvvaQu69GjhzH+1VdfqTnh4eHG+PHjx33bMNG7/EREIiMjjfHu3burOVoHs61LedeuXcb4/
v371ZzY2Fhj3NadGBcXZ4y3bNlSzenSpYsx/uqrr6o5v//
979VlNYljrTrt5yUisnr1amM8KytLzdE6dP1h2ze1bviSkhI1R+vub9Gihc/
bYNuXtM522+epqKgwxrVzl4jI2WefbYxr3fh16VTHGnf8AAAAHEHhBwAA4AgKPwAAAEdQ+AEAADiCwg8AAM
ARFH4AAACO0J+SDsBJ06dPV5cVFRUZ44cPH1ZztBET2ggFEX3MijayRUSkoKDAGN+4caPPOREREWpOTEyMM
a6NbBERyc7ONsZtD47XxrlocRH989TVyBb4ZuzYseqysLAwYzwkJETN0Y4p25gVbfyJFhfRx4XY9ufo6Ghj
XNtnRUTKy8uNcdt5QPsOTpw4oeYEBwcb49qxLiJy6aWXGuNLlixRcxoK7vgBAAA4gsIPAADAERR+AAAAjqD
wAwAAcASFHwAAgCPo6kU17dq1U5dpnV7p6em1tDWnryYfUO4yrZNOROQPf/
iDMT5p0iQ1JzMz0xj3p0PX1gnsTxdk8+bNjXHbvqR1SGrrF9G7BrX1i+gd1LaH2v/
5z39Wl2kCAgKM8VM96B0/3XnnnedzjrYviejHbllZmc/r8Wcb8vLy1Bytq9bWDZ+Tk2OM2/
ZN7bPaupTDw8N9ei8RkUsuucQYp6sXAAAADQaFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4IsA7zZ59reUfDZuthb1t27bGeG5urprTu3dvY/zTTz/
1bcPEPjLDNrajJjXEkRWN8Vjbt2+fukwb9ZOVlaXmaKNMbPtzYGCguszX9dh+BkFBQca4bfSDNrrGNv6iR4
8exvjXX3+t5lxzzTXqsvrGsVbdhg0b1GUdOnQwxrWxKDa2Y007D9vOz9qxph1PIiLFxcXGeHx8vJqjvZ8/
o7u0kS0i+n5gG51z+PBhY/
ycc87xbcNqwamONe74AQAAOILCDwAAwBEUfgAAAI6g8AMAAHAEhR8AAIAj9BY5NCrag661zl0RkR07dhjjt
o7auLg4Y7xdu3Zqznfffacuq0mtW7c2xrXuK5jZOh21brE33nhDzTn//
PON8ePHj6s5Wqef9hB6G3+6ILXOXRGR0NBQY9zW1at1I2vvJaIfU3/+85/VHI0/
P1PUvo4dO6rLSktLjXFbZ7vWhZqZmelzju06oB2Hts5ZrbPd1gmsdQ/
b9lmt49d2TGtsx3RiYqLP79dQcMcPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAIxrk0IhEREeoyre384MGDao42/uSyyy5Tc8477zxj/IsvvlBzatLgwYPVZaNHjzbG//
KXv9TW5uD/Wb16tbps5MiRxrhtNIu2zJ8RE/
6MgLHRxsPYRqZoYyls4y+0sRTbtm2zbJ1v6xexj6xA7YqKilKXZWRkGOO2n2VxcbEx3qxZMzXHtt9qtNEou
bm5ao527Nqua9rYFttoFu1418bjiOg/h+zsbDXHNiqtoeOOHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gq7eH9E66UTsHYU1Seswsm2b1n1k65jSlr3zzjtqTl5enjGenp6u
5mj8+T5tD+euq5/Pmc6fDtA1a9aoOdrDzHft2qXmaA9ut3Wg+vMQdu2Ysu1L2vdTUFDg8/
ptXb1at+XRo0d9Xg/qV4cOHYxx2z7rz/nMn/
1ZW2bL0bqEbR30Wjd8SEiIz9umvZeIfo2Ijo5Wc7RjzXa90XJs68nJyVGX1SXu+AEAADiCwg8AAMARFH4AA
ACOoPADAABwBIUfAACAIyj8AAAAHFEv41z8GZliy/
H1vWo6p2XLluoyre09Pz9fzTnrrLOMcW2UiohIZGSkukwzePBgY/
y9995Tc7Zs2WKM276D9u3bG+O2h4NrYzsuvvhiNSc4ONjnbUPNsD2cXRt/
ov28ROwPVPeVbTyNdl7x58H1tnOUtsw2nkbLsX0eNEy9evUyxm3jQrTjIywsTM3R9hnbCKCmTc1lgBYX0Uc
XaaPIRPT91jbiJCYmxhi3jY0pLi72OUf7rP7UA9r4KhGRL774wuf3qw3c8QMAAHAEhR8AAIAjKPwAAAAcQe
EHAADgCAo/AAAAR/zkrl5/
OnRtnTL+PGS6JvXu3VtdpnUypaenqzk9evQwxrXOIxGRzZs3G+O2jtaRI0ca4/369VNztK6xGTNmqDkHDhw
wxteuXavmvPXWW8Z437591Zxvv/
3WGLd9b1pHma17FDXD9h2XlJQY47YH1GvnAdvPX8uxnaO07l3btmndgf5029q6em0PoveVrXsUtU/
rTrXR9o2IiAg1Z8+ePca4rRNYO6Zsx4B23NiOAe24sZ07tGMgNDRUzdGmX9i6lLXjIzw8XM3RaFMsROjqBQ
AAQB2j8AMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/AAAAR/
zkcS41PWZFG8lha2FPSkoyxm2jErRl7dq1U3PS0tKM8RYtWqg5CQkJxnhycrKao7XRDxw4UM3RHmq/e/
duNSczM9MY19rhRfQ2eu0h5CL6dn/
88cdqjjYi580331RzZs6caYzHxcWpOajOn9EftmMtNzfXGLedO7RxEbbRLP6MWfFnBIw/
67GNxvCVbSyFRvs+UTds41Q02jFVWFio5nz22WfG+NChQ9Uc7f1s+4z2eTIyMtQcbTRLfHy8mqONUGvWrJm
a488xrY2UKSoqUnO081fbtm3VnIaCO34AAACOoPADAABwBIUfAACAIyj8AAAAHEHhBwAA4Iif3NXbsmVLdZ
nWrWPrnI2OjjbGDxw4oOYcPXrUGG/
durWao3UY2TpNtS4r2wOjte9A6xAW0buC2rRpo+a0atXKGLc91H7//v3GeO/
evdUcrUPXtp7t27cb49o2i+j7iO1707q2IiMj1RzUPm3fsHXZaR2yWve6iH5M2x4CX5NsXb3aNtg6m7Xvx5
/P40+nNmqObd/QhISEGONaV7mIyJEjR4xxW1e5tg/atlnbhpiYGJ9zbOcBbbvz8/
PVnPDwcGP88OHDak7Xrl2Ncdv5RvveYmNj1ZyGgjt+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHnPY4l379+hnj2igVEX2Mhxavabt27VKXRUREGOO2h81rrepnnXWWmpOSkmKM
n3vuuWqOtsz2ebTWctsImIMHDxrjI0eOVHNWrFhhjG/cuFHN+fnPf26Mjx8/
Xs255557jPHjx4+rObbWezQu2ugH21gKjT85tlEW2rLc3Fw1R3vgvW00h3ZM19V4GtQcbcSIjTbmxDaW5PP
PPzfGb731VjVHuw40baqXB9qIJn9GGtlGgWnbYMvRxqEtX75czWnfvr1P6xfxbzxNQ9HwtxAAAAA1gsIPAA
DAERR+AAAAjqDwAwAAcASFHwAAgCNOu6tX63oZOHCgmqN1vWRnZ6s5WufqoUOH1JyMjAxj3Nb91qxZM2M8M
zNTzdG6qc455xw1Z968ecZ4QUGBmvPdd98Z47ZuW62z+LXXXlNz0tLSjPEBAwaoOVqXU/
fu3dWcdevWGePp6elqjtZdHRUVpeZo+5U/
XZ3wjdaBKqKfB2w52gPqbTlaN53WUSsi4nmezzla56ItR9s22xQBbT3NmzdXc7QOTdu2ofbFxsYa49q1S0Q
/
BjZv3qzm7N692xgPDQ1Vc2ydqxrtnHrs2DE1Rzt2W7VqpeZo3bvacSuiT+zYunWrmqNd8xISEtQc7fj05/
usa9zxAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA44rT7jrXxI0ePHlVztNbuuLg4NWfs2LHGeOfOndWcxMREY3zDhg1q
jtZaHhMTo+akpKQY46NGjVJztPfTxtaIiDz22GPG+BdffKHmrFy50hjXxgiI6GNobA/
0vuCCC4xx2xicvXv3GuPnnXeemqO16+fn56s52v72zTffqDmozp
/RH7YxO9p4A3/GHtgegK4d07bRDxrbw+a17bbl+DMCxp9xLprG8OD4M5k2Psw2zic6OtoY379/v8/
rsdGOm7y8PDUnLCzMGLedB06cOGGM2/
ZN7fiwHTfa8amNxxHRx+BoY9JE9NonPDxczWkoOBsAAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHwAAgCNOu
61u1apVxninTp3UHK1baPXq1ae72kq2zh/
tocy2jlbtodUtWrRQc7TP+sorr6g5n332mbpM06VLF2P80KFDas6YMWOMcVsnk9YZtX79ejXn3XffNcZLS0
vVHK1zzZ8cG62jTNsPUXPatWunLvOne1fbN4ODg9Uc7YHutg5A7WHzNtrnsb2X1lmsbbOIyJ49e4zx9u3bW
7YODZG2z9jOc1rnbElJiZozcOBAY9zW2V5eXm6M27rUtfezXW+0a67t82jHru2crm2brR7QOqVt5w7te/
PnfFfXuOMHAADgCAo/
AAAAR1D4AQAAOILCDwAAwBEUfgAAAI6g8AMAAHDET+473rdvn7pMG71ga6tu27atMV5YWKjmaGNBbC3f2dn
ZxnhmZqaa88UXXxjjtvbtHj16GOPaCBrbNuTm5qo5+fn5xrhtXIRtmSYmJsYYj4yM9Pm9bN+btkwbcSCiPx
z7yJEjvm2Y42yjkzRxcXHqMu04tI1K0B42rz3o3cY2ZsWf0Qva96ONqxDRx1zEx8erOdo4Ddv5U6ONnkDd0
L5/2zGg0a5DIiJDhw41xm3nen/
GE2nHgHZdFdHPA7Zzh3a820bNFBQUGONdu3ZVc7QxZTbad2obndNQcMcPAADAERR+AAAAjqDwAwAAcASFHw
AAgCMo/AAAABxRq08T1rpts7Ky1BzbssboTPs8tq5nuEvr9raxdQ/bOvL9eb+afC9/
OjG1zknbQ+21TlBb97Cv74X65U8H6ObNm9Vl11xzjTFum4qh7c+2Y0DrqrV1yWsdurZjXft+tIkhIvq+bpu
k8fHHH6vLNNr348/
kgbrGHT8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHwAAgCNqdZwLgMbHn1Eqtgeta2MPbA9a13
L82TYbbT22URbaiAnbQ+21z1pSUuLztp177rlqDhqmoqKiGnsv28iUhIQEY1wbrSaij3OxjS0qKyszxqOjo
9UcbRtso4a09dhytPXEx8erObm5ueoyXxUXF9fYe9UW7vgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHwAA
gCPo6gVQhdZJZ9OiRQt12dGjR41xW2eePw+O13Js69G6hG0duhpbt6W2DbbPo3UCa52baLi0Y8DW2Z6Tk2O
Mt2vXTs0JCQkxxrOzs9Uc7RiwHTdaZ7s/x5rtO/B1/
SL6Z42IiFBzbMehrznaz7oh4Y4fAACAIyj8AAAAHEHhBwAA4AgKPwAAAEdQ+AEAADiCwg8AAMARjHMB8JN1
797d55ywsDCfc2wPm9doYyRE7GMhfH2/
mh41o33WVq1aWbbOd9q2+fPdwGzfvn3GuG10kvZz+fjjj9Uc7WdmGwGUl5enLvN1Pdo2i+ifNTg4WM2pyVE
zHTp0UHNCQ0ON8T179vic48/3Wde44wcAAOAICj8AAABHUPgBAAA4gsIPAADAERR+AAAAjqCrF8BP1rp1a3
WZ1mVXXFys5kRGRhrjtg7d8PBwYzwoKEjN0ZbZuhO1h8rbumC1jl9bR2NBQYExHhcXp+agYcrKyjLGmzbVL
8EhISHGuNZNKiIyadIkY/yGG25Qc7Qu8Z49e6o5Wodsdna2mpOTk2OMx8TEqDknTpwwxr/
55hs1Jzk52RhPSUlRczIzM41x23etnQdsP9OGgjt+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHNPy+YwC1QhtZYhtLomnTpo26TBvJYBvNoo050caiiOjjYbSHw4voI1OioqLU
HI1t22wjZXx9v/j4eDWnc+fOxvjevXvVHG0she17g2+0n+WhQ4fUnGPHjhnjsbGxas6+ffuM8T/
96U+WrYMmPT1dXaado/
w5f9Y17vgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHwAAgCPo6gXwk9kezq7Run1F9I658vJyNUfrnLU9N
L2kpMSn9YuIhISEGONal/Sp3k+jvd/WrVvVHG3bUL/
WrFljjA8aNKhO1m87Buje1g0ZMqS+N6FWcMcPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOCIAK8xPFEYAAAAPxl3/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4gsIPAADAERR+dSwgIEDuuOOOU75uwYIFEhAQIOnp6bW/
UUADNn36dImIiDjl64YPHy7Dhw+v/Q0CgEaMwq8Gbdu2TSZPniyJiYkSGhoqCQkJMnr0aJk7d26tr/
vhhx+W5cuX1/p6gNPxzDPPSEBAgAwaNKi+N8Vv06dPl4CAgMo/TZs2lfbt28vUqVNlx44dtbruwsJCue+++
+TDDz+s1fUAvjh5Q+KHf1q2bCkjRoyQ1NTU+t48nKam9b0BZ4q1a9fKiBEjpEOHDjJjxgxp3bq17N+/
X9avXy9PPvmk3HnnnT6933XXXSdTp06VkJCQ03r9ww8/
LJMnT5aJEyf6sfVAzVq0aJEkJSXJhg0bZM+ePdKlS5f63iS/hISEyD//+U8RESkrK5O9e/
fKvHnzZMWKFbJjxw5p27Ztray3sLBQ7r//
fhER7mKiwXnggQekY8eO4nmeHDlyRBYsWCBjx46Vt956S8aPH1/
fm4dToPCrIQ899JBER0fLxo0bJSYmpsqyjIwMn98vMDBQAgMDra/xPE+Ki4slLCzM5/
cHaktaWpqsXbtWUlJS5JZbbpFFixbJrFmz6nuz/
NK0aVO59tprq8QGDx4s48ePl3feeUdmzJhRT1sG1J9LLrlE+vfvX/n3m266SVq1aiWvvPIKhV8jwD/
11pC9e/dK7969qxV9IiItW7asFlu+fLkkJydLSEiI9O7dW1asWFFluel3/JKSkmT8+PGycuVK6d+/
v4SFhclzzz0nAQEBUlBQIC++
+GLl7ffp06fX8CcETs+iRYukefPmMm7cOJk8ebIsWrSo2mvS09MlICBAZs+eLc8//7x07txZQkJCZMCAAbJ
x48ZTrmPLli0SHx8vw4cPl/z8fPV1JSUlMmvWLOnSpYuEhIRI+/bt5Te/+Y2UlJT4/
flat24tIt8XhT+0b98+mTJlisTGxkqzZs1k8ODB8s4771TLz8jIqLxQhoaGytlnny0vvvhi5fL09HSJj48X
EZH777+/8pi+7777/
N5moDbFxMRIWFhYlWNi9uzZcv7550tcXJyEhYVJv3795LXXXquWW1RUJHfddZe0aNFCIiMjZcKECXLgwAH2
+VrEHb8akpiYKOvWrZMvv/
xSkpOTra9ds2aNpKSkyG233SaRkZHy1FNPyRVXXCHffvutxMXFWXN37dolV199tdxyyy0yY8YM6d69u7z00
kty8803y8CBA2XmzJkiItK5c+ca+2yALxYtWiSXX365BAcHy9VXXy3PPvusbNy4UQYMGFDttS+//
LLk5eXJLbfcIgEBAfLoo4/K5ZdfLvv27ZOgoCDj+2/cuFHGjBkj/
fv3lzfeeEO9411RUSETJkyQNWvWyMyZM6Vnz56ybds2eeKJJ+Trr78+7d+JzczMFBGR8vJy2bdvn/
z2t7+VuLi4Knc2jhw5Iueff74UFhbKXXfdJXFxcfLiiy/
KhAkT5LXXXpNJkyaJyPcXueHDh8uePXvkjjvukI4dO8rSpUtl+vTpkp2dLf/zP/
8j8fHx8uyzz8ovfvELmTRpklx++eUiItKnT5/T2l6gtuXk5EhmZqZ4nicZGRkyd+5cyc/Pr3J3/
Mknn5QJEybINddcI6WlpbJ48WKZMmWKvP322zJu3LjK102fPl1effVVue6662Tw4MHy0UcfVVmOWuChRrz3
3nteYGCgFxgY6J133nneb37zG2/lypVeaWlpldeJiBccHOzt2bOnMrZ161ZPRLy5c+dWxubPn+
+JiJeWllYZS0xM9ETEW7FiRbX1h4eHe9OmTavxzwX44rPPPvNExFu1apXneZ5XUVHhtWvXzvuf//
mfKq9LS0vzRMSLi4vzsrKyKuNvvPGGJyLeW2+9VRmbNm2aFx4e7nme561Zs8aLioryxo0b5xUXF1d5z2HDh
nnDhg2r/PtLL73kNWnSxPvkk0+qvG7evHmeiHiffvqp9bNMmzbNE5FqfxISErxNmzZVee3dd9/
tiUiVdeXl5XkdO3b0kpKSvPLycs/zPG/
OnDmeiHgLFy6sfF1paal33nnneREREV5ubq7neZ539OhRT0S8WbNmWbcRqEsnr0s//
hMSEuItWLCgymsLCwur/L20tNRLTk72Ro4cWRnbtGmTJyLe3XffXeW106dPZ/
+vRfxTbw0ZPXq0rFu3TiZMmCBbt26VRx99VMaMGSMJCQny5ptvVnntqFGjqtyR69Onj0RFRcm+fftOuZ6OH
TvKmDFjanz7gZqwaNEiadWqlYwYMUJEvh9fdNVVV8nixYulvLy82uuvuuoqad68eeXfhwwZIiJiPBZWr14t
Y8aMkYsuukhSUlJO2fi0dOlS6dmzp/To0UMyMzMr/4wcObLy/
U4lNDRUVq1aJatWrZKVK1fKc889JxERETJ27Fj5+uuvK1/37rvvysCBA+XCCy+sjEVERMjMmTMlPT29sgv4
3XffldatW8vVV19d+bqgoCC56667JD8/Xz766KNTbhNQ355+
+unK42LhwoUyYsQIufnmmyUlJaXyNT+8E3/8+HHJycmRIUOGyOeff14ZP/krTrfddluV9/
e1GRK+4Z96a9CAAQMkJSVFSktLZevWrbJs2TJ54oknZPLkybJlyxbp1auXiIh06NChWm7z5s3l+PHjp1xHx
44da3y7gZpQXl4uixcvlhEjRkhaWlplfNCgQfL3v/
9dPvjgA7n44our5Pz4WDhZBP74WCguLpZx48ZJv3795NVXX632+3Umu3fvlq++
+qry9+V+7HSargIDA2XUqFFVYmPHjpWuXbvK7373O3n99ddFROSbb74xjq7p2bNn5fLk5GT55ptvpGvXrtK
kSRP1dUBDN3DgwCrNHVdffbWcc845cscdd8j48eMlODhY3n77bXnwwQdly5YtVX6nNiAgoPL/f/
PNN9KkSZNq17XGOgWgsaDwqwXBwcEyYMAAGTBggHTr1k1uuOEGWbp0aWVno9at63neKd+bDl40VP/5z3/
k0KFDsnjxYlm8eHG15YsWLapW+J3usRASEiJjx46VN954Q1asWHFanYMVFRVy1llnyeOPP25c3r59+1O+h0
m7du2ke/fu8vHHH/uVD5xpmjRpIiNGjJAnn3xSdu/
eLVlZWTJhwgQZOnSoPPPMM9KmTRsJCgqS+fPny8svv1zfm+s8Cr9advK/ig4dOlSr6/
nhf0UB9WHRokXSsmVLefrpp6stS0lJkWXLlsm8efP8+o+XgIAAWbRokVx22WUyZcoUSU1NPeV8u86dO8vWr
VvloosuqvHjo6ysrEo3cWJiouzatava63bu3Fm5/OT/fvHFF1JRUVHlrt+PX8fxjMamrKxMRETy8/
Pl9ddfl9DQUFm5cmWVX8mYP39+lZzExESpqKiQtLQ06dq1a2V8z549dbPRjuJ3/
GrI6tWrjXfs3n33XRER6d69e62uPzw8XLKzs2t1HYCmqKhIUlJSZPz48TJ58uRqf+644w7Jy8ur9vuuvggO
DpaUlBQZMGCAXHrppbJhwwbr66+88ko5cOCA/OMf/zBub0FBgV/
b8fXXX8uuXbvk7LPProyNHTtWNmzYIOvWrauMFRQUyPPPPy9JSUmVv+YxduxYOXz4sCxZsqTydWVlZTJ37l
yJiIiQYcOGiYhIs2bNREQ4ptEonDhxQt577z0JDg6Wnj17SmBgoAQEBFT5vd709PRqnfQnf1/9mWeeqRKvi
6dduYw7fjXkzjvvlMLCQpk0aZL06NFDSktLZe3atbJkyRJJSkqSG264oVbX369fP3n//
ffl8ccfl7Zt20rHjh0b9eOy0Li8+eabkpeXJxMmTDAuHzx4sMTHx8uiRYvkqquu8ns9YWFh8vbbb8vIkSPl
kksukY8++kgdn3TdddfJq6++KrfeequsXr1aLrjgAikvL5edO3fKq6++WjkP06asrEwWLlwoIt//
03F6errMmzdPKioqqgylvvfee+WVV16RSy65RO666y6JjY2VF198UdLS0uT111+vvLs3c+ZMee6552T69Om
yadMmSUpKktdee00+/
fRTmTNnjkRGRlZ+zl69esmSJUukW7duEhsbK8nJyaccFQXUhdTU1Mq71BkZGfLyyy/
L7t275d5775WoqCgZN26cPP744/Kzn/1Mfv7zn0tGRoY8/fTT0qVLF/
niiy8q36dfv35yxRVXyJw5c+TYsWOV41xONk5x57uW1HNX8RkjNTXVu/
HGG70ePXp4ERERXnBwsNelSxfvzjvv9I4cOVL5OhHxbr/99mr5iYmJVcaxaONcxo0bZ1z/
zp07vaFDh3phYWGeiDDaBXXq0ksv9UJDQ72CggL1NdOnT/eCgoK8zMzMynEujz32WLXXyY/
GOPxwnMtJmZmZXq9evbzWrVt7u3fv9jyv+jgXz/t+hMTf/vY3r3fv3l5ISIjXvHlzr1+/ft7999/
v5eTkWD+TaZxLVFSUd9FFF3nvv/9+tdfv3bvXmzx5shcTE+OFhoZ6AwcO9N5++
+1qrzty5Ih3ww03eC1atPCCg4O9s846y5s/
f361161du9br16+fFxwczGgLNAimcS6hoaFe3759vWeffdarqKiofO0LL7zgde3a1QsJCfF69OjhzZ8/35s
1a5b347KjoKDAu/32273Y2FgvIiLCmzhxordr1y5PRLy//
vWvdf0RnRDgeafRUQAAAFAHtmzZIuecc44sXLhQrrnmmvrenDMOv+MHAADqRVFRUbXYnDlzpEmTJjJ06NB6
2KIzH7/jBwAA6sWjjz4qmzZtkhEjRkjTpk0lNTVVUlNTZebMmX6PXIId/9QLAADqxapVq+T+++
+XHTt2SH5+vnTo0EGuu+46+cMf/nBag9rhOwo/AAAAR/
A7fgAAAI6g8AMAAHAEhR8AAIAjTvs3JxvyBO0fPvPyhyoqKmp0PX379jXGtYfAi4js3bvXGP/www/
VHO27DgoKUnMSEhKM8Q4dOqg555xzjjH+wQcfqDn33nuvuqwmad9BTf9KakP8FdeGfKwB/
uJYqz+2z+nPz+XkIwh/rGfPnmrOyWf5/tjJJ4CYmJ5/
fSp1de1oyE71WbnjBwAA4AgKPwAAAEdQ+AEAADiCwg8AAMARpz3Aub5/
CdafX04NDAxUc2bOnGmMX3HFFWqO9gut+/
fvV3OaNWtmjHfu3FnN0ba7tLRUzcnLyzPGDxw4oOYcO3bMGLc1hGgNJqmpqWrOsmXLjPHVq1erOZqa/
iXlhvgLv/
V9rAG1gWOtYdK+g+uvv17N0X6Wq1at8nk9F110kZqTmZlpjNuuN6C5AwAAAP8PhR8AAIAjKPwAAAAcQeEHA
ADgCAo/AAAAR1D4AQAAOKLBjXPRRpmUl5erOeHh4cb4m2+
+qeZEREQY4wcPHlRzgoODjfGwsDA1R2tHLykpUXNiYmKMce2ZxCIiWVlZPm9bSEiIMZ6bm6vm2Ea9+Lqe7d
u3qzk33HCDz+vx55nNjJgA6gbHWs2o6ZFWP/
vZz3xeT12NU7nsssuM8ePHj6s5H3/8sTFuu37arhGNEeNcAAAAICIUfgAAAM6g8AMAAHAEhR8AAIAjKPwAA
AAc0bS+N+DHtM4bW1fvHXfcYYzHxcWpOfv27TPGbd09WsexrYNGy2nTpo3POTk5OWqO1qUcHR2t5miftbi4
WM0pKyszxgsLC9Ucbbs7d+6s5nTp0sUY37Nnj5rjT1cvAJzp2rVrpy4LDQ01xpcvX+7zeoKCgnzOOXHihLr
sjTfeMMZvvvlmNWfDhg3GuO26pnUwN8RO9JrAHT8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcASFHw
AAgCMa3DgXW2u35sILLzTGs7Ky1Byt7bxZs2ZqTm5urk/
vJSISExNjjNseMq2xjWYpKCgwxnft2qXmaCNTmjdvruZo32lwcLCaExISYoxro2FERIYNG2aM28a52N4PZz
5tJIM2HklE32cGDRrk83ps+2ZmZqa6rCZp2+YP2yiLpk3Nlw7b2C3UDH9GjLRo0UJdtn//fp/fT/
v5287B2nZrY7hE9FFcO3bsUHP69OljjGtjXmzbdqbijh8AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOKJeunr96eKx6du3rzG+e/
duNUfrHratX+tOzcvLU3O0LmGtK0rEv+4nLSc2NlbN0ToAbdumdQ/bti0iIsIYLyoqUnNGjx5tjL/
wwgtqDtymHQP+dHs/9NBD6rK4uDhj3J9Ofa0DUUTkn//8pzF+7733qjl11Z1IB33DZOtg1+Tn5/
ucU5P7mT+d6LZO5Pbt2/+UzanCtm2NuROYO34AAACOoPADAABwBIUfAACAIyj8AAAAHEHhBwAA4AgKPwAAA
EfUyziXoKAgdVlJSYkxfuGFF6o5GRkZxrittV1bjzZ6RETk6NGjxrg25kVEbwfXxsmIiGRnZxvjtjErmuLi
YnWZNpIhKytLzdG+U9sYHO27tn1vbdq0UZcBtc02CiomJsYYj4qKUnO0cUe2URraSKPVq1erOevXrzfGc3J
y1Bx/aOfjxMTEGl2Py7Rrh22MSHh4uDEeHBys5mjXGxttG/
wZceLPCDfb9bO8vNwY145bEf07YJwLAAAAGjUKPwAAAEdQ+AEAADiCwg8AAMARFH4AAACOqJeu3tLSUp9zz
j33XHWZ1sUTGhqq5rRo0cIYz8vLU3M6dOhgjB8+fFjN0TqBbV2rWtezrdtWe79mzZqpOenp6cZ4WFiYmqN1
Pefm5qo5PXr0MMZt3Vzt2rVTlwG1rU+fPuoyrRvetj9rx43t3KF1Ar/
xxhtqjtbtaOuC3LZtmzG+cOFCNUfr6n3+
+efVHJdp3aG2rlFtioPt+ql1ltu6em2TLGqSP9+B1jlru7ZrUyS0OsHGtm3a8Wnr9m0oncDc8QMAAHAEhR8
AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOKJexrn409I8ZMgQdZk2RkFrhxfRH9h88OBBNUdre7c9nF1jG6+gtZDbPo/
Wwq49tFtEJDY21hi3tb1rLeza+kVEiouLjfHAwEA159ChQ8Z4UlKSmqONp4EbtP3Jtj+HhIQY47aRRtrIJ9
s4F21Ek+08oFm+fLm6rKioyBjv1KmTmqOdC3/3u9+pOVu3bjXGP/
30UzXHZdo1z3Yt9Gfs2XfffWeM28ZjtW/f3hj/
8ssv1Rzbvq7x5zvQxMXFqcu0US+2UW0af0bANAbc8QMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR9RqV6/
WnepPF0+3bt3UZbm5uca4rWtU65iKjIxUc7ROJltHq5ajdfmJiBQUFBjjtu5hrdtW66gV0b8DrdNRRP88zZ
s3V3O0Dqzs7Gw1R+tg7t27t5pDV6/b/
OnAmzJlijGen5+v5miduP5MEbAdnxpbd2Lr1q2NcVsXptZBr51XReznPFSnXYu0/UJEnyLhT0et7dqhXY/
79++v5mjbsH//
fjWnZ8+exrjtOv31118b41ptIaJ39UZHR6s52uexrSc4ONgYz8rK8nk9dY07fgAAAI6g8AMAAHAEhR8AAIA
jKPwAAAAcQeEHAADgCAo/AAAAR9TqOBdtxIht7EJsbKwx3qFDBzVn586dxrhtHIE2esHWbq2Ne
LCNMtFybCMZ2rRpY4xnZGSoOdrnsbXKa6MEbO3oWo5t27p3726Mb9y4Uc3RtqFfv35qzjvvvKMuw5nBNl7B
nzFR11xzjTFuO0dp45a0c5eIyOHDh41xbfSEiD6aRRvDJKKfb2wjmrRl4eHhas6XX36pLkN1F154oTHetm1
bNUcb9WM7p2tsP8u9e/ca47/85S/VnEmTJhnjCQkJas6vfvUrY1wbWyMi8pe//
MUYt+3P2vXTth5tNEtYWJiao133tXpERGTlypXqsrrEHT8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAA
cEStdvX602XXq1cvY1zrirOxdeQUFRUZ47ZuIa1z1dZlpXU5paenqzk7duwwxhMTE9Uc7cHxtm7bpKQkY9z
2QO/MzExjPDIyUs3ROpgLCwvVHK2j7fzzz1dzAF9pD463PWxe6w60dcNry2ydht9+
+60xbusE1ZbZzsXath07dkzN+cUvfqEuQ3XNmjUzxm0d2kePHjXGtQkOIvo5VZuwIaKfa23n523bthnjtqk
Yu3fvNsZ79+6t5mjbbfvecnJyjHFbp77WDW/
7PNqEAe37bEi44wcAAOAICj8AAABHUPgBAAA4gsIPAADAERR+AAAAjqDwAwAAcEStjnOxPVBdExcXV2PvZW
t7t7V2a1q2bGmM28a5lJWVGePaQ6FFRLp162aM21rLtRZy28O5tRb2oKAgNUdj27bvvvvOGLf9fLRl2v4BN
/gzImrixInqsrS0NJ/fTxuDZDumtTFItmPg888/
N8a1MS8i+tgrbdyTiEi7du2M8dzcXDUHvtH2W9uYFW1Elm1cSGxsrM852n5r259jYmKMcX8+j3ZdFRFp0aK
FMW47D2jL/Dl3aNdvEX1Ejz+1RV3jjh8AAIAjKPwAAAAcQeEHAADgCAo/
AAAAR1D4AQAAOKJWu3r96aLp16+fMW7rFtK6a6KiotQcrcstLy9PzdE6o7RuJRH9YdKhoaFqzldffWWMJyQ
kqDlFRUU+xUX070fr9hXRuxD96eaydRpqHYW2n2lISIgxXlJSouagYdK6+G3nFG3fvOWWW9Qc7YHutn0mOz
vbGLedB+Lj443xtm3bqjlaB7uta/DQoUPG+LFjx9QcrXNROxeL6OevxvCA+vqg7c/atUtEPw/
bumC1641tKoY2kcF2ftb2Z9t6wsLCjHFb97C2b9q64f3phtbOK7brpzb9onnz5mpOQ8EdPwAAAEdQ+AEAAD
iCwg8AAMARFH4AAACOoPADAABwBIUfAACAI2p1nIvWJm6jPQDd1iautW9HR0erOdpIhL1796o57du3N8a1k
RAi+lgIW0u+1naekZGh5nTp0sUY10aciIgcOXLEGNda9UX08QPfffedmqONmOjTp4+aU1BQYIwHBgaqOdrD
5m0/U9QM2/Gp/
cxsD0DXxivYjulnn33W523TRknYxgZpYxxsI2DWrVtnjNvGuXTr1s0YLywsVHO084ptfNThw4eNcW2kkkjj
eBB9Q6J9/7axJNo+aBudpe2bNtr1xjY6SRvNYqPtT7ZrlD+jWbTzje0Y0Ni2LTEx0Rg/
evSoz+upa9zxAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABH1GpXrz8SEhKMcVsHoNatY+tKmjdvnjE+dOhQNUd7aLbtocxaJ+7x48fVnA4d
OhjjmZmZao724Hjb9xYeHm6M5+XlqTlt2rQxxg8cOOBzju0B5Vqn9Lnnnqvm+NO11ZjYulO1Tj/
bMWB7P43WzWlbj20f1FxwwQXG+N13363maNtg6wDUuhNtXavaMtv+N2zYMGPc1gG4b98+Y9zW1WmbMKDRzg
PffvutmnPixAmf1+Mybd/
UuspF9P2ptLRUzdH2Ddt6tOPD1iGsvZ9t37Qt02j7mW3/05bZrjdaJ7DtO9DOHf5MM6lr3PEDAABwBIUfAA
CAIyj8AAAAHEHhBwAA4AgKPwAAAEdQ+AEAADiiXsa52FrLk5KSjHHbuBDbg641b7/9tjF+1VVXqTnaeAPbK
Auttdz2EPgdO3YY4/Hx8WqO1o5ua/3XHkBta7vXWv+jo6PVnFWrVhnjl112mZqTlpZmjPft21fN0cbgbN+
+Xc1piLQxK7b9zDZ+pL5169bNGB83bpyak5ycbIzv3LnT5/
XYxtZo35vtHKW9n+08VFRUZIwXFBSoOTExMca47VjT3s820qakpMQY1x5CLyISGxtrjGdlZak5LtNGftmOW
+1nZjs/a6OT/
BmppI0rsb2f7ViLiIjw6b1E9P25devWao52fGjj2ET0sS3aNVJEP95t546Ggjt+AAAAjqDwAwAAcASFHwAA
gCMo/
AAAABxB4QcAAOCIeunq1Tp3RUSOHTtmjNu6hRISEozxZcuWqTnp6enGeIsWLdScr776yhjXuu9E9Aeg27rf
evfubYwfOXJEzdEezq51k4nonYa2DsDIyEhj3NbRqH3X2s9aRP9O/
XnQdmNj697VnH322cb4ueeeq+Zo3W/
aviSid6PbutS1rj1bx7m2z7Rv317N0boGbdumfddap6uISGhoqDF+/
PhxNUebCNC9e3c1Rzuv7dmzR83Rttt2bGjLDh48qOb06NHDGF+7dq2a47LCwkJj3Hbt0K55ti5Y7fxsO6dr
kyds3anaMtt1QDvWKioq1Jy4uDhjXLuu2tajHbc2tu/an/
N0Q8EdPwAAAEdQ+AEAADiCwg8AAMARFH4AAACOoPADAABwBIUfAACAI+plnEvnzp3VZVqbuNYOL6K3aa9Zs
8a3DRORNm3aqMv8GZkSHx9vjGujYUT0sS1aa7st5/Dhw2pO27ZtjfHs7Gw1JzMz0xi3PdB9+/
btxrjtYfPaWIC8vDw1x7ZfnQkefvhhdVmnTp2M8QcffFDN0cb52PZnbZxOt27d1JyePXsa47ZRFtoIFtu4C
G38hfY5RfTzjfbQdhvbKCjt+7GtRzsObTnaaAzbKIuCggJjXBvzISISFhamLkN12nesjSAS0X/Otu9e+/
mHhISoOU2a+H7/R7se20Y0aeuxrV8732jfp4g+nsg2AkY7d9hGtmg5tu+goeCOHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4ol66epOSktRl5eXlPr+f1l2jdaCKiHTp0sUYz8/
PV3O0LlTbA9D96ZzUupxs3UJap5dt27SOJdtD7bXPY+u2/
Oabb4xxrUtaRO9Cs3Ua2varxkTrth44cKCao3Ws/
fa3v1VzVq1aZYzv379fzTnrrLOM8cGDB6s52nFj22dsD4jXaOcO7fwgondO2h4cr22brXt4z549xrjtfKcd
A7YOTX+6EzW2c8fRo0d9fj+XafuZrdta2zdsx432c7b9/
P3p0C4rKzPGbceaNpWiVatWao7W2W47P2idwNo2i+jnT1v3cElJiU/
xhoQ7fgAAAI6g8AMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/AAAAR9TLOJfk5GR1mTayxDbCQPOf//
xHXXbeeecZ47ZxIRp/
RtDk5eWpy7R2dNsDvbX3sz2YWmtvt42N0T6rbTyNNkrA9h1oLfFHjhxRc86UcS4PPPCAMW4bR6CNLrIda4M
GDTLGtbELIiK7du0yxm0/F23Eg230g7bMnwfK22jfqW38hTaaw7Y/a+MnYmNj1RxtG2znKH/
GtmifxzbOxTaCA9VpI1hsY0kKCwuNcdv5WRtDZBtPpB1TtnO6P9c87Tv49ttv1Rxt34yJifE5x3ZsaN+PLU
cb/
dYYjg3u+AEAADiCwg8AAMARFH4AAACOoPADAABwBIUfAACAI+qlq1d7CL2I3mVn6+rVOm+OHTum5nTs2NEY
P3jwoJqjdRLZOoy0hzy3adNGzSkuLjbGc3Nz1Rytk8jWCXro0CFjvGvXrmqO1r1p6wDUtGjRQl0WHx9vjNs
6J/3paGyItm/
fbox36tRJzdE65vbt26fmaJ2zUVFRak50dLQx3r59ezVH6xq0dQZqOf48oN7WPVxUVGSM2/
Znras2MTFRzTl69KgxvmfPHjVH6/
i0dfX6813703Fq+zmgOu3nb7sWat27tukO2s9f289FRHJycoxx27VDu7bajjVtnwkLC1NztM5i7Rop4t91w
PZ+Gu27runJA7Wh4W8hAAAAagSFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHUPgBAAA4ol568lu3bq0u00aW2EYL2MYbaLTxE7axB9pDmW0Ptdfa6G3jELTR
NbZW+czMTGM8ISFBzdG+U9sDvUNDQ9VlvrKN6NFGY3z11VdqzpkyzuWJJ54wxt966y0158EHHzTGW7ZsqeZ
oP0vb6IfDhw8b47bxJ9q+bhsXoT1o3TZ2ITs72xi3jVvq0qWLMa6N0hAROXDggDG+bNkyNUcbXdSvXz81Rx
sBYvuute22Hbfa+VP7GYgwzsVX2r5uu65pbDnasWu7djRr1swYt40l0a4r/
uwXtpzCwkJj3Had1tiOgfDwcGPcdi3Uxur4U4/
UNe74AQAAOILCDwAAwBEUfgAAAI6g8AMAAHAEhR8AAIAj6qU1S+ugEREpKCgwxm3dNdpDpm3atWtnjJeUlK
g5WteorQNQewC1Pw+FtnXZ+fNwbi3Hth7t56B1htksXrxYXTZ69Gif38+23WeCPXv2qMumTp1qjNs62aZNm
2aMDxs2TM0ZMGCAMW7rNNQ68LQueRF9v7WdO/zp0P3tb39rjP/nP/
9Rc2zbrWnbtq0xrnUIi4h8+OGHxninTp3UHK2j0NYFuXfvXmPc1m2pnddgpp2bIiMj1Rzt2LV1w2vXKFsnu
LYNtikJ2jFg22e0iR22bdOuUbb9WfvebOvRup796YbWapiGhDt+AAAAjqDwAwAAcASFHwAAgCMo/
AAAABxB4QcAAOAICj8AAABHNLgnbWujFxITE9WcnTt3+rye9u3bG+O2MQVaC7lt1Iw25kJ7+LSI3iZuG82i
rcc2LkDbBn8eMm0b56G1xGdkZKg52dnZxnhUVJTP6zlT2D6fNnrBNjboueee8ylu06JFC3VZjx49jHHtGBQ
RadmypTFuG5WQl5dnjO/
atUvN2bJli7qsJh08eNAY174bEX0EiO3coZ2LbMe0lhMSEqLmpKenq8tQnbZvxsXFqTnbtm0zxrVjQ8Q+7k
hz9OhRY9w2NiYrK8sYt41M0Ual2cZwaecv25gqbb+1fR7tGNDGyYjo17z9+/erOQ0Fd/
wAAAAcQeEHAADgCAo/AAAAR1D4AQAAOILCDwAAwBH10tX73XffqcsiIiKMcVs3X0pKis/
boHUSHTt2TM3Rti0tLU3N0R4cb+uy076fnj17qjm2bdA0b97cGLd122rfW1JSkpqjdZxqD4cX0TuYbV3X//
3vf9VlZwLbQ9PrW2ZmprpszZo1dbgljYut4xhnjh07dhjj3bt3V3O0TnCtO1ZE70K1dc5qkx+06RIi9s5yT
dOm5nLD1omsXT9zc3PVHO2z2tajdQJr13wRvRu6MZzvuOMHAADgCAo/
AAAAR1D4AQAAOILCDwAAwBEUfgAAAI6g8AMAAHBEvYxz2bRpk7rsmmuuMcZtI2DmzJnj8zZoI0s6d+6s5mg
jWGwP2o6NjTXGtRZ6Eb1V3taOPnjwYGPc9mBq7SHT0dHRao623dpDrm3eeOMNddnTTz9tjO/
evVvN+eqrr3zeBgCobdo5/
Z577lFzxo0bZ4y3aNFCzdHGudjGYOXn5xvjtrFeMTExxvjZZ5+t5mhjaGzjT8rLy41xbfyKiEhgYKAxro2T
EdGvX7axMfPnz1eXNXTc8QMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/AAAARwR4p/
nk94CAgBpbaadOndRll19+uTF+/PhxNeeFF17weRu0hz/
PmjVLzdE6i20drfHx8cZ4QUGBmlNUVGSMh4aGqjnaj9Gfbtvi4mJ1mdYZtWHDBjVn2bJlPm/Dn/
70J2N88+bNas7bb7/t83pOc/
evUzV5rAENBccaUDdOdaxxxw8AAMARFH4AAACOoPADAABwBIUfAACAIyj8AAAAHEHhBwAA4IjTHucCAACAx
o07fgAAAI6g8AMAAHAEhR8AAIAjKPwAAAAcQeEHAADgCAo/AAAAR1D4AQAAOILCDwAAwBEUfgAAAI74/
wA6zffWPaZyhgAAAABJRU5ErkJggg==",
"text/plain": [
"<Figure size 800x800 with 9 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
}
],
"source": [
"labels_map = {\n",
" 0: \"T-Shirt\",\n",
" 1: \"Trouser\",\n",
" 2: \"Pullover\",\n",
" 3: \"Dress\",\n",
" 4: \"Coat\",\n",
" 5: \"Sandal\",\n",
" 6: \"Shirt\",\n",
" 7: \"Sneaker\",\n",
" 8: \"Bag\",\n",
" 9: \"Ankle Boot\",\n",
"}\n",
"figure = plt.figure(figsize=(8, 8))\n",
"cols, rows = 3, 3\n",
"for i in range(1, cols * rows + 1):\n",
" sample_idx = torch.randint(len(training_data), size=(1,)).item()\n",
" img, label = training_data[sample_idx]\n",
" figure.add_subplot(rows, cols, i)\n",
" plt.title(labels_map[label])\n",
" plt.axis(\"off\")\n",
" plt.imshow(img.squeeze(), cmap=\"gray\")\n",
"plt.show()"
]
},
{
"cell_type": "markdown",
"id": "694e123d-e609-4e20-b605-89a94e5c5c96",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"#### Creating a Custom Dataset for your files"
]
},
{
"cell_type": "markdown",
"id": "0bc133ed-a308-4f74-b6dc-3013c46ccd50",
"metadata": {},
"source": [
"A custom Dataset class must implement three functions: __init__, __len__, and
__getitem__. Take a look at this implementation; the FashionMNIST images are stored
in a directory img_dir, and their labels are stored separately in a CSV file
annotations_file.\n",
"\n",
"In the next sections, we’ll break down what’s happening in each of these
functions."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "72a56fc6-288b-43c1-bda4-0610d37b237c",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import pandas as pd\n",
"from torchvision.io import read_image\n",
"\n",
"class CustomImageDataset(Dataset):\n",
" def __init__(self, annotations_file, img_dir, transform=None,
target_transform=None):\n",
" self.img_labels = pd.read_csv(annotations_file)\n",
" self.img_dir = img_dir\n",
" self.transform = transform\n",
" self.target_transform = target_transform\n",
"\n",
" def __len__(self):\n",
" return len(self.img_labels)\n",
"\n",
" def __getitem__(self, idx):\n",
" img_path = os.path.join(self.img_dir, self.img_labels.iloc[idx, 0])\
n",
" image = read_image(img_path)\n",
" label = self.img_labels.iloc[idx, 1]\n",
" if self.transform:\n",
" image = self.transform(image)\n",
" if self.target_transform:\n",
" label = self.target_transform(label)\n",
" return image, label"
]
},
{
"cell_type": "markdown",
"id": "08d253bc-cfd6-4f01-a5a3-8e802ce9c07a",
"metadata": {},
"source": [
"##### `__init__`"
]
},
{
"cell_type": "markdown",
"id": "0eed7c50-6b23-4646-90cd-1a008076ad7a",
"metadata": {},
"source": [
"The __init__ function is run once when instantiating the Dataset object. We
initialize the directory containing the images, the annotations file, and both
transforms (covered in more detail in the next section).\n",
"\n",
"The labels.csv file looks like:\n",
"\n",
"```\n",
"tshirt1.jpg, 0\n",
"tshirt2.jpg, 0\n",
"......\n",
"ankleboot999.jpg, 9\n",
"```"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b19f9476-953a-4068-b809-5c8e8242673f",
"metadata": {},
"outputs": [],
"source": [
"def __init__(self, annotations_file, img_dir, transform=None,
target_transform=None):\n",
" self.img_labels = pd.read_csv(annotations_file)\n",
" self.img_dir = img_dir\n",
" self.transform = transform\n",
" self.target_transform = target_transform"
]
},
{
"cell_type": "markdown",
"id": "00fe9467-de1d-4572-be5c-3c043ed0a1f2",
"metadata": {},
"source": [
"##### __len__\n",
"The __len__ function returns the number of samples in our dataset.\n",
"\n",
"Example:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "ea46759e-1c1f-4594-9579-10e0e81e0e86",
"metadata": {},
"outputs": [],
"source": [
"def __len__(self):\n",
" return len(self.img_labels)"
]
},
{
"cell_type": "markdown",
"id": "e348ebf2-f545-4529-abac-1db96b641146",
"metadata": {},
"source": [
"##### __getitem__\n",
"The __getitem__ function loads and returns a sample from the dataset at the
given index idx. Based on the index, it identifies the image’s location on disk,
converts that to a tensor using read_image, retrieves the corresponding label from
the csv data in self.img_labels, calls the transform functions on them (if
applicable), and returns the tensor image and corresponding label in a tuple."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3921452a-3d9e-47a7-8537-ec9c9e24780f",
"metadata": {},
"outputs": [],
"source": [
"def __getitem__(self, idx):\n",
" img_path = os.path.join(self.img_dir, self.img_labels.iloc[idx, 0])\n",
" image = read_image(img_path)\n",
" label = self.img_labels.iloc[idx, 1]\n",
" if self.transform:\n",
" image = self.transform(image)\n",
" if self.target_transform:\n",
" label = self.target_transform(label)\n",
" return image, label"
]
},
{
"cell_type": "markdown",
"id": "af13d24d-fd14-44e1-81bc-e86fb3cba295",
"metadata": {},
"source": [
"#### Preparing your data for training with DataLoaders\n",
"The Dataset retrieves our dataset’s features and labels one sample at a time.
While training a model, we typically want to pass samples in “minibatches”,
reshuffle the data at every epoch to reduce model overfitting, and use Python’s
multiprocessing to speed up data retrieval.\n",
"\n",
"DataLoader is an iterable that abstracts this complexity for us in an easy
API."
]
},
{
"cell_type": "code",
"execution_count": 33,
"id": "485b9a89-5865-4137-8fe4-091116af4743",
"metadata": {},
"outputs": [],
"source": [
"from torch.utils.data import DataLoader\n",
"\n",
"train_dataloader = DataLoader(training_data, batch_size=64, shuffle=True)\n",
"test_dataloader = DataLoader(test_data, batch_size=64, shuffle=True)"
]
},
{
"cell_type": "markdown",
"id": "91ec29a2-aafa-434a-9031-7c4c9e3664e3",
"metadata": {},
"source": [
"#### Iterate through the DataLoader\n",
"\n",
"We have loaded that dataset into the DataLoader and can iterate through the
dataset as needed. Each iteration below returns a batch of train_features and
train_labels (containing batch_size=64 features and labels respectively). Because
we specified shuffle=True, after we iterate over all batches the data is shuffled
(for finer-grained control over the data loading order, take a look at Samplers)."
]
},
{
"cell_type": "code",
"execution_count": 34,
"id": "ff49b9aa-8215-49f7-907c-7df43a4eb710",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Feature batch shape: torch.Size([64, 1, 28, 28])\n",
"Labels batch shape: torch.Size([64])\n"
]
}
],
"source": [
"# Display image and label.\n",
"train_features, train_labels = next(iter(train_dataloader))\n",
"print(f\"Feature batch shape: {train_features.size()}\")\n",
"print(f\"Labels batch shape: {train_labels.size()}\")"
]
},
{
"cell_type": "code",
"execution_count": 35,
"id": "085202bf-bd15-404e-9461-0dd5e41b0458",
"metadata": {},
"outputs": [
{
"data": {
"image/png":
"iVBORw0KGgoAAAANSUhEUgAAAaAAAAGdCAYAAABU0qcqAAAAOXRFWHRTb2Z0d2FyZQBNYXRwbG90bGliIH
ZlcnNpb24zLjkuMCwgaHR0cHM6Ly9tYXRwbG90bGliLm9yZy80BEi2AAAACXBIWXMAAA9hAAAPYQGoP6dpA
AAfv0lEQVR4nO3de2zV9f3H8ddpaU9baA+WQi9SoOAFx83IoGMqPxwNFxMjyh/e/
gBjILpihsxpWFTULemGi3MzDLNkg5mIOhOB6BIWRFuiKxhQgszZAasDBi0D13N6gba0398fxG5Hrp8Pp+fd
nj4fyUnoOefV76fffuHVL+d73g0FQRAIAIAkS7NeAABgYKKAAAAmKCAAgAkKCABgggICAJiggAAAJiggAIA
JCggAYGKQ9QK+qbu7W0ePHlVubq5CoZD1cgAAjoIgUHNzs0pKSpSWduHznD5XQEePHlVpaan1MgAAV+jw4c
MaOXLkBR/
vcwWUm5trvQT0oov9NHQh6enpzpnOzk7njOS3vptuusk5E4lEnDNNTU3Omd27dztnfPl8n7q6unphJegrLv
Xvea8V0Jo1a/TCCy+ooaFBU6ZM0csvv6zp06dfMsd/u/2Xz75I5mg/n/UlK+PLZ1uDBrn/
NfLJ+PwDn0zJ+t4yvrL/
uNT3t1cuQnjzzTe1YsUKrVq1Sp988ommTJmiuXPn6vjx472xOQBAP9QrBfTiiy9qyZIlevDBB/
Wtb31Lr7zyinJycvT73/+
+NzYHAOiHEl5AHR0d2r17tyoqKv67kbQ0VVRUqLa29pznt7e3KxaLxd0AAKkv4QV04sQJdXV1qbCwMO7+ws
JCNTQ0nPP8qqoqRSKRnhtXwAHAwGD+RtSVK1cqGo323A4fPmy9JABAEiT8KriCggKlp6ersbEx7v7GxkYVF
RWd8/xwOKxwOJzoZQAA+riEnwFlZmZq6tSp2rZtW8993d3d2rZtm2bMmJHozQEA+qleeR/
QihUrtGjRIn3729/W9OnT9dJLL6m1tVUPPvhgb2wOANAP9UoB3XPPPfr3v/
+tZ555Rg0NDbrxxhu1ZcuWcy5MAAAMXKGgj72tOBaLeY0pSUXJepe476QBn7E1fX30yq9+9SvnTE5OjnPm8
88/d87cdtttzplf/
OIXzhlJ2r59u1fOVbKmXPSxf+YGjGg0qry8vAs+bn4VHABgYKKAAAAmKCAAgAkKCABgggICAJiggAAAJigg
AIAJCggAYIICAgCYoIAAACYoIACACQoIAGCCYaQpJj093TmTzAGhWVlZzpmpU6c6Z7773e86ZyS/
waLHjh1zzkSjUeeMzzT5goIC54ykc36h5OWorq52zvz1r391zqD/
YBgpAKBPooAAACYoIACACQoIAGCCAgIAmKCAAAAmKCAAgAkKCABgggICAJiggAAAJiggAIAJCggAYIICAgC
YYBp2HxYKhZwzPt/Oi02rvZiFCxc6Z8aMGeOcOXXqlHOmoaHBOSNJnZ2dzhnf/
eequbnZOePz9UhSfn6+c8Zn8vagQYOcM3/605+cMx9//
LFzBleOadgAgD6JAgIAmKCAAAAmKCAAgAkKCABgggICAJiggAAAJiggAIAJCggAYIICAgCYoIAAACYoIACA
CfdJgEiaZM2JXbx4sVfOZ32fffaZc6a9vd05k5bm97OVT85nfV1dXc6ZM2fOOGd8hn1KfsNcfTK5ubnOmQU
LFjhnmpqanDOS9Pe//
905k6whwqmAMyAAgAkKCABgggICAJiggAAAJiggAIAJCggAYIICAgCYoIAAACYoIACACQoIAGCCAgIAmKCA
AAAmGEaaYrKyspwznZ2dXtv66quvnDPp6enOmezsbOdMd3e3c0byGxLqw2c/
JHM7PjmfwaenT592zuzfv985M2HCBOeM5DeMdKAOFvXBGRAAwAQFBAAwkfACevbZZxUKheJu48ePT/
RmAAD9XK+8BjRhwgS99957/92I5y/FAgCkrl5phkGDBqmoqKg3PjUAIEX0ymtA+/
fvV0lJicaOHasHHnhAhw4duuBz29vbFYvF4m4AgNSX8AIqLy/
X+vXrtWXLFq1du1b19fW69dZb1dzcfN7nV1VVKRKJ9NxKS0sTvSQAQB8UCnr5ovWmpiaNHj1aL774oh566K
FzHm9vb1d7e3vPx7FYjBK6Aj7vA3rwwQe9tuXzPiAfPodoMt8HlKz39CRzbcl6H5DPe9AGDx7snGlqanLOS
NLGjRu9cjgrGo0qLy/vgo/3+tUBQ4cO1XXXXacDBw6c9/FwOKxwONzbywAA9DG9/
j6glpYWHTx4UMXFxb29KQBAP5LwAnr88cdVU1OjL7/8Un/5y1901113KT09Xffdd1+iNwUA6McS/
l9wR44c0X333aeTJ09q+PDhuuWWW7Rjxw4NHz480ZsCAPRjCS+gN954I9GfEg4KCgqcM76vweXk5DhnOjo6
nDNtbW3OmWRdGCAl7+IAn4zvcFWfbZ05c8Y543NBgY/8/
PykbAdumAUHADBBAQEATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQQAMAEBQQAMEEBAQBMUEAAABMUEADA
RK//QjokVyQScc74DiPNyMjwyrlqaWlJynaSyXdIaLL4rM/nt/EOGTLEOePzW1R9h9OGQiHnTC//
kumUwhkQAMAEBQQAMEEBAQBMUEAAABMUEADABAUEADBBAQEATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQ
QAMAE07BTTElJSdK25TN5+6uvvnLOpKXxc1Ky+Uyc9vk+nTlzxjmTnZ3tnPGdPp6fn+
+cOXnypNe2BiL+ZgMATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQQAMAEBQQAMEEBAQBMUEAAABMUEADAB
AUEADDBMNIUM2TIkKRta8KECc6Zzz77zDnT0tLinAmHw84ZyW9opc9ATR8ZGRnOGd/
94LPPfYbTlpeXO2caGxudM19+
+aVzRpKKioqcMwwjvXycAQEATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQQAMAEBQQAMEEBAQBMUEAAABM
UEADABAUEADDBMNIU4zOwMicnx2tbp0+fds589NFHzplbb73VOXPixAnnjCSlpbn/
TBYKhZwzQRA4Z3zWlp6e7pyR/
I6JwsJC54zP8NeysjLnjM8QXEm66qqrvHK4PJwBAQBMUEAAABPOBbR9+3bdcccdKikpUSgU0qZNm+IeD4JA
zzzzjIqLi5Wdna2Kigrt378/UesFAKQI5wJqbW3VlClTtGbNmvM+vnr1av3617/
WK6+8op07d2rw4MGaO3eu1+sFAIDU5XwRwvz58zV//
vzzPhYEgV566SU99dRTuvPOOyVJr776qgoLC7Vp0ybde+
+9V7ZaAEDKSOhrQPX19WpoaFBFRUXPfZFIROXl5aqtrT1vpr29XbFYLO4GAEh9CS2ghoYGSedejllYWNjz2
DdVVVUpEon03EpLSxO5JABAH2V+FdzKlSsVjUZ7bocPH7ZeEgAgCRJaQEVFRZKkxsbGuPsbGxt7HvumcDis
vLy8uBsAIPUltIDKyspUVFSkbdu29dwXi8W0c+dOzZgxI5GbAgD0c85XwbW0tOjAgQM9H9fX12vPnj3Kz8/
XqFGjtHz5cv30pz/Vtddeq7KyMj399NMqKSnRggULErluAEA/
51xAu3bt0m233dbz8YoVKyRJixYt0vr16/
XEE0+otbVVS5cuVVNTk2655RZt2bJFWVlZiVs1AKDfcy6gWbNmXXSQYigU0vPPP6/
nn3/+ihYGPz6DMfPz8722lZmZ6ZzZtWuXc8bn/WP/e5buIjc31znjMyTUZwinj6amJq/
c6NGjnTNHjhxxzmzdutU589vf/
tY5s3nzZueMxDDS3mZ+FRwAYGCigAAAJiggAIAJCggAYIICAgCYoIAAACYoIACACQoIAGCCAgIAmKCAAAAm
KCAAgAkKCABgggICAJhwnoaN5PH5FRYdHR3OmcLCQueMJEWjUefMmTNnnDOnTp1yzqSnpztnJL/
J1snajs+kc9/94DNFe8KECc6Z1157zTnzr3/9yzlzod/
IfCltbW1eOVwezoAAACYoIACACQoIAGCCAgIAmKCAAAAmKCAAgAkKCABgggICAJiggAAAJiggAIAJCggAYI
ICAgCYYBhpH5aRkZGU7UybNs0rt2rVKufM0KFDnTNBEDhnfIeK+gz89Fmfj66uLudMTk6O17b+85//
OGeys7OdMz7Dc0+cOOGcGTVqlHNGkv7xj3945XB5OAMCAJiggAAAJiggAIAJCggAYIICAgCYoIAAACYoIAC
ACQoIAGCCAgIAmKCAAAAmKCAAgAkKCABggmGkfVgkEnHO+AwwHTNmjHNGknbu3Omcue6665wzPoMxfQe5+g
wx7e7uds4ka8Cqz9okv/
0XjUadMz7DX2OxmHPm6quvds5IUnp6ulcOl4czIACACQoIAGCCAgIAmKCAAAAmKCAAgAkKCABgggICAJigg
AAAJiggAIAJCggAYIICAgCYoIAAACYYRtqHZWVlOWcyMzOdM52dnc4ZSTpw4IBz5vbbb3fOtLS0OGd8hlxK
UldXl1eur/L9enwGn545c8Y543OMf/HFF86ZG2+80Tkj+e0/n6/
p9OnTzplUwBkQAMAEBQQAMOFcQNu3b9cdd9yhkpIShUIhbdq0Ke7xxYsXKxQKxd3mzZuXqPUCAFKEcwG1tr
ZqypQpWrNmzQWfM2/ePB07dqzn9vrrr1/RIgEAqcf5IoT58+dr/
vz5F31OOBxWUVGR96IAAKmvV14Dqq6u1ogRI3T99dfrkUce0cmTJy/
43Pb2dsVisbgbACD1JbyA5s2bp1dffVXbtm3Tz3/+c9XU1Gj+/
PkXvJyxqqpKkUik51ZaWproJQEA+qCEvw/
o3nvv7fnzpEmTNHnyZI0bN07V1dWaPXv2Oc9fuXKlVqxY0fNxLBajhABgAOj1y7DHjh2rgoKCC75pMRwOKy
8vL+4GAEh9vV5AR44c0cmTJ1VcXNzbmwIA9CPO/wXX0tISdzZTX1+vPXv2KD8/X/
n5+Xruuee0cOFCFRUV6eDBg3riiSd0zTXXaO7cuQldOACgf3MuoF27dum2227r+fjr128WLVqktWvXau/
evfrDH/6gpqYmlZSUaM6cOfrJT36icDicuFUDAPo95wKaNWuWgiC44ON//
vOfr2hB+K8hQ4Y4Z7Kzs50zPsM+fV111VXOmY6ODufMoEHJm7PrO/i0L/
P5mtra2pwz48aNc84cOXLEOZOenu6ckfwGi2ZkZDhnGEYKAEASUUAAABMUEADABAUEADBBAQEATFBAAAATF
BAAwAQFBAAwQQEBAExQQAAAExQQAMAEBQQAMEEBAQBMJG9kMJIiMzOzT2+rr/
9aju7ubuslJJTPZGZJ6uzsdM60trY6ZyKRiHPG52vq6upyzvhuC5ePMyAAgAkKCABgggICAJiggAAAJiggA
IAJCggAYIICAgCYoIAAACYoIACACQoIAGCCAgIAmKCAAAAmGEaaYrKyspwz7e3tXtvyGSw6aJD7IdfR0eGc
8R0q6rO+UCjknElLc//
Zz2dAqC+f9Z05c8Y5k5OT45wZPHiwc8ZnbZLfceRzPAxUnAEBAExQQAAAExQQAMAEBQQAMEEBAQBMUEAAAB
MUEADABAUEADBBAQEATFBAAAATFBAAwAQFBAAwwTDSPsxnMKZPJhqNOmckaciQIc4ZnyGXPny34zNIMiMjw
znjsz6ftXV1dTlnJL8hnD7b8jleT5w44ZzxHeTqMyzV52saqDgDAgCYoIAAACYoIACACQoIAGCCAgIAmKCA
AAAmKCAAgAkKCABgggICAJiggAAAJiggAIAJCggAYIKpeX2Yz1BDnyGSPsM0JSkSiThnWltbvbblKj09PSn
bkfyGcPoOx3TlO5TV55jw+ZoGDx7snDly5Ihzpq2tzTkjSVlZWc4Zn7+DAxVnQAAAExQQAMCEUwFVVVVp2r
Rpys3N1YgRI7RgwQLV1dXFPef06dOqrKzUsGHDNGTIEC1cuFCNjY0JXTQAoP9zKqCamhpVVlZqx44d2rp1q
zo7OzVnzpy4/9d/
7LHH9M477+itt95STU2Njh49qrvvvjvhCwcA9G9Or3Jv2bIl7uP169drxIgR2r17t2bOnKloNKrf/
e532rBhg773ve9JktatW6cbbrhBO3bs0He+853ErRwA0K9d0WtAX/8q5/
z8fEnS7t271dnZqYqKip7njB8/
XqNGjVJtbe15P0d7e7tisVjcDQCQ+rwLqLu7W8uXL9fNN9+siRMnSpIaGhqUmZmpoUOHxj23sLBQDQ0N5/0
8VVVVikQiPbfS0lLfJQEA+hHvAqqsrNS+ffv0xhtvXNECVq5cqWg02nM7fPjwFX0+AED/
4PVG1GXLlundd9/
V9u3bNXLkyJ77i4qK1NHRoaamprizoMbGRhUVFZ33c4XDYYXDYZ9lAAD6MaczoCAItGzZMm3cuFHvv/+
+ysrK4h6fOnWqMjIytG3btp776urqdOjQIc2YMSMxKwYApASnM6DKykpt2LBBmzdvVm5ubs/
rOpFIRNnZ2YpEInrooYe0YsUK5efnKy8vT48+
+qhmzJjBFXAAgDhOBbR27VpJ0qxZs+LuX7dunRYvXixJ+uUvf6m0tDQtXLhQ7e3tmjt3rn7zm98kZLEAgNT
hVEBBEFzyOVlZWVqzZo3WrFnjvSiclZmZ6ZzxGcLpOxjTZ2DlhV4LvBif9fkOWPXh8xpmsgbN+g7G9Blieu
bMGedMS0uLc6a+vt4509HR4ZyR/
Ie54vKwdwEAJiggAIAJCggAYIICAgCYoIAAACYoIACACQoIAGCCAgIAmKCAAAAmKCAAgAkKCABgggICAJig
gAAAJrx+IyqSI1mTeJuamrxyWVlZzpkxY8Y4Z3wmW/
tMZvblsz6f763PVPCuri7njOQ3RdvneDh16pRzpra21jkTjUadM5IUCoWcM74TyAcizoAAACYoIACACQoIA
GCCAgIAmKCAAAAmKCAAgAkKCABgggICAJiggAAAJiggAIAJCggAYIICAgCYYBhpivEZWOlrx44dzpldu3Y5
ZwYNcj9M29vbnTPJFASB9RISzuf75DM0Ni8vzzmTmZnpnPGVit/
b3sIZEADABAUEADBBAQEATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQQAMAEBQQAMEEBAQBMUEAAABMMI+
3D0tPTnTM+w0iTOcDUZ/ikTwbJ15e/Tz6DUiWpo6MjwSvB/
+IMCABgggICAJiggAAAJiggAIAJCggAYIICAgCYoIAAACYoIACACQoIAGCCAgIAmKCAAAAmKCAAgAmGkfZh
0WjUOTNs2DDnzODBg50zyZSZmemcYYjklQmFQs6ZtDT3n2e7urqcM9nZ2c6Zr776yjkjSYWFhc4Zn303UHE
GBAAwQQEBAEw4FVBVVZWmTZum3NxcjRgxQgsWLFBdXV3cc2bNmqVQKBR3e/jhhxO6aABA/
+dUQDU1NaqsrNSOHTu0detWdXZ2as6cOWptbY173pIlS3Ts2LGe2+rVqxO6aABA/+d0EcKWLVviPl6/
fr1GjBih3bt3a+bMmT335+TkqKioKDErBACkpCt6Dejrq7Ty8/Pj7n/
ttddUUFCgiRMnauXKlWpra7vg52hvb1csFou7AQBSn/dl2N3d3Vq+fLluvvlmTZw4sef++++/
X6NHj1ZJSYn27t2rJ598UnV1dXr77bfP+3mqqqr03HPP+S4DANBPeRdQZWWl9u3bpw8//DDu/qVLl/
b8edKkSSouLtbs2bN18OBBjRs37pzPs3LlSq1YsaLn41gsptLSUt9lAQD6Ca8CWrZsmd59911t375dI0eOv
Ohzy8vLJUkHDhw4bwGFw2GFw2GfZQAA+jGnAgqCQI8+
+qg2btyo6upqlZWVXTKzZ88eSVJxcbHXAgEAqcmpgCorK7VhwwZt3rxZubm5amhokCRFIhFlZ2fr4MGD2rB
hg26//XYNGzZMe/
fu1WOPPaaZM2dq8uTJvfIFAAD6J6cCWrt2raSzbzb9X+vWrdPixYuVmZmp9957Ty+99JJaW1tVWlqqhQsX6
qmnnkrYggEAqcH5v+AuprS0VDU1NVe0IADAwMA07D5s0CD3b88NN9zgnBkzZoxzJpl8Jibjylzqh83z6e7u
7oWVnMtn2vStt97qta1du3Y5Z775vsjLMVDf/
8gwUgCACQoIAGCCAgIAmKCAAAAmKCAAgAkKCABgggICAJiggAAAJiggAIAJCggAYIICAgCYoIAAACZCgc/
UwV4Ui8UUiUSsl9Fv5eXlJSUjSUeOHPHKufIZPtnHDmsYGz58uFeuvb3dOTNQB4ueTzQavei/
L5wBAQBMUEAAABMUEADABAUEADBBAQEATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQQAMDEIOsFfBMzvK6
Mz/7r7u7uhZUkDscErpTvMc6xd2Uutf/6XAE1NzdbL6Ff89l/
7HOkupMnT1ovYUBqbm6+6HDpPjcNu7u7W0ePHlVubu45U5BjsZhKS0t1+PBh7wnOqYD9cBb74Sz2w1nsh7P
6wn4IgkDNzc0qKSlRWtqFX+npc2dAaWlpGjly5EWfk5eXN6APsK+xH85iP5zFfjiL/
XCW9X64nF+rw0UIAAATFBAAwES/KqBwOKxVq1YpHA5bL8UU+
+Es9sNZ7Iez2A9n9af90OcuQgAADAz96gwIAJA6KCAAgAkKCABgggICAJjoNwW0Zs0ajRkzRllZWSovL9fH
H39svaSke/
bZZxUKheJu48ePt15Wr9u+fbvuuOMOlZSUKBQKadOmTXGPB0GgZ555RsXFxcrOzlZFRYX2799vs9hedKn9s
Hjx4nOOj3nz5tkstpdUVVVp2rRpys3N1YgRI7RgwQLV1dXFPef06dOqrKzUsGHDNGTIEC1cuFCNjY1GK+4d
l7MfZs2adc7x8PDDDxut+Pz6RQG9+eabWrFihVatWqVPPvlEU6ZM0dy5c3X8+HHrpSXdhAkTdOzYsZ7bhx9
+aL2kXtfa2qopU6ZozZo153189erV+vWvf61XXnlFO3fu1ODBgzV37lydPn06ySvtXZfaD5I0b968uOPj9d
dfT+IKe19NTY0qKyu1Y8cObd26VZ2dnZozZ45aW1t7nvPYY4/
pnXfe0VtvvaWamhodPXpUd999t+GqE+9y9oMkLVmyJO54WL16tdGKLyDoB6ZPnx5UVlb2fNzV1RWUlJQEVV
VVhqtKvlWrVgVTpkyxXoYpScHGjRt7Pu7u7g6KioqCF154oee+pqamIBwOB6+//rrBCpPjm/
shCIJg0aJFwZ133mmyHivHjx8PJAU1NTVBEJz93mdkZARvvfVWz3P+9re/BZKC2tpaq2X2um/uhyAIgv/
7v/8LfvCDH9gt6jL0+TOgjo4O7d69WxUVFT33paWlqaKiQrW1tYYrs7F//
36VlJRo7NixeuCBB3To0CHrJZmqr69XQ0ND3PERiURUXl4+II+P6upqjRgxQtdff70eeeSRlJ8CHY1GJUn5
+fmSpN27d6uzszPueBg/frxGjRqV0sfDN/
fD11577TUVFBRo4sSJWrlypdra2iyWd0F9bhjpN504cUJdXV0qLCyMu7+wsFBffPGF0apslJeXa/369br+
+ut17NgxPffcc7r11lu1b98+5ebmWi/PRENDgySd9/j4+rGBYt68ebr77rtVVlamgwcP6sc//
rHmz5+v2tpapaenWy8v4bq7u7V8+XLdfPPNmjhxoqSzx0NmZqaGDh0a99xUPh7Otx8k6f7779fo0aNVUlKi
vXv36sknn1RdXZ3efvttw9XG6/MFhP+aP39+z58nT56s8vJyjR49Wn/84x/
10EMPGa4MfcG9997b8+dJkyZp8uTJGjdunKqrqzV79mzDlfWOyspK7du3b0C8DnoxF9oPS5cu7fnzpEm
TVFxcrNmzZ+vgwYMaN25cspd5Xn3+v+AKCgqUnp5+zlUsjY2NKioqMlpV3zB06FBdd911OnDggPVSzHx9DH
B8nGvs2LEqKChIyeNj2bJlevfdd/XBBx/E/fqWoqIidXR0qKmpKe75qXo8XGg/
nE95ebkk9anjoc8XUGZmpqZOnapt27b13Nfd3a1t27ZpxowZhiuz19LSooMHD6q4uNh6KWbKyspUVFQUd3z
EYjHt3LlzwB8fR44c0cmTJ1Pq+AiCQMuWLdPGjRv1/
vvvq6ysLO7xqVOnKiMjI+54qKur06FDh1LqeLjUfjifPXv2SFLfOh6sr4K4HG+88UYQDoeD9evXB59//
nmwdOnSYOjQoUFDQ4P10pLqhz/8YVBdXR3U19cHH330UVBRUREUFBQEx48ft15ar2pubg4+/
fTT4NNPPw0kBS+++GLw6aefBv/85z+DIAiCn/
3sZ8HQoUODzZs3B3v37g3uvPPOoKysLDh16pTxyhPrYvuhubk5ePzxx4Pa2tqgvr4+eO+994KbbropuPbaa
4PTp09bLz1hHnnkkSASiQTV1dXBsWPHem5tbW09z3n44YeDUaNGBe+//
36wa9euYMaMGcGMGTMMV514l9oPBw4cCJ5//
vlg165dQX19fbB58+Zg7NixwcyZM41XHq9fFFAQBMHLL78cjBo1KsjMzAymT58e7Nixw3pJSXfPPfcExcXF
QWZmZnD11VcH99xzT3DgwAHrZfW6Dz74IJB0zm3RokVBEJy9FPvpp58OCgsLg3A4HMyePTuoq6uzXXQvuNh
+aGtrC+bMmRMMHz48yMjICEaPHh0sWbIk5X5IO9/XLylYt25dz3NOnToVfP/
73w+uuuqqICcnJ7jrrruCY8eO2S26F1xqPxw6dCiYOXNmkJ+fH4TD4eCaa64JfvSjHwXRaNR24d/
Ar2MAAJjo868BAQBSEwUEADBBAQEATFBAAAATFBAAwAQFBAAwQQEBAExQQAAAExQQAMAEBQQAMEEBAQBMUE
AAABP/DyQOI/aPLWmvAAAAAElFTkSuQmCC",
"text/plain": [
"<Figure size 640x480 with 1 Axes>"
]
},
"metadata": {},
"output_type": "display_data"
},
{
"name": "stdout",
"output_type": "stream",
"text": [
"Label: 2\n"
]
}
],
"source": [
"img = train_features[0].squeeze()\n",
"label = train_labels[0]\n",
"plt.imshow(img, cmap=\"gray\")\n",
"plt.show()\n",
"print(f\"Label: {label}\")"
]
},
{
"cell_type": "markdown",
"id": "15ffd11c-d824-41e4-8287-dd42610fbe59",
"metadata": {},
"source": [
"**Further Reading:** [torch.utils.data
API](https://pytorch.org/docs/stable/data.html)"
]
},
{
"attachments": {},
"cell_type": "markdown",
"id": "91bb8463-21a2-4fb6-a73f-4dbd7fd6215e",
"metadata": {},
"source": [
"### Build the Neural Network\n",
"\n",
"Neural networks comprise of layers/modules that perform operations on data.
The torch.nn namespace provides all the building blocks you need to build your own
neural network. Every module in PyTorch subclasses the nn.Module. A neural network
is a module itself that consists of other modules (layers). This nested structure
allows for building and managing complex architectures easily.\n",
"\n",
"In the following sections, we’ll build a neural network to classify images in
the FashionMNIST dataset."
]
},
{
"cell_type": "markdown",
"id": "1140ece3-2cbd-4a2d-8937-d3de4b68e2dc",
"metadata": {},
"source": [
"![CNN.jpg](https://d14b9ctw0m6fid.cloudfront.net/ugblog/wp-content/uploads/
2020/12/2.jpg)"
]
},
{
"cell_type": "code",
"execution_count": 36,
"id": "6384ac2f-75c4-4324-b490-34a94e41b151",
"metadata": {},
"outputs": [],
"source": [
"import os\n",
"import torch\n",
"from torch import nn\n",
"from torch.utils.data import DataLoader\n",
"from torchvision import datasets, transforms"
]
},
{
"cell_type": "markdown",
"id": "2f136576-f138-4686-a0b3-1bb8a071af8c",
"metadata": {},
"source": [
"#### Get Device for Training\n",
"We want to be able to train our model on a hardware accelerator like the GPU
or MPS, if available. Let’s check to see if torch.cuda or torch.backends.mps are
available, otherwise we use the CPU."
]
},
{
"cell_type": "code",
"execution_count": 37,
"id": "5b5d3692-0255-4c78-9faf-50bcceaeed53",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Using cpu device\n"
]
}
],
"source": [
"device = (\n",
" \"cuda\"\n",
" if torch.cuda.is_available()\n",
" else \"mps\"\n",
" if torch.backends.mps.is_available()\n",
" else \"cpu\"\n",
")\n",
"print(f\"Using {device} device\")"
]
},
{
"cell_type": "markdown",
"id": "40a81507-9a3a-415e-8ef4-f7cb0e1aaeb6",
"metadata": {},
"source": [
"#### Define the Class\n",
"We define our neural network by subclassing nn.Module, and initialize the
neural network layers in __init__. Every nn.Module subclass implements the
operations on input data in the forward method."
]
},
{
"cell_type": "code",
"execution_count": 38,
"id": "91fd82a2-e303-4c14-ae04-a85ddb811c4d",
"metadata": {},
"outputs": [],
"source": [
"class NeuralNetwork(nn.Module):\n",
" def __init__(self):\n",
" super().__init__()\n",
" self.flatten = nn.Flatten()\n",
" self.linear_relu_stack = nn.Sequential(\n",
" nn.Linear(28*28, 512),\n",
" nn.ReLU(),\n",
" nn.Linear(512, 512),\n",
" nn.ReLU(),\n",
" nn.Linear(512, 10),\n",
" )\n",
"\n",
" def forward(self, x):\n",
" x = self.flatten(x)\n",
" logits = self.linear_relu_stack(x)\n",
" return logits"
]
},
{
"cell_type": "markdown",
"id": "d8383022-eb3d-457d-98c3-52259d6df9d1",
"metadata": {},
"source": [
"We create an instance of NeuralNetwork, and move it to the device, and print
its structure."
]
},
{
"cell_type": "code",
"execution_count": 39,
"id": "957c0e08-64cf-479f-b4fb-f6804e8f388f",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"NeuralNetwork(\n",
" (flatten): Flatten(start_dim=1, end_dim=-1)\n",
" (linear_relu_stack): Sequential(\n",
" (0): Linear(in_features=784, out_features=512, bias=True)\n",
" (1): ReLU()\n",
" (2): Linear(in_features=512, out_features=512, bias=True)\n",
" (3): ReLU()\n",
" (4): Linear(in_features=512, out_features=10, bias=True)\n",
" )\n",
")\n"
]
}
],
"source": [
"model = NeuralNetwork().to(device)\n",
"print(model)"
]
},
{
"cell_type": "markdown",
"id": "5f6ba335-ad1d-417e-a288-1d5625845535",
"metadata": {},
"source": [
"To use the model, we pass it the input data. This executes the model’s
forward, along with some background operations. Do not call model.forward()
directly!\n",
"\n",
"Calling the model on the input returns a 2-dimensional tensor with dim=0
corresponding to each output of 10 raw predicted values for each class, and dim=1
corresponding to the individual values of each output. We get the prediction
probabilities by passing it through an instance of the nn.Softmax module."
]
},
{
"cell_type": "code",
"execution_count": 40,
"id": "7c219e94-2483-444d-ab3d-d422a8f8c68e",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Predicted class: tensor([2])\n"
]
}
],
"source": [
"X = torch.rand(1, 28, 28, device=device)\n",
"logits = model(X)\n",
"pred_probab = nn.Softmax(dim=1)(logits)\n",
"y_pred = pred_probab.argmax(1)\n",
"print(f\"Predicted class: {y_pred}\")"
]
},
{
"cell_type": "markdown",
"id": "d76c322d-a3c9-4d75-873a-0ac381a14df9",
"metadata": {
"jp-MarkdownHeadingCollapsed": true
},
"source": [
"#### Model Layers"
]
},
{
"cell_type": "markdown",
"id": "aeec52aa-34f7-4434-b923-b4f0447b9752",
"metadata": {},
"source": [
"Let’s break down the layers in the FashionMNIST model. To illustrate it, we
will take a sample minibatch of 3 images of size 28x28 and see what happens to it
as we pass it through the network."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "d12d6a19-98f6-49b4-9751-8f3b1c250818",
"metadata": {},
"outputs": [],
"source": [
"input_image = torch.rand(3,28,28)\n",
"print(input_image.size())"
]
},
{
"cell_type": "markdown",
"id": "f82e9082-db2f-4d77-9418-2429dba9730a",
"metadata": {},
"source": [
"##### nn.Flatten\n",
"We initialize the nn.Flatten layer to convert each 2D 28x28 image into a
contiguous array of 784 pixel values ( the minibatch dimension (at dim=0) is
maintained)."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "adef780c-7aad-40a5-92d0-b47511908c7e",
"metadata": {},
"outputs": [],
"source": [
"flatten = nn.Flatten()\n",
"flat_image = flatten(input_image)\n",
"print(flat_image.size())"
]
},
{
"cell_type": "markdown",
"id": "7b559747-4981-4fcc-8ff8-ec5cfaba7f07",
"metadata": {},
"source": [
"##### nn.Linear\n",
"The linear layer is a module that applies a linear transformation on the input
using its stored weights and biases."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "e8ea1953-2b3e-4a01-a007-21c8befd8056",
"metadata": {},
"outputs": [],
"source": [
"layer1 = nn.Linear(in_features=28*28, out_features=20)\n",
"hidden1 = layer1(flat_image)\n",
"print(hidden1.size())"
]
},
{
"cell_type": "markdown",
"id": "146eee88-270f-4739-b25e-43edce68497a",
"metadata": {},
"source": [
"##### nn.ReLU\n",
"Non-linear activations are what create the complex mappings between the
model’s inputs and outputs. They are applied after linear transformations to
introduce nonlinearity, helping neural networks learn a wide variety of phenomena.\
n",
"\n",
"In this model, we use nn.ReLU between our linear layers, but there’s other
activations to introduce non-linearity in your model."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "1861cf73-cbbf-4ed2-acd5-d843be487576",
"metadata": {},
"outputs": [],
"source": [
"print(f\"Before ReLU: {hidden1}\\n\\n\")\n",
"hidden1 = nn.ReLU()(hidden1)\n",
"print(f\"After ReLU: {hidden1}\")"
]
},
{
"cell_type": "markdown",
"id": "8a52620b-678f-41db-a12d-5c9ea20611f4",
"metadata": {},
"source": [
"##### nn.Sequential\n",
"nn.Sequential is an ordered container of modules. The data is passed through
all the modules in the same order as defined. You can use sequential containers to
put together a quick network like seq_modules."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "505a21c8-d83b-4ada-8793-8ca27c8ad94c",
"metadata": {},
"outputs": [],
"source": [
"seq_modules = nn.Sequential(\n",
" flatten,\n",
" layer1,\n",
" nn.ReLU(),\n",
" nn.Linear(20, 10)\n",
")\n",
"input_image = torch.rand(3,28,28)\n",
"logits = seq_modules(input_image)"
]
},
{
"cell_type": "markdown",
"id": "51f0099e-6d04-487a-a49f-2e6326944d9a",
"metadata": {},
"source": [
"##### nn.Softmax\n",
"The last linear layer of the neural network returns logits - raw values in [-
infty, infty] - which are passed to the nn.Softmax module. The logits are scaled to
values [0, 1] representing the model’s predicted probabilities for each class. dim
parameter indicates the dimension along which the values must sum to 1."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "3fe109ce-ff2e-4e03-be0d-108e42d4db5d",
"metadata": {},
"outputs": [],
"source": [
"softmax = nn.Softmax(dim=1)\n",
"pred_probab = softmax(logits)"
]
},
{
"cell_type": "markdown",
"id": "19f93ef6-6770-48a8-941f-495c232b2245",
"metadata": {},
"source": [
"#### Model Parameters\n",
"Many layers inside a neural network are parameterized, i.e. have associated
weights and biases that are optimized during training. Subclassing nn.Module
automatically tracks all fields defined inside your model object, and makes all
parameters accessible using your model’s parameters() or named_parameters()
methods.\n",
"\n",
"In this example, we iterate over each parameter, and print its size and a
preview of its values."
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "b76821e7-e218-4835-bc8c-095a64352829",
"metadata": {},
"outputs": [],
"source": [
"print(f\"Model structure: {model}\\n\\n\")\n",
"\n",
"for name, param in model.named_parameters():\n",
" print(f\"Layer: {name} | Size: {param.size()} | Values : {param[:2]} \\
n\")"
]
},
{
"cell_type": "markdown",
"id": "51c61bb9-6327-49ab-be4b-c251005771ee",
"metadata": {},
"source": [
"**Further Reading:** [torch.nn API](https://pytorch.org/docs/stable/nn.html)"
]
},
{
"cell_type": "markdown",
"id": "6ce7be27-c753-4fcc-bad3-0f06d1cbfcc2",
"metadata": {},
"source": [
"### Automatic Differentiation with torch.autograd\n",
"\n",
"When training neural networks, the most frequently used algorithm is back
propagation. In this algorithm, parameters (model weights) are adjusted according
to the gradient of the loss function with respect to the given parameter.\n",
"\n",
"To compute those gradients, PyTorch has a built-in differentiation engine
called torch.autograd. It supports automatic computation of gradient for any
computational graph.\n",
"\n",
"Consider the simplest one-layer neural network, with input x, parameters w and
b, and some loss function. It can be defined in PyTorch in the following manner:"
]
},
{
"cell_type": "code",
"execution_count": 41,
"id": "bf8ac982-5c19-40a4-915c-36fa1a5555ff",
"metadata": {},
"outputs": [],
"source": [
"import torch\n",
"\n",
"x = torch.ones(5) # input tensor\n",
"y = torch.zeros(3) # expected output\n",
"w = torch.randn(5, 3, requires_grad=True)\n",
"b = torch.randn(3, requires_grad=True)\n",
"z = torch.matmul(x, w)+b\n",
"loss = torch.nn.functional.binary_cross_entropy_with_logits(z, y)"
]
},
{
"cell_type": "markdown",
"id": "3f2e5c98-0c9f-40cb-b194-cc5137fc17d9",
"metadata": {},
"source": [
"Tensors, Functions and Computational graph\n",
"\n",
"This code defines the following computational graph:\n",
"![comp-)graph](https://pytorch.org/tutorials/_images/comp-graph.png)\n",
"In this network, w and b are parameters, which we need to optimize. Thus, we
need to be able to compute the gradients of loss function with respect to those
variables. In order to do that, we set the requires_grad property of those
tensors."
]
},
{
"cell_type": "markdown",
"id": "3afbffa7-026a-46e1-9304-2eeb3a5dd6d3",
"metadata": {},
"source": [
"A function that we apply to tensors to construct computational graph is in
fact an object of class Function. This object knows how to compute the function in
the forward direction, and also how to compute its derivative during the backward
propagation step. A reference to the backward propagation function is stored in
grad_fn property of a tensor. You can find more information of Function in the
documentation."
]
},
{
"cell_type": "code",
"execution_count": 42,
"id": "5b89a976-76fe-4a6e-ad88-eb0923b4b422",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"Gradient function for z = <AddBackward0 object at 0x000001C246423910>\n",
"Gradient function for loss = <BinaryCrossEntropyWithLogitsBackward0 object
at 0x000001C246420490>\n"
]
}
],
"source": [
"print(f\"Gradient function for z = {z.grad_fn}\")\n",
"print(f\"Gradient function for loss = {loss.grad_fn}\")"
]
},
{
"cell_type": "markdown",
"id": "073d894d-e9fb-4b52-8dd1-ec508e3832f5",
"metadata": {},
"source": [
"#### Computing Gradients\n",
"\n",
"To optimize weights of parameters in the neural network, we need to compute
the derivatives of our loss function with respect to parameters, namely, we need
∂loss∂w∂w∂loss and ∂loss∂b∂b∂loss under some fixed values of x and y. To compute
those derivatives, we call loss.backward(), and then retrieve the values from
w.grad and b.grad:"
]
},
{
"cell_type": "code",
"execution_count": 43,
"id": "26b19635-ba69-4b24-9676-69b2e51fdfeb",
"metadata": {},
"outputs": [
{
"name": "stdout",
"output_type": "stream",
"text": [
"tensor([[0.2759, 0.1437, 0.0955],\n",
" [0.2759, 0.1437, 0.0955],\n",
" [0.2759, 0.1437, 0.0955],\n",
" [0.2759, 0.1437, 0.0955],\n",
" [0.2759, 0.1437, 0.0955]])\n",
"tensor([0.2759, 0.1437, 0.0955])\n"
]
}
],
"source": [
"loss.backward()\n",
"print(w.grad)\n",
"print(b.grad)"
]
},
{
"cell_type": "markdown",
"id": "46b48081-3e9d-4d0b-a131-f5a93da41f04",
"metadata": {},
"source": [
"#### Disabling Gradient Tracking\n",
"\n",
"By default, all tensors with requires_grad=True are tracking their
computational history and support gradient computation. However, there are some
cases when we do not need to do that, for example, when we have trained the model
and just want to apply it to some input data, i.e. we only want to do forward
computations through the network. We can stop tracking computations by surrounding
our computation code with torch.no_grad() block:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "a05ea47c-6330-4633-99f2-84a5a8e8310a",
"metadata": {},
"outputs": [],
"source": [
"z = torch.matmul(x, w)+b\n",
"print(z.requires_grad)\n",
"\n",
"with torch.no_grad():\n",
" z = torch.matmul(x, w)+b\n",
"print(z.requires_grad)"
]
},
{
"cell_type": "markdown",
"id": "d997218c-9a33-4ff7-9ead-63b83f9432f6",
"metadata": {},
"source": [
"Another way to achieve the same result is to use the detach() method on the
tensor:"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "5936cdfc-6bac-4ced-bd7a-e64e217e78a3",
"metadata": {},
"outputs": [],
"source": [
"z = torch.matmul(x, w)+b\n",
"z_det = z.detach()\n",
"print(z_det.requires_grad)"
]
},
{
"cell_type": "markdown",
"id": "5b4ae852-003c-4902-8114-2606eefd898b",
"metadata": {},
"source": [
"There are reasons you might want to disable gradient tracking:\n",
"\n",
"* To mark some parameters in your neural network as frozen parameters.\n",
"* To speed up computations when you are only doing forward pass, because
computations on tensors that do not track gradients would be more efficient.\n"
]
},
{
"cell_type": "markdown",
"id": "deb101fd-2575-473c-917f-70b2dd4c8ce1",
"metadata": {},
"source": [
"#### More on Computational Graphs\n",
"\n",
"Conceptually, autograd keeps a record of data (tensors) and all executed
operations (along with the resulting new tensors) in a directed acyclic graph (DAG)
consisting of Function objects. In this DAG, leaves are the input tensors, roots
are the output tensors. By tracing this graph from roots to leaves, you can
automatically compute the gradients using the chain rule.\n",
"\n",
"In a forward pass, autograd does two things simultaneously:\n",
"\n",
" run the requested operation to compute a resulting tensor\n",
"\n",
" maintain the operation’s gradient function in the DAG.\n",
"\n",
"The backward pass kicks off when .backward() is called on the DAG root.
autograd then:\n",
"\n",
" computes the gradients from each .grad_fn,\n",
"\n",
" accumulates them in the respective tensor’s .grad attribute\n",
"\n",
" using the chain rule, propagates all the way to the leaf tensors"
]
},
{
"cell_type": "code",
"execution_count": null,
"id": "95170ee3-1907-469b-9704-641f2fc33217",
"metadata": {},
"outputs": [],
"source": []
}
],
"metadata": {
"kernelspec": {
"display_name": "Python 3 (ipykernel)",
"language": "python",
"name": "python3"
},
"language_info": {
"codemirror_mode": {
"name": "ipython",
"version": 3
},
"file_extension": ".py",
"mimetype": "text/x-python",
"name": "python",
"nbconvert_exporter": "python",
"pygments_lexer": "ipython3",
"version": "3.11.6"
}
},
"nbformat": 4,
"nbformat_minor": 5
}

You might also like