hexsha
stringlengths 40
40
| size
int64 6
14.9M
| ext
stringclasses 1
value | lang
stringclasses 1
value | max_stars_repo_path
stringlengths 6
260
| max_stars_repo_name
stringlengths 6
119
| max_stars_repo_head_hexsha
stringlengths 40
41
| max_stars_repo_licenses
list | max_stars_count
int64 1
191k
โ | max_stars_repo_stars_event_min_datetime
stringlengths 24
24
โ | max_stars_repo_stars_event_max_datetime
stringlengths 24
24
โ | max_issues_repo_path
stringlengths 6
260
| max_issues_repo_name
stringlengths 6
119
| max_issues_repo_head_hexsha
stringlengths 40
41
| max_issues_repo_licenses
list | max_issues_count
int64 1
67k
โ | max_issues_repo_issues_event_min_datetime
stringlengths 24
24
โ | max_issues_repo_issues_event_max_datetime
stringlengths 24
24
โ | max_forks_repo_path
stringlengths 6
260
| max_forks_repo_name
stringlengths 6
119
| max_forks_repo_head_hexsha
stringlengths 40
41
| max_forks_repo_licenses
list | max_forks_count
int64 1
105k
โ | max_forks_repo_forks_event_min_datetime
stringlengths 24
24
โ | max_forks_repo_forks_event_max_datetime
stringlengths 24
24
โ | avg_line_length
float64 2
1.04M
| max_line_length
int64 2
11.2M
| alphanum_fraction
float64 0
1
| cells
list | cell_types
list | cell_type_groups
list |
---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|---|
cbf82a23ef415bec41f82e346f24541aec9a8c1b
| 24,289 |
ipynb
|
Jupyter Notebook
|
docs/_downloads/e8d0748ca1aad4cdc05491f3344aad00/cifar10_tutorial.ipynb
|
taehui530/PyTorch-tutorials-kr
|
83d384dbc6c374128e21075e1719f60402fe0cf2
|
[
"BSD-3-Clause"
] | 221 |
2018-04-06T01:42:58.000Z
|
2021-11-28T10:12:45.000Z
|
docs/_downloads/e8d0748ca1aad4cdc05491f3344aad00/cifar10_tutorial.ipynb
|
taehui530/PyTorch-tutorials-kr
|
83d384dbc6c374128e21075e1719f60402fe0cf2
|
[
"BSD-3-Clause"
] | 280 |
2018-05-25T08:53:21.000Z
|
2021-12-02T05:37:25.000Z
|
docs/_downloads/e8d0748ca1aad4cdc05491f3344aad00/cifar10_tutorial.ipynb
|
taehui530/PyTorch-tutorials-kr
|
83d384dbc6c374128e21075e1719f60402fe0cf2
|
[
"BSD-3-Clause"
] | 181 |
2018-05-25T02:00:28.000Z
|
2021-11-19T11:56:39.000Z
| 79.375817 | 4,284 | 0.646136 |
[
[
[
"%matplotlib inline",
"_____no_output_____"
]
],
[
[
"\n๋ถ๋ฅ๊ธฐ(Classifier) ํ์ตํ๊ธฐ\n============================\n\n์ง๊ธ๊น์ง ์ด๋ป๊ฒ ์ ๊ฒฝ๋ง์ ์ ์ํ๊ณ , ์์ค์ ๊ณ์ฐํ๋ฉฐ ๋ ๊ฐ์ค์น๋ฅผ ๊ฐฑ์ ํ๋์ง์\n๋ํด์ ๋ฐฐ์ ์ต๋๋ค.\n\n์ด์ ์๋ง๋ ์ด๋ฐ ์๊ฐ์ ํ๊ณ ๊ณ์คํ
๋ฐ์,\n\n๋ฐ์ดํฐ๋ ์ด๋ป๊ฒ ํ๋์?\n------------------------\n\n์ผ๋ฐ์ ์ผ๋ก ์ด๋ฏธ์ง๋ ํ
์คํธ, ์ค๋์ค๋ ๋น๋์ค ๋ฐ์ดํฐ๋ฅผ ๋ค๋ฃฐ ๋๋ ํ์ค Python ํจํค์ง๋ฅผ\n์ด์ฉํ์ฌ NumPy ๋ฐฐ์ด๋ก ๋ถ๋ฌ์ค๋ฉด ๋ฉ๋๋ค. ๊ทธ ํ ๊ทธ ๋ฐฐ์ด์ ``torch.*Tensor`` ๋ก ๋ณํํฉ๋๋ค.\n\n- ์ด๋ฏธ์ง๋ Pillow๋ OpenCV ๊ฐ์ ํจํค์ง๊ฐ ์ ์ฉํฉ๋๋ค.\n- ์ค๋์ค๋ฅผ ์ฒ๋ฆฌํ ๋๋ SciPy์ LibROSA๊ฐ ์ ์ฉํ๊ณ ์.\n- ํ
์คํธ์ ๊ฒฝ์ฐ์๋ ๊ทธ๋ฅ Python์ด๋ Cython์ ์ฌ์ฉํด๋ ๋๊ณ , NLTK๋ SpaCy๋\n ์ ์ฉํฉ๋๋ค.\n\nํน๋ณํ ์์ ๋ถ์ผ๋ฅผ ์ํ ``torchvision`` ์ด๋ผ๋ ํจํค์ง๊ฐ ๋ง๋ค์ด์ ธ ์๋๋ฐ,\n์ฌ๊ธฐ์๋ Imagenet์ด๋ CIFAR10, MNIST ๋ฑ๊ณผ ๊ฐ์ด ์ผ๋ฐ์ ์ผ๋ก ์ฌ์ฉํ๋ ๋ฐ์ดํฐ์
์ ์ํ\n๋ฐ์ดํฐ ๋ก๋(data loader), ์ฆ ``torchvision.datasets`` ๊ณผ ์ด๋ฏธ์ง์ฉ ๋ฐ์ดํฐ ๋ณํ๊ธฐ\n(data transformer), ์ฆ ``torch.utils.data.DataLoader`` ๊ฐ ํฌํจ๋์ด ์์ต๋๋ค.\n\n์ด๋ฌํ ๊ธฐ๋ฅ์ ์์ฒญ๋๊ฒ ํธ๋ฆฌํ๋ฉฐ, ๋งค๋ฒ ์ ์ฌํ ์ฝ๋(boilerplate code)๋ฅผ ๋ฐ๋ณตํด์\n์์ฑํ๋ ๊ฒ์ ํผํ ์ ์์ต๋๋ค.\n\n์ด ํํ ๋ฆฌ์ผ์์๋ CIFAR10 ๋ฐ์ดํฐ์
์ ์ฌ์ฉํฉ๋๋ค. ์ฌ๊ธฐ์๋ ๋ค์๊ณผ ๊ฐ์ ๋ถ๋ฅ๋ค์ด\n์์ต๋๋ค: '๋นํ๊ธฐ(airplane)', '์๋์ฐจ(automobile)', '์(bird)', '๊ณ ์์ด(cat)',\n'์ฌ์ด(deer)', '๊ฐ(dog)', '๊ฐ๊ตฌ๋ฆฌ(frog)', '๋ง(horse)', '๋ฐฐ(ship)', 'ํธ๋ญ(truck)'.\n๊ทธ๋ฆฌ๊ณ CIFAR10์ ํฌํจ๋ ์ด๋ฏธ์ง์ ํฌ๊ธฐ๋ 3x32x32๋ก, ์ด๋ 32x32 ํฝ์
ํฌ๊ธฐ์ ์ด๋ฏธ์ง๊ฐ\n3๊ฐ ์ฑ๋(channel)์ ์์๋ก ์ด๋ค์ ธ ์๋ค๋ ๊ฒ์ ๋ปํฉ๋๋ค.\n\n.. figure:: /_static/img/cifar10.png\n :alt: cifar10\n\n cifar10\n\n\n์ด๋ฏธ์ง ๋ถ๋ฅ๊ธฐ ํ์ตํ๊ธฐ\n----------------------------\n\n๋ค์๊ณผ ๊ฐ์ ๋จ๊ณ๋ก ์งํํด๋ณด๊ฒ ์ต๋๋ค:\n\n1. ``torchvision`` ์ ์ฌ์ฉํ์ฌ CIFAR10์ ํ์ต์ฉ / ์ํ์ฉ ๋ฐ์ดํฐ์
์\n ๋ถ๋ฌ์ค๊ณ , ์ ๊ทํ(nomarlizing)ํฉ๋๋ค.\n2. ํฉ์ฑ๊ณฑ ์ ๊ฒฝ๋ง(Convolution Neural Network)์ ์ ์ํฉ๋๋ค.\n3. ์์ค ํจ์๋ฅผ ์ ์ํฉ๋๋ค.\n4. ํ์ต์ฉ ๋ฐ์ดํฐ๋ฅผ ์ฌ์ฉํ์ฌ ์ ๊ฒฝ๋ง์ ํ์ตํฉ๋๋ค.\n5. ์ํ์ฉ ๋ฐ์ดํฐ๋ฅผ ์ฌ์ฉํ์ฌ ์ ๊ฒฝ๋ง์ ๊ฒ์ฌํฉ๋๋ค.\n\n1. CIFAR10์ ๋ถ๋ฌ์ค๊ณ ์ ๊ทํํ๊ธฐ\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n``torchvision`` ์ ์ฌ์ฉํ์ฌ ๋งค์ฐ ์ฝ๊ฒ CIFAR10์ ๋ถ๋ฌ์ฌ ์ ์์ต๋๋ค.\n",
"_____no_output_____"
]
],
[
[
"import torch\nimport torchvision\nimport torchvision.transforms as transforms",
"_____no_output_____"
]
],
[
[
"torchvision ๋ฐ์ดํฐ์
์ ์ถ๋ ฅ(output)์ [0, 1] ๋ฒ์๋ฅผ ๊ฐ๋ PILImage ์ด๋ฏธ์ง์
๋๋ค.\n์ด๋ฅผ [-1, 1]์ ๋ฒ์๋ก ์ ๊ทํ๋ Tensor๋ก ๋ณํํฉ๋๋ค.\n\n<div class=\"alert alert-info\"><h4>Note</h4><p>๋ง์ฝ Windows ํ๊ฒฝ์์ BrokenPipeError๊ฐ ๋ฐ์ํ๋ค๋ฉด,\n torch.utils.data.DataLoader()์ num_worker๋ฅผ 0์ผ๋ก ์ค์ ํด๋ณด์ธ์.</p></div>\n\n",
"_____no_output_____"
]
],
[
[
"transform = transforms.Compose(\n [transforms.ToTensor(),\n transforms.Normalize((0.5, 0.5, 0.5), (0.5, 0.5, 0.5))])\n\nbatch_size = 4\n\ntrainset = torchvision.datasets.CIFAR10(root='./data', train=True,\n download=True, transform=transform)\ntrainloader = torch.utils.data.DataLoader(trainset, batch_size=batch_size,\n shuffle=True, num_workers=2)\n\ntestset = torchvision.datasets.CIFAR10(root='./data', train=False,\n download=True, transform=transform)\ntestloader = torch.utils.data.DataLoader(testset, batch_size=batch_size,\n shuffle=False, num_workers=2)\n\nclasses = ('plane', 'car', 'bird', 'cat',\n 'deer', 'dog', 'frog', 'horse', 'ship', 'truck')",
"_____no_output_____"
]
],
[
[
"์ฌ๋ฏธ์ผ์ ํ์ต์ฉ ์ด๋ฏธ์ง ๋ช ๊ฐ๋ฅผ ๋ณด๊ฒ ์ต๋๋ค.\n\n",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\nimport numpy as np\n\n# ์ด๋ฏธ์ง๋ฅผ ๋ณด์ฌ์ฃผ๊ธฐ ์ํ ํจ์\n\ndef imshow(img):\n img = img / 2 + 0.5 # unnormalize\n npimg = img.numpy()\n plt.imshow(np.transpose(npimg, (1, 2, 0)))\n plt.show()\n\n\n# ํ์ต์ฉ ์ด๋ฏธ์ง๋ฅผ ๋ฌด์์๋ก ๊ฐ์ ธ์ค๊ธฐ\ndataiter = iter(trainloader)\nimages, labels = dataiter.next()\n\n# ์ด๋ฏธ์ง ๋ณด์ฌ์ฃผ๊ธฐ\nimshow(torchvision.utils.make_grid(images))\n# ์ ๋ต(label) ์ถ๋ ฅ\nprint(' '.join('%5s' % classes[labels[j]] for j in range(batch_size)))",
"_____no_output_____"
]
],
[
[
"2. ํฉ์ฑ๊ณฑ ์ ๊ฒฝ๋ง(Convolution Neural Network) ์ ์ํ๊ธฐ\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n์ด์ ์ ์ ๊ฒฝ๋ง ์น์
์์ ์ ๊ฒฝ๋ง์ ๋ณต์ฌํ ํ, (๊ธฐ์กด์ 1์ฑ๋ ์ด๋ฏธ์ง๋ง ์ฒ๋ฆฌํ๋๋ก\n์ ์๋ ๊ฒ์) 3์ฑ๋ ์ด๋ฏธ์ง๋ฅผ ์ฒ๋ฆฌํ ์ ์๋๋ก ์์ ํฉ๋๋ค.\n\n",
"_____no_output_____"
]
],
[
[
"import torch.nn as nn\nimport torch.nn.functional as F\n\n\nclass Net(nn.Module):\n def __init__(self):\n super().__init__()\n self.conv1 = nn.Conv2d(3, 6, 5)\n self.pool = nn.MaxPool2d(2, 2)\n self.conv2 = nn.Conv2d(6, 16, 5)\n self.fc1 = nn.Linear(16 * 5 * 5, 120)\n self.fc2 = nn.Linear(120, 84)\n self.fc3 = nn.Linear(84, 10)\n\n def forward(self, x):\n x = self.pool(F.relu(self.conv1(x)))\n x = self.pool(F.relu(self.conv2(x)))\n x = torch.flatten(x, 1) # ๋ฐฐ์น๋ฅผ ์ ์ธํ ๋ชจ๋ ์ฐจ์์ ํํํ(flatten)\n x = F.relu(self.fc1(x))\n x = F.relu(self.fc2(x))\n x = self.fc3(x)\n return x\n\n\nnet = Net()",
"_____no_output_____"
]
],
[
[
"3. ์์ค ํจ์์ Optimizer ์ ์ํ๊ธฐ\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n๊ต์ฐจ ์ํธ๋กํผ ์์ค(Cross-Entropy loss)๊ณผ ๋ชจ๋ฉํ
(momentum) ๊ฐ์ ๊ฐ๋ SGD๋ฅผ ์ฌ์ฉํฉ๋๋ค.\n\n",
"_____no_output_____"
]
],
[
[
"import torch.optim as optim\n\ncriterion = nn.CrossEntropyLoss()\noptimizer = optim.SGD(net.parameters(), lr=0.001, momentum=0.9)",
"_____no_output_____"
]
],
[
[
"4. ์ ๊ฒฝ๋ง ํ์ตํ๊ธฐ\n^^^^^^^^^^^^^^^^^^^^\n\n์ด์ ์ฌ๋ฏธ์๋ ๋ถ๋ถ์ด ์์๋ฉ๋๋ค.\n๋จ์ํ ๋ฐ์ดํฐ๋ฅผ ๋ฐ๋ณตํด์ ์ ๊ฒฝ๋ง์ ์
๋ ฅ์ผ๋ก ์ ๊ณตํ๊ณ , ์ต์ ํ(Optimize)๋ง ํ๋ฉด\n๋ฉ๋๋ค.\n\n",
"_____no_output_____"
]
],
[
[
"for epoch in range(2): # ๋ฐ์ดํฐ์
์ ์์ฐจ๋ก ๋ฐ๋ณตํฉ๋๋ค.\n\n running_loss = 0.0\n for i, data in enumerate(trainloader, 0):\n # [inputs, labels]์ ๋ชฉ๋ก์ธ data๋ก๋ถํฐ ์
๋ ฅ์ ๋ฐ์ ํ;\n inputs, labels = data\n\n # ๋ณํ๋(Gradient) ๋งค๊ฐ๋ณ์๋ฅผ 0์ผ๋ก ๋ง๋ค๊ณ \n optimizer.zero_grad()\n\n # ์์ ํ + ์ญ์ ํ + ์ต์ ํ๋ฅผ ํ ํ\n outputs = net(inputs)\n loss = criterion(outputs, labels)\n loss.backward()\n optimizer.step()\n\n # ํต๊ณ๋ฅผ ์ถ๋ ฅํฉ๋๋ค.\n running_loss += loss.item()\n if i % 2000 == 1999: # print every 2000 mini-batches\n print('[%d, %5d] loss: %.3f' %\n (epoch + 1, i + 1, running_loss / 2000))\n running_loss = 0.0\n\nprint('Finished Training')",
"_____no_output_____"
]
],
[
[
"ํ์ตํ ๋ชจ๋ธ์ ์ ์ฅํด๋ณด๊ฒ ์ต๋๋ค:\n\n",
"_____no_output_____"
]
],
[
[
"PATH = './cifar_net.pth'\ntorch.save(net.state_dict(), PATH)",
"_____no_output_____"
]
],
[
[
"PyTorch ๋ชจ๋ธ์ ์ ์ฅํ๋ ์์ธํ ๋ฐฉ๋ฒ์ `์ฌ๊ธฐ <https://pytorch.org/docs/stable/notes/serialization.html>`_\n๋ฅผ ์ฐธ์กฐํด์ฃผ์ธ์.\n\n5. ์ํ์ฉ ๋ฐ์ดํฐ๋ก ์ ๊ฒฝ๋ง ๊ฒ์ฌํ๊ธฐ\n^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^^\n\n์ง๊ธ๊น์ง ํ์ต์ฉ ๋ฐ์ดํฐ์
์ 2ํ ๋ฐ๋ณตํ๋ฉฐ ์ ๊ฒฝ๋ง์ ํ์ต์์ผฐ์ต๋๋ค.\n์ ๊ฒฝ๋ง์ด ์ ํ ๋ฐฐ์ด๊ฒ ์์์ง๋ ๋ชจ๋ฅด๋ ํ์ธํด๋ด
๋๋ค.\n\n์ ๊ฒฝ๋ง์ด ์์ธกํ ์ถ๋ ฅ๊ณผ ์ง์ง ์ ๋ต(Ground-truth)์ ๋น๊ตํ๋ ๋ฐฉ์์ผ๋ก ํ์ธํฉ๋๋ค.\n๋ง์ฝ ์์ธก์ด ๋ง๋ค๋ฉด ์ํ์ '๋ง์ ์์ธก๊ฐ(correct predictions)' ๋ชฉ๋ก์ ๋ฃ๊ฒ ์ต๋๋ค.\n\n์ฒซ๋ฒ์งธ๋ก ์ํ์ฉ ๋ฐ์ดํฐ๋ฅผ ์ข ๋ณด๊ฒ ์ต๋๋ค.\n\n",
"_____no_output_____"
]
],
[
[
"dataiter = iter(testloader)\nimages, labels = dataiter.next()\n\n# ์ด๋ฏธ์ง๋ฅผ ์ถ๋ ฅํฉ๋๋ค.\nimshow(torchvision.utils.make_grid(images))\nprint('GroundTruth: ', ' '.join('%5s' % classes[labels[j]] for j in range(4)))",
"_____no_output_____"
]
],
[
[
"์ด์ , ์ ์ฅํ๋ ๋ชจ๋ธ์ ๋ถ๋ฌ์ค๋๋ก ํ๊ฒ ์ต๋๋ค (์ฃผ: ๋ชจ๋ธ์ ์ ์ฅํ๊ณ ๋ค์ ๋ถ๋ฌ์ค๋\n์์
์ ์ฌ๊ธฐ์์๋ ๋ถํ์ํ์ง๋ง, ์ด๋ป๊ฒ ํ๋์ง ์ค๋ช
์ ์ํด ํด๋ณด๊ฒ ์ต๋๋ค):\n\n",
"_____no_output_____"
]
],
[
[
"net = Net()\nnet.load_state_dict(torch.load(PATH))",
"_____no_output_____"
]
],
[
[
"์ข์ต๋๋ค, ์ด์ ์ด ์์ ๋ค์ ์ ๊ฒฝ๋ง์ด ์ด๋ป๊ฒ ์์ธกํ๋์ง๋ฅผ ๋ณด๊ฒ ์ต๋๋ค:\n\n",
"_____no_output_____"
]
],
[
[
"outputs = net(images)",
"_____no_output_____"
]
],
[
[
"์ถ๋ ฅ์ 10๊ฐ ๋ถ๋ฅ ๊ฐ๊ฐ์ ๋ํ ๊ฐ์ผ๋ก ๋ํ๋ฉ๋๋ค. ์ด๋ค ๋ถ๋ฅ์ ๋ํด์ ๋ ๋์ ๊ฐ์ด\n๋ํ๋๋ค๋ ๊ฒ์, ์ ๊ฒฝ๋ง์ด ๊ทธ ์ด๋ฏธ์ง๊ฐ ํด๋น ๋ถ๋ฅ์ ๋ ๊ฐ๊น๋ค๊ณ ์๊ฐํ๋ค๋ ๊ฒ์
๋๋ค.\n๋ฐ๋ผ์, ๊ฐ์ฅ ๋์ ๊ฐ์ ๊ฐ๋ ์ธ๋ฑ์ค(index)๋ฅผ ๋ฝ์๋ณด๊ฒ ์ต๋๋ค:\n\n",
"_____no_output_____"
]
],
[
[
"_, predicted = torch.max(outputs, 1)\n\nprint('Predicted: ', ' '.join('%5s' % classes[predicted[j]]\n for j in range(4)))",
"_____no_output_____"
]
],
[
[
"๊ฒฐ๊ณผ๊ฐ ๊ด์ฐฎ์๋ณด์ด๋ค์.\n\n๊ทธ๋ผ ์ ์ฒด ๋ฐ์ดํฐ์
์ ๋ํด์๋ ์ด๋ป๊ฒ ๋์ํ๋์ง ๋ณด๊ฒ ์ต๋๋ค.\n\n",
"_____no_output_____"
]
],
[
[
"correct = 0\ntotal = 0\n# ํ์ต ์ค์ด ์๋๋ฏ๋ก, ์ถ๋ ฅ์ ๋ํ ๋ณํ๋๋ฅผ ๊ณ์ฐํ ํ์๊ฐ ์์ต๋๋ค\nwith torch.no_grad():\n for data in testloader:\n images, labels = data\n # ์ ๊ฒฝ๋ง์ ์ด๋ฏธ์ง๋ฅผ ํต๊ณผ์์ผ ์ถ๋ ฅ์ ๊ณ์ฐํฉ๋๋ค\n outputs = net(images)\n # ๊ฐ์ฅ ๋์ ๊ฐ(energy)๋ฅผ ๊ฐ๋ ๋ถ๋ฅ(class)๋ฅผ ์ ๋ต์ผ๋ก ์ ํํ๊ฒ ์ต๋๋ค\n _, predicted = torch.max(outputs.data, 1)\n total += labels.size(0)\n correct += (predicted == labels).sum().item()\n\nprint('Accuracy of the network on the 10000 test images: %d %%' % (\n 100 * correct / total))",
"_____no_output_____"
]
],
[
[
"(10๊ฐ์ง ๋ถ๋ฅ ์ค์ ํ๋๋ฅผ ๋ฌด์์๋ก) ์ฐ์์ ๋์ ์ ํ๋์ธ 10% ๋ณด๋ค๋ ๋์๋ณด์
๋๋ค.\n์ ๊ฒฝ๋ง์ด ๋ญ๊ฐ ๋ฐฐ์ฐ๊ธด ํ ๊ฒ ๊ฐ๋ค์.\n\n๊ทธ๋ผ ์ด๋ค ๊ฒ๋ค์ ๋ ์ ๋ถ๋ฅํ๊ณ , ์ด๋ค ๊ฒ๋ค์ ๋ ๋ชปํ๋์ง ์์๋ณด๊ฒ ์ต๋๋ค:\n\n",
"_____no_output_____"
]
],
[
[
"# ๊ฐ ๋ถ๋ฅ(class)์ ๋ํ ์์ธก๊ฐ ๊ณ์ฐ์ ์ํด ์ค๋น\ncorrect_pred = {classname: 0 for classname in classes}\ntotal_pred = {classname: 0 for classname in classes}\n\n# ๋ณํ๋๋ ์ฌ์ ํ ํ์ํ์ง ์์ต๋๋ค\nwith torch.no_grad():\n for data in testloader:\n images, labels = data\n outputs = net(images)\n _, predictions = torch.max(outputs, 1)\n # ๊ฐ ๋ถ๋ฅ๋ณ๋ก ์ฌ๋ฐ๋ฅธ ์์ธก ์๋ฅผ ๋ชจ์๋๋ค\n for label, prediction in zip(labels, predictions):\n if label == prediction:\n correct_pred[classes[label]] += 1\n total_pred[classes[label]] += 1\n\n\n# ๊ฐ ๋ถ๋ฅ๋ณ ์ ํ๋(accuracy)๋ฅผ ์ถ๋ ฅํฉ๋๋ค\nfor classname, correct_count in correct_pred.items():\n accuracy = 100 * float(correct_count) / total_pred[classname]\n print(\"Accuracy for class {:5s} is: {:.1f} %\".format(classname,\n accuracy))",
"_____no_output_____"
]
],
[
[
"์, ์ด์ ๋ค์์ผ๋ก ๋ฌด์์ ํด๋ณผ๊น์?\n\n์ด๋ฌํ ์ ๊ฒฝ๋ง๋ค์ GPU์์ ์คํํ๋ ค๋ฉด ์ด๋ป๊ฒ ํด์ผ ํ ๊น์?\n\nGPU์์ ํ์ตํ๊ธฐ\n----------------\nTensor๋ฅผ GPU๋ก ์ด๋ํ๋ ๊ฒ์ฒ๋ผ, ์ ๊ฒฝ๋ง ๋ํ GPU๋ก ์ฎ๊ธธ ์ ์์ต๋๋ค.\n\n๋จผ์ (CUDA๋ฅผ ์ฌ์ฉํ ์ ์๋ค๋ฉด) ์ฒซ๋ฒ์งธ CUDA ์ฅ์น๋ฅผ ์ฌ์ฉํ๋๋ก ์ค์ ํฉ๋๋ค:\n\n",
"_____no_output_____"
]
],
[
[
"device = torch.device(\"cuda:0\" if torch.cuda.is_available() else \"cpu\")\n\n# CUDA ๊ธฐ๊ธฐ๊ฐ ์กด์ฌํ๋ค๋ฉด, ์๋ ์ฝ๋๊ฐ CUDA ์ฅ์น๋ฅผ ์ถ๋ ฅํฉ๋๋ค:\n\nprint(device)",
"_____no_output_____"
]
],
[
[
"์ด ์น์
์ ๋๋จธ์ง ๋ถ๋ถ์์๋ ``device`` ๋ฅผ CUDA ์ฅ์น๋ผ๊ณ ๊ฐ์ ํ๊ฒ ์ต๋๋ค.\n\n๊ทธ๋ฆฌ๊ณ ์ด ๋ฉ์๋(Method)๋ค์ ์ฌ๊ท์ ์ผ๋ก ๋ชจ๋ ๋ชจ๋์ ๋งค๊ฐ๋ณ์์ ๋ฒํผ๋ฅผ\nCUDA tensor๋ก ๋ณ๊ฒฝํฉ๋๋ค:\n\n.. code:: python\n\n net.to(device)\n\n\n๋ํ, ๊ฐ ๋จ๊ณ์์ ์
๋ ฅ(input)๊ณผ ์ ๋ต(target)๋ GPU๋ก ๋ณด๋ด์ผ ํ๋ค๋ ๊ฒ๋ ๊ธฐ์ตํด์ผ\nํฉ๋๋ค:\n\n.. code:: python\n\n inputs, labels = data[0].to(device), data[1].to(device)\n\nCPU์ ๋น๊ตํ์ ๋ ์ด๋ง์ด๋งํ ์๋ ์ฐจ์ด๊ฐ ๋์ง ์๋ ๊ฒ์ ์ ๊ทธ๋ด๊น์?\n๊ทธ ์ด์ ๋ ๋ฐ๋ก ์ ๊ฒฝ๋ง์ด ๋๋ฌด ์๊ธฐ ๋๋ฌธ์
๋๋ค.\n\n**์ฐ์ต:** ์ ๊ฒฝ๋ง์ ํฌ๊ธฐ๋ฅผ ํค์๋ณด๊ณ , ์ผ๋ง๋ ๋นจ๋ผ์ง๋์ง ํ์ธํด๋ณด์ธ์.\n(์ฒซ๋ฒ์งธ ``nn.Conv2d`` ์ 2๋ฒ์งธ ์ธ์์ ๋๋ฒ์งธ ``nn.Conv2d`` ์ 1๋ฒ์งธ ์ธ์๋\n๊ฐ์ ์ซ์์ฌ์ผ ํฉ๋๋ค.)\n\n**๋ค์ ๋ชฉํ๋ค์ ๋ฌ์ฑํ์ต๋๋ค**:\n\n- ๋์ ์์ค์์ PyTorch์ Tensor library์ ์ ๊ฒฝ๋ง์ ์ดํดํฉ๋๋ค.\n- ์ด๋ฏธ์ง๋ฅผ ๋ถ๋ฅํ๋ ์์ ์ ๊ฒฝ๋ง์ ํ์ต์ํต๋๋ค.\n\n์ฌ๋ฌ๊ฐ์ GPU์์ ํ์ตํ๊ธฐ\n-------------------------\n๋ชจ๋ GPU๋ฅผ ํ์ฉํด์ ๋์ฑ ๋ ์๋๋ฅผ ์ฌ๋ฆฌ๊ณ ์ถ๋ค๋ฉด, :doc:`data_parallel_tutorial`\n์ ์ฐธ๊ณ ํ์ธ์.\n\n์ด์ ๋ฌด์์ ํด๋ณผ๊น์?\n-----------------------\n\n- :doc:`๋น๋์ค ๊ฒ์์ ํ ์ ์๋ ์ ๊ฒฝ๋ง ํ์ต์ํค๊ธฐ </intermediate/reinforcement_q_learning>`\n- `imagenet์ผ๋ก ์ต์ฒจ๋จ(state-of-the-art) ResNet ์ ๊ฒฝ๋ง ํ์ต์ํค๊ธฐ`_\n- `์ ๋์ ์์ฑ ์ ๊ฒฝ๋ง์ผ๋ก ์ผ๊ตด ์์ฑ๊ธฐ ํ์ต์ํค๊ธฐ`_\n- `์ํ LSTM ๋คํธ์ํฌ๋ฅผ ์ฌ์ฉํด ๋จ์ด ๋จ์ ์ธ์ด ๋ชจ๋ธ ํ์ต์ํค๊ธฐ`_\n- `๋ค๋ฅธ ์์ ๋ค ์ฐธ๊ณ ํ๊ธฐ`_\n- `๋ ๋ง์ ํํ ๋ฆฌ์ผ ๋ณด๊ธฐ`_\n- `ํฌ๋ผ์์ PyTorch์ ๋ํด ์๊ธฐํ๊ธฐ`_\n- `Slack์์ ๋ค๋ฅธ ์ฌ์ฉ์์ ๋ํํ๊ธฐ`_\n\n\n",
"_____no_output_____"
]
],
[
[
"# %%%%%%INVISIBLE_CODE_BLOCK%%%%%%\ndel dataiter\n# %%%%%%INVISIBLE_CODE_BLOCK%%%%%%",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbf837666adbd75259c66dffc5663b5a047b3c6e
| 124,105 |
ipynb
|
Jupyter Notebook
|
1_Data_Preparation.ipynb
|
noernimat/modeling_dataset_using_supervised_learning
|
d63ced44057d015256a47b556dab10f51f53412a
|
[
"CC0-1.0"
] | null | null | null |
1_Data_Preparation.ipynb
|
noernimat/modeling_dataset_using_supervised_learning
|
d63ced44057d015256a47b556dab10f51f53412a
|
[
"CC0-1.0"
] | null | null | null |
1_Data_Preparation.ipynb
|
noernimat/modeling_dataset_using_supervised_learning
|
d63ced44057d015256a47b556dab10f51f53412a
|
[
"CC0-1.0"
] | null | null | null | 39.039006 | 6,937 | 0.308876 |
[
[
[
"<a href=\"https://colab.research.google.com/github/NoerNikmat/machine_learning_models_for_absenteeism_at_work_dataset/blob/main/1_Data_Preparation.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"# DATA PREPARATION FOR MACHINE LEARNING MODELS\r\nUsing Absenteeism at work An UCI dataset",
"_____no_output_____"
],
[
"## Programming with Python",
"_____no_output_____"
],
[
"### Import Dataset from Kaggle",
"_____no_output_____"
],
[
"Install Kaggle for upload dataset into google colab",
"_____no_output_____"
]
],
[
[
"!pip install -q kaggle",
"_____no_output_____"
]
],
[
[
"Upload Kaggle API key",
"_____no_output_____"
]
],
[
[
"from google.colab import files\r\nfiles.upload()",
"_____no_output_____"
],
[
"! mkdir ~/.kaggle\r\n! cp kaggle.json ~/.kaggle/\r\n! chmod 600 ~/.kaggle/kaggle.json",
"_____no_output_____"
]
],
[
[
"Download dataset from Kaggle",
"_____no_output_____"
]
],
[
[
"! kaggle datasets download -d 'loganalive/absenteeism-at-work-an-uci-dataset/download' ",
"Downloading absenteeism-at-work-an-uci-dataset.zip to /content\n\r 0% 0.00/7.35k [00:00<?, ?B/s]\n\r100% 7.35k/7.35k [00:00<00:00, 5.45MB/s]\n"
],
[
"!ls",
"absenteeism-at-work-an-uci-dataset.zip\tkaggle.json sample_data\n"
],
[
"!unzip -q absenteeism-at-work-an-uci-dataset.zip",
"_____no_output_____"
],
[
"!ls",
"absenteeism-at-work-an-uci-dataset.zip\tkaggle.json\nAbsenteeism_at_work.csv\t\t\tsample_data\n"
]
],
[
[
"### Import Library",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport numpy as np",
"_____no_output_____"
],
[
"absent = pd.read_csv('Absenteeism_at_work.csv')\r\nabsent.head(20)",
"_____no_output_____"
],
[
"absent['Work load Average/day ']",
"_____no_output_____"
]
],
[
[
"Dimensions of data",
"_____no_output_____"
]
],
[
[
"shape = absent.shape\nprint (shape)",
"(740, 21)\n"
]
],
[
[
"Data type for each attribute",
"_____no_output_____"
]
],
[
[
"types = absent.dtypes\nprint(types)",
"ID int64\nReason for absence int64\nMonth of absence int64\nDay of the week int64\nSeasons int64\nTransportation expense int64\nDistance from Residence to Work int64\nService time int64\nAge int64\nWork load Average/day float64\nHit target int64\nDisciplinary failure int64\nEducation int64\nSon int64\nSocial drinker int64\nSocial smoker int64\nPet int64\nWeight int64\nHeight int64\nBody mass index int64\nAbsenteeism time in hours int64\ndtype: object\n"
],
[
"absent",
"_____no_output_____"
],
[
"# Null values in dataset\nabsent_data = pd.DataFrame(absent.isnull().sum())\nabsent_data = absent_data.rename(columns={0:\"Absent_sum\"})\nabsent_data[\"Absent Percent\"] = (absent_data[\"Absent_sum\"]/len(absent))*100\nabsent_data",
"_____no_output_____"
],
[
"absent.describe()",
"_____no_output_____"
],
[
"pd.set_option('display.width', 100)\npd.set_option('precision',3)\ndescription = absent.describe()\nprint(description)",
" ID Reason for absence ... Body mass index Absenteeism time in hours\ncount 740.000 740.000 ... 740.000 740.000\nmean 18.018 19.216 ... 26.677 6.924\nstd 11.021 8.433 ... 4.285 13.331\nmin 1.000 0.000 ... 19.000 0.000\n25% 9.000 13.000 ... 24.000 2.000\n50% 18.000 23.000 ... 25.000 3.000\n75% 28.000 26.000 ... 31.000 8.000\nmax 36.000 28.000 ... 38.000 120.000\n\n[8 rows x 21 columns]\n"
],
[
"class_counts = absent.groupby('Absenteeism time in hours').size()\nprint(class_counts)",
"Absenteeism time in hours\n0 44\n1 88\n2 157\n3 112\n4 60\n5 7\n7 1\n8 208\n16 19\n24 16\n32 6\n40 7\n48 1\n56 2\n64 3\n80 3\n104 1\n112 2\n120 3\ndtype: int64\n"
],
[
"correlations = absent.corr(method = 'pearson')\nprint(correlations)",
" ID ... Absenteeism time in hours\nID 1.000e+00 ... -0.018\nReason for absence -6.424e-02 ... -0.173\nMonth of absence -4.346e-05 ... 0.024\nDay of the week 3.447e-02 ... -0.124\nSeasons 9.849e-02 ... -0.006\nTransportation expense -2.242e-01 ... 0.028\nDistance from Residence to Work -4.862e-01 ... -0.088\nService time -2.727e-01 ... 0.019\nAge 4.090e-02 ... 0.066\nWork load Average/day 9.246e-02 ... 0.025\nHit target 1.879e-02 ... 0.027\nDisciplinary failure 4.502e-03 ... -0.124\nEducation -3.625e-02 ... -0.046\nSon 2.767e-03 ... 0.114\nSocial drinker -4.513e-01 ... 0.065\nSocial smoker -1.083e-02 ... -0.009\nPet -4.142e-02 ... -0.028\nWeight -2.542e-01 ... 0.016\nHeight 7.636e-02 ... 0.144\nBody mass index -3.069e-01 ... -0.050\nAbsenteeism time in hours -1.800e-02 ... 1.000\n\n[21 rows x 21 columns]\n"
]
],
[
[
"The most common method for calculating correlation is Pearsonโs Correlation Coefficient, that assumes a normal distribution of the attributes involved. A correlation of -1 or 1 shows a full negative or positive correlation respectively. Whereas a value of 0 shows no correlation at all.\n\nThe matrix lists all attributes across the top and down the side, to give correlation between all pairs of attributes (twice, because the matrix is symmetrical). There can show the diagonal line through the matrix from the top left to bottom right corners of the matrix shows perfect correlation of each attribute with itself.",
"_____no_output_____"
]
],
[
[
"skew = absent.skew()\nprint(skew)",
"ID 0.017\nReason for absence -0.915\nMonth of absence 0.069\nDay of the week 0.102\nSeasons -0.039\nTransportation expense 0.396\nDistance from Residence to Work 0.312\nService time -0.005\nAge 0.698\nWork load Average/day 0.961\nHit target -1.262\nDisciplinary failure 3.952\nEducation 2.109\nSon 1.086\nSocial drinker -0.273\nSocial smoker 3.290\nPet 2.736\nWeight 0.017\nHeight 2.566\nBody mass index 0.305\nAbsenteeism time in hours 5.721\ndtype: float64\n"
]
],
[
[
"The skew result show a positive (right) or negative (left) skew. \nValues closer to zeroshow less skew.",
"_____no_output_____"
]
],
[
[
"def unique(list1): \n list_set = set(list1) \n unique_list = (list(list_set)) \n for x in unique_list: \n print (x)",
"_____no_output_____"
],
[
"abtag=absent['Absenteeism time in hours']\nunique(abtag)",
"0\n1\n2\n3\n4\n32\n5\n7\n8\n40\n64\n104\n16\n80\n112\n48\n24\n56\n120\n"
],
[
"abtag2=abtag[absent['Absenteeism time in hours']]\nunique(abtag2)",
"0\n1\n2\n3\n4\n40\n8\n56\n"
],
[
"# add categorical target column as per project requirement\nabsent['Absenteeism categories'] = np.where((absent['Absenteeism time in hours'] >= 0)&(absent['Absenteeism time in hours'] <= 20), \"Group 0\", \n np.where((absent['Absenteeism time in hours'] >= 21)&(absent['Absenteeism time in hours'] <= 40), \"Group 1\",\n np.where((absent['Absenteeism time in hours'] >= 41)&(absent['Absenteeism time in hours'] <= 60), \"Group 2\",\n np.where((absent['Absenteeism time in hours'] >= 61)&(absent['Absenteeism time in hours'] <= 80), \"Group 3\",\n np.where((absent['Absenteeism time in hours'] >= 81)&(absent['Absenteeism time in hours'] <= 100), \"Group 4\",\n np.where((absent['Absenteeism time in hours'] >= 101)&(absent['Absenteeism time in hours'] <= 120),\"Group 5\",0))\n ))))",
"_____no_output_____"
],
[
"absent.head(20)",
"_____no_output_____"
],
[
"absent['Absenteeism categories'].tail()",
"_____no_output_____"
]
],
[
[
"Formatting to proper data type",
"_____no_output_____"
]
],
[
[
"absent['followUp_req'] = np.where(absent['Reason for absence']<= 21,1, 0)\n\nabsent['Reason for absence'] = absent['Reason for absence'].astype('category')\nabsent['Month of absence'] = absent['Month of absence'].astype('category')\nabsent['Day of the week'] = absent['Day of the week'].astype('category')\nabsent['Seasons'] = absent['Seasons'].astype('category')\nabsent['Disciplinary failure'] = absent['Disciplinary failure'].astype('category')\nabsent['Education'] = absent['Education'].astype('category')\nabsent['Social drinker'] = absent['Social drinker'].astype('category')\nabsent['Social smoker'] = absent['Social smoker'].astype('category')\nabsent['Pet'] = absent['Pet'].astype('category')\nabsent['followUp_req'] = absent['followUp_req'].astype('category')\nabsent['Absenteeism categories'] = absent['Absenteeism categories'].astype('category')\nabsent.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 740 entries, 0 to 739\nData columns (total 23 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 ID 740 non-null int64 \n 1 Reason for absence 740 non-null category\n 2 Month of absence 740 non-null category\n 3 Day of the week 740 non-null category\n 4 Seasons 740 non-null category\n 5 Transportation expense 740 non-null int64 \n 6 Distance from Residence to Work 740 non-null int64 \n 7 Service time 740 non-null int64 \n 8 Age 740 non-null int64 \n 9 Work load Average/day 740 non-null float64 \n 10 Hit target 740 non-null int64 \n 11 Disciplinary failure 740 non-null category\n 12 Education 740 non-null category\n 13 Son 740 non-null int64 \n 14 Social drinker 740 non-null category\n 15 Social smoker 740 non-null category\n 16 Pet 740 non-null category\n 17 Weight 740 non-null int64 \n 18 Height 740 non-null int64 \n 19 Body mass index 740 non-null int64 \n 20 Absenteeism time in hours 740 non-null int64 \n 21 Absenteeism categories 740 non-null category\n 22 followUp_req 740 non-null category\ndtypes: category(11), float64(1), int64(11)\nmemory usage: 81.0 KB\n"
],
[
"# store two datasets, one for continous and other categorical\ndataset_continuous = absent.drop('Absenteeism categories', axis=1)\ndataset_categorical = absent.drop('Absenteeism time in hours',axis=1)\n\nprint(dataset_continuous.shape)\nprint(dataset_categorical.shape)",
"(740, 22)\n(740, 22)\n"
],
[
"# write the taining data to file\n\ndataset_continuous.to_csv('cleanDataset_continuousTarget.csv',index=False)\ndataset_categorical.to_csv('cleanDataset_categoricalTarget.csv',index=False)",
"_____no_output_____"
],
[
"# get the test dataset\ntest_path = 'Absenteeism_at_work.csv'\ndata_test = pd.read_csv(test_path, decimal=\",\")",
"_____no_output_____"
],
[
"# preprocess the test dataset\n# adding new column named 'followUp_req' based on whether reason for absence required follow up or not\ndata_test['followUp_req'] = np.where(data_test['Reason for absence'] <= 21, 1, 0)\n\n# add categorical target column as per project requirement\n\ndata_test['Absenteeism categories'] = np.where(data_test['Absenteeism time in hours'] == 0, \"Group 0\", \n np.where(data_test['Absenteeism time in hours'] == 1, \"Group 1\",\n np.where(data_test['Absenteeism time in hours'] == 2, \"Group 2\",\n np.where(data_test['Absenteeism time in hours'] == 3, \"Group 3\",\n np.where((data_test['Absenteeism time in hours'] >= 4)&(data_test['Absenteeism time in hours'] <= 7), \"Group 4\",\n np.where(data_test['Absenteeism time in hours'] == 8, \"Group 5\",\n np.where(data_test['Absenteeism time in hours'] >= 9, \"Group 6\",0))\n )))))\n\ndata_test['Reason for absence'] = data_test['Reason for absence'].astype('category').cat.codes\ndata_test['Month of absence'] = data_test['Month of absence'].astype('category').cat.codes\ndata_test['Day of the week'] = data_test['Day of the week'].astype('category').cat.codes\ndata_test['Seasons'] = data_test['Seasons'].astype('category').cat.codes\ndata_test['Disciplinary failure'] = data_test['Disciplinary failure'].astype('category').cat.codes\ndata_test['Education'] = data_test['Education'].astype('category').cat.codes\ndata_test['Social drinker'] = data_test['Social drinker'].astype('category').cat.codes\ndata_test['Social smoker'] = data_test['Social smoker'].astype('category').cat.codes\ndata_test['Pet'] = data_test['Pet'].astype('category').cat.codes\ndata_test['followUp_req'] = data_test['followUp_req'].astype('category').cat.codes\ndata_test['Absenteeism categories'] = data_test['Absenteeism categories'].astype('category').cat.codes",
"_____no_output_____"
],
[
"dataset_test = data_test\n\nprint(dataset_test.shape)",
"(740, 23)\n"
],
[
"data_test.head()",
"_____no_output_____"
],
[
"dataset_test.to_csv('cleanDataset_categoricalTarget_test.csv',index=False)",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf83be92a69af9e9cecbd386b56a35866c97231
| 8,714 |
ipynb
|
Jupyter Notebook
|
notebooks/demos/imdb_bidirectional_lstm.ipynb
|
laturose/keras-js
|
953b4d79943419969449697cf154a6ccd9d3f7e2
|
[
"MIT"
] | null | null | null |
notebooks/demos/imdb_bidirectional_lstm.ipynb
|
laturose/keras-js
|
953b4d79943419969449697cf154a6ccd9d3f7e2
|
[
"MIT"
] | null | null | null |
notebooks/demos/imdb_bidirectional_lstm.ipynb
|
laturose/keras-js
|
953b4d79943419969449697cf154a6ccd9d3f7e2
|
[
"MIT"
] | 1 |
2019-04-22T05:23:34.000Z
|
2019-04-22T05:23:34.000Z
| 30.900709 | 756 | 0.585495 |
[
[
[
"# Bidirection LSTM - IMDB sentiment classification\n\nsee **https://github.com/fchollet/keras/blob/master/examples/imdb_bidirectional_lstm.py**",
"_____no_output_____"
]
],
[
[
"KERAS_MODEL_FILEPATH = '../../demos/data/imdb_bidirectional_lstm/imdb_bidirectional_lstm.h5'",
"_____no_output_____"
],
[
"import numpy as np\nnp.random.seed(1337) # for reproducibility\n\nfrom keras.preprocessing import sequence\nfrom keras.models import Sequential\nfrom keras.layers import Dense, Dropout, Embedding, LSTM, Input, Bidirectional\nfrom keras.datasets import imdb\nfrom keras.callbacks import EarlyStopping, ModelCheckpoint\n\nimport json",
"Using TensorFlow backend.\n/home/leon/miniconda3/lib/python3.6/importlib/_bootstrap.py:219: RuntimeWarning: compiletime version 3.5 of module 'tensorflow.python.framework.fast_tensor_util' does not match runtime version 3.6\n return f(*args, **kwds)\n"
],
[
"max_features = 20000\nmaxlen = 200 # cut texts after this number of words (among top max_features most common words)\n\nprint('Loading data...')\n(X_train, y_train), (X_test, y_test) = imdb.load_data(num_words=max_features)\nprint(len(X_train), 'train sequences')\nprint(len(X_test), 'test sequences')\n\nprint(\"Pad sequences (samples x time)\")\nX_train = sequence.pad_sequences(X_train, maxlen=maxlen)\nX_test = sequence.pad_sequences(X_test, maxlen=maxlen)\nprint('X_train shape:', X_train.shape)\nprint('X_test shape:', X_test.shape)\ny_train = np.array(y_train)\ny_test = np.array(y_test)",
"Loading data...\nDownloading data from https://s3.amazonaws.com/text-datasets/imdb.npz\n17465344/17464789 [==============================] - 9s 1us/step\n25000 train sequences\n25000 test sequences\nPad sequences (samples x time)\nX_train shape: (25000, 200)\nX_test shape: (25000, 200)\n"
],
[
"model = Sequential()\nmodel.add(Embedding(max_features, 64, input_length=maxlen))\nmodel.add(Bidirectional(LSTM(32)))\nmodel.add(Dropout(0.5))\nmodel.add(Dense(1, activation='sigmoid'))\n\n# try using different optimizers and different optimizer configs\nmodel.compile('adam', 'binary_crossentropy', metrics=['accuracy'])",
"_____no_output_____"
],
[
"# Model saving callback\ncheckpointer = ModelCheckpoint(filepath=KERAS_MODEL_FILEPATH, monitor='val_acc', verbose=1, save_best_only=True)\n\n# Early stopping\nearly_stopping = EarlyStopping(monitor='val_acc', verbose=1, patience=2)\n\n# train\nbatch_size = 128\nepochs = 10\nmodel.fit(X_train, y_train, \n validation_data=[X_test, y_test],\n batch_size=batch_size, epochs=epochs, verbose=2,\n callbacks=[checkpointer, early_stopping])",
"Train on 25000 samples, validate on 25000 samples\nEpoch 1/10\nEpoch 00001: val_acc improved from -inf to 0.86916, saving model to ../../demos/data/imdb_bidirectional_lstm/imdb_bidirectional_lstm.h5\n - 77s - loss: 0.4602 - acc: 0.7830 - val_loss: 0.3180 - val_acc: 0.8692\nEpoch 2/10\nEpoch 00002: val_acc improved from 0.86916 to 0.87272, saving model to ../../demos/data/imdb_bidirectional_lstm/imdb_bidirectional_lstm.h5\n - 74s - loss: 0.2304 - acc: 0.9169 - val_loss: 0.3164 - val_acc: 0.8727\nEpoch 3/10\nEpoch 00003: val_acc did not improve\n - 74s - loss: 0.1490 - acc: 0.9520 - val_loss: 0.3412 - val_acc: 0.8637\nEpoch 4/10\nEpoch 00004: val_acc did not improve\n - 75s - loss: 0.1026 - acc: 0.9694 - val_loss: 0.4146 - val_acc: 0.8626\nEpoch 00004: early stopping\n"
]
],
[
[
"**sample data**",
"_____no_output_____"
]
],
[
[
"word_index = imdb.get_word_index()",
"Downloading data from https://s3.amazonaws.com/text-datasets/imdb_word_index.json\n1646592/1641221 [==============================] - 1s 0us/step\n"
],
[
"word_dict = {idx: word for word, idx in word_index.items()}",
"_____no_output_____"
],
[
"sample = []\nfor idx in X_train[0]:\n if idx >= 3:\n sample.append(word_dict[idx-3])\n elif idx == 2:\n sample.append('-')\n' '.join(sample)",
"_____no_output_____"
],
[
"with open('../../demos/data/imdb_bidirectional_lstm/imdb_dataset_word_index_top20000.json', 'w') as f:\n f.write(json.dumps({word: idx for word, idx in word_index.items() if idx < max_features}))",
"_____no_output_____"
],
[
"with open('../../demos/data/imdb_bidirectional_lstm/imdb_dataset_word_dict_top20000.json', 'w') as f:\n f.write(json.dumps({idx: word for word, idx in word_index.items() if idx < max_features}))",
"_____no_output_____"
],
[
"sample_test_data = []\nfor i in np.random.choice(range(X_test.shape[0]), size=1000, replace=False):\n sample_test_data.append({'values': X_test[i].tolist(), 'label': y_test[i].tolist()})\n \nwith open('../../demos/data/imdb_bidirectional_lstm/imdb_dataset_test.json', 'w') as f:\n f.write(json.dumps(sample_test_data))",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8413d4126b0465d1d0b52d9ed5fff2fe3b632
| 5,576 |
ipynb
|
Jupyter Notebook
|
Grokking-Algorithms/08.greedy-algorithm.ipynb
|
zzsza/Algorithm-Training
|
5d2a428bbe56536b23e370b0ef121ecd3cfc7793
|
[
"MIT"
] | 10 |
2017-11-28T05:12:48.000Z
|
2021-08-30T13:46:38.000Z
|
Grokking-Algorithms/08.greedy-algorithm.ipynb
|
zzsza/Algorithm-Training
|
5d2a428bbe56536b23e370b0ef121ecd3cfc7793
|
[
"MIT"
] | null | null | null |
Grokking-Algorithms/08.greedy-algorithm.ipynb
|
zzsza/Algorithm-Training
|
5d2a428bbe56536b23e370b0ef121ecd3cfc7793
|
[
"MIT"
] | 3 |
2019-05-24T05:43:54.000Z
|
2021-08-24T11:32:17.000Z
| 20.651852 | 91 | 0.47023 |
[
[
[
"## ์ด๋ฒ ํํธ์์ \n- ๋น ๋ฅธ ์๊ณ ๋ฆฌ์ฆ ํด๋ฒ์ด ์กด์ฌํ์ง ์๋ NP_์์ ๋ฌธ์ ๋ฅผ ๋ค๋ฃจ๋ ๋ฒ\n- ์๊ฐ์ ๋ญ๋นํ์ง ์๋๋ก ๋ฌธ์ ํด๊ฒฐ์ด ๋ถ๊ฐ๋ฅํ์ง ์๋์ง ํ์
ํ๋ ๋ฐฉ๋ฒ ๊ณต๋ถ\n- NP-์์ ๋ฌธ์ ์ ๋ํ ๊ฐ๋ตํ ํด๋ฒ์ ๋นจ๋ฆฌ ๊ตฌํ ์ ์๋ ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ\n- ํ์ ์๊ณ ๋ฆฌ์ฆ ๊ณต๋ถ\n\n### ํ์ ์๊ณ ๋ฆฌ์ฆ\n- ๊ฐ๊ฐ์ ๋จ๊ณ์์ ์ต์ ์ ์๋ฅผ ์ฐพ์\n- ๊ตญ์ ์ต์ ํด๋ฅผ ์ฐพ์ ์ต์ข
์ ์ผ๋ก ์ ์ญ ์ต์ ํด๋ฅผ ๊ตฌํจ",
"_____no_output_____"
],
[
"### ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ\n- ์ ํํ ๋ต์ ๊ณ์ฐํ ๋ ์๊ฐ์ด ๋ง์ด ๊ฑธ๋ฆฌ๋ฉด ๊ทผ์ฌ ์๊ณ ๋ฆฌ์ฆ์ ์ฌ์ฉํ ์ ์์\n- ์ฑ๋ฅ์ ๋ค์ ๋ ๊ฐ์ง๋ก ํ๋จ\n - ์ผ๋ง๋ ๋น ๋ฅธ๊ฐ\n - ์ต์ ํด์ ์ผ๋ง๋ ๊ฐ๊น์ด๊ฐ",
"_____no_output_____"
]
],
[
[
"states_needed = set([\"mt\", \"wa\", \"or\", \"id\", \"nv\", \"ut\", \"ca\", \"az\"])",
"_____no_output_____"
],
[
"stations = {}",
"_____no_output_____"
],
[
"stations[\"kone\"] = set([\"id\", \"nv\", \"ut\"])\nstations[\"ktwo\"] = set([\"wa\", \"id\", \"mt\"])\nstations[\"kthree\"] = set([\"or\", \"nv\", \"ca\"])\nstations[\"kfour\"] = set([\"nv\", \"ut\"])\nstations[\"kfive\"] = set([\"ca\", \"az\"])",
"_____no_output_____"
],
[
"final_stations = set()",
"_____no_output_____"
],
[
"while states_needed:\n best_station = None\n states_covered = set()\n for station, states in stations.items():\n covered = states_needed & states\n if len(covered) > len(states_covered):\n best_station = station\n states_covered = covered\n\n states_needed -= states_covered\n final_stations.add(best_station)\n \nprint(final_stations)",
"{'kthree', 'kfive', 'ktwo', 'kone'}\n"
]
],
[
[
"### set ๊ด๋ จ ์ฐ์ฐ",
"_____no_output_____"
]
],
[
[
"fruits = set([\"avocado\", \"tomato\", \"banana\"])\nvegatables = set([\"beets\", \"carrots\", \"tomato\"])",
"_____no_output_____"
],
[
"fruits | vegatables",
"_____no_output_____"
],
[
"fruits & vegatables",
"_____no_output_____"
],
[
"fruits - vegatables",
"_____no_output_____"
],
[
"vegatables - fruits",
"_____no_output_____"
]
],
[
[
"### NP-์์ ๋ฌธ์ \n- ๋ชจ๋ ๊ฐ๋ฅํ ๊ฒฝ๋ก ํ์\n- [์ํค](https://ko.wikipedia.org/wiki/NP-%EC%99%84%EC%A0%84)\n- NP-์์ ๋ฌธ์ ์ธ์ง ํ์ธํ๋ ๋ฐฉ๋ฒ\n - ํญ๋ชฉ์ด ์ ์ ๋ ๋น ๋ฅด์ง๋ง ํญ๋ชฉ์ด ๋์ด๋๋ฉฐ ๋๋ ค์ง๋ ๊ฒฝ์ฐ\n - X์ ๋ชจ๋ ์กฐํฉ์ด๋ผ๊ณ ํ ๊ฒฝ์ฐ\n - ๋ ์์ ํ์ ๋ฌธ์ ๋ก ๋ณํํ ์ ์์ด์ X์ ๊ฐ๋ฅํ ๋ชจ๋ ๋ฒ์ ์ ๊ณ์ฐํด์ผ ํ๋ฉด ์๋ง NP-์์ ๋ฌธ์ ์ผ ์ ์์\n - ์์ด์ ํฌํจํ๊ณ ํ๊ธฐ ์ด๋ ค์ฐ๋ฉด NP-์์ ๋ฌธ์ ์ผ ์ ์์\n - ์งํฉ์ ํฌํจํ๊ณ ํ๊ธฐ ์ด๋ ค์ฐ๋ฉด NP-์์ ๋ฌธ์ ์ผ ์ ์์\n - ๋ฌธ์ ๋ฅผ ์งํฉ ์ปค๋ฒ๋ง ๋ฌธ์ ๋ ์ธํ์ ๋ฌธ์ ๋ก ์ฌ์ ์ํ ์ ์์ผ๋ฉด ๋ช
๋ฐฑํ๊ฒ NP-์์ ๋ฌธ์ ",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
]
] |
cbf846be5c6583ce4786763df7b5d34dba34b7b8
| 4,937 |
ipynb
|
Jupyter Notebook
|
quantum-matchup-generator.ipynb
|
Tomaszbrauntsch/quant-matchup-generator
|
30f496027bbd254b52c4c9f3eded0102ae775075
|
[
"CC0-1.0"
] | null | null | null |
quantum-matchup-generator.ipynb
|
Tomaszbrauntsch/quant-matchup-generator
|
30f496027bbd254b52c4c9f3eded0102ae775075
|
[
"CC0-1.0"
] | null | null | null |
quantum-matchup-generator.ipynb
|
Tomaszbrauntsch/quant-matchup-generator
|
30f496027bbd254b52c4c9f3eded0102ae775075
|
[
"CC0-1.0"
] | null | null | null | 29.921212 | 153 | 0.563095 |
[
[
[
"\"\"\"\nRemoval of players?\nCounter when a player has been choose too many times\n\"\"\"\n\n#Eventually implement a virtual environment\n\n# Importing standard Qiskit libraries\nfrom qiskit import QuantumCircuit, transpile, Aer, IBMQ, execute\nfrom qiskit.tools.jupyter import *\nfrom ibm_quantum_widgets import *\n\nfrom math import sqrt, ceil, floor\n\nprovider = IBMQ.load_account() #Loading IBMQ Creds\n\n#implement user input for list of names and how many names to get generated\nlist_of_players = []\n\nnum_to_generate = int(input(\"How many players need to be generate: \"))\ncurrent_name = input(\"Enter a name to add to the list: \") #import file???\n\nwhile(current_name != \"\"):\n list_of_players.append(current_name)\n current_name = input(\"Enter a name to add to the list: \")\n\namount_of_qubits = ceil(sqrt(len(list_of_players))) #finds amount of qubits needed for the amount of states needed\n\nname_qubits = QuantumCircuit(amount_of_qubits)\n\n#for having the h-gates use for loops\nfor x in range(amount_of_qubits):\n name_qubits.h(x)\n\nname_qubits.measure_all()\n",
"ibmqfactory.load_account:WARNING:2021-07-30 00:42:52,347: Credentials are already in use. The existing account in the session will be replaced.\n"
],
[
"# Use Aer's qasm_simulator\nsimulator = Aer.get_backend('qasm_simulator')\n# Execute the circuit on the qasm simulator\n\ncurrent_players = [] #used for formatting the print statement\n\nfor x in range(num_to_generate): #used for user input on amount of players to choose\n rng_index = rng(simulator) \n for player in current_players: #error handling; for each player in \n while(list_of_players[rng_index] == player):\n rng_index = rng(simulator)\n current_players.append(list_of_players[rng_index])\n print(list_of_players[rng_index], end = \" \")\n if (x != num_to_generate-1):\n print(\"and\", end = \" \")\n\n \n# if(x != num_to_generate-1):\n# print(\"and\", end=\" \")\nprint(\"are versing right now!\")",
"Tom and Cris are versing right now!\n"
],
[
"def rng(simulator):\n job = execute(name_qubits, simulator, shots=1)\n \n result = job.result()\n counts = result.get_counts() #obtain results from the job\n \n measured_bit = max(counts, key=counts.get) # the key is used to prevent errors\n measured_bit_val = int(max(counts),2)\n rng_to_index = floor(measured_bit_val * (len(list_of_players)/(2**(amount_of_qubits)))) #converting random number to work with list \n return rng_to_index",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code"
]
] |
cbf847975b8b98b007bd3bc3ef3727d05963e8f5
| 20,153 |
ipynb
|
Jupyter Notebook
|
tcc/notebooks/TCC PT2.ipynb
|
rvitorgomes/textCrawler
|
4d2cb0fe43f785293c625ed6c5822055f0d86f63
|
[
"CNRI-Python"
] | 3 |
2020-11-17T10:27:55.000Z
|
2021-11-15T08:06:19.000Z
|
tcc/notebooks/TCC PT2.ipynb
|
rvitorgomes/textCrawler
|
4d2cb0fe43f785293c625ed6c5822055f0d86f63
|
[
"CNRI-Python"
] | 2 |
2021-03-14T22:14:04.000Z
|
2021-05-07T03:19:57.000Z
|
tcc/notebooks/TCC PT2.ipynb
|
rashamalek/tripadvisor-crawler
|
81c9434a7a7409616b53341e043ab75139879f4f
|
[
"MIT"
] | 3 |
2020-08-13T06:48:57.000Z
|
2021-11-15T07:56:06.000Z
| 30.305263 | 370 | 0.514514 |
[
[
[
"## TODO:\n<ul>\n <li>Usar o libreoffice e encontrar 2000 palavras erradas (80h)</li>\n <li>Classificar as palavras por tipo (80h)</li>\n</ul>",
"_____no_output_____"
],
[
"## <b>Italian Pipeline</b>",
"_____no_output_____"
]
],
[
[
"# load hunspell",
"_____no_output_____"
],
[
"import urllib\nimport json\nimport numpy as np\nimport pandas as pd\nimport itertools\nfrom matplotlib import pyplot as plt\nimport re",
"_____no_output_____"
],
[
"suggestions = pd.DataFrame(data)",
"_____no_output_____"
],
[
"suggestions",
"_____no_output_____"
],
[
"suggestions.to_csv('suggestions.auto.csv')",
"_____no_output_____"
],
[
"import hunspell\nit_spellchecker = hunspell.HunSpell('/home/rgomes/dictionaries/dictionaries/it/index.dic', '/home/rgomes/dictionaries/dictionaries/it/index.aff')",
"_____no_output_____"
],
[
"with open('../auto.spellchecker.results.filtered.json', encoding='utf-8') as data_file:\n data = json.loads(data_file.read())\n data = list(filter(lambda x: x,data))",
"_____no_output_____"
],
[
"a = map(lambda x: x['word'], data)\nb = map(lambda x : (x,it_spellchecker.spell(x)), a)\nasd = filter(lambda x: x[1] ,b)\nerrors_hunspell = list(filter(lambda x: x[1] == False , b))",
"_____no_output_____"
],
[
"ac_errors = filter(lambda x: re.search(r'[ร-ลพ\\'\\`]', x[0]) ,errors_hunspell)",
"_____no_output_____"
],
[
"# for item in list(ac_errors):\n #print(item[0] + '\\n')",
"_____no_output_____"
],
[
"corrected_ac_errors = []\nwith open('../italian_accented_erros.txt', encoding='utf-8') as data_file2:\n lines = data_file2.readlines()\n corrected_ac_errors = list(filter(lambda y: y != '',map(lambda x: x.rstrip('\\n'), lines)))",
"_____no_output_____"
],
[
"corrected_words = []\nfor index,x in enumerate(ac_errors):\n if x[0] != corrected_ac_errors[index]:\n corrected_words.append((x[0], corrected_ac_errors[index]))",
"_____no_output_____"
],
[
"all_words = []\nwith open('../italian_words_all.txt', encoding='utf-8') as data_file_all:\n lines = data_file_all.readlines()\n all_words = list(map(lambda x: x.rstrip('\\n').lower(), lines))\n all_words = list(map(lambda x: x.replace('!#$%&()*+,./:;<=>?@[\\\\]_{|}', ''), all_words))",
"_____no_output_____"
],
[
"def histogram(list):\n d={}\n for i in list:\n if i in d:\n d[i] += 1\n else:\n d[i] = 1\n return d",
"_____no_output_____"
],
[
"def plotHistogram(data):\n h = histogram(data)\n h = sorted(h.items(), key=lambda x: x[1], reverse=True)\n h = map(lambda x: x[1], h)\n # remove the words that appears only once\n h = filter(lambda x: x > 1, h)\n\n plt.plot(list(h))\n plt.show()",
"_____no_output_____"
],
[
"suggestions_csv = pd.read_csv('/home/rgomes/Downloads/suggestions filtered - suggestions.auto.csv')\nsuggestions_csv = suggestions_csv.replace(np.nan, '', regex=True)",
"_____no_output_____"
],
[
"suggestions_csv.drop(['is_italian_word', 'suggestions', 'HELPFUL LINK', 'Already removed words'], axis=1)\n\nsuggestions_corrected = []\nfor _, row in suggestions_csv.iterrows():\n if row['spelling_correction']:\n suggestions_corrected.append((row['word'], row['spelling_correction']))",
"_____no_output_____"
],
[
"suggestions_corrected",
"_____no_output_____"
],
[
"print(len(suggestions_corrected))",
"142\n"
],
[
"h = histogram(all_words)\nh = sorted(h.items(), key=lambda x: x[1], reverse=True)\n\n#######\n# filtra apenas aquelas corrigidas com repeticao\ncombined_corrections_map = list(set(corrected_words + suggestions_corrected))\nprint('Total corrections {}'.format(len(combined_corrections_map)))\n\ncombined_words_list = list(map(lambda x : x[0].lower(), combined_corrections_map))\n#print(combined_words_list)\nmapped_combined_words = filter(lambda x : x[0].lower() in combined_words_list, h)\n\ntotal_words = list(mapped_combined_words)\nprint(total_words[0])\n\ncount = 0\nfor w in total_words:\n count = count + w[1]\nprint(count)",
"Total corrections 252\n('perchรจ', 1237)\n1981\n"
],
[
"combined_corrections_map\nprint(len(corrected_words), len(suggestions_corrected))",
"114 142\n"
],
[
"a_ordered = filter(lambda x: re.search(r'[ร-ลพ\\'\\`]', x[0]),h)\nb_ordered = filter(lambda x: not it_spellchecker.spell(x[0]),a_ordered)\nc_ordered = filter(lambda x: not(x[0] in combined_words_list),b_ordered)\nd = list(c_ordered)\ncount2 = 0\nfor w in d:\n count2 = count2 + w[1]\nprint(count2)",
"21459\n"
],
[
"with open('../ordered_last_errors.txt', 'w') as ordered_last_errors:\n for item in d:\n ordered_last_errors.write(item[0] + '\\n')",
"_____no_output_____"
],
[
"last_corrections = []\nwith open('../ordered_last_errors_corrected.txt') as ordered_last_corrections:\n lines = list(map(lambda x: x.rstrip('\\n').lower(), ordered_last_corrections))\n for index, item in enumerate(d):\n if item[0] != lines[index]:\n last_corrections.append((item[0],lines[index]))\nprint(len(last_corrections))",
"92\n"
],
[
"h = histogram(all_words)\nh = sorted(h.items(), key=lambda x: x[1], reverse=True)\n\n# filtra apenas aquelas corrigidas com repeticao\ncombined_corrections_map = list(set(corrected_words + suggestions_corrected + last_corrections))\n#combined_corrections_map = list(map(lambda x : (x[0].replace('!\"#$%&\\'()*+,-./:;<=>?@[\\\\]^_`{|}~', ''), combined_corrections_map)))\nprint('Total corrections {}'.format(len(combined_corrections_map)))\n\ncombined_words_list = list(map(lambda x : x[0].lower(), combined_corrections_map))\n#print(combined_words_list)\nmapped_combined_words = list(filter(lambda x : x[0].lower() in combined_words_list, h))\n\n#remove rare cases and outliers\n# todo: remove nonsense words verified by norton\ntotal_words = list(filter(lambda x: x[1] > 1 and x[1] < 2200,mapped_combined_words))\n\nprint(total_words[0])\n\ncount = 0\nfor w in total_words:\n count = count + w[1]\nprint(count)",
"Total corrections 344\n('perchรจ', 1237)\n4537\n"
],
[
"all_count_dict = dict((a[0], a) for a in total_words)",
"_____no_output_____"
],
[
"all_corrections_dict = dict((a[0], a) for a in combined_corrections_map)",
"_____no_output_____"
],
[
"all_data = []\nfor item in all_count_dict:\n if all_corrections_dict.get(item):\n all_data.append((item, all_count_dict[item][1], all_corrections_dict[item][1]))",
"_____no_output_____"
],
[
"print(len(all_data))\ndf = pd.DataFrame(all_data)",
"133\n"
],
[
"df.to_csv('../final_corrections.csv')",
"_____no_output_____"
]
]
] |
[
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf84e33e365ae84877636f6fae3c436e00e57aa
| 12,527 |
ipynb
|
Jupyter Notebook
|
scratch.ipynb
|
silberman/snecko
|
0e34e4c530fd74eb3c93bf5ea2060f36d85a69a4
|
[
"MIT"
] | null | null | null |
scratch.ipynb
|
silberman/snecko
|
0e34e4c530fd74eb3c93bf5ea2060f36d85a69a4
|
[
"MIT"
] | 3 |
2020-08-11T05:26:24.000Z
|
2020-08-14T23:05:26.000Z
|
scratch.ipynb
|
silberman/snecko
|
0e34e4c530fd74eb3c93bf5ea2060f36d85a69a4
|
[
"MIT"
] | 1 |
2020-08-11T05:18:53.000Z
|
2020-08-11T05:18:53.000Z
| 28.213964 | 147 | 0.37024 |
[
[
[
"import fastai\nimport random\nimport torch\n\nfrom fastai.tabular import *",
"_____no_output_____"
],
[
"%load_ext autoreload\n%autoreload 1\n%aimport logs",
"_____no_output_____"
],
[
"# Check if we're using the GPU.\ntorch.cuda.get_device_name(0)",
"_____no_output_____"
]
],
[
[
"This next section is training the model. You don't have to do this if the model is already trained.",
"_____no_output_____"
]
],
[
[
"fname = logs.character_filename(\"ironclad\")\ndf = pd.read_csv(fname)\ndf.head()",
"_____no_output_____"
],
[
"size = len(df)\nprocs = [Categorify, Normalize]\ndep_var = \"Picked\"\ncat_names = [\"Character\", \"Choice1\", \"Choice2\", \"Choice3\"]\nvalid_idx = sorted(random.sample(list(range(size)), int(size / 5)))",
"_____no_output_____"
],
[
"data = TabularDataBunch.from_df(fname, df, dep_var, valid_idx=valid_idx, procs=procs, cat_names=cat_names)",
"_____no_output_____"
],
[
"learn = tabular_learner(data, layers=[200,100], metrics=accuracy)",
"_____no_output_____"
],
[
"learn.fit_one_cycle(1, 0.01)",
"epoch train_loss valid_loss accuracy time \n0 0.998445 115610.2734380.576721 00:08 \n"
],
[
"learn.fit_one_cycle(1, 0.001)",
"epoch train_loss valid_loss accuracy time \n0 0.943560 64.770348 0.585118 00:07 \n"
],
[
"learn.export(logs.cachefile(\"ironclad.learn\"))",
"_____no_output_____"
]
],
[
[
"This next section is testing out the model on sample data. The `load_learner` call is loading a cached learner instead of training a new one.",
"_____no_output_____"
]
],
[
[
"learn = load_learner(logs.CACHE, \"ironclad.learn\")",
"_____no_output_____"
],
[
"deck = [\"Strike_R\"] * 3 + [\"Defend_R\"] * 4 + [\n \"Bash+1\", \"Anger\", \"AscendersBane\", \"Disarm+1\", \"Headbutt\", \"Evolve\", \"Impervious\", \"Whirlwind+1\", \"Shockwave\",\n \"Demon Form+1\", \"Heavy Blade+1\", \"Dark Embrace+1\", \"Headbutt+1\", \"Sever Soul+1\", \"Clothesline+1\", \"Corruption\"]\nrelics = [\"Burning Blood\", \"Cursed Key\"]\nfloor = 36\nchoices = [\"Perfected Strike+1\", \"Sword Boomerang+1\", \"Armaments\"]\ntestcsv = logs.mini_csv(\"IRONCLAD\", floor, deck, relics, choices)\ntestf = pd.read_csv(testcsv)\np = learn.predict(testf.iloc[0])\np",
"_____no_output_____"
],
[
"list(p[2].numpy())",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
cbf8525e96e5ce3d5c76b816f266a4f21443f359
| 40,208 |
ipynb
|
Jupyter Notebook
|
3-recurrent-neural-networks/tensorboard/Anna KaRNNa.ipynb
|
vanyaland/deep-learning-foundation
|
05a0df56c8223547bd7e8b62653a67f265c8e5ca
|
[
"MIT"
] | 6 |
2017-04-18T13:48:30.000Z
|
2018-01-02T13:32:16.000Z
|
3-recurrent-neural-networks/tensorboard/Anna KaRNNa.ipynb
|
ivan-magda/deep-learning-foundation
|
05a0df56c8223547bd7e8b62653a67f265c8e5ca
|
[
"MIT"
] | null | null | null |
3-recurrent-neural-networks/tensorboard/Anna KaRNNa.ipynb
|
ivan-magda/deep-learning-foundation
|
05a0df56c8223547bd7e8b62653a67f265c8e5ca
|
[
"MIT"
] | null | null | null | 53.468085 | 3,712 | 0.575458 |
[
[
[
"# Anna KaRNNa\n\nIn this notebook, I'll build a character-wise RNN trained on Anna Karenina, one of my all-time favorite books. It'll be able to generate new text based on the text from the book.\n\nThis network is based off of Andrej Karpathy's [post on RNNs](http://karpathy.github.io/2015/05/21/rnn-effectiveness/) and [implementation in Torch](https://github.com/karpathy/char-rnn). Also, some information [here at r2rt](http://r2rt.com/recurrent-neural-networks-in-tensorflow-ii.html) and from [Sherjil Ozair](https://github.com/sherjilozair/char-rnn-tensorflow) on GitHub. Below is the general architecture of the character-wise RNN.\n\n<img src=\"assets/charseq.jpeg\" width=\"500\">",
"_____no_output_____"
]
],
[
[
"import time\nfrom collections import namedtuple\n\nimport numpy as np\nimport tensorflow as tf",
"_____no_output_____"
]
],
[
[
"First we'll load the text file and convert it into integers for our network to use.",
"_____no_output_____"
]
],
[
[
"with open('anna.txt', 'r') as f:\n text=f.read()\nvocab = set(text)\nvocab_to_int = {c: i for i, c in enumerate(vocab)}\nint_to_vocab = dict(enumerate(vocab))\nchars = np.array([vocab_to_int[c] for c in text], dtype=np.int32)",
"_____no_output_____"
],
[
"text[:100]",
"_____no_output_____"
],
[
"chars[:100]",
"_____no_output_____"
]
],
[
[
"Now I need to split up the data into batches, and into training and validation sets. I should be making a test set here, but I'm not going to worry about that. My test will be if the network can generate new text.\n\nHere I'll make both input and target arrays. The targets are the same as the inputs, except shifted one character over. I'll also drop the last bit of data so that I'll only have completely full batches.\n\nThe idea here is to make a 2D matrix where the number of rows is equal to the number of batches. Each row will be one long concatenated string from the character data. We'll split this data into a training set and validation set using the `split_frac` keyword. This will keep 90% of the batches in the training set, the other 10% in the validation set.",
"_____no_output_____"
]
],
[
[
"def split_data(chars, batch_size, num_steps, split_frac=0.9):\n \"\"\" \n Split character data into training and validation sets, inputs and targets for each set.\n \n Arguments\n ---------\n chars: character array\n batch_size: Size of examples in each of batch\n num_steps: Number of sequence steps to keep in the input and pass to the network\n split_frac: Fraction of batches to keep in the training set\n \n \n Returns train_x, train_y, val_x, val_y\n \"\"\"\n \n \n slice_size = batch_size * num_steps\n n_batches = int(len(chars) / slice_size)\n \n # Drop the last few characters to make only full batches\n x = chars[: n_batches*slice_size]\n y = chars[1: n_batches*slice_size + 1]\n \n # Split the data into batch_size slices, then stack them into a 2D matrix \n x = np.stack(np.split(x, batch_size))\n y = np.stack(np.split(y, batch_size))\n \n # Now x and y are arrays with dimensions batch_size x n_batches*num_steps\n \n # Split into training and validation sets, keep the virst split_frac batches for training\n split_idx = int(n_batches*split_frac)\n train_x, train_y= x[:, :split_idx*num_steps], y[:, :split_idx*num_steps]\n val_x, val_y = x[:, split_idx*num_steps:], y[:, split_idx*num_steps:]\n \n return train_x, train_y, val_x, val_y",
"_____no_output_____"
],
[
"train_x, train_y, val_x, val_y = split_data(chars, 10, 200)",
"_____no_output_____"
],
[
"train_x.shape",
"_____no_output_____"
],
[
"train_x[:,:10]",
"_____no_output_____"
]
],
[
[
"I'll write another function to grab batches out of the arrays made by split data. Here each batch will be a sliding window on these arrays with size `batch_size X num_steps`. For example, if we want our network to train on a sequence of 100 characters, `num_steps = 100`. For the next batch, we'll shift this window the next sequence of `num_steps` characters. In this way we can feed batches to the network and the cell states will continue through on each batch.",
"_____no_output_____"
]
],
[
[
"def get_batch(arrs, num_steps):\n batch_size, slice_size = arrs[0].shape\n \n n_batches = int(slice_size/num_steps)\n for b in range(n_batches):\n yield [x[:, b*num_steps: (b+1)*num_steps] for x in arrs]",
"_____no_output_____"
],
[
"def build_rnn(num_classes, batch_size=50, num_steps=50, lstm_size=128, num_layers=2,\n learning_rate=0.001, grad_clip=5, sampling=False):\n \n if sampling == True:\n batch_size, num_steps = 1, 1\n\n tf.reset_default_graph()\n \n # Declare placeholders we'll feed into the graph\n \n inputs = tf.placeholder(tf.int32, [batch_size, num_steps], name='inputs')\n x_one_hot = tf.one_hot(inputs, num_classes, name='x_one_hot')\n\n\n targets = tf.placeholder(tf.int32, [batch_size, num_steps], name='targets')\n y_one_hot = tf.one_hot(targets, num_classes, name='y_one_hot')\n y_reshaped = tf.reshape(y_one_hot, [-1, num_classes])\n \n keep_prob = tf.placeholder(tf.float32, name='keep_prob')\n \n # Build the RNN layers\n \n lstm = tf.contrib.rnn.BasicLSTMCell(lstm_size)\n drop = tf.contrib.rnn.DropoutWrapper(lstm, output_keep_prob=keep_prob)\n cell = tf.contrib.rnn.MultiRNNCell([drop] * num_layers)\n\n initial_state = cell.zero_state(batch_size, tf.float32)\n\n # Run the data through the RNN layers\n rnn_inputs = [tf.squeeze(i, squeeze_dims=[1]) for i in tf.split(x_one_hot, num_steps, 1)]\n outputs, state = tf.contrib.rnn.static_rnn(cell, rnn_inputs, initial_state=initial_state)\n \n final_state = tf.identity(state, name='final_state')\n \n # Reshape output so it's a bunch of rows, one row for each cell output\n \n seq_output = tf.concat(outputs, axis=1,name='seq_output')\n output = tf.reshape(seq_output, [-1, lstm_size], name='graph_output')\n \n # Now connect the RNN putputs to a softmax layer and calculate the cost\n softmax_w = tf.Variable(tf.truncated_normal((lstm_size, num_classes), stddev=0.1),\n name='softmax_w')\n softmax_b = tf.Variable(tf.zeros(num_classes), name='softmax_b')\n logits = tf.matmul(output, softmax_w) + softmax_b\n\n preds = tf.nn.softmax(logits, name='predictions')\n \n loss = tf.nn.softmax_cross_entropy_with_logits(logits=logits, labels=y_reshaped, name='loss')\n cost = tf.reduce_mean(loss, name='cost')\n\n # Optimizer for training, using gradient clipping to control exploding gradients\n tvars = tf.trainable_variables()\n grads, _ = tf.clip_by_global_norm(tf.gradients(cost, tvars), grad_clip)\n train_op = tf.train.AdamOptimizer(learning_rate)\n optimizer = train_op.apply_gradients(zip(grads, tvars))\n\n # Export the nodes \n export_nodes = ['inputs', 'targets', 'initial_state', 'final_state',\n 'keep_prob', 'cost', 'preds', 'optimizer']\n Graph = namedtuple('Graph', export_nodes)\n local_dict = locals()\n graph = Graph(*[local_dict[each] for each in export_nodes])\n \n return graph",
"_____no_output_____"
]
],
[
[
"## Hyperparameters\n\nHere I'm defining the hyperparameters for the network. The two you probably haven't seen before are `lstm_size` and `num_layers`. These set the number of hidden units in the LSTM layers and the number of LSTM layers, respectively. Of course, making these bigger will improve the network's performance but you'll have to watch out for overfitting. If your validation loss is much larger than the training loss, you're probably overfitting. Decrease the size of the network or decrease the dropout keep probability.",
"_____no_output_____"
]
],
[
[
"batch_size = 100\nnum_steps = 100\nlstm_size = 512\nnum_layers = 2\nlearning_rate = 0.001",
"_____no_output_____"
]
],
[
[
"## Write out the graph for TensorBoard",
"_____no_output_____"
]
],
[
[
"model = build_rnn(len(vocab),\n batch_size=batch_size,\n num_steps=num_steps,\n learning_rate=learning_rate,\n lstm_size=lstm_size,\n num_layers=num_layers)\n\nwith tf.Session() as sess:\n \n sess.run(tf.global_variables_initializer())\n file_writer = tf.summary.FileWriter('./logs/1', sess.graph)",
"_____no_output_____"
]
],
[
[
"## Training\n\nTime for training which is is pretty straightforward. Here I pass in some data, and get an LSTM state back. Then I pass that state back in to the network so the next batch can continue the state from the previous batch. And every so often (set by `save_every_n`) I calculate the validation loss and save a checkpoint.",
"_____no_output_____"
]
],
[
[
"!mkdir -p checkpoints/anna",
"_____no_output_____"
],
[
"epochs = 1\nsave_every_n = 200\ntrain_x, train_y, val_x, val_y = split_data(chars, batch_size, num_steps)\n\nmodel = build_rnn(len(vocab), \n batch_size=batch_size,\n num_steps=num_steps,\n learning_rate=learning_rate,\n lstm_size=lstm_size,\n num_layers=num_layers)\n\nsaver = tf.train.Saver(max_to_keep=100)\n\nwith tf.Session() as sess:\n sess.run(tf.global_variables_initializer())\n \n # Use the line below to load a checkpoint and resume training\n #saver.restore(sess, 'checkpoints/anna20.ckpt')\n \n n_batches = int(train_x.shape[1]/num_steps)\n iterations = n_batches * epochs\n for e in range(epochs):\n \n # Train network\n new_state = sess.run(model.initial_state)\n loss = 0\n for b, (x, y) in enumerate(get_batch([train_x, train_y], num_steps), 1):\n iteration = e*n_batches + b\n start = time.time()\n feed = {model.inputs: x,\n model.targets: y,\n model.keep_prob: 0.5,\n model.initial_state: new_state}\n batch_loss, new_state, _ = sess.run([model.cost, model.final_state, model.optimizer], \n feed_dict=feed)\n loss += batch_loss\n end = time.time()\n print('Epoch {}/{} '.format(e+1, epochs),\n 'Iteration {}/{}'.format(iteration, iterations),\n 'Training loss: {:.4f}'.format(loss/b),\n '{:.4f} sec/batch'.format((end-start)))\n \n \n if (iteration%save_every_n == 0) or (iteration == iterations):\n # Check performance, notice dropout has been set to 1\n val_loss = []\n new_state = sess.run(model.initial_state)\n for x, y in get_batch([val_x, val_y], num_steps):\n feed = {model.inputs: x,\n model.targets: y,\n model.keep_prob: 1.,\n model.initial_state: new_state}\n batch_loss, new_state = sess.run([model.cost, model.final_state], feed_dict=feed)\n val_loss.append(batch_loss)\n\n print('Validation loss:', np.mean(val_loss),\n 'Saving checkpoint!')\n saver.save(sess, \"checkpoints/anna/i{}_l{}_{:.3f}.ckpt\".format(iteration, lstm_size, np.mean(val_loss)))",
"Epoch 1/1 Iteration 1/178 Training loss: 4.4204 10.7077 sec/batch\n"
],
[
"tf.train.get_checkpoint_state('checkpoints/anna')",
"_____no_output_____"
]
],
[
[
"## Sampling\n\nNow that the network is trained, we'll can use it to generate new text. The idea is that we pass in a character, then the network will predict the next character. We can use the new one, to predict the next one. And we keep doing this to generate all new text. I also included some functionality to prime the network with some text by passing in a string and building up a state from that.\n\nThe network gives us predictions for each character. To reduce noise and make things a little less random, I'm going to only choose a new character from the top N most likely characters.\n\n",
"_____no_output_____"
]
],
[
[
"def pick_top_n(preds, vocab_size, top_n=5):\n p = np.squeeze(preds)\n p[np.argsort(p)[:-top_n]] = 0\n p = p / np.sum(p)\n c = np.random.choice(vocab_size, 1, p=p)[0]\n return c",
"_____no_output_____"
],
[
"def sample(checkpoint, n_samples, lstm_size, vocab_size, prime=\"The \"):\n prime = \"Far\"\n samples = [c for c in prime]\n model = build_rnn(vocab_size, lstm_size=lstm_size, sampling=True)\n saver = tf.train.Saver()\n with tf.Session() as sess:\n saver.restore(sess, checkpoint)\n new_state = sess.run(model.initial_state)\n for c in prime:\n x = np.zeros((1, 1))\n x[0,0] = vocab_to_int[c]\n feed = {model.inputs: x,\n model.keep_prob: 1.,\n model.initial_state: new_state}\n preds, new_state = sess.run([model.preds, model.final_state], \n feed_dict=feed)\n\n c = pick_top_n(preds, len(vocab))\n samples.append(int_to_vocab[c])\n\n for i in range(n_samples):\n x[0,0] = c\n feed = {model.inputs: x,\n model.keep_prob: 1.,\n model.initial_state: new_state}\n preds, new_state = sess.run([model.preds, model.final_state], \n feed_dict=feed)\n\n c = pick_top_n(preds, len(vocab))\n samples.append(int_to_vocab[c])\n \n return ''.join(samples)",
"_____no_output_____"
],
[
"checkpoint = \"checkpoints/anna/i3560_l512_1.122.ckpt\"\nsamp = sample(checkpoint, 2000, lstm_size, len(vocab), prime=\"Far\")\nprint(samp)",
"Farlathit that if had so\nlike it that it were. He could not trouble to his wife, and there was\nanything in them of the side of his weaky in the creature at his forteren\nto him.\n\n\"What is it? I can't bread to those,\" said Stepan Arkadyevitch. \"It's not\nmy children, and there is an almost this arm, true it mays already,\nand tell you what I have say to you, and was not looking at the peasant,\nwhy is, I don't know him out, and she doesn't speak to me immediately, as\nyou would say the countess and the more frest an angelembre, and time and\nthings's silent, but I was not in my stand that is in my head. But if he\nsay, and was so feeling with his soul. A child--in his soul of his\nsoul of his soul. He should not see that any of that sense of. Here he\nhad not been so composed and to speak for as in a whole picture, but\nall the setting and her excellent and society, who had been delighted\nand see to anywing had been being troed to thousand words on them,\nwe liked him.\n\nThat set in her money at the table, he came into the party. The capable\nof his she could not be as an old composure.\n\n\"That's all something there will be down becime by throe is\nsuch a silent, as in a countess, I should state it out and divorct.\nThe discussion is not for me. I was that something was simply they are\nall three manshess of a sensitions of mind it all.\"\n\n\"No,\" he thought, shouted and lifting his soul. \"While it might see your\nhonser and she, I could burst. And I had been a midelity. And I had a\nmarnief are through the countess,\" he said, looking at him, a chosing\nwhich they had been carried out and still solied, and there was a sen that\nwas to be completely, and that this matter of all the seconds of it, and\na concipation were to her husband, who came up and conscaously, that he\nwas not the station. All his fourse she was always at the country,,\nto speak oft, and though they were to hear the delightful throom and\nwhether they came towards the morning, and his living and a coller and\nhold--the children. \n"
],
[
"checkpoint = \"checkpoints/anna/i200_l512_2.432.ckpt\"\nsamp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\nprint(samp)",
"Farnt him oste wha sorind thans tout thint asd an sesand an hires on thime sind thit aled, ban thand and out hore as the ter hos ton ho te that, was tis tart al the hand sostint him sore an tit an son thes, win he se ther san ther hher tas tarereng,.\n\nAnl at an ades in ond hesiln, ad hhe torers teans, wast tar arering tho this sos alten sorer has hhas an siton ther him he had sin he ard ate te anling the sosin her ans and\narins asd and ther ale te tot an tand tanginge wath and ho ald, so sot th asend sat hare sother horesinnd, he hesense wing ante her so tith tir sherinn, anded and to the toul anderin he sorit he torsith she se atere an ting ot hand and thit hhe so the te wile har\nens ont in the sersise, and we he seres tar aterer, to ato tat or has he he wan ton here won and sen heren he sosering, to to theer oo adent har herere the wosh oute, was serild ward tous hed astend..\n\nI's sint on alt in har tor tit her asd hade shithans ored he talereng an soredendere tim tot hees. Tise sor and \n"
],
[
"checkpoint = \"checkpoints/anna/i600_l512_1.750.ckpt\"\nsamp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\nprint(samp)",
"Fard as astice her said he celatice of to seress in the raice, and to be the some and sere allats to that said to that the sark and a cast a the wither ald the pacinesse of her had astition, he said to the sount as she west at hissele. Af the cond it he was a fact onthis astisarianing.\n\n\n\"Or a ton to to be that's a more at aspestale as the sont of anstiring as\nthours and trey.\n\nThe same wo dangring the\nraterst, who sore and somethy had ast out an of his book. \"We had's beane were that, and a morted a thay he had to tere. Then to\nher homent andertersed his his ancouted to the pirsted, the soution for of the pirsice inthirgest and stenciol, with the hard and and\na colrice of to be oneres,\nthe song to this anderssad.\nThe could ounterss the said to serom of\nsoment a carsed of sheres of she\ntorded\nhar and want in their of hould, but\nher told in that in he tad a the same to her. Serghing an her has and with the seed, and the camt ont his about of the\nsail, the her then all houg ant or to hus to \n"
],
[
"checkpoint = \"checkpoints/anna/i1000_l512_1.484.ckpt\"\nsamp = sample(checkpoint, 1000, lstm_size, len(vocab), prime=\"Far\")\nprint(samp)",
"Farrat, his felt has at it.\n\n\"When the pose ther hor exceed\nto his sheant was,\" weat a sime of his sounsed. The coment and the facily that which had began terede a marilicaly whice whether the pose of his hand, at she was alligated herself the same on she had to\ntaiking to his forthing and streath how to hand\nbegan in a lang at some at it, this he cholded not set all her. \"Wo love that is setthing. Him anstering as seen that.\"\n\n\"Yes in the man that say the mare a crances is it?\" said Sergazy Ivancatching. \"You doon think were somether is ifficult of a mone of\nthough the most at the countes that the\nmean on the come to say the most, to\nhis feesing of\na man she, whilo he\nsained and well, that he would still at to said. He wind at his for the sore in the most\nof hoss and almoved to see him. They have betine the sumper into at he his stire, and what he was that at the so steate of the\nsound, and shin should have a geest of shall feet on the conderation to she had been at that imporsing the dre\n"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf855fde73e9f9997275ad89fb72802c9fc2203
| 144,599 |
ipynb
|
Jupyter Notebook
|
docs/source/notebooks/BEST.ipynb
|
rmill040/pymc3
|
ffcd50f7de3c9205f529c4ce563f12c89c05bbb5
|
[
"Apache-2.0"
] | 1 |
2019-01-16T05:04:05.000Z
|
2019-01-16T05:04:05.000Z
|
docs/source/notebooks/BEST.ipynb
|
HerrZYZ/pymc3
|
b64f810fdc1f5945149ff78b9edcef10e142478a
|
[
"Apache-2.0"
] | null | null | null |
docs/source/notebooks/BEST.ipynb
|
HerrZYZ/pymc3
|
b64f810fdc1f5945149ff78b9edcef10e142478a
|
[
"Apache-2.0"
] | null | null | null | 278.61079 | 41,876 | 0.903298 |
[
[
[
"# Bayesian Estimation Supersedes the T-Test",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\nimport numpy as np\nimport pymc3 as pm\nimport pandas as pd\nimport matplotlib.pyplot as plt\nplt.style.use('seaborn-darkgrid')\nprint('Running on PyMC3 v{}'.format(pm.__version__))",
"Running on PyMC3 v3.3\n"
]
],
[
[
"This model replicates the example used in:\nKruschke, John. (2012) **Bayesian estimation supersedes the t-test**. *Journal of Experimental Psychology*: General.",
"_____no_output_____"
],
[
"### The Problem\n\nSeveral statistical inference procedures involve the comparison of two groups. We may be interested in whether one group is larger than another, or simply different from the other. We require a statistical model for this because true differences are usually accompanied by measurement or stochastic noise that prevent us from drawing conclusions simply from differences calculated from the observed data. \n\nThe *de facto* standard for statistically comparing two (or more) samples is to use a statistical test. This involves expressing a null hypothesis, which typically claims that there is no difference between the groups, and using a chosen test statistic to determine whether the distribution of the observed data is plausible under the hypothesis. This rejection occurs when the calculated test statistic is higher than some pre-specified threshold value.\n\nUnfortunately, it is not easy to conduct hypothesis tests correctly, and their results are very easy to misinterpret. Setting up a statistical test involves several subjective choices (*e.g.* statistical test to use, null hypothesis to test, significance level) by the user that are rarely justified based on the problem or decision at hand, but rather, are usually based on traditional choices that are entirely arbitrary (Johnson 1999). The evidence that it provides to the user is indirect, incomplete, and typically overstates the evidence against the null hypothesis (Goodman 1999). \n\nA more informative and effective approach for comparing groups is one based on **estimation** rather than **testing**, and is driven by Bayesian probability rather than frequentist. That is, rather than testing whether two groups are different, we instead pursue an estimate of how different they are, which is fundamentally more informative. Moreover, we include an estimate of uncertainty associated with that difference which includes uncertainty due to our lack of knowledge of the model parameters (epistemic uncertainty) and uncertainty due to the inherent stochasticity of the system (aleatory uncertainty).",
"_____no_output_____"
],
[
"## Example: Drug trial evaluation\n\nTo illustrate how this Bayesian estimation approach works in practice, we will use a fictitious example from Kruschke (2012) concerning the evaluation of a clinical trial for drug evaluation. The trial aims to evaluate the efficacy of a \"smart drug\" that is supposed to increase intelligence by comparing IQ scores of individuals in a treatment arm (those receiving the drug) to those in a control arm (those recieving a placebo). There are 47 individuals and 42 individuals in the treatment and control arms, respectively.",
"_____no_output_____"
]
],
[
[
"drug = (101,100,102,104,102,97,105,105,98,101,100,123,105,103,100,95,102,106,\n 109,102,82,102,100,102,102,101,102,102,103,103,97,97,103,101,97,104,\n 96,103,124,101,101,100,101,101,104,100,101)\nplacebo = (99,101,100,101,102,100,97,101,104,101,102,102,100,105,88,101,100,\n 104,100,100,100,101,102,103,97,101,101,100,101,99,101,100,100,\n 101,100,99,101,100,102,99,100,99)\n\ny1 = np.array(drug)\ny2 = np.array(placebo)\ny = pd.DataFrame(dict(value=np.r_[y1, y2], group=np.r_[['drug']*len(drug), ['placebo']*len(placebo)]))\n\ny.hist('value', by='group');",
"_____no_output_____"
]
],
[
[
"The first step in a Bayesian approach to inference is to specify the full probability model that corresponds to the problem. For this example, Kruschke chooses a Student-t distribution to describe the distributions of the scores in each group. This choice adds robustness to the analysis, as a T distribution is less sensitive to outlier observations, relative to a normal distribution. The three-parameter Student-t distribution allows for the specification of a mean $\\mu$, a precision (inverse-variance) $\\lambda$ and a degrees-of-freedom parameter $\\nu$:\n\n$$f(x|\\mu,\\lambda,\\nu) = \\frac{\\Gamma(\\frac{\\nu + 1}{2})}{\\Gamma(\\frac{\\nu}{2})} \\left(\\frac{\\lambda}{\\pi\\nu}\\right)^{\\frac{1}{2}} \\left[1+\\frac{\\lambda(x-\\mu)^2}{\\nu}\\right]^{-\\frac{\\nu+1}{2}}$$\n \nthe degrees-of-freedom parameter essentially specifies the \"normality\" of the data, since larger values of $\\nu$ make the distribution converge to a normal distribution, while small values (close to zero) result in heavier tails.\n\nThus, the likelihood functions of our model are specified as follows:\n\n$$y^{(treat)}_i \\sim T(\\nu, \\mu_1, \\sigma_1)$$\n\n$$y^{(placebo)}_i \\sim T(\\nu, \\mu_2, \\sigma_2)$$\n\nAs a simplifying assumption, we will assume that the degree of normality $\\nu$ is the same for both groups. We will, of course, have separate parameters for the means $\\mu_k, k=1,2$ and standard deviations $\\sigma_k$.\n\nSince the means are real-valued, we will apply normal priors on them, and arbitrarily set the hyperparameters to the pooled empirical mean of the data and twice the pooled empirical standard deviation, which applies very diffuse information to these quantities (and importantly, does not favor one or the other *a priori*).\n\n$$\\mu_k \\sim N(\\bar{x}, 2s)$$",
"_____no_output_____"
]
],
[
[
"ฮผ_m = y.value.mean()\nฮผ_s = y.value.std() * 2\n\nwith pm.Model() as model:\n group1_mean = pm.Normal('group1_mean', ฮผ_m, sigma=ฮผ_s)\n group2_mean = pm.Normal('group2_mean', ฮผ_m, sigma=ฮผ_s)",
"_____no_output_____"
]
],
[
[
"The group standard deviations will be given a uniform prior over a plausible range of values for the variability of the outcome variable, IQ.\n\nIn Kruschke's original model, he uses a very wide uniform prior for the group standard deviations, from the pooled empirical standard deviation divided by 1000 to the pooled standard deviation multiplied by 1000. This is a poor choice of prior, because very basic prior knowledge about measures of human coginition dictate that the variation cannot ever be as high as this upper bound. IQ is a standardized measure, and hence this constrains how variable a given population's IQ values can be. When you place such a wide uniform prior on these values, you are essentially giving a lot of prior weight on inadmissable values. In this example, there is little practical difference, but in general it is best to apply as much prior information that you have available to the parameterization of prior distributions. \n\nWe will instead set the group standard deviations to have a $\\text{Uniform}(1,10)$ prior:",
"_____no_output_____"
]
],
[
[
"ฯ_low = 1\nฯ_high = 10\n\nwith model:\n group1_std = pm.Uniform('group1_std', lower=ฯ_low, upper=ฯ_high)\n group2_std = pm.Uniform('group2_std', lower=ฯ_low, upper=ฯ_high)",
"_____no_output_____"
]
],
[
[
"We follow Kruschke by making the prior for $\\nu$ exponentially distributed with a mean of 30; this allocates high prior probability over the regions of the parameter that describe the range from normal to heavy-tailed data under the Student-T distribution.",
"_____no_output_____"
]
],
[
[
"with model:\n ฮฝ = pm.Exponential('ฮฝ_minus_one', 1/29.) + 1\n\npm.kdeplot(np.random.exponential(30, size=10000), shade=0.5);",
"_____no_output_____"
]
],
[
[
"Since PyMC3 parameterizes the Student-T in terms of precision, rather than standard deviation, we must transform the standard deviations before specifying our likelihoods.",
"_____no_output_____"
]
],
[
[
"with model:\n ฮป1 = group1_std**-2\n ฮป2 = group2_std**-2\n\n group1 = pm.StudentT('drug', nu=ฮฝ, mu=group1_mean, lam=ฮป1, observed=y1)\n group2 = pm.StudentT('placebo', nu=ฮฝ, mu=group2_mean, lam=ฮป2, observed=y2)",
"_____no_output_____"
]
],
[
[
"Having fully specified our probabilistic model, we can turn our attention to calculating the comparisons of interest in order to evaluate the effect of the drug. To this end, we can specify deterministic nodes in our model for the difference between the group means and the difference between the group standard deviations. Wrapping them in named `Deterministic` objects signals to PyMC that we wish to record the sampled values as part of the output.\n\nAs a joint measure of the groups, we will also estimate the \"effect size\", which is the difference in means scaled by the pooled estimates of standard deviation. This quantity can be harder to interpret, since it is no longer in the same units as our data, but the quantity is a function of all four estimated parameters.",
"_____no_output_____"
]
],
[
[
"with model:\n diff_of_means = pm.Deterministic('difference of means', group1_mean - group2_mean)\n diff_of_stds = pm.Deterministic('difference of stds', group1_std - group2_std)\n effect_size = pm.Deterministic('effect size', \n diff_of_means / np.sqrt((group1_std**2 + group2_std**2) / 2))\n",
"_____no_output_____"
]
],
[
[
"Now, we can fit the model and evaluate its output.",
"_____no_output_____"
]
],
[
[
"with model:\n trace = pm.sample(2000, cores=2)",
"Auto-assigning NUTS sampler...\nInitializing NUTS using jitter+adapt_diag...\nMultiprocess sampling (2 chains in 2 jobs)\nNUTS: [ฮฝ_minus_one_log__, group2_std_interval__, group1_std_interval__, group2_mean, group1_mean]\n100%|โโโโโโโโโโ| 2500/2500 [00:03<00:00, 727.76it/s]\n"
]
],
[
[
"We can plot the stochastic parameters of the model. PyMC's `plot_posterior` function replicates the informative histograms portrayed in Kruschke (2012). These summarize the posterior distributions of the parameters, and present a 95% credible interval and the posterior mean. The plots below are constructed with the final 1000 samples from each of the 2 chains, pooled together.",
"_____no_output_____"
]
],
[
[
"pm.plot_posterior(trace, varnames=['group1_mean','group2_mean', 'group1_std', 'group2_std', 'ฮฝ_minus_one'],\n color='#87ceeb');",
"_____no_output_____"
]
],
[
[
"Looking at the group differences, we can conclude that there are meaningful differences between the two groups for all three measures. For these comparisons, it is useful to use zero as a reference value (`ref_val`); providing this reference value yields cumulative probabilities for the posterior distribution on either side of the value. Thus, for the difference in means, 99.4% of the posterior probability is greater than zero, which suggests the group means are credibly different. The effect size and differences in standard deviation are similarly positive.\n\nThese estimates suggest that the \"smart drug\" increased both the expected scores, but also the variability in scores across the sample. So, this does not rule out the possibility that some recipients may be adversely affected by the drug at the same time others benefit.",
"_____no_output_____"
]
],
[
[
"pm.plot_posterior(trace, varnames=['difference of means','difference of stds', 'effect size'],\n ref_val=0,\n color='#87ceeb');",
"_____no_output_____"
]
],
[
[
"When `forestplot` is called on a trace with more than one chain, it also plots the potential scale reduction parameter, which is used to reveal evidence for lack of convergence; values near one, as we have here, suggest that the model has converged.",
"_____no_output_____"
]
],
[
[
"pm.forestplot(trace, varnames=['group1_mean',\n 'group2_mean']);",
"_____no_output_____"
],
[
"pm.forestplot(trace, varnames=['group1_std',\n 'group2_std',\n 'ฮฝ_minus_one']);",
"_____no_output_____"
],
[
"pm.summary(trace,varnames=['difference of means', 'difference of stds', 'effect size'])",
"_____no_output_____"
]
],
[
[
"## References\n\n1.\tGoodman SN. Toward evidence-based medical statistics. 1: The P value fallacy. Annals of Internal Medicine. 1999;130(12):995-1004. doi:10.7326/0003-4819-130-12-199906150-00008.\n2.\tJohnson D. The insignificance of statistical significance testing. Journal of Wildlife Management. 1999;63(3):763-772.\n3.\tKruschke JK. Bayesian estimation supersedes the t test. J Exp Psychol Gen. 2013;142(2):573-603. doi:10.1037/a0029146.",
"_____no_output_____"
],
[
"The original pymc2 implementation was written by Andrew Straw and can be found here: https://github.com/strawlab/best\n\nPorted to PyMC3 by [Thomas Wiecki](https://twitter.com/twiecki) (c) 2015, updated by Chris Fonnesbeck.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
]
] |
cbf8704e7d2dfdc85bc3d0c877bf4833a0e99bc1
| 100,101 |
ipynb
|
Jupyter Notebook
|
DeepLearning/PyTorch/book_repo/p1ch7/2_birds_airplanes.ipynb
|
dSalazar10/Course-Exploring_Deep_Learning
|
e79cbc7c4802c9b2d62d7fc419eb77b4d2fed355
|
[
"MIT"
] | null | null | null |
DeepLearning/PyTorch/book_repo/p1ch7/2_birds_airplanes.ipynb
|
dSalazar10/Course-Exploring_Deep_Learning
|
e79cbc7c4802c9b2d62d7fc419eb77b4d2fed355
|
[
"MIT"
] | null | null | null |
DeepLearning/PyTorch/book_repo/p1ch7/2_birds_airplanes.ipynb
|
dSalazar10/Course-Exploring_Deep_Learning
|
e79cbc7c4802c9b2d62d7fc419eb77b4d2fed355
|
[
"MIT"
] | null | null | null | 47.039944 | 8,928 | 0.689683 |
[
[
[
"%matplotlib inline\nfrom matplotlib import pyplot as plt\nimport numpy as np\nimport torch\n\ntorch.set_printoptions(edgeitems=2)\ntorch.manual_seed(123)",
"_____no_output_____"
],
[
"class_names = ['airplane','automobile','bird','cat','deer',\n 'dog','frog','horse','ship','truck']",
"_____no_output_____"
],
[
"from torchvision import datasets, transforms\ndata_path = '../data-unversioned/p1ch7/'\ncifar10 = datasets.CIFAR10(data_path, train=True, download=False,\n transform=transforms.Compose([\n transforms.ToTensor(),\n transforms.Normalize((0.4915, 0.4823, 0.4468),\n (0.2470, 0.2435, 0.2616))\n ]))",
"_____no_output_____"
],
[
"cifar10_val = datasets.CIFAR10(data_path, train=False, download=False,\n transform=transforms.Compose([\n transforms.ToTensor(),\n transforms.Normalize((0.4915, 0.4823, 0.4468),\n (0.2470, 0.2435, 0.2616))\n ]))",
"_____no_output_____"
],
[
"label_map = {0: 0, 2: 1}\nclass_names = ['airplane', 'bird']\ncifar2 = [(img, label_map[label]) for img, label in cifar10 if label in [0, 2]]\ncifar2_val = [(img, label_map[label]) for img, label in cifar10_val if label in [0, 2]]",
"_____no_output_____"
],
[
"import torch.nn as nn\n\nn_out = 2\n\nmodel = nn.Sequential(\n nn.Linear(\n 3072, # <1>\n 512, # <2>\n ),\n nn.Tanh(),\n nn.Linear(\n 512, # <2>\n n_out, # <3>\n )\n )",
"_____no_output_____"
],
[
"def softmax(x):\n return torch.exp(x) / torch.exp(x).sum()",
"_____no_output_____"
],
[
"x = torch.tensor([1.0, 2.0, 3.0])\n\nsoftmax(x)",
"_____no_output_____"
],
[
"softmax(x).sum()",
"_____no_output_____"
],
[
"softmax = nn.Softmax(dim=1)\n\nx = torch.tensor([[1.0, 2.0, 3.0],\n [1.0, 2.0, 3.0]])\n\nsoftmax(x)",
"_____no_output_____"
],
[
"model = nn.Sequential(\n nn.Linear(3072, 512),\n nn.Tanh(),\n nn.Linear(512, 2),\n nn.Softmax(dim=1))",
"_____no_output_____"
],
[
"img, _ = cifar2[0]\n\nplt.imshow(img.permute(1, 2, 0))\nplt.show()",
"Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n"
],
[
"img_batch = img.view(-1).unsqueeze(0)",
"_____no_output_____"
],
[
"out = model(img_batch)\nout",
"_____no_output_____"
],
[
"_, index = torch.max(out, dim=1)\n\nindex",
"_____no_output_____"
],
[
"out = torch.tensor([\n [0.6, 0.4],\n [0.9, 0.1],\n [0.3, 0.7],\n [0.2, 0.8],\n])\nclass_index = torch.tensor([0, 0, 1, 1]).unsqueeze(1)\n\ntruth = torch.zeros((4,2))\ntruth.scatter_(dim=1, index=class_index, value=1.0)\ntruth",
"_____no_output_____"
],
[
"def mse(out):\n return ((out - truth) ** 2).sum(dim=1).mean()\nmse(out)",
"_____no_output_____"
],
[
"out.gather(dim=1, index=class_index)",
"_____no_output_____"
],
[
"def likelihood(out):\n prod = 1.0\n for x in out.gather(dim=1, index=class_index):\n prod *= x\n return prod\n\nlikelihood(out)",
"_____no_output_____"
],
[
"def neg_log_likelihood(out):\n return -likelihood(out).log()\n\nneg_log_likelihood(out)",
"_____no_output_____"
],
[
"out0 = out.clone().detach()\nout0[0] = torch.tensor([0.9, 0.1]) # more right\n\nout2 = out.clone().detach()\nout2[0] = torch.tensor([0.4, 0.6]) # slightly wrong\n\nout3 = out.clone().detach()\nout3[0] = torch.tensor([0.1, 0.9]) # very wrong\n\nmse_comparison = torch.tensor([mse(o) for o in [out0, out, out2, out3]])\nmse_comparison",
"_____no_output_____"
],
[
"((mse_comparison / mse_comparison[1]) - 1) * 100",
"_____no_output_____"
],
[
"nll_comparison = torch.tensor([neg_log_likelihood(o) for o in [out0, out, out2, out3]])\nnll_comparison",
"_____no_output_____"
],
[
"((nll_comparison / nll_comparison[1]) - 1) * 100",
"_____no_output_____"
],
[
"softmax = nn.Softmax(dim=1)\n\nlog_softmax = nn.LogSoftmax(dim=1)\n\nx = torch.tensor([[0.0, 104.0]])\n\nsoftmax(x)",
"_____no_output_____"
],
[
"softmax = nn.Softmax(dim=1)\n\nlog_softmax = nn.LogSoftmax(dim=1)\n\nx = torch.tensor([[0.0, 104.0]])\n\nsoftmax(x)",
"_____no_output_____"
],
[
"torch.log(softmax(x))",
"_____no_output_____"
],
[
"log_softmax(x)",
"_____no_output_____"
],
[
"torch.exp(log_softmax(x))",
"_____no_output_____"
],
[
"model = nn.Sequential(\n nn.Linear(3072, 512),\n nn.Tanh(),\n nn.Linear(512, 2),\n nn.LogSoftmax(dim=1))",
"_____no_output_____"
],
[
"loss = nn.NLLLoss()",
"_____no_output_____"
],
[
"img, label = cifar2[0]\n\nout = model(img.view(-1).unsqueeze(0))\n\nloss(out, torch.tensor([label]))",
"_____no_output_____"
],
[
"import torch\nimport torch.nn as nn\nimport torch.optim as optim\n\nmodel = nn.Sequential(\n nn.Linear(3072, 512),\n nn.Tanh(),\n nn.Linear(512, 2),\n nn.LogSoftmax(dim=1))\n\nlearning_rate = 1e-2\n\noptimizer = optim.SGD(model.parameters(), lr=learning_rate)\n\nloss_fn = nn.NLLLoss()\n\nn_epochs = 100\n\nfor epoch in range(n_epochs):\n for img, label in cifar2:\n out = model(img.view(-1).unsqueeze(0))\n loss = loss_fn(out, torch.tensor([label]))\n \n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n\n print(\"Epoch: %d, Loss: %f\" % (epoch, float(loss)))",
"Epoch: 0, Loss: 5.347057\nEpoch: 1, Loss: 7.705317\nEpoch: 2, Loss: 6.510838\nEpoch: 3, Loss: 9.557189\nEpoch: 4, Loss: 4.151933\nEpoch: 5, Loss: 5.636873\nEpoch: 6, Loss: 6.531207\nEpoch: 7, Loss: 20.450516\nEpoch: 8, Loss: 5.072948\nEpoch: 9, Loss: 4.941860\nEpoch: 10, Loss: 6.445535\nEpoch: 11, Loss: 4.580799\nEpoch: 12, Loss: 6.660308\nEpoch: 13, Loss: 9.436373\nEpoch: 14, Loss: 16.786476\nEpoch: 15, Loss: 8.349138\nEpoch: 16, Loss: 8.176860\nEpoch: 17, Loss: 5.862664\nEpoch: 18, Loss: 8.218906\nEpoch: 19, Loss: 13.296558\nEpoch: 20, Loss: 7.313433\nEpoch: 21, Loss: 4.585245\nEpoch: 22, Loss: 11.706884\nEpoch: 23, Loss: 18.208710\nEpoch: 24, Loss: 0.343157\nEpoch: 25, Loss: 9.255491\nEpoch: 26, Loss: 10.466807\nEpoch: 27, Loss: 12.226366\nEpoch: 28, Loss: 12.728527\nEpoch: 29, Loss: 9.777843\nEpoch: 30, Loss: 6.128856\nEpoch: 31, Loss: 13.284330\nEpoch: 32, Loss: 10.321814\nEpoch: 33, Loss: 2.928349\nEpoch: 34, Loss: 8.623670\nEpoch: 35, Loss: 12.719531\nEpoch: 36, Loss: 4.030444\nEpoch: 37, Loss: 4.621825\nEpoch: 38, Loss: 13.210777\nEpoch: 39, Loss: 14.217413\nEpoch: 40, Loss: 3.880259\nEpoch: 41, Loss: 13.189833\nEpoch: 42, Loss: 17.787762\nEpoch: 43, Loss: 3.953930\nEpoch: 44, Loss: 0.640078\nEpoch: 45, Loss: 9.262226\nEpoch: 46, Loss: 7.383645\nEpoch: 47, Loss: 5.352252\nEpoch: 48, Loss: 11.515299\nEpoch: 49, Loss: 12.266010\nEpoch: 50, Loss: 12.210896\nEpoch: 51, Loss: 3.987965\nEpoch: 52, Loss: 12.570765\nEpoch: 53, Loss: 13.025002\nEpoch: 54, Loss: 13.747946\nEpoch: 55, Loss: 6.783926\nEpoch: 56, Loss: 11.822943\nEpoch: 57, Loss: 8.200066\nEpoch: 58, Loss: 9.206728\nEpoch: 59, Loss: 7.715425\nEpoch: 60, Loss: 5.571069\nEpoch: 61, Loss: 13.017315\nEpoch: 62, Loss: 10.307802\nEpoch: 63, Loss: 2.660404\nEpoch: 64, Loss: 11.096642\nEpoch: 65, Loss: 5.284830\nEpoch: 66, Loss: 8.374750\nEpoch: 67, Loss: 1.418676\nEpoch: 68, Loss: 9.891462\nEpoch: 69, Loss: 9.079073\nEpoch: 70, Loss: 6.453581\nEpoch: 71, Loss: 8.293860\nEpoch: 72, Loss: 4.585221\nEpoch: 73, Loss: 14.174129\nEpoch: 74, Loss: 6.072280\nEpoch: 75, Loss: 5.925417\nEpoch: 76, Loss: 0.260600\nEpoch: 77, Loss: 3.055498\nEpoch: 78, Loss: 0.347163\nEpoch: 79, Loss: 3.497080\nEpoch: 80, Loss: 6.615281\nEpoch: 81, Loss: 8.944511\nEpoch: 82, Loss: 10.230938\nEpoch: 83, Loss: 6.776264\nEpoch: 84, Loss: 10.169885\nEpoch: 85, Loss: 7.014330\nEpoch: 86, Loss: 3.467798\nEpoch: 87, Loss: 3.772486\nEpoch: 88, Loss: 13.495383\nEpoch: 89, Loss: 11.781836\nEpoch: 90, Loss: 6.853724\nEpoch: 91, Loss: 3.313806\nEpoch: 92, Loss: 7.867707\nEpoch: 93, Loss: 16.117371\nEpoch: 94, Loss: 15.077475\nEpoch: 95, Loss: 17.807060\nEpoch: 96, Loss: 16.376089\nEpoch: 97, Loss: 9.348265\nEpoch: 98, Loss: 18.044790\nEpoch: 99, Loss: 15.565783\n"
],
[
"train_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=True)",
"_____no_output_____"
],
[
"import torch\nimport torch.nn as nn\nimport torch.optim as optim\n\ntrain_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=True)\n\nmodel = nn.Sequential(\n nn.Linear(3072, 128),\n nn.Tanh(),\n nn.Linear(128, 2),\n nn.LogSoftmax(dim=1))\n\nlearning_rate = 1e-2\n\noptimizer = optim.SGD(model.parameters(), lr=learning_rate)\n\nloss_fn = nn.NLLLoss()\n\nn_epochs = 100\n\nfor epoch in range(n_epochs):\n for imgs, labels in train_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n loss = loss_fn(outputs, labels)\n\n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n\n print(\"Epoch: %d, Loss: %f\" % (epoch, float(loss)))",
"Epoch: 0, Loss: 0.604063\nEpoch: 1, Loss: 0.597974\nEpoch: 2, Loss: 0.271415\nEpoch: 3, Loss: 0.451056\nEpoch: 4, Loss: 0.629758\nEpoch: 5, Loss: 0.458762\nEpoch: 6, Loss: 0.277813\nEpoch: 7, Loss: 0.406921\nEpoch: 8, Loss: 0.951961\nEpoch: 9, Loss: 0.433738\nEpoch: 10, Loss: 0.351960\nEpoch: 11, Loss: 0.355687\nEpoch: 12, Loss: 0.518611\nEpoch: 13, Loss: 0.262623\nEpoch: 14, Loss: 0.221969\nEpoch: 15, Loss: 0.774132\nEpoch: 16, Loss: 0.324406\nEpoch: 17, Loss: 0.447701\nEpoch: 18, Loss: 0.299780\nEpoch: 19, Loss: 0.267090\nEpoch: 20, Loss: 0.279828\nEpoch: 21, Loss: 0.197123\nEpoch: 22, Loss: 0.196783\nEpoch: 23, Loss: 0.328715\nEpoch: 24, Loss: 0.334952\nEpoch: 25, Loss: 0.500689\nEpoch: 26, Loss: 0.186956\nEpoch: 27, Loss: 0.138649\nEpoch: 28, Loss: 0.239988\nEpoch: 29, Loss: 0.495020\nEpoch: 30, Loss: 0.251347\nEpoch: 31, Loss: 0.088298\nEpoch: 32, Loss: 0.175127\nEpoch: 33, Loss: 0.208338\nEpoch: 34, Loss: 0.145656\nEpoch: 35, Loss: 0.129570\nEpoch: 36, Loss: 0.200110\nEpoch: 37, Loss: 0.133076\nEpoch: 38, Loss: 0.230561\nEpoch: 39, Loss: 0.241688\nEpoch: 40, Loss: 0.106870\nEpoch: 41, Loss: 0.281168\nEpoch: 42, Loss: 0.175034\nEpoch: 43, Loss: 0.073779\nEpoch: 44, Loss: 0.171294\nEpoch: 45, Loss: 0.112456\nEpoch: 46, Loss: 0.132553\nEpoch: 47, Loss: 0.048826\nEpoch: 48, Loss: 0.076014\nEpoch: 49, Loss: 0.122317\nEpoch: 50, Loss: 0.103442\nEpoch: 51, Loss: 0.201585\nEpoch: 52, Loss: 0.145637\nEpoch: 53, Loss: 0.055844\nEpoch: 54, Loss: 0.046278\nEpoch: 55, Loss: 0.081562\nEpoch: 56, Loss: 0.058857\nEpoch: 57, Loss: 0.197200\nEpoch: 58, Loss: 0.044184\nEpoch: 59, Loss: 0.043374\nEpoch: 60, Loss: 0.032936\nEpoch: 61, Loss: 0.072488\nEpoch: 62, Loss: 0.060811\nEpoch: 63, Loss: 0.029262\nEpoch: 64, Loss: 0.036435\nEpoch: 65, Loss: 0.058120\nEpoch: 66, Loss: 0.063329\nEpoch: 67, Loss: 0.020670\nEpoch: 68, Loss: 0.077189\nEpoch: 69, Loss: 0.060933\nEpoch: 70, Loss: 0.070848\nEpoch: 71, Loss: 0.036434\nEpoch: 72, Loss: 0.084855\nEpoch: 73, Loss: 0.044776\nEpoch: 74, Loss: 0.037828\nEpoch: 75, Loss: 0.024554\nEpoch: 76, Loss: 0.018965\nEpoch: 77, Loss: 0.033381\nEpoch: 78, Loss: 0.016183\nEpoch: 79, Loss: 0.020083\nEpoch: 80, Loss: 0.041192\nEpoch: 81, Loss: 0.015122\nEpoch: 82, Loss: 0.014245\nEpoch: 83, Loss: 0.018538\nEpoch: 84, Loss: 0.044791\nEpoch: 85, Loss: 0.034532\nEpoch: 86, Loss: 0.010175\nEpoch: 87, Loss: 0.021837\nEpoch: 88, Loss: 0.005545\nEpoch: 89, Loss: 0.012682\nEpoch: 90, Loss: 0.026414\nEpoch: 91, Loss: 0.021372\nEpoch: 92, Loss: 0.025901\nEpoch: 93, Loss: 0.025262\nEpoch: 94, Loss: 0.047044\nEpoch: 95, Loss: 0.016064\nEpoch: 96, Loss: 0.059213\nEpoch: 97, Loss: 0.017386\nEpoch: 98, Loss: 0.016215\nEpoch: 99, Loss: 0.016987\n"
],
[
"import torch\nimport torch.nn as nn\nimport torch.optim as optim\n\ntrain_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=True)\n\nmodel = nn.Sequential(\n nn.Linear(3072, 512),\n nn.Tanh(),\n nn.Linear(512, 2),\n nn.LogSoftmax(dim=1))\n\nlearning_rate = 1e-2\n\noptimizer = optim.SGD(model.parameters(), lr=learning_rate)\n\nloss_fn = nn.NLLLoss()\n\nn_epochs = 100\n\nfor epoch in range(n_epochs):\n for imgs, labels in train_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n loss = loss_fn(outputs, labels)\n\n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n\n print(\"Epoch: %d, Loss: %f\" % (epoch, float(loss)))",
"Epoch: 0, Loss: 0.732168\nEpoch: 1, Loss: 0.348352\nEpoch: 2, Loss: 0.318960\nEpoch: 3, Loss: 0.313264\nEpoch: 4, Loss: 0.378358\nEpoch: 5, Loss: 0.276529\nEpoch: 6, Loss: 0.443889\nEpoch: 7, Loss: 0.436946\nEpoch: 8, Loss: 0.324288\nEpoch: 9, Loss: 0.274647\nEpoch: 10, Loss: 0.291681\nEpoch: 11, Loss: 0.242894\nEpoch: 12, Loss: 0.301849\nEpoch: 13, Loss: 0.202063\nEpoch: 14, Loss: 0.389276\nEpoch: 15, Loss: 0.167129\nEpoch: 16, Loss: 0.135282\nEpoch: 17, Loss: 0.385485\nEpoch: 18, Loss: 0.453852\nEpoch: 19, Loss: 0.641304\nEpoch: 20, Loss: 0.287667\nEpoch: 21, Loss: 0.337029\nEpoch: 22, Loss: 0.393282\nEpoch: 23, Loss: 0.409480\nEpoch: 24, Loss: 0.138473\nEpoch: 25, Loss: 0.690729\nEpoch: 26, Loss: 0.572156\nEpoch: 27, Loss: 0.078534\nEpoch: 28, Loss: 0.324833\nEpoch: 29, Loss: 0.262829\nEpoch: 30, Loss: 0.430449\nEpoch: 31, Loss: 0.071872\nEpoch: 32, Loss: 0.058039\nEpoch: 33, Loss: 0.052903\nEpoch: 34, Loss: 0.065879\nEpoch: 35, Loss: 0.107696\nEpoch: 36, Loss: 0.305224\nEpoch: 37, Loss: 0.098637\nEpoch: 38, Loss: 0.139823\nEpoch: 39, Loss: 0.226455\nEpoch: 40, Loss: 0.117763\nEpoch: 41, Loss: 0.106498\nEpoch: 42, Loss: 0.086254\nEpoch: 43, Loss: 0.135652\nEpoch: 44, Loss: 0.070890\nEpoch: 45, Loss: 0.304346\nEpoch: 46, Loss: 0.016917\nEpoch: 47, Loss: 0.057929\nEpoch: 48, Loss: 0.131021\nEpoch: 49, Loss: 0.136299\nEpoch: 50, Loss: 0.048885\nEpoch: 51, Loss: 0.241048\nEpoch: 52, Loss: 0.092595\nEpoch: 53, Loss: 0.059137\nEpoch: 54, Loss: 0.047421\nEpoch: 55, Loss: 0.102036\nEpoch: 56, Loss: 0.023338\nEpoch: 57, Loss: 0.054306\nEpoch: 58, Loss: 0.073878\nEpoch: 59, Loss: 0.031387\nEpoch: 60, Loss: 0.039865\nEpoch: 61, Loss: 0.022344\nEpoch: 62, Loss: 0.052310\nEpoch: 63, Loss: 0.059688\nEpoch: 64, Loss: 0.023977\nEpoch: 65, Loss: 0.010632\nEpoch: 66, Loss: 0.039090\nEpoch: 67, Loss: 0.080844\nEpoch: 68, Loss: 0.029650\nEpoch: 69, Loss: 0.027038\nEpoch: 70, Loss: 0.028515\nEpoch: 71, Loss: 0.021998\nEpoch: 72, Loss: 0.014992\nEpoch: 73, Loss: 0.019659\nEpoch: 74, Loss: 0.025150\nEpoch: 75, Loss: 0.017384\nEpoch: 76, Loss: 0.013249\nEpoch: 77, Loss: 0.009451\nEpoch: 78, Loss: 0.034637\nEpoch: 79, Loss: 0.114242\nEpoch: 80, Loss: 0.019007\nEpoch: 81, Loss: 0.016319\nEpoch: 82, Loss: 0.027428\nEpoch: 83, Loss: 0.022366\nEpoch: 84, Loss: 0.022583\nEpoch: 85, Loss: 0.006275\nEpoch: 86, Loss: 0.011964\nEpoch: 87, Loss: 0.018711\nEpoch: 88, Loss: 0.019636\nEpoch: 89, Loss: 0.018975\nEpoch: 90, Loss: 0.023520\nEpoch: 91, Loss: 0.016398\nEpoch: 92, Loss: 0.006638\nEpoch: 93, Loss: 0.013305\nEpoch: 94, Loss: 0.017126\nEpoch: 95, Loss: 0.021641\nEpoch: 96, Loss: 0.036945\nEpoch: 97, Loss: 0.004735\nEpoch: 98, Loss: 0.016781\nEpoch: 99, Loss: 0.012039\n"
],
[
"train_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=False)\n\ncorrect = 0\ntotal = 0\n\nwith torch.no_grad():\n for imgs, labels in train_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n _, predicted = torch.max(outputs, dim=1)\n total += labels.shape[0]\n correct += int((predicted == labels).sum())\n \nprint(\"Accuracy: %f\" % (correct / total))",
"Accuracy: 0.997700\n"
],
[
"val_loader = torch.utils.data.DataLoader(cifar2_val, batch_size=64, shuffle=False)\n\ncorrect = 0\ntotal = 0\n\nwith torch.no_grad():\n for imgs, labels in val_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n _, predicted = torch.max(outputs, dim=1)\n total += labels.shape[0]\n correct += int((predicted == labels).sum())\n \nprint(\"Accuracy: %f\" % (correct / total))",
"Accuracy: 0.821000\n"
],
[
"model = nn.Sequential(\n nn.Linear(3072, 1024),\n nn.Tanh(),\n nn.Linear(1024, 512),\n nn.Tanh(),\n nn.Linear(512, 128),\n nn.Tanh(),\n nn.Linear(128, 2),\n nn.LogSoftmax(dim=1))",
"_____no_output_____"
],
[
"model = nn.Sequential(\n nn.Linear(3072, 1024),\n nn.Tanh(),\n nn.Linear(1024, 512),\n nn.Tanh(),\n nn.Linear(512, 128),\n nn.Tanh(),\n nn.Linear(128, 2))\n\nloss_fn = nn.CrossEntropyLoss()",
"_____no_output_____"
],
[
"import torch\nimport torch.nn as nn\nimport torch.optim as optim\n\ntrain_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=True)\n\nmodel = nn.Sequential(\n nn.Linear(3072, 1024),\n nn.Tanh(),\n nn.Linear(1024, 512),\n nn.Tanh(),\n nn.Linear(512, 128),\n nn.Tanh(),\n nn.Linear(128, 2))\n\nlearning_rate = 1e-2\n\noptimizer = optim.SGD(model.parameters(), lr=learning_rate)\n\nloss_fn = nn.CrossEntropyLoss()\n\nn_epochs = 100\n\nfor epoch in range(n_epochs):\n for imgs, labels in train_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n loss = loss_fn(outputs, labels)\n\n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n\n print(\"Epoch: %d, Loss: %f\" % (epoch, float(loss)))",
"Epoch: 0, Loss: 0.641261\nEpoch: 1, Loss: 0.525149\nEpoch: 2, Loss: 0.466143\nEpoch: 3, Loss: 0.451913\nEpoch: 4, Loss: 0.343860\nEpoch: 5, Loss: 0.309738\nEpoch: 6, Loss: 0.485261\nEpoch: 7, Loss: 0.283789\nEpoch: 8, Loss: 0.301561\nEpoch: 9, Loss: 0.408200\nEpoch: 10, Loss: 0.346715\nEpoch: 11, Loss: 0.358134\nEpoch: 12, Loss: 0.388485\nEpoch: 13, Loss: 0.378096\nEpoch: 14, Loss: 0.518019\nEpoch: 15, Loss: 0.359279\nEpoch: 16, Loss: 0.420371\nEpoch: 17, Loss: 0.366249\nEpoch: 18, Loss: 0.282639\nEpoch: 19, Loss: 0.468854\nEpoch: 20, Loss: 0.467920\nEpoch: 21, Loss: 0.237441\nEpoch: 22, Loss: 0.243472\nEpoch: 23, Loss: 0.566929\nEpoch: 24, Loss: 0.316143\nEpoch: 25, Loss: 0.336322\nEpoch: 26, Loss: 0.473064\nEpoch: 27, Loss: 0.407040\nEpoch: 28, Loss: 0.252989\nEpoch: 29, Loss: 0.195740\nEpoch: 30, Loss: 0.663084\nEpoch: 31, Loss: 0.659899\nEpoch: 32, Loss: 0.285113\nEpoch: 33, Loss: 0.212042\nEpoch: 34, Loss: 0.324017\nEpoch: 35, Loss: 0.097063\nEpoch: 36, Loss: 0.181754\nEpoch: 37, Loss: 0.091362\nEpoch: 38, Loss: 0.069348\nEpoch: 39, Loss: 0.085656\nEpoch: 40, Loss: 0.163399\nEpoch: 41, Loss: 0.064912\nEpoch: 42, Loss: 0.046740\nEpoch: 43, Loss: 0.029891\nEpoch: 44, Loss: 0.018157\nEpoch: 45, Loss: 0.103532\nEpoch: 46, Loss: 0.161911\nEpoch: 47, Loss: 0.238185\nEpoch: 48, Loss: 0.081116\nEpoch: 49, Loss: 0.040988\nEpoch: 50, Loss: 0.008668\nEpoch: 51, Loss: 0.012557\nEpoch: 52, Loss: 0.015967\nEpoch: 53, Loss: 0.020964\nEpoch: 54, Loss: 0.023478\nEpoch: 55, Loss: 0.012850\nEpoch: 56, Loss: 0.054703\nEpoch: 57, Loss: 0.014922\nEpoch: 58, Loss: 0.045488\nEpoch: 59, Loss: 0.122221\nEpoch: 60, Loss: 0.028012\nEpoch: 61, Loss: 0.029533\nEpoch: 62, Loss: 0.004758\nEpoch: 63, Loss: 0.080409\nEpoch: 64, Loss: 0.005409\nEpoch: 65, Loss: 0.020399\nEpoch: 66, Loss: 0.008184\nEpoch: 67, Loss: 0.013888\nEpoch: 68, Loss: 0.002199\nEpoch: 69, Loss: 0.001918\nEpoch: 70, Loss: 0.018765\nEpoch: 71, Loss: 0.004223\nEpoch: 72, Loss: 0.001795\nEpoch: 73, Loss: 0.102238\nEpoch: 74, Loss: 0.002482\nEpoch: 75, Loss: 0.005807\nEpoch: 76, Loss: 0.001742\nEpoch: 77, Loss: 0.012760\nEpoch: 78, Loss: 0.017469\nEpoch: 79, Loss: 0.002849\nEpoch: 80, Loss: 0.001452\nEpoch: 81, Loss: 0.002740\nEpoch: 82, Loss: 0.003317\nEpoch: 83, Loss: 0.002066\nEpoch: 84, Loss: 0.001952\nEpoch: 85, Loss: 0.010757\nEpoch: 86, Loss: 0.004866\nEpoch: 87, Loss: 0.003957\nEpoch: 88, Loss: 0.001295\nEpoch: 89, Loss: 0.004410\nEpoch: 90, Loss: 0.002952\nEpoch: 91, Loss: 0.000676\nEpoch: 92, Loss: 0.001835\nEpoch: 93, Loss: 0.000739\nEpoch: 94, Loss: 0.001102\nEpoch: 95, Loss: 0.000792\nEpoch: 96, Loss: 0.000515\nEpoch: 97, Loss: 0.001548\nEpoch: 98, Loss: 0.026913\nEpoch: 99, Loss: 0.000140\n"
],
[
"train_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=False)\n\ncorrect = 0\ntotal = 0\n\nwith torch.no_grad():\n for imgs, labels in train_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n _, predicted = torch.max(outputs, dim=1)\n total += labels.shape[0]\n correct += int((predicted == labels).sum())\n \nprint(\"Accuracy: %f\" % (correct / total))",
"Accuracy: 0.999700\n"
],
[
"val_loader = torch.utils.data.DataLoader(cifar2_val, batch_size=64, shuffle=False)\n\ncorrect = 0\ntotal = 0\n\nwith torch.no_grad():\n for imgs, labels in val_loader:\n outputs = model(imgs.view(imgs.shape[0], -1))\n _, predicted = torch.max(outputs, dim=1)\n total += labels.shape[0]\n correct += int((predicted == labels).sum())\n \nprint(\"Accuracy: %f\" % (correct / total))",
"Accuracy: 0.801000\n"
],
[
"sum([p.numel() for p in model.parameters()])",
"_____no_output_____"
],
[
"sum([p.numel() for p in model.parameters() if p.requires_grad == True])",
"_____no_output_____"
],
[
"first_model = nn.Sequential(\n nn.Linear(3072, 512),\n nn.Tanh(),\n nn.Linear(512, 2),\n nn.LogSoftmax(dim=1))\n\nsum([p.numel() for p in first_model.parameters()])",
"_____no_output_____"
],
[
"sum([p.numel() for p in nn.Linear(3072, 512).parameters()])",
"_____no_output_____"
],
[
"sum([p.numel() for p in nn.Linear(3072, 1024).parameters()])",
"_____no_output_____"
],
[
"linear = nn.Linear(3072, 1024)\n\nlinear.weight.shape, linear.bias.shape",
"_____no_output_____"
],
[
"conv = nn.Conv2d(3, 16, kernel_size=3)",
"_____no_output_____"
],
[
"conv.weight.shape",
"_____no_output_____"
],
[
"conv.bias.shape",
"_____no_output_____"
],
[
"img, _ = cifar2[0]\n\noutput = conv(img.unsqueeze(0))",
"_____no_output_____"
],
[
"img.unsqueeze(0).shape, output.shape",
"_____no_output_____"
],
[
"plt.imshow(img.permute(1, 2, 0), cmap='gray')\nplt.show()",
"Clipping input data to the valid range for imshow with RGB data ([0..1] for floats or [0..255] for integers).\n"
],
[
"plt.imshow(output[0, 0].detach(), cmap='gray')\nplt.show()",
"_____no_output_____"
],
[
"output.shape",
"_____no_output_____"
],
[
"conv = nn.Conv2d(3, 1, kernel_size=3, padding=1)",
"_____no_output_____"
],
[
"output = conv(img.unsqueeze(0))\n\noutput.shape",
"_____no_output_____"
],
[
"with torch.no_grad():\n conv.bias.zero_()",
"_____no_output_____"
],
[
"with torch.no_grad():\n conv.weight.fill_(1.0 / 9.0)",
"_____no_output_____"
],
[
"output = conv(img.unsqueeze(0))\nplt.imshow(output[0, 0].detach(), cmap='gray')\nplt.show()",
"_____no_output_____"
],
[
"conv = nn.Conv2d(3, 1, kernel_size=3, padding=1)\n\nwith torch.no_grad():\n conv.weight[:] = torch.tensor([[-1.0, 0.0, 1.0],\n [-1.0, 0.0, 1.0],\n [-1.0, 0.0, 1.0]])\n conv.bias.zero_()",
"_____no_output_____"
],
[
"output = conv(img.unsqueeze(0))\nplt.imshow(output[0, 0].detach(), cmap='gray')\nplt.show()",
"_____no_output_____"
],
[
"pool = nn.MaxPool2d(2)",
"_____no_output_____"
],
[
"output = pool(img.unsqueeze(0))\n\noutput.shape",
"_____no_output_____"
],
[
"model = nn.Sequential(\n nn.Conv2d(3, 16, kernel_size=3, padding=1),\n nn.Tanh(),\n nn.MaxPool2d(2),\n nn.Conv2d(16, 8, kernel_size=3, padding=1),\n nn.Tanh(),\n nn.MaxPool2d(2),\n ...)",
"_____no_output_____"
],
[
"model = nn.Sequential(\n nn.Conv2d(3, 16, kernel_size=3, padding=1),\n nn.Tanh(),\n nn.MaxPool2d(2),\n nn.Conv2d(16, 8, kernel_size=3, padding=1),\n nn.Tanh(),\n nn.MaxPool2d(2),\n # WARNING: something missing here\n nn.Linear(512, 32),\n nn.Tanh(),\n nn.Linear(32, 2))",
"_____no_output_____"
],
[
"sum([p.numel() for p in model.parameters()])",
"_____no_output_____"
],
[
"model(img.unsqueeze(0))",
"_____no_output_____"
],
[
"class Net(nn.Module):\n def __init__(self):\n super(Net, self).__init__()\n self.conv1 = nn.Conv2d(3, 16, kernel_size=3, padding=1)\n self.act1 = nn.Tanh()\n self.pool1 = nn.MaxPool2d(2)\n self.conv2 = nn.Conv2d(16, 8, kernel_size=3, padding=1)\n self.act2 = nn.Tanh()\n self.pool2 = nn.MaxPool2d(2)\n self.fc1 = nn.Linear(8 * 8 * 8, 32)\n self.act4 = nn.Tanh()\n self.fc2 = nn.Linear(32, 2)\n\n def forward(self, x):\n out = self.pool1(self.act1(self.conv1(x)))\n out = self.pool2(self.act2(self.conv2(out)))\n out = out.view(-1, 8 * 8 * 8)\n out = self.act4(self.fc1(out))\n out = self.fc2(out)\n return out",
"_____no_output_____"
],
[
"model = Net()\n\nsum([p.numel() for p in model.parameters()])",
"_____no_output_____"
],
[
"import torch.nn.functional as F\n\nclass Net(nn.Module):\n def __init__(self):\n super(Net, self).__init__()\n self.conv1 = nn.Conv2d(3, 16, kernel_size=3, padding=1)\n self.conv2 = nn.Conv2d(16, 8, kernel_size=3, padding=1)\n self.fc1 = nn.Linear(8 * 8 * 8, 32)\n self.fc2 = nn.Linear(32, 2)\n \n def forward(self, x):\n out = F.max_pool2d(torch.tanh(self.conv1(x)), 2)\n out = F.max_pool2d(torch.tanh(self.conv2(out)), 2)\n out = out.view(-1, 8 * 8 * 8)\n out = torch.tanh(self.fc1(out))\n out = self.fc2(out)\n return out",
"_____no_output_____"
],
[
"model = Net()\nmodel(img.unsqueeze(0))",
"_____no_output_____"
],
[
"import torch\nimport torch.nn as nn\nimport torch.nn.functional as F\n\ntrain_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=True)\n\nclass Net(nn.Module):\n def __init__(self):\n super(Net, self).__init__()\n self.conv1 = nn.Conv2d(3, 16, kernel_size=3, padding=1)\n self.conv2 = nn.Conv2d(16, 8, kernel_size=3, padding=1)\n self.fc1 = nn.Linear(8 * 8 * 8, 32)\n self.fc2 = nn.Linear(32, 2)\n \n def forward(self, x):\n out = F.max_pool2d(torch.relu(self.conv1(x)), 2)\n out = F.max_pool2d(torch.relu(self.conv2(out)), 2)\n out = out.view(-1, 8 * 8 * 8)\n out = torch.tanh(self.fc1(out))\n out = self.fc2(out)\n return out\n \nmodel = Net()\n\nlearning_rate = 1e-2\n\noptimizer = optim.SGD(model.parameters(), lr=learning_rate)\n\nloss_fn = nn.CrossEntropyLoss()\n\nn_epochs = 100\n\nfor epoch in range(n_epochs):\n for imgs, labels in train_loader:\n outputs = model(imgs)\n loss = loss_fn(outputs, labels)\n \n optimizer.zero_grad()\n loss.backward()\n optimizer.step()\n\n print(\"Epoch: %d, Loss: %f\" % (epoch, float(loss)))",
"_____no_output_____"
],
[
"train_loader = torch.utils.data.DataLoader(cifar2, batch_size=64, shuffle=False)\n\ncorrect = 0\ntotal = 0\n\nwith torch.no_grad():\n for imgs, labels in train_loader:\n outputs = model(imgs)\n _, predicted = torch.max(outputs, dim=1)\n total += labels.shape[0]\n correct += int((predicted == labels).sum())\n \nprint(\"Accuracy: %f\" % (correct / total))",
"_____no_output_____"
],
[
"val_loader = torch.utils.data.DataLoader(cifar2_val, batch_size=64, shuffle=False)\n\ncorrect = 0\ntotal = 0\n\nwith torch.no_grad():\n for imgs, labels in val_loader:\n outputs = model(imgs)\n _, predicted = torch.max(outputs, dim=1)\n total += labels.shape[0]\n correct += int((predicted == labels).sum())\n \nprint(\"Accuracy: %f\" % (correct / total))",
"_____no_output_____"
],
[
"import torch\nimport torch.nn as nn\nimport torch.nn.functional as F\n\nclass Net(nn.Module):\n def __init__(self):\n super(Net, self).__init__()\n self.conv1 = nn.Conv2d(3, 16, kernel_size=3, padding=1)\n self.conv2 = nn.Conv2d(16, 8, kernel_size=3, padding=1)\n self.fc1 = nn.Linear(8 * 8 * 8, 32)\n self.fc2 = nn.Linear(32, 2)\n \n def forward(self, x):\n out = F.max_pool2d(torch.relu(self.conv1(x)), 2)\n out = F.max_pool2d(torch.relu(self.conv2(out)), 2)\n out = out.view(-1, 8 * 8 * 8)\n out = torch.tanh(self.fc1(out))\n out = self.fc2(out)\n return out\n \nmodel = Net()\nsum([p.numel() for p in model.parameters()])",
"_____no_output_____"
],
[
"model = nn.Sequential(\n nn.Conv2d(3, 16, kernel_size=3, padding=1),\n nn.Tanh(),\n nn.MaxPool2d(2),\n nn.Conv2d(16, 8, kernel_size=3, padding=1),\n nn.Tanh(),\n nn.MaxPool2d(2),\n nn.Linear(8*8*8, 32),\n nn.Tanh(),\n nn.Linear(32, 2))\n\nmodel(img.unsqueeze(0))",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf876b388e2e2a97eff178bad43981a8aa9c280
| 16,488 |
ipynb
|
Jupyter Notebook
|
Titanic/titanic.ipynb
|
llichengtong/yx4
|
17de7a6257a9f0c38e12089b2d1947927ec54c90
|
[
"Apache-2.0"
] | 128 |
2017-03-04T08:53:44.000Z
|
2020-06-05T11:19:16.000Z
|
Titanic/titanic.ipynb
|
GloriaGUOGUO/TensorFlowBook
|
17de7a6257a9f0c38e12089b2d1947927ec54c90
|
[
"Apache-2.0"
] | null | null | null |
Titanic/titanic.ipynb
|
GloriaGUOGUO/TensorFlowBook
|
17de7a6257a9f0c38e12089b2d1947927ec54c90
|
[
"Apache-2.0"
] | 120 |
2017-02-07T09:41:25.000Z
|
2022-03-17T00:57:59.000Z
| 29.601436 | 231 | 0.547246 |
[
[
[
"# TensorFlowๅฎๆTitanic่งฃๆ",
"_____no_output_____"
],
[
"## ไธใๆฐๆฎ่ฏปๅ
ฅๅ้ขๅค็",
"_____no_output_____"
],
[
"### 1. ไฝฟ็จpandas่ฏปๅ
ฅcsvๆไปถ๏ผ่ฏปๅ
ฅไธบpands.DataFrameๅฏน่ฑก",
"_____no_output_____"
]
],
[
[
"import os\nimport numpy as np\nimport pandas as pd\nimport tensorflow as tf\n\n# read data from file\ndata = pd.read_csv('data/train.csv')\nprint(data.info())",
"_____no_output_____"
]
],
[
[
"### 2. ้ขๅค็\n\n1. ๅ้ค็ฉบๆฐๆฎ\n2. ๅฐ'Sex'ๅญๆฎต่ฝฌๆขไธบint็ฑปๅ\n3. ้ๅๆฐๅผ็ฑปๅ็ๅญๆฎต๏ผๆๅผๅญ็ฌฆไธฒ็ฑปๅๅญๆฎต",
"_____no_output_____"
]
],
[
[
"# fill nan values with 0\ndata = data.fillna(0)\n# convert ['male', 'female'] values of Sex to [1, 0]\ndata['Sex'] = data['Sex'].apply(lambda s: 1 if s == 'male' else 0)\n# 'Survived' is the label of one class,\n# add 'Deceased' as the other class\ndata['Deceased'] = data['Survived'].apply(lambda s: 1 - s)\n\n# select features and labels for training\ndataset_X = data[['Sex', 'Age', 'Pclass', 'SibSp', 'Parch', 'Fare']]\ndataset_Y = data[['Deceased', 'Survived']]\n\nprint(dataset_X)\nprint(dataset_Y)",
"_____no_output_____"
]
],
[
[
"### 3. ๅฐ่ฎญ็ปๆฐๆฎๅๅไธบ่ฎญ็ป้(training set)ๅ้ช่ฏ้(validation set)",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import train_test_split\n\n# split training data and validation set data\nX_train, X_val, y_train, y_val = train_test_split(dataset_X.as_matrix(), dataset_Y.as_matrix(),\n test_size=0.2,\n random_state=42)",
"_____no_output_____"
]
],
[
[
"# ไบใๆๅปบ่ฎก็ฎๅพ",
"_____no_output_____"
],
[
"### ้ป่พๅๅฝ\n\n้ป่พๅๅฝๆฏๅฝขๅผๆ็ฎๅ๏ผๅนถไธๆๅฎนๆ็่งฃ็ๅ็ฑปๅจไนไธใไปๆฐๅญฆไธ๏ผ้ป่พๅๅฝ็้ขๆตๅฝๆฐๅฏไปฅ่กจ็คบไธบๅฆไธๅ
ฌๅผ๏ผ\n\n *y = softmax(xW + b)*\n\nๅ
ถไธญ๏ผ*x*ไธบ่พๅ
ฅๅ้๏ผๆฏๅคงๅฐไธบ*dร1*็ๅๅ้๏ผ*d*ๆฏ็นๅพๆฐใ*W*ๆฏๅคงๅฐไธบ็*cรd*ๆ้็ฉ้ต๏ผ*c*ๆฏๅ็ฑป็ฑปๅซๆฐ็ฎใ*b*ๆฏๅ็ฝฎๅ้๏ผไธบ*cร1*ๅๅ้ใ*softmax*ๅจๆฐๅญฆๅฎไน้๏ผๆฏๆไธ็งๅฝไธๅๆๆฐๅฝๆฐใๅฎๅฐไธไธช*k*็ปด็ๅ้*x*ๆ็
งๅ
ฌๅผ\n\n\n\n็ๅฝขๅผๅฐๅ้ไธญ็ๅ
็ด ่ฝฌๆขไธบ*(0, 1)*็ๅบ้ดใๆบๅจๅญฆไน ้ขๅๅธธไฝฟ็จ่ฟ็งๆนๆณๅฐ็ฑปไผผๅคๅซๅฝๆฐ็็ฝฎไฟกๅบฆๅผ่ฝฌๆขไธบๆฆ็ๅฝขๅผ๏ผๅฆๅคๅซ่ถ
ๅนณ้ข็่ท็ฆป็ญ๏ผใ*softmax*ๅฝๆฐๅธธ็จไบ่พๅบๅฑ๏ผ็จไบๆๅฎๅฏไธ็ๅ็ฑป่พๅบใ\n",
"_____no_output_____"
],
[
"### 1.\tไฝฟ็จplaceholderๅฃฐๆ่พๅ
ฅๅ ไฝ็ฌฆ\nTensorFlow่ฎพ่ฎกไบๆฐๆฎFeedๆบๅถใไนๅฐฑๆฏ่ฏด่ฎก็ฎ็จๅบๅนถไธไผ็ดๆฅไบคไบๆง่ก๏ผ่ๆฏๅจๅฃฐๆ่ฟ็จๅชๅ่ฎก็ฎๅพ็ๆๅปบใๆไปฅ๏ผๆญคๆถๅนถไธไผ่งฆ็ขฐ็ๅฎ็ๆฐๆฎ๏ผ่ๅชๆฏ้่ฟplaceholder็ฎๅญๅฃฐๆไธไธช่พๅ
ฅๆฐๆฎ็ๅ ไฝ็ฌฆ๏ผๅจๅ้ข็ๆญฃ่ฟ่ก่ฎก็ฎๆถ๏ผๆ็จๆฐๆฎๆฟๆขๅ ไฝ็ฌฆใ\n\nๅฃฐๆๅ ไฝ็ฌฆplaceholder้่ฆ็ปๅฎไธไธชๅๆฐ๏ผๅๅซๆฏ่พๅ
ฅๆฐๆฎ็ๅ
็ด ็ฑปๅdtypeใ็ปดๅบฆๅฝข็ถshapeๅๅ ไฝ็ฌฆๅ็งฐๆ ่ฏnameใ",
"_____no_output_____"
]
],
[
[
"# ๅฃฐๆ่พๅ
ฅๆฐๆฎๅ ไฝ็ฌฆ\n# shapeๅๆฐ็็ฌฌไธไธชๅ
็ด ไธบNone๏ผ่กจ็คบๅฏไปฅๅๆถๆพๅ
ฅไปปๆๆก่ฎฐๅฝ\nX = tf.placeholder(tf.float32, shape=[None, 6], name='input')\ny = tf.placeholder(tf.float32, shape=[None, 2], name='label')",
"_____no_output_____"
]
],
[
[
"### 2.\tๅฃฐๆๅๆฐๅ้\nๅ้็ๅฃฐๆๆนๅผๆฏ็ดๆฅๅฎไนtf.Variable()ๅฏน่ฑกใ\n\nๅๅงๅๅ้ๅฏน่ฑกๆไธค็งๆนๅผ๏ผไธ็งๆฏไปprotocol buffer็ปๆVariableDefไธญๅๅบๅๅ๏ผๅฆไธ็งๆฏ้่ฟๅๆฐๆๅฎๅๅงๅผใๆ็ฎๅ็ๆนๅผๅฐฑๆฏๅไธ้ข็จๅบ่ฟๆ ท๏ผไธบๅ้ไผ ๅ
ฅๅๅงๅผใๅๅงๅผๅฟ
้กปๆฏไธไธชtensorๅฏน่ฑก๏ผๆๆฏๅฏไปฅ้่ฟconvert_to_tensor()ๆนๆณ่ฝฌๆขๆtensor็Pythonๅฏน่ฑกใTensorFlowๆไพไบๅค็งๆ้ ้ๆบtensor็ๆนๆณ๏ผๅฏไปฅๆ้ ๅ
จ้ถtensorใ้ๆบๆญฃๆๅๅธtensor็ญใๅฎไนๅ้ไผไฟ็ๅๅงๅผ็็ปดๅบฆๅฝข็ถใ",
"_____no_output_____"
]
],
[
[
"# ๅฃฐๆๅ้\nweights = tf.Variable(tf.random_normal([6, 2]), name='weights')\nbias = tf.Variable(tf.zeros([2]), name='bias')",
"_____no_output_____"
]
],
[
[
"### 3.\tๆ้ ๅๅไผ ๆญ่ฎก็ฎๅพ\n\nไฝฟ็จ็ฎๅญๆๅปบ็ฑ่พๅ
ฅ่ฎก็ฎๅบๆ ็ญพ็่ฎก็ฎ่ฟ็จใ\n\nๅจ่ฎก็ฎๅพ็ๆๅปบ่ฟ็จไธญ๏ผTensorFlowไผ่ชๅจๆจ็ฎๆฏไธไธช่็น็่พๅ
ฅ่พๅบๅฝข็ถใ่ฅๆ ๆณ่ฟ็ฎ๏ผๆฏๅฆไธคไธช่กๅๆฐไธๅ็็ฉ้ต็ธๅ ๏ผๅไผ็ดๆฅๆฅ้ใ",
"_____no_output_____"
]
],
[
[
"y_pred = tf.nn.softmax(tf.matmul(X, weights) + bias)",
"_____no_output_____"
]
],
[
[
"### 4.\tๅฃฐๆไปฃไปทๅฝๆฐ\n\nไฝฟ็จไบคๅ็ต(cross entropy)ไฝไธบไปฃไปทๅฝๆฐใ",
"_____no_output_____"
]
],
[
[
"# ไฝฟ็จไบคๅ็ตไฝไธบไปฃไปทๅฝๆฐ\ncross_entropy = - tf.reduce_sum(y * tf.log(y_pred + 1e-10),\n reduction_indices=1)\n# ๆน้ๆ ทๆฌ็ไปฃไปทๅผไธบๆๆๆ ทๆฌไบคๅ็ต็ๅนณๅๅผ\ncost = tf.reduce_mean(cross_entropy)",
"_____no_output_____"
]
],
[
[
"#### NOTE\nๅจ่ฎก็ฎไบคๅ็ต็ๆถๅ๏ผๅฏนๆจกๅ่พๅบๅผ y_pred ๅ ไธไบไธไธชๅพๅฐ็่ฏฏๅทฎๅผ๏ผๅจไธ้ข็จๅบไธญๆฏ 1e-10๏ผ๏ผ่ฟๆฏๅ ไธบๅฝ y_pred ๅๅๆฅ่ฟ็ๅผ y_true ็ๆถๅ๏ผไนๅฐฑๆฏ y_pred ็ๅผ้ๅธธๆฅ่ฟ 0 ๆ 1 ็ๅๅผๆถ๏ผ่ฎก็ฎไผๅพๅฐ่ดๆ ็ฉท -inf๏ผไป่ๅฏผ่ด่พๅบ้ๆณ๏ผๅนถ่ฟไธๆญฅๅฏผ่ดๆ ๆณ่ฎก็ฎๆขฏๅบฆ๏ผ่ฟญไปฃ้ทๅ
ฅๅดฉๆบใ่ฆ่งฃๅณ่ฟไธช้ฎ้ขๆไธ็งๅๆณ๏ผ\n\n1. ๅจ่ฎก็ฎๆถ๏ผ็ดๆฅๅ ๅ
ฅไธไธชๆๅฐ็่ฏฏๅทฎๅผ๏ผไฝฟ่ฎก็ฎๅๆณใ่ฟๆ ทๅฏไปฅ้ฟๅ
่ฎก็ฎ๏ผไฝๅญๅจ็้ฎ้ขๆฏๅ ๅ
ฅ่ฏฏๅทฎๅ็ธๅฝไบy_pred็ๅผไผ็ช็ ด1ใๅจ็คบไพไปฃ็ ไธญไฝฟ็จไบ่ฟ็งๆนๆก๏ผ\n2. ไฝฟ็จ clip() ๅฝๆฐ๏ผๅฝ y_pred ๆฅ่ฟ 0 ๆถ๏ผๅฐๅ
ถ่ตๅผๆไธบๆๅฐ่ฏฏๅทฎๅผใไนๅฐฑๆฏๅฐ y_pred ็ๅๅผ่ๅด้ๅฎๅจ็่ๅดๅ
๏ผ\n3. ๅฝ่ฎก็ฎไบคๅ็ต็่ฎก็ฎๅบ็ฐ nan ๅผๆถ๏ผๆพๅผๅฐๅฐcost่ฎพ็ฝฎไธบ0ใ่ฟ็งๆนๅผๅ้ฟไบ ๅฝๆฐ่ฎก็ฎ็้ฎ้ข๏ผ่ๆฏๅจๆ็ป็ไปฃไปทๅฝๆฐไธ่ฟ่กๅฎน้ๅค็ใ\n",
"_____no_output_____"
],
[
"### 5. ๅ ๅ
ฅไผๅ็ฎๆณ\n\nTensorFlowๅ
็ฝฎไบๅค็ง็ปๅ
ธ็ไผๅ็ฎๆณ๏ผๅฆ้ๆบๆขฏๅบฆไธ้็ฎๆณ๏ผSGD๏ผStochastic Gradient Descent๏ผใๅจ้็ฎๆณ๏ผMomentum๏ผใAdagrad็ฎๆณใADAM็ฎๆณใRMSProp็ฎๆณ็ญใไผๅๅจๅ
้จไผ่ชๅจๆๅปบๆขฏๅบฆ่ฎก็ฎๅๅๅไผ ๆญ้จๅ็่ฎก็ฎๅพใ\n\nไธ่ฌๅฏนไบไผๅ็ฎๆณ๏ผๆๅ
ณ้ฎ็ๅๆฐๆฏๅญฆไน ็๏ผlearning rate๏ผ๏ผๅฏนไบๅญฆไน ็็่ฎพ็ฝฎๆฏไธ้จๆๆฏใๅๆถ๏ผไธๅไผๅ็ฎๆณๅจไธๅ้ฎ้ขไธๅฏ่ฝไผๆไธๅ็ๆถๆ้ๅบฆ๏ผๅจ่งฃๅณๅฎ้
้ฎ้ขๆถๅฏไปฅๅๅค็งๅฐ่ฏใ",
"_____no_output_____"
]
],
[
[
"# ไฝฟ็จ้ๆบๆขฏๅบฆไธ้็ฎๆณไผๅๅจๆฅๆๅฐๅไปฃไปท๏ผ็ณป็ป่ชๅจๆๅปบๅๅไผ ๆญ้จๅ็่ฎก็ฎๅพ\ntrain_op = tf.train.GradientDescentOptimizer(0.001).minimize(cost)",
"_____no_output_____"
]
],
[
[
"### 6. (optional) ่ฎก็ฎๅ็กฎ็",
"_____no_output_____"
]
],
[
[
"# ่ฎก็ฎๅ็กฎ็\ncorrect_pred = tf.equal(tf.argmax(y, 1), tf.argmax(y_pred, 1))\nacc_op = tf.reduce_mean(tf.cast(correct_pred, tf.float32))",
"_____no_output_____"
]
],
[
[
"# ไธใๆๅปบ่ฎญ็ป่ฟญไปฃ & ๆง่ก่ฎญ็ป",
"_____no_output_____"
],
[
"### ๅฏๅจSession๏ผไปฃๅ
ฅๆฐๆฎ่ฟ่ก่ฎก็ฎใ่ฎญ็ป็ปๆๅไฝฟ็จ้ช่ฏ้่ฏไผฐ่ฎญ็ปๆๆ",
"_____no_output_____"
]
],
[
[
"with tf.Session() as sess:\n # variables have to be initialized at the first place\n tf.global_variables_initializer().run()\n\n # training loop\n for epoch in range(10):\n total_loss = 0.\n for i in range(len(X_train)):\n # prepare feed data and run\n feed_dict = {X: [X_train[i]], y: [y_train[i]]}\n _, loss = sess.run([train_op, cost], feed_dict=feed_dict)\n total_loss += loss\n # display loss per epoch\n print('Epoch: %04d, total loss=%.9f' % (epoch + 1, total_loss))\n\n print 'Training complete!'\n \n # Accuracy calculated by TensorFlow\n accuracy = sess.run(acc_op, feed_dict={X: X_val, y: y_val})\n print(\"Accuracy on validation set: %.9f\" % accuracy)\n\n # Accuracy calculated by NumPy\n pred = sess.run(y_pred, feed_dict={X: X_val})\n correct = np.equal(np.argmax(pred, 1), np.argmax(y_val, 1))\n numpy_accuracy = np.mean(correct.astype(np.float32))\n print(\"Accuracy on validation set (numpy): %.9f\" % numpy_accuracy)",
"_____no_output_____"
]
],
[
[
"# ๅใๅญๅจๅๅ ่ฝฝๆจกๅๅๆฐ",
"_____no_output_____"
],
[
"ๅ้็ๅญๅจๅ่ฏปๅๆฏ้่ฟtf.train.Saver็ฑปๆฅๅฎๆ็ใSaverๅฏน่ฑกๅจๅๅงๅๆถ๏ผไธบ่ฎก็ฎๅพๅ ๅ
ฅไบ็จไบๅญๅจๅๅ ่ฝฝๅ้็็ฎๅญ๏ผๅนถๅฏไปฅ้่ฟๅๆฐๆๅฎๆฏ่ฆๅญๅจๅชไบๅ้ใSaverๅฏน่ฑก็save()ๅrestore()ๆนๆณๆฏ่งฆๅๅพไธญ็ฎๅญ็ๅ
ฅๅฃใ\n",
"_____no_output_____"
]
],
[
[
"# ่ฎญ็ปๆญฅๆฐ่ฎฐๅฝ\nglobal_step = tf.Variable(0, name='global_step', trainable=False)\n# ๅญๆกฃๅ
ฅๅฃ\nsaver = tf.train.Saver()\n\n# ๅจSaverๅฃฐๆไนๅๅฎไน็ๅ้ๅฐไธไผ่ขซๅญๅจ\n# non_storable_variable = tf.Variable(777)\n\nckpt_dir = './ckpt_dir'\nif not os.path.exists(ckpt_dir):\n os.makedirs(ckpt_dir)\n\nwith tf.Session() as sess:\n tf.global_variables_initializer().run()\n\n # ๅ ่ฝฝๆจกๅๅญๆกฃ\n ckpt = tf.train.get_checkpoint_state(ckpt_dir)\n if ckpt and ckpt.model_checkpoint_path:\n print('Restoring from checkpoint: %s' % ckpt.model_checkpoint_path)\n saver.restore(sess, ckpt.model_checkpoint_path)\n\n start = global_step.eval()\n for epoch in range(start, start + 10):\n total_loss = 0.\n for i in range(0, len(X_train)):\n feed_dict = {\n X: [X_train[i]],\n y: [y_train[i]]\n }\n _, loss = sess.run([train_op, cost], feed_dict=feed_dict)\n total_loss += loss\n print('Epoch: %04d, loss=%.9f' % (epoch + 1, total_loss))\n\n\n # ๆจกๅๅญๆกฃ\n global_step.assign(epoch).eval()\n saver.save(sess, ckpt_dir + '/logistic.ckpt',\n global_step=global_step)\n print('Training complete!')",
"_____no_output_____"
]
],
[
[
"# TensorBoard\n\nTensorBoardๆฏTensorFlow้
ๅฅ็ๅฏ่งๅๅทฅๅ
ท๏ผๅฏไปฅ็จๆฅๅธฎๅฉ็่งฃๅคๆ็ๆจกๅๅๆฃๆฅๅฎ็ฐไธญ็้่ฏฏใ\n\nTensorBoard็ๅทฅไฝๆนๅผๆฏๅฏๅจไธไธชWEBๆๅก๏ผ่ฏฅๆๅก่ฟ็จไปTensorFlow็จๅบๆง่กๆๅพ็ไบไปถๆฅๅฟๆไปถ๏ผevent files๏ผไธญ่ฏปๅๆฆ่ฆ๏ผsummary๏ผๆฐๆฎ๏ผ็ถๅๅฐๆฐๆฎๅจ็ฝ้กตไธญ็ปๅถๆๅฏ่งๅ็ๅพ่กจใๆฆ่ฆๆฐๆฎไธป่ฆๅ
ๆฌไปฅไธๅ ็ง็ฑปๅซ๏ผ\n1.\tๆ ้ๆฐๆฎ๏ผๅฆๅ็กฎ็ใไปฃไปทๆๅคฑๅผ๏ผไฝฟ็จtf.summary.scalarๅ ๅ
ฅ่ฎฐๅฝ็ฎๅญ๏ผ\n2.\tๅๆฐๆฐๆฎ๏ผๅฆๅๆฐ็ฉ้ตweightsใๅ็ฝฎ็ฉ้ตbias๏ผไธ่ฌไฝฟ็จtf.summary.histogram่ฎฐๅฝ๏ผ\n3.\tๅพๅๆฐๆฎ๏ผ็จtf.summary.imageๅ ๅ
ฅ่ฎฐๅฝ็ฎๅญ๏ผ\n4.\t้ณ้ขๆฐๆฎ๏ผ็จtf.summary.audioๅ ๅ
ฅ่ฎฐๅฝ็ฎๅญ๏ผ\n5.\t่ฎก็ฎๅพ็ปๆ๏ผๅจๅฎไนtf.summary.FileWriterๅฏน่ฑกๆถ่ชๅจ่ฎฐๅฝใ",
"_____no_output_____"
],
[
"ๅฏไปฅ้่ฟTensorBoardๅฑ็คบ็ๅฎๆด็จๅบ๏ผ",
"_____no_output_____"
]
],
[
[
"################################\n# Constructing Dataflow Graph\n################################\n\n# arguments that can be set in command line\ntf.app.flags.DEFINE_integer('epochs', 10, 'Training epochs')\ntf.app.flags.DEFINE_integer('batch_size', 10, 'size of mini-batch')\nFLAGS = tf.app.flags.FLAGS\n\nwith tf.name_scope('input'):\n # create symbolic variables\n X = tf.placeholder(tf.float32, shape=[None, 6])\n y_true = tf.placeholder(tf.float32, shape=[None, 2])\n\nwith tf.name_scope('classifier'):\n # weights and bias are the variables to be trained\n weights = tf.Variable(tf.random_normal([6, 2]))\n bias = tf.Variable(tf.zeros([2]))\n y_pred = tf.nn.softmax(tf.matmul(X, weights) + bias)\n\n # add histogram summaries for weights, view on tensorboard\n tf.summary.histogram('weights', weights)\n tf.summary.histogram('bias', bias)\n\n# Minimise cost using cross entropy\n# NOTE: add a epsilon(1e-10) when calculate log(y_pred),\n# otherwise the result will be -inf\nwith tf.name_scope('cost'):\n cross_entropy = - tf.reduce_sum(y_true * tf.log(y_pred + 1e-10),\n reduction_indices=1)\n cost = tf.reduce_mean(cross_entropy)\n tf.summary.scalar('loss', cost)\n\n# use gradient descent optimizer to minimize cost\ntrain_op = tf.train.GradientDescentOptimizer(0.001).minimize(cost)\n\nwith tf.name_scope('accuracy'):\n correct_pred = tf.equal(tf.argmax(y_true, 1), tf.argmax(y_pred, 1))\n acc_op = tf.reduce_mean(tf.cast(correct_pred, tf.float32))\n # Add scalar summary for accuracy\n tf.summary.scalar('accuracy', acc_op)\n\nglobal_step = tf.Variable(0, name='global_step', trainable=False)\n# use saver to save and restore model\nsaver = tf.train.Saver()\n\n# this variable won't be stored, since it is declared after tf.train.Saver()\nnon_storable_variable = tf.Variable(777)\n\nckpt_dir = './ckpt_dir'\nif not os.path.exists(ckpt_dir):\n os.makedirs(ckpt_dir)\n\n################################\n# Training the model\n################################\n\n# use session to run the calculation\nwith tf.Session() as sess:\n # create a log writer. run 'tensorboard --logdir=./logs'\n writer = tf.summary.FileWriter('./logs', sess.graph)\n merged = tf.summary.merge_all()\n\n # variables have to be initialized at the first place\n tf.global_variables_initializer().run()\n\n # restore variables from checkpoint if exists\n ckpt = tf.train.get_checkpoint_state(ckpt_dir)\n if ckpt and ckpt.model_checkpoint_path:\n print('Restoring from checkpoint: %s' % ckpt.model_checkpoint_path)\n saver.restore(sess, ckpt.model_checkpoint_path)\n\n start = global_step.eval()\n # training loop\n for epoch in range(start, start + FLAGS.epochs):\n total_loss = 0.\n for i in range(0, len(X_train), FLAGS.batch_size):\n # train with mini-batch\n feed_dict = {\n X: X_train[i: i + FLAGS.batch_size],\n y_true: y_train[i: i + FLAGS.batch_size]\n }\n _, loss = sess.run([train_op, cost], feed_dict=feed_dict)\n total_loss += loss\n # display loss per epoch\n print('Epoch: %04d, loss=%.9f' % (epoch + 1, total_loss))\n\n summary, accuracy = sess.run([merged, acc_op],\n feed_dict={X: X_val, y_true: y_val})\n writer.add_summary(summary, epoch) # Write summary\n print('Accuracy on validation set: %.9f' % accuracy)\n\n # set and update(eval) global_step with epoch\n global_step.assign(epoch).eval()\n saver.save(sess, ckpt_dir + '/logistic.ckpt',\n global_step=global_step)\n print('Training complete!')",
"_____no_output_____"
]
],
[
[
"\n",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbf89280d955e287a864ae47f06ec57284efdaae
| 83,800 |
ipynb
|
Jupyter Notebook
|
University of Liverpool - Ion Switching/model/lgbm-kfold-no-out.ipynb
|
DavideStenner/Kaggle
|
c3e6eae84413611a0859358767319f9604a07d4d
|
[
"MIT"
] | null | null | null |
University of Liverpool - Ion Switching/model/lgbm-kfold-no-out.ipynb
|
DavideStenner/Kaggle
|
c3e6eae84413611a0859358767319f9604a07d4d
|
[
"MIT"
] | null | null | null |
University of Liverpool - Ion Switching/model/lgbm-kfold-no-out.ipynb
|
DavideStenner/Kaggle
|
c3e6eae84413611a0859358767319f9604a07d4d
|
[
"MIT"
] | null | null | null | 113.090418 | 61,964 | 0.857792 |
[
[
[
"!pip install pandas==1.0.3",
"Collecting pandas==1.0.3\r\n Downloading pandas-1.0.3-cp37-cp37m-manylinux1_x86_64.whl (10.0 MB)\r\n\u001b[K |โโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโโ| 10.0 MB 8.9 MB/s \r\n\u001b[?25hRequirement already satisfied: pytz>=2017.2 in /opt/conda/lib/python3.7/site-packages (from pandas==1.0.3) (2019.3)\r\nRequirement already satisfied: python-dateutil>=2.6.1 in /opt/conda/lib/python3.7/site-packages (from pandas==1.0.3) (2.8.1)\r\nRequirement already satisfied: numpy>=1.13.3 in /opt/conda/lib/python3.7/site-packages (from pandas==1.0.3) (1.18.1)\r\nRequirement already satisfied: six>=1.5 in /opt/conda/lib/python3.7/site-packages (from python-dateutil>=2.6.1->pandas==1.0.3) (1.14.0)\r\n\u001b[31mERROR: hypertools 0.6.2 has requirement scikit-learn<0.22,>=0.19.1, but you'll have scikit-learn 0.22.2.post1 which is incompatible.\u001b[0m\r\n\u001b[31mERROR: datalab 1.1.5 has requirement pandas-profiling==1.4.0, but you'll have pandas-profiling 2.4.0 which is incompatible.\u001b[0m\r\nInstalling collected packages: pandas\r\n Attempting uninstall: pandas\r\n Found existing installation: pandas 1.0.1\r\n Uninstalling pandas-1.0.1:\r\n Successfully uninstalled pandas-1.0.1\r\nSuccessfully installed pandas-1.0.3\r\n"
],
[
"from sklearn.model_selection import StratifiedKFold, GroupKFold\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport numpy as np\nfrom tqdm import tqdm_notebook\nfrom math import log\nimport lightgbm as lgb\nimport gc\nimport shap\nfrom sklearn.metrics import f1_score\nfrom tqdm.notebook import tqdm\nfrom sklearn.preprocessing import LabelEncoder, MinMaxScaler\nimport seaborn as sns\n\nn_classes = 11\nfolds = 5\nSEED_ = 987654321",
"_____no_output_____"
],
[
"path = '../input/lgbm-dataset/'\n\ngroup = np.load(path + 'group.npy', allow_pickle = True)\ntrain_target = pd.read_csv('../input/liverpool-ion-switching/train.csv', usecols = ['open_channels'])\ntime = pd.read_csv('../input/liverpool-ion-switching/train.csv', usecols = ['open_channels', 'time'])\n\ntrain = pd.read_csv(path + 'train_clean.csv')",
"_____no_output_____"
],
[
"out_ = np.arange(3640000, 3824000)\nidx = train.index\nbool_mask = idx.isin(out_)\ntrain = train[~idx.isin(out_)].reset_index(drop = True)\ngroup = group[~idx.isin(out_)]\ntrain_target = train_target[~idx.isin(out_)].reset_index(drop = True)\n",
"_____no_output_____"
],
[
"def evaluate_macroF1_lgb(predictions, truth): \n # this follows the discussion in https://github.com/Microsoft/LightGBM/issues/1483\n labels = truth.get_label()\n pred_labels = predictions.reshape(n_classes,-1).argmax(axis=0) #np.unique(\n\n f1 = f1_score(labels, pred_labels, average='macro')\n return ('macroF1', f1, True) \n\nparams = {\n \"objective\" : \"multiclass\",\n \"num_class\" : n_classes,\n 'metric' : \"None\",\n 'boosting_type':'gbdt',\n 'learning_rate':0.05,\n 'colsample_bytree': 0.8,\n 'lambda_l1': 1,\n 'lambda_l2': 1,\n 'max_depth': -1,\n 'num_leaves': 2**8,\n 'subsample': .75,\n 'seed': SEED_,\n 'importance_type':'gain',\n 'n_jobs':-1,\n}\n\ngc.collect()",
"_____no_output_____"
],
[
"time = pd.read_csv('../input/liverpool-ion-switching/train.csv', usecols = ['open_channels', 'time']).loc[~idx.isin(out_)].reset_index(drop = True)\ntime['group'] = (time['time'].transform(lambda x: np.ceil(x*10000/500000)))\ntime['segment'] = train['segment']",
"_____no_output_____"
],
[
"strat =time['segment'].astype(str).copy() + time['open_channels'].astype(str).copy()\nle = LabelEncoder()\nstrat = le.fit_transform(strat).astype(np.int16)\n",
"_____no_output_____"
],
[
"del time, le\ngc.collect()\n",
"_____no_output_____"
],
[
"gc.collect()\nkf = GroupKFold(n_splits = folds)\nmodel_list = []\nscore = 0\npred_oof = np.zeros((train.shape[0], n_classes))\n\nfor fold_n, (train_index, valid_index) in enumerate(kf.split(train, strat, group)):\n print(f'BEGIN FOLD: {fold_n} -------\\n\\n\\n')\n\n X_train, X_valid = train.iloc[train_index,:], train.iloc[valid_index,:]\n y_train, y_valid = train_target.iloc[train_index,:], train_target.iloc[valid_index,:]\n\n model = lgb.train(params,lgb.Dataset(X_train, label=y_train, categorical_feature = ['segment']), \n 5000, valid_sets = lgb.Dataset(X_valid, label=y_valid,categorical_feature = ['segment']), valid_names ='validation',\n verbose_eval = 50, feval = evaluate_macroF1_lgb, early_stopping_rounds = 50)\n gc.collect()\n \n model_list += [model]\n \n valid_ = model.predict(X_valid)\n pred = valid_.argmax(axis = 1).reshape((-1))\n pred_oof[valid_index, :] = valid_\n\n score_temp = f1_score(y_valid, pred, average = 'macro')\n score += score_temp/folds\n\n del model, X_train, X_valid, y_train, y_valid\n gc.collect()\n print(f'\\n\\nF1_SCORE: {score_temp}\\n\\n\\nENDED FOLD: {fold_n} -------\\n\\n\\n')\n\nnp.save('pred_oof.npy', pred_oof, allow_pickle = True)\n\nprint(f'FINAL F1 SCORE: {score}')",
"BEGIN FOLD: 0 -------\n\n\n\n"
],
[
"feature_importances = pd.DataFrame()\nfeature_importances['feature'] = train.columns\n\n\nfor fold_, mod in tqdm(enumerate(model_list)):\n feature_importances['fold_{}'.format(fold_ + 1)] = mod.feature_importance(importance_type='gain')\n mod.save_model(f'model{fold_}')",
"_____no_output_____"
],
[
"scaler = MinMaxScaler(feature_range=(0, 100))\n\n\nfeature_importances['average'] = scaler.fit_transform(X=pd.DataFrame(feature_importances[['fold_{}'.format(fold + 1) for fold in range(kf.n_splits)]].mean(axis=1)))\n\nfig = plt.figure(figsize=(20, 16))\nsns.barplot(data=feature_importances.sort_values(by='average', ascending=False).head(50), x='average', y='feature');\nplt.title('50 TOP feature importance over {} average'.format(fold_+1))\n",
"_____no_output_____"
],
[
"del train, train_target\ngc.collect()",
"_____no_output_____"
],
[
"test = pd.read_csv(path + 'test_clean.csv')\n\nfor i, mod in enumerate(model_list):\n if i == 0:\n pred_test = mod.predict(test)/folds\n else:\n pred_test += mod.predict(test)/folds\n",
"_____no_output_____"
],
[
"np.save('pred_test.npy', pred_test, allow_pickle = True)\n",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8937fc449f211f5979f8084210243c99d3eec
| 144,745 |
ipynb
|
Jupyter Notebook
|
03_RNN_mean_squared_error/33_RNN_Tpdfnoise_NHKRadio(mean_squared_error).ipynb
|
wajimax/Python-SignalProcessing
|
59e7436c0854849aa8b39b944525513f0345ebb9
|
[
"MIT"
] | null | null | null |
03_RNN_mean_squared_error/33_RNN_Tpdfnoise_NHKRadio(mean_squared_error).ipynb
|
wajimax/Python-SignalProcessing
|
59e7436c0854849aa8b39b944525513f0345ebb9
|
[
"MIT"
] | null | null | null |
03_RNN_mean_squared_error/33_RNN_Tpdfnoise_NHKRadio(mean_squared_error).ipynb
|
wajimax/Python-SignalProcessing
|
59e7436c0854849aa8b39b944525513f0345ebb9
|
[
"MIT"
] | null | null | null | 134.2718 | 101,414 | 0.813023 |
[
[
[
"#!/usr/bin/env python\n# vim:fileencoding=utf-8\nimport sys\nimport numpy as np\nimport matplotlib.pyplot as plt\nimport soundfile as sf\nimport matplotlib\nimport pandas as pd\n\n#ใใผใฟใปใใใฎๅๅฒ\nfrom sklearn.model_selection import train_test_split",
"_____no_output_____"
],
[
"#ๆทฑๅฑคๅญฆ็ฟใฉใคใใฉใช\nfrom keras.models import Sequential\nfrom keras.layers.core import Dense, Activation\nfrom keras.layers.recurrent import SimpleRNN\nfrom keras.optimizers import Adam\nfrom keras.callbacks import EarlyStopping",
"C:\\Users\\AdminUser\\Anaconda2\\lib\\site-packages\\h5py\\__init__.py:34: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\nUsing Theano backend.\n"
],
[
"#้ณๆฅฝใใกใคใซ\nMusic_file = './Input/01_Radio/NHKRadio.wav'\nMusic_noise_file = './Input/01_Radio/NHKRadio_Tpdfnoise.wav'\n\nMusicType = 'NHK Radio'\nMusicFileName = 'NHKRadio'\nNoiseType = 'Tpdfnoise'",
"_____no_output_____"
],
[
"# wavใใกใคใซ่ชญใฟ่พผใฟ\nMusic_wav, Music_fs = sf.read(Music_file)\n\n# ในใใฌใช2chใฎๅ ดๅใใขใใฉใซ้ณๆบใซๅคๆ(ๅทฆๅณใฎๅ้ณใ2ใงๅฒใฃใ้ณใ่ถณใใฆไฝๆ๏ผ)\nif(Music_wav.shape[1] == 1):\n Music_wavdata = Music_wav\n print(Music_wav.shape[1])\nelse:\n Music_wavdata = (0.5 * Music_wav[:, 1]) + (0.5 * Music_wav[:, 0])",
"_____no_output_____"
],
[
"# wavใใกใคใซ่ชญใฟ่พผใฟ\nMusic_whitenoise_wav, Music_whitenoise_fs = sf.read(Music_noise_file)\n\n# ในใใฌใช2chใฎๅ ดๅใใขใใฉใซ้ณๆบใซๅคๆ(ๅทฆๅณใฎๅ้ณใ2ใงๅฒใฃใ้ณใ่ถณใใฆไฝๆ๏ผ)\nif(Music_whitenoise_wav.shape[1] == 1):\n Music_whitenoise_wavdata = Music_whitenoise_wav\n print(Music_whitenoise_wav.shape[1])\nelse:\n Music_whitenoise_wavdata = (0.5 * Music_whitenoise_wav[:, 1]) + (0.5 * Music_whitenoise_wav[:, 0])",
"_____no_output_____"
],
[
"#ๆ้่ปธ(ไฟกๅทใฎๅ ดๅใฏLength)\n#x = range(300)\nx = range(500)\n\n#Y่ปธใฏไฟกๅทใใผใฟ\n#y = Music_whitenoise_wavdata[:300]\ny = Music_whitenoise_wavdata[:500]\n\n#ๅญฆ็ฟใฎใใฌใผใ \nl = 150\n\n#้ขๆฐ\n#ไฟกๅทใใผใฟใจๅญฆ็ฟใฎใใฌใผใ ใไฝฟ็จ\ndef make_dataset(y, l):\n data = []\n target = []\n for i in range(len(y)-l):\n data.append(y[i:i+l])\n target.append(y[i + l])\n return(data, target)\n\n#้ขๆฐๅผใณๅบใใงใใผใฟใปใใใไฝๆ\n(data, target) = make_dataset(y, l)",
"_____no_output_____"
],
[
"#1ใใฌใผใ ใฎใใผใฟ\n#data[0]\n#len(data[0])\n#len(data)\n#target\n#len(target)",
"_____no_output_____"
],
[
"#RNN็จใฎใใผใฟใปใใใไฝๆ\n#ใใใง3ๆฌกๅ
ใใผใฟใปใใใไฝๆใใชใใฆใฏใชใใชใ\ndata = np.array(data).reshape(-1, l, 1)",
"_____no_output_____"
],
[
"num_neurons = 1\nn_hidden = 200\n\nmodel = Sequential()\nmodel.add(SimpleRNN(n_hidden, batch_input_shape=(None, l, num_neurons), return_sequences=False))\nmodel.add(Dense(num_neurons))\nmodel.add(Activation('linear'))\noptimizer = Adam(lr = 0.001)\nmodel.compile(loss=\"mean_squared_error\", optimizer=optimizer)\n#model.compile(loss=\"mean_squared_logarithmic_error\", optimizer=optimizer)\n#model.compile(loss=\"mean_absolute_percentage_error\", optimizer=optimizer)\n#model.compile(loss=\"cosine_similarity\", optimizer=optimizer)\n#model.compile(loss=\"mean_absolute_error\", optimizer=optimizer)\n#early_stopping = EarlyStopping(monitor='val_loss', mode='auto', patience=20)\nearly_stopping = EarlyStopping(monitor='val_loss', mode='min', patience=20)",
"INFO (theano.gof.compilelock): Waiting for existing lock by process '10816' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '10980' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '10816' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '10980' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7036' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7020' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7020' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '10980' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7020' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7020' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8232' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8232' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8232' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '6764' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8956' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '10212' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '6764' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7796' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '7796' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8956' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8956' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\nINFO (theano.gof.compilelock): Waiting for existing lock by process '8956' (I am process '8724')\nINFO (theano.gof.compilelock): To manually release the lock, delete C:\\Users\\AdminUser\\AppData\\Local\\Theano\\compiledir_Windows-7-6.1.7601-SP1-Intel64_Family_6_Model_58_Stepping_9_GenuineIntel-2.7.13-64\\lock_dir\n"
],
[
"#model.fit(data, target, batch_size=300, epochs=100, validation_split=0.1, callbacks=[early_stopping])\nmodel.fit(data, target, batch_size=300, epochs=100, validation_split=0.1, callbacks=[early_stopping])",
"C:\\Users\\AdminUser\\Anaconda2\\lib\\site-packages\\theano\\scan_module\\scan_perform_ext.py:76: UserWarning: The file scan_perform.c is not available. This donot happen normally. You are probably in a strangesetup. This mean Theano can not use the cython code for scan. If youwant to remove this warning, use the Theano flag'cxx=' (set to an empty string) to disable all ccode generation.\n \"The file scan_perform.c is not available. This do\"\nC:\\Users\\AdminUser\\Anaconda2\\lib\\site-packages\\theano\\scan_module\\scan_perform_ext.py:76: UserWarning: The file scan_perform.c is not available. This donot happen normally. You are probably in a strangesetup. This mean Theano can not use the cython code for scan. If youwant to remove this warning, use the Theano flag'cxx=' (set to an empty string) to disable all ccode generation.\n \"The file scan_perform.c is not available. This do\"\nC:\\Users\\AdminUser\\Anaconda2\\lib\\site-packages\\theano\\scan_module\\scan_perform_ext.py:76: UserWarning: The file scan_perform.c is not available. This donot happen normally. You are probably in a strangesetup. This mean Theano can not use the cython code for scan. If youwant to remove this warning, use the Theano flag'cxx=' (set to an empty string) to disable all ccode generation.\n \"The file scan_perform.c is not available. This do\"\n"
],
[
"fig = plt.figure(1)\n\n#ๆค่จผใงใฏใใคใบใใชใใใผใฟใไฝฟ็จใใใใจ\npred = model.predict(data)\n\n#Y่ปธใฎใฉใใซ\nSignal_Ylabel_str = MusicType + 'Signal' + NoiseType\nRNN_Ylabel_str = 'RNN Pred' + 'Signal'\n\nplt.figure(figsize=(15, 4))\nplt.subplot(1, 2, 1)\nplt.xlim(0, 500)\nplt.plot(x, y, color='blue')\nplt.xlabel('Original Signal ' + MusicType)\nplt.ylabel(RNN_Ylabel_str)\n \nplt.subplot(1, 2, 2)\nplt.xlim(0, 500)\nplt.plot(x[:l], y[:l], color='blue', label=Signal_Ylabel_str)\nplt.plot(x[l:], pred, color='red', label=RNN_Ylabel_str, linestyle=\"dotted\")\nplt.xlabel('RNN Prediction of Acoustic Signal(' + MusicType + ')')\nplt.legend(loc='lower left')\n\n#plt.savefig\nfig.set_tight_layout(True)\nplt.savefig('./Output/RNN_Pred_Signal/RNN_' + NoiseType + '_' + MusicFileName + '.png')\nplt.show()",
"C:\\Users\\AdminUser\\Anaconda2\\lib\\site-packages\\theano\\scan_module\\scan_perform_ext.py:76: UserWarning: The file scan_perform.c is not available. This donot happen normally. You are probably in a strangesetup. This mean Theano can not use the cython code for scan. If youwant to remove this warning, use the Theano flag'cxx=' (set to an empty string) to disable all ccode generation.\n \"The file scan_perform.c is not available. This do\"\nC:\\Users\\AdminUser\\Anaconda2\\lib\\site-packages\\matplotlib\\font_manager.py:1333: UserWarning: findfont: Font family [u'IPAexGothic'] not found. Falling back to DejaVu Sans\n (prop.get_family(), self.defaultFamily[fontext]))\n"
],
[
"#ใทใฐใใซๅค\npd_y = pd.DataFrame(y[:l],columns=[\"Signal\"])\npd_pred = pd.DataFrame(pred,columns=[\"Signal\"])\npd_concat_y = pd.concat([pd_y,pd_pred], axis=0)\npd_concat_y = pd_concat_y.reset_index(drop=True)\n\n#ๆ้่ปธ\npd_pandas_x = pd.DataFrame(range(500),columns=[\"Time\"])\n\n#ไฟกๅท้
ๅ\npd_concat_Signal = pd.concat([pd_pandas_x,pd_concat_y], axis=1)\n\n#ไฟๅญ\npd_concat_Signal.to_csv('./Output/RNN_Pred_Signal/RNN_' + NoiseType + '_' + MusicFileName + '.csv')\npd_concat_Signal",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf894276a762482e17869ac431b67f27f044a2b
| 653,302 |
ipynb
|
Jupyter Notebook
|
Webscraping.ipynb
|
sarim-Sikander/WebScraping-Basic
|
d50e72179ffe070a6683473fe61de06695ab3d2f
|
[
"Apache-2.0"
] | null | null | null |
Webscraping.ipynb
|
sarim-Sikander/WebScraping-Basic
|
d50e72179ffe070a6683473fe61de06695ab3d2f
|
[
"Apache-2.0"
] | null | null | null |
Webscraping.ipynb
|
sarim-Sikander/WebScraping-Basic
|
d50e72179ffe070a6683473fe61de06695ab3d2f
|
[
"Apache-2.0"
] | null | null | null | 64.093201 | 58,423 | 0.550672 |
[
[
[
"from bs4 import BeautifulSoup\nimport requests\nimport pandas as pd\nfrom pandas import Series, DataFrame\nfrom ipywidgets import FloatProgress\nfrom time import sleep\nfrom IPython.display import display\nimport re\nimport pickle",
"_____no_output_____"
],
[
"url = 'http://www.imdb.com/chart/top?ref_=nv_mv_250_6'",
"_____no_output_____"
],
[
"result = requests.get(url)\nc = result.content\nsoup = BeautifulSoup(c,\"lxml\")",
"_____no_output_____"
],
[
"soup",
"_____no_output_____"
],
[
"moviename = []\ncast = []\ndescription = []\nrating = []\nratingoutof = []\nyear = []\ngenre = []\nmovielength = []\nrot_audscore = []\nrot_avgrating = []\nrot_users = []",
"_____no_output_____"
],
[
"summary = soup.find('div',{'class':'article'})",
"_____no_output_____"
],
[
"rgx = re.compile('[%s]' % '()')\nf = FloatProgress(min=0, max=250)\ndisplay(f)\nfor row,i in zip(summary.find('table').findAll('tr'),range(len(summary.find('table').findAll('tr')))):\n for sitem in row.findAll('span',{'class':'secondaryInfo'}):\n s = sitem.find(text=True)\n year.append(rgx.sub(\"\", s))\n for ritem in row.findAll('td',{'class':'ratingColumnimdbRating'}):\n for iget in ritem.findAll('strong',{'title':'9.2basedon2,364,168userratings'}):\n rat = iget.find(text=True)\n rating.append(rat)\n ratingoutof.append(iget.get('title').split(' ', 4)[3])\n for item in row.findAll('td',{'class':'titleColumn'}):\n for href in item.findAll('a',href=True):\n moviename.append(href.find(text=True))\n rurl = 'https://www.rottentomatoes.com/m/'+ href.find(text=True)\n try:\n rresult = requests.get(rurl)\n except requests.exceptions.ConnectionError:\n status_code = \"Connection refused\"\n rc = rresult.content\n rsoup = BeautifulSoup(rc)\n try:\n rot_audscore.append(rsoup.find('div',{'class':'meter-value'}).find('span',{'class':'superPageFontColor'}).text)\n rot_avgrating.append(rsoup.find('div',{'class':'audience-info hidden-xssuperPageFontColor'}).find('div').contents[2].strip())\n rot_users.append(rsoup.find('div',{'class':'audience-info hidden-xssuperPageFontColor'}).contents[3].contents[2].strip())\n except AttributeError:\n rot_audscore.append(\"\")\n rot_avgrating.append(\"\")\n rot_users.append(\"\")\n cast.append(href.get('title'))\n imdb = \"http://www.imdb.com\" + href.get('href')\n try:\n iresult = requests.get(imdb)\n ic = iresult.content\n isoup = BeautifulSoup(ic)\n \n for mov in isoup.findAll('time',text=True):\n movielength.append(mov.find(text=True))\n rating.append(isoup.find('span',{'itemprop':'ratingValue'}).find(text=True))\n description.append(isoup.find('div',{'class':'summary_text'}).find(text=True).strip())\n genre.append(isoup.find('span',{'class':'itemprop'}).find(text=True))\n \n except requests.exceptions.ConnectionError:\n description.append(\"\")\n genre.append(\"\")\n movielength.append(\"\")\n sleep(.1)\n f.value = i",
"_____no_output_____"
],
[
"href.find(text=True)",
"_____no_output_____"
],
[
"moviename = Series(moviename)\ncast = Series(cast)\ndescription = Series(description)\nrating = Series(rating)\nratingoutof = Series(ratingoutof)\nyear = Series(year)\ngenre = Series(genre)\nmovielength = Series(movielength)\nrot_audscore = Series(rot_audscore)\nrot_avgrating = Series(rot_avgrating)\nrot_users = Series(rot_users)",
"<ipython-input-16-7a46a21d2685>:5: DeprecationWarning: The default dtype for empty Series will be 'object' instead of 'float64' in a future version. Specify a dtype explicitly to silence this warning.\n ratingoutof = Series(ratingoutof)\n"
],
[
"imdb_df = pd.concat([moviename,year,description,genre,movielength,cast,rating,ratingoutof,rot_audscore,rot_avgrating,rot_users],axis=1)\nimdb_df.columns = [ 'moviename','year','description','genre','movielength','cast','imdb_rating','imdb_ratingbasedon','tomatoes_audscore','tomatoes_rating','tomatoes_ratingbasedon']\nimdb_df['rank'] = imdb_df.index + 1\nimdb_df.head(10)",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8970713abb784880914f33728e9243255dcab
| 45,863 |
ipynb
|
Jupyter Notebook
|
sandbox.ipynb
|
wd15/fastai-nix-intstall
|
c4624d8e61ca9bad2d46cebcf648672d5daf8eac
|
[
"MIT"
] | 2 |
2020-12-10T01:35:18.000Z
|
2021-07-01T14:47:08.000Z
|
sandbox.ipynb
|
wd15/fastai-nix-intstall
|
c4624d8e61ca9bad2d46cebcf648672d5daf8eac
|
[
"MIT"
] | null | null | null |
sandbox.ipynb
|
wd15/fastai-nix-intstall
|
c4624d8e61ca9bad2d46cebcf648672d5daf8eac
|
[
"MIT"
] | null | null | null | 179.854902 | 40,436 | 0.906962 |
[
[
[
"import numpy as np\nfrom toolz.curried import pipe, curry\nimport matplotlib.pyplot as plt\n\n%matplotlib inline",
"_____no_output_____"
],
[
"def softmax(x):\n return np.exp(x) / np.sum(np.exp(x), axis=1, keepdims=True)",
"_____no_output_____"
],
[
"@curry\ndef xentropy(y, x):\n return -np.log(x[range(len(y)), y])",
"_____no_output_____"
],
[
"@curry\ndef nn(w, b, x):\n return np.dot(x, w) + b",
"_____no_output_____"
],
[
"def relu(x):\n return np.maximum(0, x)",
"_____no_output_____"
],
[
"@curry\ndef loss(reg, weights, x):\n w0 = weights['w0']\n w1 = weights['w1']\n return np.sum(x) / len(x) + 0.5 * reg * (np.sum(w0 * w0) + np.sum(w1 * w1)) ",
"_____no_output_____"
],
[
"def calc_layers(x, y, weights, reg):\n return pipe(\n x,\n nn(weights['w0'], weights['b0']),\n relu,\n nn(weights['w1'], weights['b1']),\n softmax,\n xentropy(y),\n loss(reg, weights)\n )",
"_____no_output_____"
],
[
"def init(shape):\n return 0.01 * (2 * np.random.random(shape) - 1)",
"_____no_output_____"
],
[
"def init_weights(shapes):\n return dict(\n w0=init((shapes[0], shapes[1])),\n w1=init((shapes[1], shapes[2])),\n b0=np.zeros(shapes[1]),\n b1=np.zeros(shapes[2])\n )",
"_____no_output_____"
],
[
"Nsample = 10\nNfeature = 20\nN0 = 15\nNclass = 3\nreg = 1e-3\n\nexpected = np.random.choice(np.arange(Nclass), size=(Nsample,))\ninputs = 2 * np.random.random((Nsample, Nfeature)) - 1\nweights = init_weights((Nfeature, N0, Nclass))\n\nloss_ = calc_layers(inputs, expected, weights, reg)\nprint(loss_)",
"1.098526677544226\n"
],
[
"def get_data():\n N = 100 # number of points per class\n D = 2 # dimensionality\n K = 3 # number of classes\n X = np.zeros((N*K,D)) # data matrix (each row = single example)\n y = np.zeros(N*K, dtype='uint8') # class labels\n for j in range(K):\n ix = range(N*j,N*(j+1))\n r = np.linspace(0.0,1,N) # radius\n t = np.linspace(j*4,(j+1)*4,N) + np.random.randn(N)*0.2 # theta\n X[ix] = np.c_[r*np.sin(t), r*np.cos(t)]\n y[ix] = j\n return X, y\n \nx, y = get_data()\nprint(x.shape)\nprint(y.shape)\nplt.scatter(x[:, 0], x[:, 1], c=y, s=40, cmap=plt.cm.Spectral)\nplt.show()\n",
"(300, 2)\n(300,)\n"
],
[
"Nsample = 300\nNfeature = 2\nN0 = 10\nNclass = 3\nreg = 1e-3\n\nexpected = y\ninputs = x\nweights = init_weights((Nfeature, N0, Nclass))\n\nloss_ = calc_layers(inputs, expected, weights, reg)\nprint(loss_)\n",
"1.0986264118045523\n"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8984e1f6b43c93496a7cbebb270f8095d93c7
| 41,254 |
ipynb
|
Jupyter Notebook
|
notebook/03-Models/05_SST2_Huggingface_model_ai_platform_training_python.ipynb
|
tarrade/proj_multilingual_text_classification
|
3b94b0be5a9ee013ff7bc093e23abe85e4d697e3
|
[
"Apache-2.0"
] | 3 |
2020-05-19T16:17:26.000Z
|
2021-04-26T18:58:00.000Z
|
notebook/03-Models/05_SST2_Huggingface_model_ai_platform_training_python.ipynb
|
tarrade/proj_multilingual_text_classification
|
3b94b0be5a9ee013ff7bc093e23abe85e4d697e3
|
[
"Apache-2.0"
] | 80 |
2020-03-12T13:43:11.000Z
|
2021-11-10T19:49:44.000Z
|
notebook/03-Models/05_SST2_Huggingface_model_ai_platform_training_python.ipynb
|
tarrade/proj_multilingual_text_classification
|
3b94b0be5a9ee013ff7bc093e23abe85e4d697e3
|
[
"Apache-2.0"
] | null | null | null | 37.232852 | 561 | 0.569181 |
[
[
[
"# The Stanford Sentiment Treebank \nThe Stanford Sentiment Treebank consists of sentences from movie reviews and human annotations of their sentiment. The task is to predict the sentiment of a given sentence. We use the two-way (positive/negative) class split, and use only sentence-level labels.",
"_____no_output_____"
]
],
[
[
"from IPython.display import display, Markdown\nwith open('../../doc/env_variables_setup.md', 'r') as fh:\n content = fh.read()\ndisplay(Markdown(content))",
"_____no_output_____"
]
],
[
[
"## Import Packages",
"_____no_output_____"
]
],
[
[
"import tensorflow as tf\nfrom transformers import (\n BertConfig,\n BertTokenizer,\n XLMRobertaTokenizer,\n TFBertModel,\n TFXLMRobertaModel,\n)\nimport os\nfrom datetime import datetime\nimport tensorflow_datasets\nfrom tensorboard import notebook\nimport math\n#from google.cloud import storage\nfrom googleapiclient import discovery\nfrom googleapiclient import errors\nimport logging\nimport json",
"_____no_output_____"
]
],
[
[
"## Check configuration",
"_____no_output_____"
]
],
[
[
"print(tf.version.GIT_VERSION, tf.version.VERSION)",
"v2.3.0-rc2-23-gb36436b087 2.3.0\n"
],
[
"print(tf.keras.__version__)",
"2.4.0\n"
],
[
"gpus = tf.config.list_physical_devices('GPU')\nif len(gpus)>0:\n for gpu in gpus:\n print('Name:', gpu.name, ' Type:', gpu.device_type)\nelse:\n print('No GPU available !!!!')",
"No GPU available !!!!\n"
]
],
[
[
"## Define Paths",
"_____no_output_____"
]
],
[
[
"try:\n data_dir=os.environ['PATH_DATASETS']\nexcept KeyError:\n print('missing PATH_DATASETS')\ntry: \n tensorboard_dir=os.environ['PATH_TENSORBOARD']\nexcept KeyError:\n print('missing PATH_TENSORBOARD')\ntry: \n savemodel_dir=os.environ['PATH_SAVE_MODEL']\nexcept KeyError:\n print('missing PATH_SAVE_MODEL')",
"_____no_output_____"
]
],
[
[
"# Import local packages",
"_____no_output_____"
]
],
[
[
"import utils.model_utils as mu",
"_____no_output_____"
],
[
"import importlib\nimportlib.reload(mu);",
"_____no_output_____"
]
],
[
[
"## Train the model on AI Platform Training (for production)",
"_____no_output_____"
]
],
[
[
"project_name = os.environ['PROJECT_ID']\nproject_id = 'projects/{}'.format(project_name)\nai_platform_training = discovery.build('ml', 'v1', cache_discovery=False)",
"/Users/tarrade/anaconda-release/conda-env/env_multilingual_class/lib/python3.7/site-packages/google/auth/_default.py:69: UserWarning: Your application has authenticated using end user credentials from Google Cloud SDK without a quota project. You might receive a \"quota exceeded\" or \"API not enabled\" error. We recommend you rerun `gcloud auth application-default login` and make sure a quota project is added. Or you can use service accounts instead. For more information about service accounts, see https://cloud.google.com/docs/authentication/\n warnings.warn(_CLOUD_SDK_CREDENTIALS_WARNING)\n"
],
[
"# choose the model\nmodel_name = 'tf_bert_classification'\n#model_name = 'test_log_bert'\n\n# variable used to build some variable's name\ntype_production = 'test' #'test', 'production'\nhardware = 'cpu' #'cpu', 'gpu', 'tpu'\nowner = os.environ['OWNER']\ntier = 'basic' #'basic', 'custom'\npython_version = '3.7'\nruntime_version = '2.2'\n\n\nhp_tuning= False\nverbosity = 'INFO'\nprofiling = False\n# use custom container\nuse_custom_container = False\ntag='/test:v0.0.0'\n# overwrite parameter for testing logging\ntest_logging = False\n\n\nprint(' modifying Tensorflow env variable')\n# 0 = all messages are logged (default behavior)\n# 1 = INFO messages are not printed\n# 2 = INFO and WARNING messages are not printed\n# 3 = INFO, WARNING, and ERROR messages are not printed\nwith open(os.environ['DIR_PROJ']+'/utils/env_variables.json', 'r') as outfile:\n env_var = json.load(outfile)\nif verbosity == 'DEBUG' or verbosity == 'VERBOSE' or verbosity == 'INFO':\n env_var['TF_CPP_MIN_LOG_LEVEL'] = 0\n env_var['TF_CPP_MIN_VLOG_LEVEL'] = 0\nelif verbosity == 'WARNING':\n env_var['TF_CPP_MIN_LOG_LEVEL'] = 1\n env_var['TF_CPP_MIN_VLOG_LEVEL'] = 1\nelif verbosity == 'ERROR':\n env_var['TF_CPP_MIN_LOG_LEVEL'] = 2\n env_var['TF_CPP_MIN_VLOG_LEVEL'] = 2\nelse:\n env_var['TF_CPP_MIN_LOG_LEVEL'] = 3\n env_var['TF_CPP_MIN_VLOG_LEVEL'] = 3\nprint(\"env_var['TF_CPP_MIN_LOG_LEVEL']=\", env_var['TF_CPP_MIN_LOG_LEVEL'])\nprint(\"env_var['TF_CPP_MIN_VLOG_LEVEL']=\", env_var['TF_CPP_MIN_VLOG_LEVEL'])\ndata={}\ndata['TF_CPP_MIN_LOG_LEVEL'] = env_var['TF_CPP_MIN_LOG_LEVEL']\ndata['TF_CPP_MIN_VLOG_LEVEL'] = env_var['TF_CPP_MIN_VLOG_LEVEL']\nwith open(os.environ['DIR_PROJ']+'/utils/env_variables.json', 'w') as outfile:\n json.dump(data, outfile)\n\n# define parameters for ai platform training\nif not use_custom_container:\n # delete old package version\n for root, dirs, files in os.walk(os.environ['DIR_PROJ'] + '/dist/'):\n for filename in files:\n package_dist=os.environ['DIR_PROJ'] + '/dist/'+filename\n if package_dist[-7:]=='.tar.gz':\n print('removing package\"', package_dist)\n os.remove(package_dist)\n package_gcs = mu.create_module_tar_archive(model_name)\nelse:\n package_gcs = None\n\ntimestamp = datetime.now().strftime(\"%Y_%m_%d_%H%M%S\")\nif hp_tuning:\n job_name = model_name+'_hp_tuning_'+hardware+'_'+timestamp\nelse:\n job_name = model_name+'_'+hardware+'_'+timestamp\nmodule_name = 'model.'+model_name+'.task'\n\nif tier=='basic' and hardware=='cpu':\n # CPU\n region = 'europe-west1'\n \nelif tier=='basic' and hardware=='gpu':\n # GPU\n region = 'europe-west1'\n \nelif tier=='custom' and hardware=='gpu':\n # Custom GPU\n region = 'europe-west4'\n \nelif tier=='basic' and hardware=='tpu':\n # TPU\n #region = 'us-central1'\n region = 'europe-west4' # No zone in region europe-west4 has accelerators of all requested types\n #region = 'europe-west6' # The request for 8 TPU_V2 accelerators exceeds the allowed maximum of 0 K80, 0 P100, 0 P4, 0 T4, 0 TPU_V2, 0 TPU_V2_POD, 0 TPU_V3, 0 TPU_V3_POD, 0 V100\n #region = 'europe-west2' # No zone in region europe-west2 has accelerators of all requested types\n\nelif tier=='custom' and hardware=='tpu':\n # TPU\n #region = 'us-central1'\n region = 'europe-west4'\n #region = 'europe-west6'\n #region = 'europe-west2'\n\nelse:\n # Default\n region = 'europe-west1'\n\n# define parameters for training of the model\nif type_production=='production':\n # reading metadata\n _, info = tensorflow_datasets.load(name='glue/sst2',\n data_dir=data_dir,\n with_info=True)\n # define parameters\n epochs = 2 \n batch_size_train = 32\n #batch_size_test = 32\n batch_size_eval = 64 \n \n # Maxium length, becarefull BERT max length is 512!\n max_length = 128\n\n # extract parameters\n size_train_dataset=info.splits['train'].num_examples\n #size_test_dataset=info.splits['test'].num_examples\n size_valid_dataset=info.splits['validation'].num_examples\n\n # computer parameter\n steps_per_epoch_train = math.ceil(size_train_dataset/batch_size_train)\n #steps_per_epoch_test = math.ceil(size_test_dataset/batch_size_test)\n steps_per_epoch_eval = math.ceil(size_valid_dataset/batch_size_eval)\n\n #print('Dataset size: {:6}/{:6}/{:6}'.format(size_train_dataset, size_test_dataset, size_valid_dataset))\n #print('Batch size: {:6}/{:6}/{:6}'.format(batch_size_train, batch_size_test, batch_size_eval))\n #print('Step per epoch: {:6}/{:6}/{:6}'.format(steps_per_epoch_train, steps_per_epoch_test, steps_per_epoch_eval))\n #print('Total number of batch: {:6}/{:6}/{:6}'.format(steps_per_epoch_train*(epochs+1), steps_per_epoch_test*(epochs+1), steps_per_epoch_eval*1))\n print('Number of epoch: {:6}'.format(epochs))\n print('Batch size: {:6}/{:6}'.format(batch_size_train, batch_size_eval))\n print('Step per epoch: {:6}/{:6}'.format(steps_per_epoch_train, steps_per_epoch_eval))\n\nelse:\n if hardware=='tpu':\n epochs = 1 \n steps_per_epoch_train = 6 #5 \n batch_size_train = 32 \n steps_per_epoch_eval = 1 \n batch_size_eval = 64\n else:\n epochs = 1 \n steps_per_epoch_train = 6 #5 \n batch_size_train = 32 \n steps_per_epoch_eval = 1 \n batch_size_eval = 64\n \nsteps=epochs*steps_per_epoch_train\nif steps<=5:\n n_steps_history=4\nelif steps>=5 and steps<1000:\n n_steps_history=10\n print('be carefull with profiling between step: 10-20')\nelse:\n n_steps_history=int(steps/100)\n print('be carefull with profiling between step: 10-20')\nprint('will compute accuracy on the test set every {} step so {} time'.format(n_steps_history, int(steps/n_steps_history)))\n\nif profiling:\n print(' profiling ...')\n steps_per_epoch_train = 100 \n n_steps_history=25\n\ninput_eval_tfrecords = 'gs://'+os.environ['BUCKET_NAME']+'/tfrecord/sst2/bert-base-multilingual-uncased/valid' #'gs://public-test-data-gs/valid'\ninput_train_tfrecords = 'gs://'+os.environ['BUCKET_NAME']+'/tfrecord/sst2/bert-base-multilingual-uncased/train' #'gs://public-test-data-gs/train'\nif hp_tuning:\n output_dir = 'gs://'+os.environ['BUCKET_NAME']+'/training_model_gcp/'+model_name+'_hp_tuning_'+hardware+'_'+timestamp\nelse:\n output_dir = 'gs://'+os.environ['BUCKET_NAME']+'/training_model_gcp/'+model_name+'_'+hardware+'_'+timestamp\npretrained_model_dir = 'gs://'+os.environ['BUCKET_NAME']+'/pretrained_model/bert-base-multilingual-uncased'\n#epsilon = 1.7788921050163616e-06\n#learning_rate= 0.0007763625134788308\nepsilon = 1e-8\nlearning_rate= 5e-5\n\n# bulding training_inputs\nparameters = ['--epochs', str(epochs),\n '--steps_per_epoch_train', str(steps_per_epoch_train),\n '--batch_size_train', str(batch_size_train),\n '--steps_per_epoch_eval', str(steps_per_epoch_eval),\n '--n_steps_history', str(n_steps_history),\n '--batch_size_eval', str(batch_size_eval),\n '--input_eval_tfrecords', input_eval_tfrecords ,\n '--input_train_tfrecords', input_train_tfrecords,\n '--output_dir', output_dir,\n '--pretrained_model_dir', pretrained_model_dir,\n '--verbosity_level', verbosity,\n '--epsilon', str(epsilon),\n '--learning_rate', str(learning_rate)]\nif hardware=='tpu':\n parameters.append('--use_tpu')\n parameters.append('True')\n\ntraining_inputs = {\n 'args': parameters,\n 'region': region,\n}\n \nif not use_custom_container:\n training_inputs['packageUris'] = [package_gcs]\n training_inputs['pythonModule'] = module_name\n training_inputs['runtimeVersion'] = runtime_version\n training_inputs['pythonVersion'] = python_version\n \nelse:\n accelerator_master = {'imageUri': image_uri}\n training_inputs['masterConfig'] = accelerator_master\n\n\nif tier=='basic' and hardware=='cpu':\n # CPU\n training_inputs['scaleTier'] = 'BASIC'\n #training_inputs['scaleTier'] = 'STANDARD_1'\n \nelif tier=='custom' and hardware=='cpu':\n # CPU\n training_inputs['scaleTier'] = 'CUSTOM'\n training_inputs['masterType'] = 'n1-standard-16'\n \nelif tier=='basic' and hardware=='gpu':\n # GPU\n training_inputs['scaleTier'] = 'BASIC_GPU'\n \nelif tier=='custom' and hardware=='gpu':\n # Custom GPU\n training_inputs['scaleTier'] = 'CUSTOM'\n training_inputs['masterType'] = 'n1-standard-8'\n accelerator_master = {'acceleratorConfig': {\n 'count': '1',\n 'type': 'NVIDIA_TESLA_V100'}\n }\n training_inputs['masterConfig'] = accelerator_master\n\n \nelif tier=='basic' and hardware=='tpu':\n # TPU\n training_inputs['scaleTier'] = 'BASIC_TPU'\n \nelif tier=='custom' and hardware=='tpu':\n # Custom TPU\n training_inputs['scaleTier'] = 'CUSTOM'\n training_inputs['masterType'] = 'n1-highcpu-16'\n training_inputs['workerType'] = 'cloud_tpu'\n training_inputs['workerCount'] = '1'\n accelerator_master = {'acceleratorConfig': {\n 'count': '8',\n 'type': 'TPU_V3'}\n }\n training_inputs['workerConfig'] = accelerator_master\n\nelse:\n # Default\n training_inputs['scaleTier'] = 'BASIC'\n print('======')\n\n# add hyperparameter tuning to the job config.\nif hp_tuning:\n hyperparams = {\n 'algorithm': 'ALGORITHM_UNSPECIFIED',\n 'goal': 'MAXIMIZE',\n 'maxTrials': 3,\n 'maxParallelTrials': 2,\n 'maxFailedTrials': 1,\n 'enableTrialEarlyStopping': True,\n 'hyperparameterMetricTag': 'metric_accuracy_train_epoch',\n 'params': []}\n\n hyperparams['params'].append({\n 'parameterName':'learning_rate',\n 'type':'DOUBLE',\n 'minValue': 1.0e-8,\n 'maxValue': 1.0,\n 'scaleType': 'UNIT_LOG_SCALE'})\n \n hyperparams['params'].append({\n 'parameterName':'epsilon',\n 'type':'DOUBLE',\n 'minValue': 1.0e-9,\n 'maxValue': 1.0,\n 'scaleType': 'UNIT_LOG_SCALE'})\n\n # Add hyperparameter specification to the training inputs dictionary.\n training_inputs['hyperparameters'] = hyperparams\n \n# building job_spec\nlabels = {'accelerator': hardware,\n 'prod_type': type_production,\n 'owner': owner}\n\nif use_custom_container:\n labels['type'] = 'custom_container'\nelse:\n labels['type'] = 'gcp_runtime'\n\njob_spec = {'jobId': job_name, \n 'labels': labels, \n 'trainingInput': training_inputs}\n\n\nif test_logging:\n # test\n # variable used to build some variable's name\n owner = os.environ['OWNER']\n tier = 'basic' \n verbosity = 'INFO'\n\n # define parameters for ai platform training\n if not use_custom_container:\n package_gcs = package_gcs\n else:\n image_uri='gcr.io/'+os.environ['PROJECT_ID']+tag\n \n job_name = 'debug_test_'+datetime.now().strftime(\"%Y_%m_%d_%H%M%S\")\n\n module_name = 'model.test-log.task'\n #module_name = 'model.test.task'\n\n region = 'europe-west1'\n\n # bulding training_inputs\n parameters = ['--verbosity_level', verbosity]\n\n \n training_inputs = {\n 'args': parameters,\n 'region': region,\n }\n \n if not use_custom_container:\n training_inputs['packageUris'] = [package_gcs]\n training_inputs['pythonModule'] = module_name \n training_inputs['runtimeVersion'] = runtime_version\n training_inputs['pythonVersion'] = python_version\n \n else:\n accelerator_master = {'imageUri': image_uri}\n #training_inputs['pythonModule'] = module_name # not working to overwrite the entrypoint\n training_inputs['masterConfig'] = accelerator_master\n\n training_inputs['scaleTier'] = 'BASIC'\n\n # building job_spec\n labels = {'accelerator': 'cpu',\n 'prod_type': 'debug',\n 'owner': owner}\n \n if use_custom_container:\n labels['type'] = 'custom_container'\n else:\n labels['type'] = 'gcp_runtime'\n\n job_spec = {'jobId': job_name, \n 'labels': labels, \n 'trainingInput': training_inputs}",
" modifying Tensorflow env variable\nenv_var['TF_CPP_MIN_LOG_LEVEL']= 0\nenv_var['TF_CPP_MIN_VLOG_LEVEL']= 0\nremoving package\" /Users/tarrade/Desktop/Work/Data_Science/Tutorials_Codes/Python/proj_multilingual_text_classification/src/dist/bert_model-0.2.tar.gz\nwarning: sdist: standard file not found: should have one of README, README.rst, README.txt, README.md\n\nwarning: check: missing required meta-data: url\n\nrunning sdist\nrunning egg_info\nwriting bert_model.egg-info/PKG-INFO\nwriting dependency_links to bert_model.egg-info/dependency_links.txt\nwriting requirements to bert_model.egg-info/requires.txt\nwriting top-level names to bert_model.egg-info/top_level.txt\nreading manifest file 'bert_model.egg-info/SOURCES.txt'\nreading manifest template 'MANIFEST.in'\nwriting manifest file 'bert_model.egg-info/SOURCES.txt'\nrunning check\ncreating bert_model-0.2\ncreating bert_model-0.2/analysis\ncreating bert_model-0.2/bert_model.egg-info\ncreating bert_model-0.2/data\ncreating bert_model-0.2/model\ncreating bert_model-0.2/model/sklearn_naive_bayes\ncreating bert_model-0.2/model/test\ncreating bert_model-0.2/model/test_hp\ncreating bert_model-0.2/model/test_hp_bert\ncreating bert_model-0.2/model/test_log\ncreating bert_model-0.2/model/tf_bert_classification\ncreating bert_model-0.2/model/tf_custom_bert_classification\ncreating bert_model-0.2/preprocessing\ncreating bert_model-0.2/utils\ncopying files to bert_model-0.2...\ncopying MANIFEST.in -> bert_model-0.2\ncopying setup.py -> bert_model-0.2\ncopying analysis/__init__.py -> bert_model-0.2/analysis\ncopying analysis/get_data.py -> bert_model-0.2/analysis\ncopying bert_model.egg-info/PKG-INFO -> bert_model-0.2/bert_model.egg-info\ncopying bert_model.egg-info/SOURCES.txt -> bert_model-0.2/bert_model.egg-info\ncopying bert_model.egg-info/dependency_links.txt -> bert_model-0.2/bert_model.egg-info\ncopying bert_model.egg-info/requires.txt -> bert_model-0.2/bert_model.egg-info\ncopying bert_model.egg-info/top_level.txt -> bert_model-0.2/bert_model.egg-info\ncopying data/__init__.py -> bert_model-0.2/data\ncopying data/load_imdb.py -> bert_model-0.2/data\ncopying model/__init__.py -> bert_model-0.2/model\ncopying model/sklearn_naive_bayes/__init__.py -> bert_model-0.2/model/sklearn_naive_bayes\ncopying model/sklearn_naive_bayes/model.py -> bert_model-0.2/model/sklearn_naive_bayes\ncopying model/sklearn_naive_bayes/task.py -> bert_model-0.2/model/sklearn_naive_bayes\ncopying model/test/__init__.py -> bert_model-0.2/model/test\ncopying model/test/task.py -> bert_model-0.2/model/test\ncopying model/test_hp/__init__.py -> bert_model-0.2/model/test_hp\ncopying model/test_hp/model.py -> bert_model-0.2/model/test_hp\ncopying model/test_hp/task.py -> bert_model-0.2/model/test_hp\ncopying model/test_hp/util.py -> bert_model-0.2/model/test_hp\ncopying model/test_hp_bert/__init__.py -> bert_model-0.2/model/test_hp_bert\ncopying model/test_hp_bert/model.py -> bert_model-0.2/model/test_hp_bert\ncopying model/test_hp_bert/task.py -> bert_model-0.2/model/test_hp_bert\ncopying model/test_log/__init__.py -> bert_model-0.2/model/test_log\ncopying model/test_log/task.py -> bert_model-0.2/model/test_log\ncopying model/tf_bert_classification/__init__.py -> bert_model-0.2/model/tf_bert_classification\ncopying model/tf_bert_classification/model.py -> bert_model-0.2/model/tf_bert_classification\ncopying model/tf_bert_classification/task.py -> bert_model-0.2/model/tf_bert_classification\ncopying model/tf_custom_bert_classification/__init__.py -> bert_model-0.2/model/tf_custom_bert_classification\ncopying model/tf_custom_bert_classification/model.py -> bert_model-0.2/model/tf_custom_bert_classification\ncopying model/tf_custom_bert_classification/task.py -> bert_model-0.2/model/tf_custom_bert_classification\ncopying preprocessing/__init__.py -> bert_model-0.2/preprocessing\ncopying preprocessing/preprocessing.py -> bert_model-0.2/preprocessing\ncopying utils/__init__.py -> bert_model-0.2/utils\ncopying utils/env_variables.json -> bert_model-0.2/utils\ncopying utils/model_metrics.py -> bert_model-0.2/utils\ncopying utils/model_tests.py -> bert_model-0.2/utils\ncopying utils/model_utils.py -> bert_model-0.2/utils\ncopying utils/ressources_utils.py -> bert_model-0.2/utils\nWriting bert_model-0.2/setup.cfg\nCreating tar archive\nremoving 'bert_model-0.2' (and everything under it)\nproj_multilingual_text_classification/bert_model-0.2.tar.gz\nLast modified: Sun Aug 30 10:50:15 2020\nCreated: Sun Aug 30 10:50:15 2020\n"
],
[
"training_inputs, job_name",
"_____no_output_____"
],
[
"# submit the training job\nrequest = ai_platform_training.projects().jobs().create(body=job_spec,\n parent=project_id)\ntry:\n response = request.execute()\n print('Job status for {}:'.format(response['jobId']))\n print(' state : {}'.format(response['state']))\n print(' createTime: {}'.format(response['createTime']))\n\nexcept errors.HttpError as err:\n # For this example, just send some text to the logs.\n # You need to import logging for this to work.\n logging.error('There was an e0rror creating the training job.'\n ' Check the details:')\n logging.error(err._get_reason())",
"Job status for tf_bert_classification_cpu_2020_08_30_105018:\n state : QUEUED\n createTime: 2020-08-30T08:50:24Z\n"
],
[
"# if you wnat to specify a specif job ID\n#job_name = 'tf_bert_classification_2020_05_16_193551'\njobId = 'projects/{}/jobs/{}'.format(project_name, job_name)\nrequest = ai_platform_training.projects().jobs().get(name=jobId)\nresponse = None\n\ntry:\n response = request.execute()\n print('Job status for {}:'.format(response['jobId']))\n print(' state : {}'.format(response['state']))\n if 'trainingOutput' in response.keys():\n if 'trials' in response['trainingOutput'].keys():\n for sub_job in response['trainingOutput']['trials']:\n print(' trials : {}'.format(sub_job))\n if 'consumedMLUnits' in response.keys():\n print(' consumedMLUnits : {}'.format(response['trainingOutput']['consumedMLUnits']))\n if 'errorMessage' in response.keys():\n print(' errorMessage : {}'.format(response['errorMessage']))\n \nexcept errors.HttpError as err:\n logging.error('There was an error getting the logs.'\n ' Check the details:')\n logging.error(err._get_reason())",
"Job status for tf_bert_classification_cpu_2020_08_20_105700:\n state : PREPARING\n"
],
[
"# how to stream logs\n# --stream-logs",
"_____no_output_____"
]
],
[
[
"# TensorBoard for job running on GCP",
"_____no_output_____"
]
],
[
[
"# View open TensorBoard instance\n#notebook.list() ",
"_____no_output_____"
],
[
"# View pid\n#!ps -ef|grep tensorboard",
"_____no_output_____"
],
[
"# Killed Tensorboard process by using pid\n#!kill -9 pid",
"_____no_output_____"
],
[
"%load_ext tensorboard\n#%reload_ext tensorboard\n%tensorboard --logdir {output_dir+'/tensorboard'} \\\n #--host 0.0.0.0 \\\n #--port 6006 \\\n #--debugger_port 6006",
"The tensorboard extension is already loaded. To reload it, use:\n %reload_ext tensorboard\n"
],
[
"%load_ext tensorboard\n#%reload_ext tensorboard\n%tensorboard --logdir {output_dir+'/hparams_tuning'} \\\n #--host 0.0.0.0 \\\n #--port 6006 \\\n #--debugger_port 6006",
"The tensorboard extension is already loaded. To reload it, use:\n %reload_ext tensorboard\n"
],
[
"!tensorboard dev upload --logdir \\\n 'gs://multilingual_text_classification/training_model_gcp/tf_bert_classification_cpu_2020_08_20_093837/tensorboard' --one_shot --yes",
"usage: tensorboard [-h] [--helpfull] {serve,dev} ...\ntensorboard: error: unrecognized arguments: --yes\n"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8b98c2393425646b20b9b96b9a7df742ac299
| 505,754 |
ipynb
|
Jupyter Notebook
|
docker/work/answer/ans_preprocess_knock_Python.ipynb
|
nknytk/100knocks-preprocess
|
994c940b4ac7395f370c3cf2790b8b47b7ce46a0
|
[
"MIT"
] | 1,589 |
2020-04-10T02:33:28.000Z
|
2022-03-30T22:03:11.000Z
|
docker/work/answer/ans_preprocess_knock_Python.ipynb
|
nknytk/100knocks-preprocess
|
994c940b4ac7395f370c3cf2790b8b47b7ce46a0
|
[
"MIT"
] | 92 |
2020-06-17T16:43:56.000Z
|
2022-03-30T05:54:18.000Z
|
docker/work/answer/ans_preprocess_knock_Python.ipynb
|
nknytk/100knocks-preprocess
|
994c940b4ac7395f370c3cf2790b8b47b7ce46a0
|
[
"MIT"
] | 260 |
2020-06-15T14:37:07.000Z
|
2022-03-29T13:16:40.000Z
| 32.199274 | 293 | 0.370186 |
[
[
[
"# ใใผใฟใตใคใจใณใน100ๆฌใใใฏ๏ผๆง้ ๅใใผใฟๅ ๅทฅ็ทจ๏ผ - Python",
"_____no_output_____"
],
[
"## ใฏใใใซ\n- ๅใใซไปฅไธใฎใปใซใๅฎ่กใใฆใใ ใใ\n- ๅฟ
่ฆใชใฉใคใใฉใชใฎใคใณใใผใใจใใผใฟใใผใน๏ผPostgreSQL๏ผใใใฎใใผใฟ่ชญใฟ่พผใฟใ่กใใพใ\n- pandas็ญใๅฉ็จใๆณๅฎใใใใฉใคใใฉใชใฏไปฅไธใปใซใงใคใณใใผใใใฆใใพใ\n- ใใฎไปๅฉ็จใใใใฉใคใใฉใชใใใใฐ้ฉๅฎใคใณในใใผใซใใฆใใ ใใ๏ผ\"!pip install ใฉใคใใฉใชๅ\"ใงใคใณในใใผใซใๅฏ่ฝ๏ผ\n- ๅฆ็ใฏ่คๆฐๅใซๅใใฆใๆงใใพใใ\n- ๅๅใไฝๆ็ญใฏใใใผใใผใฟใงใใใๅฎๅจใใใใฎใงใฏใใใพใใ",
"_____no_output_____"
]
],
[
[
"import os\nimport pandas as pd\nimport numpy as np\nfrom datetime import datetime, date\nfrom dateutil.relativedelta import relativedelta\nimport math\nimport psycopg2\nfrom sqlalchemy import create_engine\nfrom sklearn import preprocessing\nfrom sklearn.model_selection import train_test_split\nfrom imblearn.under_sampling import RandomUnderSampler\n\npgconfig = {\n 'host': 'db',\n 'port': os.environ['PG_PORT'],\n 'database': os.environ['PG_DATABASE'],\n 'user': os.environ['PG_USER'],\n 'password': os.environ['PG_PASSWORD'],\n}\n\n# pd.read_sql็จใฎใณใใฏใฟ\nconn = psycopg2.connect(**pgconfig)\n\ndf_customer = pd.read_sql(sql='select * from customer', con=conn)\ndf_category = pd.read_sql(sql='select * from category', con=conn)\ndf_product = pd.read_sql(sql='select * from product', con=conn)\ndf_receipt = pd.read_sql(sql='select * from receipt', con=conn)\ndf_store = pd.read_sql(sql='select * from store', con=conn)\ndf_geocode = pd.read_sql(sql='select * from geocode', con=conn)",
"_____no_output_____"
]
],
[
[
"# ๆผ็ฟๅ้ก",
"_____no_output_____"
],
[
"---\n> P-001: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅ
จ้
็ฎใฎๅ
้ ญ10ไปถใ่กจ็คบใใใฉใฎใใใชใใผใฟใไฟๆใใฆใใใ็ฎ่ฆใง็ขบ่ชใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-002: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใ10ไปถ่กจ็คบใใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'amount']].head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-003: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใ10ไปถ่กจ็คบใใใใใใ ใใsales_ymdใฏsales_dateใซ้
็ฎๅใๅคๆดใใชใใๆฝๅบใใใใจใ",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'amount']]. \\\n rename(columns={'sales_ymd': 'sales_date'}).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-004: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใไปฅไธใฎๆกไปถใๆบใใใใผใฟใๆฝๅบใใใ\n> - ้กงๅฎขID๏ผcustomer_id๏ผใ\"CS018205000001\"",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'amount']]. \\\n query('customer_id == \"CS018205000001\"')",
"_____no_output_____"
]
],
[
[
"---\n> P-005: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใไปฅไธใฎๆกไปถใๆบใใใใผใฟใๆฝๅบใใใ\n> - ้กงๅฎขID๏ผcustomer_id๏ผใ\"CS018205000001\"\n> - ๅฃฒไธ้้ก๏ผamount๏ผใ1,000ไปฅไธ",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'amount']] \\\n .query('customer_id == \"CS018205000001\" & amount >= 1000')",
"_____no_output_____"
]
],
[
[
"---\n> P-006: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ใdf_receiptใใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธๆฐ้๏ผquantity๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใไปฅไธใฎๆกไปถใๆบใใใใผใฟใๆฝๅบใใใ\n> - ้กงๅฎขID๏ผcustomer_id๏ผใ\"CS018205000001\"\n> - ๅฃฒไธ้้ก๏ผamount๏ผใ1,000ไปฅไธใพใใฏๅฃฒไธๆฐ้๏ผquantity๏ผใ5ไปฅไธ",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'quantity', 'amount']].\\\n query('customer_id == \"CS018205000001\" & (amount >= 1000 | quantity >=5)')",
"_____no_output_____"
]
],
[
[
"---\n> P-007: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใไปฅไธใฎๆกไปถใๆบใใใใผใฟใๆฝๅบใใใ\n> - ้กงๅฎขID๏ผcustomer_id๏ผใ\"CS018205000001\"\n> - ๅฃฒไธ้้ก๏ผamount๏ผใ1,000ไปฅไธ2,000ไปฅไธ",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'amount']] \\\n .query('customer_id == \"CS018205000001\" & 1000 <= amount <= 2000')",
"_____no_output_____"
]
],
[
[
"---\n> P-008: ใฌใทใผใๆ็ดฐใฎใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใๅๅใณใผใ๏ผproduct_cd๏ผใๅฃฒไธ้้ก๏ผamount๏ผใฎ้ ใซๅใๆๅฎใใไปฅไธใฎๆกไปถใๆบใใใใผใฟใๆฝๅบใใใ\n> - ้กงๅฎขID๏ผcustomer_id๏ผใ\"CS018205000001\"\n> - ๅๅใณใผใ๏ผproduct_cd๏ผใ\"P071401019\"ไปฅๅค",
"_____no_output_____"
]
],
[
[
"df_receipt[['sales_ymd', 'customer_id', 'product_cd', 'amount']] \\\n .query('customer_id == \"CS018205000001\" & product_cd != \"P071401019\"')",
"_____no_output_____"
]
],
[
[
"---\n> P-009: ไปฅไธใฎๅฆ็ใซใใใฆใๅบๅ็ตๆใๅคใใใซORใANDใซๆธใๆใใใ\n\n`df_store.query('not(prefecture_cd == \"13\" | floor_area > 900)')`",
"_____no_output_____"
]
],
[
[
"df_store.query('prefecture_cd != \"13\" & floor_area <= 900')",
"_____no_output_____"
]
],
[
[
"---\n> P-010: ๅบ่ใใผใฟใใฌใผใ ๏ผdf_store๏ผใใใๅบ่ใณใผใ๏ผstore_cd๏ผใ\"S14\"ใงๅงใพใใใฎใ ใๅ
จ้
็ฎๆฝๅบใใ10ไปถใ ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_store.query(\"store_cd.str.startswith('S14')\", engine='python').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-011: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใใ้กงๅฎขID๏ผcustomer_id๏ผใฎๆซๅฐพใ1ใฎใใฎใ ใๅ
จ้
็ฎๆฝๅบใใ10ไปถใ ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.query(\"customer_id.str.endswith('1')\", engine='python').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-012: ๅบ่ใใผใฟใใฌใผใ ๏ผdf_store๏ผใใๆจชๆตๅธใฎๅบ่ใ ใๅ
จ้
็ฎ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_store.query(\"address.str.contains('ๆจชๆตๅธ')\", engine='python')",
"_____no_output_____"
]
],
[
[
"---\n> P-013: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใใใในใใผใฟในใณใผใ๏ผstatus_cd๏ผใฎๅ
้ ญใใขใซใใกใใใใฎAใFใงๅงใพใใใผใฟใๅ
จ้
็ฎๆฝๅบใใ10ไปถใ ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.query(\"status_cd.str.contains('^[A-F]', regex=True)\", \n engine='python').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-014: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใใใในใใผใฟในใณใผใ๏ผstatus_cd๏ผใฎๆซๅฐพใๆฐๅญใฎ1ใ9ใง็ตใใใใผใฟใๅ
จ้
็ฎๆฝๅบใใ10ไปถใ ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.query(\"status_cd.str.contains('[1-9]$', regex=True)\", engine='python').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-015: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใใใในใใผใฟในใณใผใ๏ผstatus_cd๏ผใฎๅ
้ ญใใขใซใใกใใใใฎAใFใงๅงใพใใๆซๅฐพใๆฐๅญใฎ1ใ9ใง็ตใใใใผใฟใๅ
จ้
็ฎๆฝๅบใใ10ไปถใ ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.query(\"status_cd.str.contains('^[A-F].*[1-9]$', regex=True)\", \n engine='python').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-016: ๅบ่ใใผใฟใใฌใผใ ๏ผdf_store๏ผใใใ้ป่ฉฑ็ชๅท๏ผtel_no๏ผใ3ๆก-3ๆก-4ๆกใฎใใผใฟใๅ
จ้
็ฎ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_store.query(\"tel_no.str.contains('^[0-9]{3}-[0-9]{3}-[0-9]{4}$',regex=True)\", \n engine='python')",
"_____no_output_____"
]
],
[
[
"---\n> P-17: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใ็ๅนดๆๆฅ๏ผbirth_day๏ผใง้ซ้ฝข้ ใซใฝใผใใใๅ
้ ญ10ไปถใๅ
จ้
็ฎ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.sort_values('birth_day', ascending=True).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-18: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใ็ๅนดๆๆฅ๏ผbirth_day๏ผใง่ฅใ้ ใซใฝใผใใใๅ
้ ญ10ไปถใๅ
จ้
็ฎ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.sort_values('birth_day', ascending=False).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-19: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ1ไปถใใใใฎๅฃฒไธ้้ก๏ผamount๏ผใ้ซใ้ ใซใฉใณใฏใไปไธใใๅ
้ ญ10ไปถใๆฝๅบใใใ้
็ฎใฏ้กงๅฎขID๏ผcustomer_id๏ผใๅฃฒไธ้้ก๏ผamount๏ผใไปไธใใใฉใณใฏใ่กจ็คบใใใใใจใใชใใๅฃฒไธ้้ก๏ผamount๏ผใ็ญใใๅ ดๅใฏๅไธ้ ไฝใไปไธใใใใฎใจใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.concat([df_receipt[['customer_id', 'amount']] \n ,df_receipt['amount'].rank(method='min', \n ascending=False)], axis=1)\ndf_tmp.columns = ['customer_id', 'amount', 'ranking']\ndf_tmp.sort_values('ranking', ascending=True).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-020: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ1ไปถใใใใฎๅฃฒไธ้้ก๏ผamount๏ผใ้ซใ้ ใซใฉใณใฏใไปไธใใๅ
้ ญ10ไปถใๆฝๅบใใใ้
็ฎใฏ้กงๅฎขID๏ผcustomer_id๏ผใๅฃฒไธ้้ก๏ผamount๏ผใไปไธใใใฉใณใฏใ่กจ็คบใใใใใจใใชใใๅฃฒไธ้้ก๏ผamount๏ผใ็ญใใๅ ดๅใงใๅฅ้ ไฝใไปไธใใใใจใ",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.concat([df_receipt[['customer_id', 'amount']] \n ,df_receipt['amount'].rank(method='first', \n ascending=False)], axis=1)\ndf_tmp.columns = ['customer_id', 'amount', 'ranking']\ndf_tmp.sort_values('ranking', ascending=True).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-021: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใไปถๆฐใใซใฆใณใใใใ",
"_____no_output_____"
]
],
[
[
"len(df_receipt)",
"_____no_output_____"
]
],
[
[
"---\n> P-022: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎ้กงๅฎขID๏ผcustomer_id๏ผใซๅฏพใใใฆใใผใฏไปถๆฐใใซใฆใณใใใใ",
"_____no_output_____"
]
],
[
[
"len(df_receipt['customer_id'].unique())",
"_____no_output_____"
]
],
[
[
"---\n> P-023: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใจๅฃฒไธๆฐ้๏ผquantity๏ผใๅ่จใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').agg({'amount':'sum', \n 'quantity':'sum'}).reset_index()",
"_____no_output_____"
]
],
[
[
"---\n> P-024: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๆใๆฐใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใๆฑใใ10ไปถ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('customer_id').sales_ymd.max().reset_index().head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-025: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๆใๅคใๅฃฒไธๆฅ๏ผsales_ymd๏ผใๆฑใใ10ไปถ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('customer_id').agg({'sales_ymd':'min'}).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-026: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๆใๆฐใใๅฃฒไธๆฅ๏ผsales_ymd๏ผใจๅคใๅฃฒไธๆฅใๆฑใใไธก่
ใ็ฐใชใใใผใฟใ10ไปถ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_receipt.groupby('customer_id'). \\\n agg({'sales_ymd':['max','min']}).reset_index()\ndf_tmp.columns = [\"_\".join(pair) for pair in df_tmp.columns]\ndf_tmp.query('sales_ymd_max != sales_ymd_min').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-027: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใฎๅนณๅใ่จ็ฎใใ้้ ใงTOP5ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').agg({'amount':'mean'}).reset_index(). \\\n sort_values('amount', ascending=False).head(5)",
"_____no_output_____"
]
],
[
[
"---\n> P-028: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใฎไธญๅคฎๅคใ่จ็ฎใใ้้ ใงTOP5ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').agg({'amount':'median'}).reset_index(). \\\n sort_values('amount', ascending=False).head(5)",
"_____no_output_____"
]
],
[
[
"---\n> P-029: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅๅใณใผใ๏ผproduct_cd๏ผใฎๆ้ ปๅคใๆฑใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').product_cd. \\\n apply(lambda x: x.mode()).reset_index()",
"_____no_output_____"
]
],
[
[
"---\n> P-030: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใฎๆจๆฌๅๆฃใ่จ็ฎใใ้้ ใงTOP5ใ่กจ็คบใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').amount.var(ddof=0).reset_index(). \\\n sort_values('amount', ascending=False).head(5)",
"_____no_output_____"
]
],
[
[
"---\n> P-031: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใฎๆจๆฌๆจๆบๅๅทฎใ่จ็ฎใใ้้ ใงTOP5ใ่กจ็คบใใใ",
"_____no_output_____"
],
[
"TIPS:\n\nPandasใจNumpyใงddofใฎใใใฉใซใๅคใ็ฐใชใใใจใซๆณจๆใใพใใใ\n```\nPandas๏ผ\nDataFrame.std(self, axis=None, skipna=None, level=None, ddof=1, numeric_only=None, **kwargs)\nNumpy:\nnumpy.std(a, axis=None, dtype=None, out=None, ddof=0, keepdims=)\n```",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').amount.std(ddof=0).reset_index(). \\\n sort_values('amount', ascending=False).head(5)",
"_____no_output_____"
]
],
[
[
"---\n> P-032: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใซใคใใฆใ25๏ผ
ๅปใฟใงใใผใปใณใฟใคใซๅคใๆฑใใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\nnp.percentile(df_receipt['amount'], q=[25, 50, 75,100])",
"_____no_output_____"
],
[
"# ใณใผใไพ2\ndf_receipt.amount.quantile(q=np.arange(5)/4)",
"_____no_output_____"
]
],
[
[
"---\n> P-033: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใๅบ่ใณใผใ๏ผstore_cd๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใฎๅนณๅใ่จ็ฎใใ330ไปฅไธใฎใใฎใๆฝๅบใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt.groupby('store_cd').amount.mean(). \\\n reset_index().query('amount >= 330')",
"_____no_output_____"
]
],
[
[
"---\n> P-034: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใๅ่จใใฆๅ
จ้กงๅฎขใฎๅนณๅใๆฑใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใ\n",
"_____no_output_____"
]
],
[
[
"# queryใไฝฟใใชใๆธใๆน\ndf_receipt[~df_receipt['customer_id'].str.startswith(\"Z\")]. \\\n groupby('customer_id').amount.sum().mean()",
"_____no_output_____"
],
[
"# queryใไฝฟใๆธใๆน\ndf_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python').groupby('customer_id').amount.sum().mean()",
"_____no_output_____"
]
],
[
[
"---\n> P-035: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใซๅฏพใใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใๅ่จใใฆๅ
จ้กงๅฎขใฎๅนณๅใๆฑใใๅนณๅไปฅไธใซ่ฒทใ็ฉใใใฆใใ้กงๅฎขใๆฝๅบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใใชใใใใผใฟใฏ10ไปถใ ใ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"df_receipt_tmp = df_receipt[~df_receipt['customer_id'].str.startswith(\"Z\")]\namount_mean = df_receipt_tmp.groupby('customer_id').amount.sum().mean()\ndf_amount_sum = df_receipt_tmp.groupby('customer_id').amount.sum().reset_index()\ndf_amount_sum[df_amount_sum['amount'] >= amount_mean].head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-036: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใจๅบ่ใใผใฟใใฌใผใ ๏ผdf_store๏ผใๅ
้จ็ตๅใใใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ใฎๅ
จ้
็ฎใจๅบ่ใใผใฟใใฌใผใ ใฎๅบ่ๅ๏ผstore_name๏ผใ10ไปถ่กจ็คบใใใใ",
"_____no_output_____"
]
],
[
[
"pd.merge(df_receipt, df_store[['store_cd','store_name']], \n how='inner', on='store_cd').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-037: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใจใซใใดใชใใผใฟใใฌใผใ ๏ผdf_category๏ผใๅ
้จ็ตๅใใๅๅใใผใฟใใฌใผใ ใฎๅ
จ้
็ฎใจใซใใดใชใใผใฟใใฌใผใ ใฎๅฐๅบๅๅ๏ผcategory_small_name๏ผใ10ไปถ่กจ็คบใใใใ",
"_____no_output_____"
]
],
[
[
"pd.merge(df_product\n , df_category[['category_small_cd','category_small_name']]\n , how='inner', on='category_small_cd').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-038: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใจใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใใๅ้กงๅฎขใใจใฎๅฃฒไธ้้กๅ่จใๆฑใใใใใ ใใๅฃฒไธๅฎ็ธพใใชใ้กงๅฎขใซใคใใฆใฏๅฃฒไธ้้กใ0ใจใใฆ่กจ็คบใใใใใจใใพใใ้กงๅฎขใฏๆงๅฅใณใผใ๏ผgender_cd๏ผใๅฅณๆง๏ผ1๏ผใงใใใใฎใๅฏพ่ฑกใจใใ้ไผๅก๏ผ้กงๅฎขIDใ\"Z\"ใใๅงใพใใใฎ๏ผใฏ้คๅคใใใใจใใชใใ็ตๆใฏ10ไปถใ ใ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"df_amount_sum = df_receipt.groupby('customer_id').amount.sum().reset_index()\ndf_tmp = df_customer. \\\n query('gender_cd == \"1\" and not customer_id.str.startswith(\"Z\")', \n engine='python')\npd.merge(df_tmp['customer_id'], df_amount_sum, \n how='left', on='customer_id').fillna(0).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-039: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใใๅฃฒไธๆฅๆฐใฎๅคใ้กงๅฎขใฎไธไฝ20ไปถใจใๅฃฒไธ้้กๅ่จใฎๅคใ้กงๅฎขใฎไธไฝ20ไปถใๆฝๅบใใๅฎๅ
จๅค้จ็ตๅใใใใใ ใใ้ไผๅก๏ผ้กงๅฎขIDใ\"Z\"ใใๅงใพใใใฎ๏ผใฏ้คๅคใใใใจใ",
"_____no_output_____"
]
],
[
[
"df_sum = df_receipt.groupby('customer_id').amount.sum().reset_index()\ndf_sum = df_sum.query('not customer_id.str.startswith(\"Z\")', engine='python')\ndf_sum = df_sum.sort_values('amount', ascending=False).head(20)\n\ndf_cnt = df_receipt[~df_receipt.duplicated(subset=['customer_id', 'sales_ymd'])]\ndf_cnt = df_cnt.query('not customer_id.str.startswith(\"Z\")', engine='python')\ndf_cnt = df_cnt.groupby('customer_id').sales_ymd.count().reset_index()\ndf_cnt = df_cnt.sort_values('sales_ymd', ascending=False).head(20)\n\npd.merge(df_sum, df_cnt, how='outer', on='customer_id')",
"_____no_output_____"
]
],
[
[
"---\n> P-040: ๅ
จใฆใฎๅบ่ใจๅ
จใฆใฎๅๅใ็ตใฟๅใใใใจไฝไปถใฎใใผใฟใจใชใใ่ชฟๆปใใใใๅบ่๏ผdf_store๏ผใจๅๅ๏ผdf_product๏ผใ็ด็ฉใใไปถๆฐใ่จ็ฎใใใ",
"_____no_output_____"
]
],
[
[
"df_store_tmp = df_store.copy()\ndf_product_tmp = df_product.copy()\n\ndf_store_tmp['key'] = 0\ndf_product_tmp['key'] = 0\nlen(pd.merge(df_store_tmp, df_product_tmp, how='outer', on='key'))",
"_____no_output_____"
]
],
[
[
"---\n> P-041: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใๆฅไป๏ผsales_ymd๏ผใใจใซ้่จใใๅๆฅใใใฎๅฃฒไธ้้กๅขๆธใ่จ็ฎใใใใชใใ่จ็ฎ็ตๆใฏ10ไปถ่กจ็คบใใใฐใใใ",
"_____no_output_____"
]
],
[
[
"df_sales_amount_by_date = df_receipt[['sales_ymd', 'amount']].\\\n groupby('sales_ymd').sum().reset_index()\ndf_sales_amount_by_date = pd.concat([df_sales_amount_by_date, \n df_sales_amount_by_date.shift()], axis=1)\ndf_sales_amount_by_date.columns = ['sales_ymd','amount','lag_ymd','lag_amount']\ndf_sales_amount_by_date['diff_amount'] = \\\n df_sales_amount_by_date['amount'] - df_sales_amount_by_date['lag_amount']\ndf_sales_amount_by_date.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-042: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใๆฅไป๏ผsales_ymd๏ผใใจใซ้่จใใๅๆฅไปใฎใใผใฟใซๅฏพใใ๏ผๆฅๅใ๏ผๆฅๅใ๏ผๆฅๅใฎใใผใฟใ็ตๅใใใ็ตๆใฏ10ไปถ่กจ็คบใใใฐใใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1:็ธฆๆใกใฑใผใน\ndf_sales_amount_by_date = df_receipt[['sales_ymd', 'amount']]. \\\n groupby('sales_ymd').sum().reset_index()\nfor i in range(1, 4):\n if i == 1:\n df_lag = pd.concat([df_sales_amount_by_date, \n df_sales_amount_by_date.shift(i)],axis=1)\n else:\n df_lag = df_lag.append(pd.concat([df_sales_amount_by_date, \n df_sales_amount_by_date.shift(i)],\n axis=1))\ndf_lag.columns = ['sales_ymd', 'amount', 'lag_ymd', 'lag_amount']\ndf_lag.dropna().sort_values(['sales_ymd','lag_ymd']).head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2:ๆจชๆใกใฑใผใน\ndf_sales_amount_by_date = df_receipt[['sales_ymd', 'amount']].\\\n groupby('sales_ymd').sum().reset_index()\nfor i in range(1, 4):\n if i == 1:\n df_lag = pd.concat([df_sales_amount_by_date, \n df_sales_amount_by_date.shift(i)],axis=1)\n else:\n df_lag = pd.concat([df_lag, df_sales_amount_by_date.shift(i)],axis=1)\ndf_lag.columns = ['sales_ymd', 'amount', 'lag_ymd_1', 'lag_amount_1', \n 'lag_ymd_2', 'lag_amount_2', 'lag_ymd_3', 'lag_amount_3']\ndf_lag.dropna().sort_values(['sales_ymd']).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-043๏ผ ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใจ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใ็ตๅใใๆงๅฅ๏ผgender๏ผใจๅนดไปฃ๏ผageใใ่จ็ฎ๏ผใใจใซๅฃฒไธ้้ก๏ผamount๏ผใๅ่จใใๅฃฒไธใตใใชใใผใฟใใฌใผใ ๏ผdf_sales_summary๏ผใไฝๆใใใๆงๅฅใฏ0ใ็ทๆงใ1ใๅฅณๆงใ9ใไธๆใ่กจใใใฎใจใใใ\n>\n> ใใ ใใ้
็ฎๆงๆใฏๅนดไปฃใๅฅณๆงใฎๅฃฒไธ้้กใ็ทๆงใฎๅฃฒไธ้้กใๆงๅฅไธๆใฎๅฃฒไธ้้กใฎ4้
็ฎใจใใใใจ๏ผ็ธฆใซๅนดไปฃใๆจชใซๆงๅฅใฎใฏใญใน้่จ๏ผใใพใใๅนดไปฃใฏ10ๆญณใใจใฎ้็ดใจใใใใจใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_tmp = pd.merge(df_receipt, df_customer, how ='inner', on=\"customer_id\")\ndf_tmp['era'] = df_tmp['age'].apply(lambda x: math.floor(x / 10) * 10)\ndf_sales_summary = pd.pivot_table(df_tmp, index='era', columns='gender_cd', \n values='amount', aggfunc='sum').reset_index()\ndf_sales_summary.columns = ['era', 'male', 'female', 'unknown']\ndf_sales_summary",
"_____no_output_____"
],
[
"# ใณใผใไพ2\ndf_tmp = pd.merge(df_receipt, df_customer, how ='inner', on=\"customer_id\")\ndf_tmp['era'] = np.floor(df_tmp['age'] / 10).astype(int) * 10\ndf_sales_summary = pd.pivot_table(df_tmp, index='era', columns='gender_cd', \n values='amount', aggfunc='sum').reset_index()\ndf_sales_summary.columns = ['era', 'male', 'female', 'unknown']\ndf_sales_summary",
"_____no_output_____"
]
],
[
[
"---\n> P-044๏ผ ๅ่จญๅใงไฝๆใใๅฃฒไธใตใใชใใผใฟใใฌใผใ ๏ผdf_sales_summary๏ผใฏๆงๅฅใฎๅฃฒไธใๆจชๆใกใใใใใฎใงใใฃใใใใฎใใผใฟใใฌใผใ ใใๆงๅฅใ็ธฆๆใกใใใๅนดไปฃใๆงๅฅใณใผใใๅฃฒไธ้้กใฎ3้
็ฎใซๅคๆใใใใใ ใใๆงๅฅใณใผใใฏ็ทๆงใ\"00\"ใๅฅณๆงใ\"01\"ใไธๆใ\"99\"ใจใใใ",
"_____no_output_____"
]
],
[
[
"df_sales_summary = df_sales_summary.set_index('era'). \\\n stack().reset_index().replace({'female':'01','male':'00','unknown':'99'}). \\\n rename(columns={'level_1':'gender_cd', 0: 'amount'})",
"_____no_output_____"
],
[
"df_sales_summary",
"_____no_output_____"
]
],
[
[
"---\n> P-045: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎ็ๅนดๆๆฅ๏ผbirth_day๏ผใฏๆฅไปๅใงใใผใฟใไฟๆใใฆใใใใใใYYYYMMDDๅฝขๅผใฎๆๅญๅใซๅคๆใใ้กงๅฎขID๏ผcustomer_id๏ผใจใจใใซๆฝๅบใใใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.concat([df_customer['customer_id'],\n pd.to_datetime(df_customer['birth_day']).dt.strftime('%Y%m%d')],\n axis = 1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-046: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎ็ณใ่พผใฟๆฅ๏ผapplication_date๏ผใฏYYYYMMDDๅฝขๅผใฎๆๅญๅๅใงใใผใฟใไฟๆใใฆใใใใใใๆฅไปๅใซๅคๆใใ้กงๅฎขID๏ผcustomer_id๏ผใจใจใใซๆฝๅบใใใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.concat([df_customer['customer_id'],\n pd.to_datetime(df_customer['application_date'])], axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-047: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธๆฅ๏ผsales_ymd๏ผใฏYYYYMMDDๅฝขๅผใฎๆฐๅคๅใงใใผใฟใไฟๆใใฆใใใใใใๆฅไปๅใซๅคๆใใใฌใทใผใ็ชๅท(receipt_no)ใใฌใทใผใใตใ็ชๅท๏ผreceipt_sub_no๏ผใจใจใใซๆฝๅบใใใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.concat([df_receipt[['receipt_no', 'receipt_sub_no']],\n pd.to_datetime(df_receipt['sales_ymd'].astype('str'))],\n axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-048: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธใจใใใฏ็ง๏ผsales_epoch๏ผใฏๆฐๅคๅใฎUNIX็งใงใใผใฟใไฟๆใใฆใใใใใใๆฅไปๅใซๅคๆใใใฌใทใผใ็ชๅท(receipt_no)ใใฌใทใผใใตใ็ชๅท๏ผreceipt_sub_no๏ผใจใจใใซๆฝๅบใใใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.concat([df_receipt[['receipt_no', 'receipt_sub_no']],\n pd.to_datetime(df_receipt['sales_epoch'], unit='s')], \n axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-049: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธใจใใใฏ็ง๏ผsales_epoch๏ผใๆฅไปๅใซๅคๆใใใๅนดใใ ใๅใๅบใใฆใฌใทใผใ็ชๅท(receipt_no)ใใฌใทใผใใตใ็ชๅท๏ผreceipt_sub_no๏ผใจใจใใซๆฝๅบใใใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.concat([df_receipt[['receipt_no', 'receipt_sub_no']],\n pd.to_datetime(df_receipt['sales_epoch'], unit='s').dt.year],\n axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-050: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธใจใใใฏ็ง๏ผsales_epoch๏ผใๆฅไปๅใซๅคๆใใใๆใใ ใๅใๅบใใฆใฌใทใผใ็ชๅท(receipt_no)ใใฌใทใผใใตใ็ชๅท๏ผreceipt_sub_no๏ผใจใจใใซๆฝๅบใใใใชใใใๆใใฏ0ๅใ2ๆกใงๅใๅบใใใจใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"# dt.monthใงใๆใๅๅพใงใใใใใใใงใฏ0ๅใ๏ผๆกใงๅใๅบใใใstrftimeใๅฉ็จใใฆใใ\npd.concat([df_receipt[['receipt_no', 'receipt_sub_no']],\n pd.to_datetime(df_receipt['sales_epoch'], unit='s'). \\\n dt.strftime('%m')],axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-051: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธใจใใใฏ็งใๆฅไปๅใซๅคๆใใใๆฅใใ ใๅใๅบใใฆใฌใทใผใ็ชๅท(receipt_no)ใใฌใทใผใใตใ็ชๅท๏ผreceipt_sub_no๏ผใจใจใใซๆฝๅบใใใใชใใใๆฅใใฏ0ๅใ2ๆกใงๅใๅบใใใจใใใผใฟใฏ10ไปถใๆฝๅบใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"# dt.dayใงใๆฅใๅๅพใงใใใใใใใงใฏ0ๅใ๏ผๆกใงๅใๅบใใใstrftimeใๅฉ็จใใฆใใ\npd.concat([df_receipt[['receipt_no', 'receipt_sub_no']],\n pd.to_datetime(df_receipt['sales_epoch'], unit='s'). \\\n dt.strftime('%d')],axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-052: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅ่จใฎไธใๅฃฒไธ้้กๅ่จใซๅฏพใใฆ2,000ๅไปฅไธใ0ใ2,000ๅใใๅคงใใ้้กใ1ใซ2ๅคๅใใ้กงๅฎขIDใๅฃฒไธ้้กๅ่จใจใจใใซ10ไปถ่กจ็คบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python')\ndf_sales_amount = df_sales_amount[['customer_id', 'amount']]. \\\n groupby('customer_id').sum().reset_index()\ndf_sales_amount['sales_flg'] = df_sales_amount['amount']. \\\n apply(lambda x: 1 if x > 2000 else 0)\ndf_sales_amount.head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2๏ผnp.whereใฎๆดป็จ๏ผ\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python')\ndf_sales_amount = df_sales_amount[['customer_id', 'amount']]. \\\n groupby('customer_id').sum().reset_index()\ndf_sales_amount['sales_flg'] = np.where(df_sales_amount['amount'] > 2000, 1, 0)\ndf_sales_amount.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-053: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎ้ตไพฟ็ชๅท๏ผpostal_cd๏ผใซๅฏพใใๆฑไบฌ๏ผๅ
้ ญ3ๆกใ100ใ209ใฎใใฎ๏ผใ1ใใใไปฅๅคใฎใใฎใ0ใซ๏ผๅคๅใใใใใใซใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใจ็ตๅใใๅ
จๆ้ใซใใใฆๅฃฒไธๅฎ็ธพใใใ้กงๅฎขๆฐใใไฝๆใใ2ๅคใใจใซใซใฆใณใใใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_tmp = df_customer[['customer_id', 'postal_cd']].copy()\ndf_tmp['postal_flg'] = df_tmp['postal_cd']. \\\n apply(lambda x: 1 if 100 <= int(x[0:3]) <= 209 else 0)\n\npd.merge(df_tmp, df_receipt, how='inner', on='customer_id'). \\\n groupby('postal_flg').agg({'customer_id':'nunique'})",
"_____no_output_____"
],
[
"# ใณใผใไพ2๏ผnp.whereใbetweenใฎๆดป็จ๏ผ\ndf_tmp = df_customer[['customer_id', 'postal_cd']].copy()\ndf_tmp['postal_flg'] = np.where(df_tmp['postal_cd'].str[0:3].astype(int)\n .between(100, 209), 1, 0)\npd.merge(df_tmp, df_receipt, how='inner', on='customer_id'). \\\n groupby('postal_flg').agg({'customer_id':'nunique'})",
"_____no_output_____"
]
],
[
[
"---\n> P-054: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎไฝๆ๏ผaddress๏ผใฏใๅผ็็ใๅ่็ใๆฑไบฌ้ฝใ็ฅๅฅๅท็ใฎใใใใใจใชใฃใฆใใใ้ฝ้ๅบ็ๆฏใซใณใผใๅคใไฝๆใใ้กงๅฎขIDใไฝๆใจใจใใซๆฝๅบใใใๅคใฏๅผ็็ใ11ใๅ่็ใ12ใๆฑไบฌ้ฝใ13ใ็ฅๅฅๅท็ใ14ใจใใใใจใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.concat([df_customer[['customer_id', 'address']], \n df_customer['address'].str[0:3].map({'ๅผ็็': '11',\n 'ๅ่็':'12', \n 'ๆฑไบฌ้ฝ':'13', \n '็ฅๅฅๅท':'14'})],axis=1).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-055: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅ่จใใใใฎๅ่จ้้กใฎๅๅไฝ็นใๆฑใใใใใฎไธใงใ้กงๅฎขใใจใฎๅฃฒไธ้้กๅ่จใซๅฏพใใฆไปฅไธใฎๅบๆบใงใซใใดใชๅคใไฝๆใใ้กงๅฎขIDใๅฃฒไธ้้กๅ่จใจใจใใซ่กจ็คบใใใใซใใดใชๅคใฏไธใใ้ ใซ1ใ4ใจใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ\n>\n> - ๆๅฐๅคไปฅไธ็ฌฌไธๅๅไฝๆชๆบ\n> - ็ฌฌไธๅๅไฝไปฅไธ็ฌฌไบๅๅไฝๆชๆบ\n> - ็ฌฌไบๅๅไฝไปฅไธ็ฌฌไธๅๅไฝๆชๆบ\n> - ็ฌฌไธๅๅไฝไปฅไธ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_sales_amount = df_receipt[['customer_id', 'amount']]. \\\n groupby('customer_id').sum().reset_index()\npct25 = np.quantile(df_sales_amount['amount'], 0.25)\npct50 = np.quantile(df_sales_amount['amount'], 0.5)\npct75 = np.quantile(df_sales_amount['amount'], 0.75)\n\ndef pct_group(x):\n if x < pct25:\n return 1\n elif pct25 <= x < pct50:\n return 2\n elif pct50 <= x < pct75:\n return 3\n elif pct75 <= x:\n return 4\n\ndf_sales_amount['pct_group'] = df_sales_amount['amount'].apply(lambda x: pct_group(x))\ndf_sales_amount.head(10)",
"_____no_output_____"
],
[
"# ็ขบ่ช็จ\nprint('pct25:', pct25)\nprint('pct50:', pct50)\nprint('pct75:', pct75)",
"pct25: 548.5\npct50: 1478.0\npct75: 3651.0\n"
],
[
"# ใณใผใไพ2\ndf_temp = df_receipt.groupby('customer_id')[['amount']].sum()\ndf_temp['quantile'], bins = \\\n pd.qcut(df_receipt.groupby('customer_id')['amount'].sum(), 4, retbins=True)\ndisplay(df_temp.head())\nprint('quantiles:', bins)",
"_____no_output_____"
]
],
[
[
"---\n> P-056: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎๅนด้ฝข๏ผage๏ผใใใจใซ10ๆญณๅปใฟใงๅนดไปฃใ็ฎๅบใใ้กงๅฎขID๏ผcustomer_id๏ผใ็ๅนดๆๆฅ๏ผbirth_day๏ผใจใจใใซๆฝๅบใใใใใ ใใ60ๆญณไปฅไธใฏๅ
จใฆ60ๆญณไปฃใจใใใใจใๅนดไปฃใ่กจใใซใใดใชๅใฏไปปๆใจใใใๅ
้ ญ10ไปถใ่กจ็คบใใใใฐใใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_customer_era = pd.concat([df_customer[['customer_id', 'birth_day']],\n df_customer['age']. \\\n apply(lambda x: min(math.floor(x / 10) * 10, 60))],\n axis=1)\n\ndf_customer_era.head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2\ndf_customer['age_group'] = pd.cut(df_customer['age'], \n bins=[0, 10, 20, 30, 40, 50, 60, np.inf], \n right=False)\ndf_customer[['customer_id', 'birth_day', 'age_group']].head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-057: ๅๅ้กใฎๆฝๅบ็ตๆใจๆงๅฅ๏ผgender๏ผใ็ตใฟๅใใใๆฐใใซๆงๅฅรๅนดไปฃใฎ็ตใฟๅใใใ่กจใใซใใดใชใใผใฟใไฝๆใใใ็ตใฟๅใใใ่กจใใซใใดใชใฎๅคใฏไปปๆใจใใใๅ
้ ญ10ไปถใ่กจ็คบใใใใฐใใใ",
"_____no_output_____"
]
],
[
[
"df_customer_era['era_gender'] = \\\n df_customer['gender_cd'] + df_customer_era['age'].astype('str')\ndf_customer_era.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-058: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎๆงๅฅใณใผใ๏ผgender_cd๏ผใใใใผๅคๆฐๅใใ้กงๅฎขID๏ผcustomer_id๏ผใจใจใใซๆฝๅบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"pd.get_dummies(df_customer[['customer_id', 'gender_cd']], \n columns=['gender_cd']).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-059: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅ่จใใๅฃฒไธ้้กๅ่จใๅนณๅ0ใๆจๆบๅๅทฎ1ใซๆจๆบๅใใฆ้กงๅฎขIDใๅฃฒไธ้้กๅ่จใจใจใใซ่กจ็คบใใใๆจๆบๅใซไฝฟ็จใใๆจๆบๅๅทฎใฏใไธๅๆจๆบๅๅทฎใจๆจๆฌๆจๆบๅๅทฎใฎใฉใกใใงใ่ฏใใใฎใจใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
],
[
"TIPS:\n- query()ใฎๅผๆฐengineใง'python'ใ'numexpr'ใใ้ธๆใงใใใใใฉใซใใฏใคใณในใใผใซใใใฆใใใฐnumexprใใ็กใใใฐpythonใไฝฟใใใพใใใใใซใๆๅญๅใกใฝใใใฏengine='python'ใงใชใใจquery()ใกใฝใใใงไฝฟใใพใใใ\n",
"_____no_output_____"
]
],
[
[
"# skleanใฎpreprocessing.scaleใๅฉ็จใใใใใๆจๆฌๆจๆบๅๅทฎใง่จ็ฎใใใฆใใ\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\ndf_sales_amount['amount_ss'] = preprocessing.scale(df_sales_amount['amount'])\ndf_sales_amount.head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2๏ผfitใ่กใใใจใงใๅฅใฎใใผใฟใงใๅใใฎๅนณๅใปๆจๆบๅๅทฎใงๆจๆบๅใ่กใใ๏ผ\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\nscaler = preprocessing.StandardScaler()\nscaler.fit(df_sales_amount[['amount']])\ndf_sales_amount['amount_ss'] = scaler.transform(df_sales_amount[['amount']])\ndf_sales_amount.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-060: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅ่จใใๅฃฒไธ้้กๅ่จใๆๅฐๅค0ใๆๅคงๅค1ใซๆญฃ่ฆๅใใฆ้กงๅฎขIDใๅฃฒไธ้้กๅ่จใจใจใใซ่กจ็คบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\ndf_sales_amount['amount_mm'] = \\\n preprocessing.minmax_scale(df_sales_amount['amount'])\ndf_sales_amount.head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2๏ผfitใ่กใใใจใงใๅฅใฎใใผใฟใงใๅใใฎๅนณๅใปๆจๆบๅๅทฎใงๆจๆบๅใ่กใใ๏ผ\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\nscaler = preprocessing.MinMaxScaler()\nscaler.fit(df_sales_amount[['amount']])\ndf_sales_amount['amount_mm'] = scaler.transform(df_sales_amount[['amount']])\ndf_sales_amount.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-061: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅ่จใใๅฃฒไธ้้กๅ่จใๅธธ็จๅฏพๆฐๅ๏ผๅบ=10๏ผใใฆ้กงๅฎขIDใๅฃฒไธ้้กๅ่จใจใจใใซ่กจ็คบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"# skleanใฎpreprocessing.scaleใๅฉ็จใใใใใๆจๆฌๆจๆบๅๅทฎใง่จ็ฎใใใฆใใ\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\ndf_sales_amount['amount_log10'] = np.log10(df_sales_amount['amount'] + 0.5)\ndf_sales_amount.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-062: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขID๏ผcustomer_id๏ผใใจใซๅ่จใใๅฃฒไธ้้กๅ่จใ่ช็ถๅฏพๆฐๅ(ๅบ=e๏ผใใฆ้กงๅฎขIDใๅฃฒไธ้้กๅ่จใจใจใใซ่กจ็คบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"df_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\ndf_sales_amount['amount_loge'] = np.log(df_sales_amount['amount'] + 0.5)\ndf_sales_amount.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-063: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใใใๅๅๅใฎๅฉ็้กใ็ฎๅบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_product.copy()\ndf_tmp['unit_profit'] = df_tmp['unit_price'] - df_tmp['unit_cost']\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-064: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใใใๅๅๅใฎๅฉ็็ใฎๅ
จไฝๅนณๅใ็ฎๅบใใใ\nใใ ใใๅไพกใจๅไพกใซใฏNULLใๅญๅจใใใใจใซๆณจๆใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_product.copy()\ndf_tmp['unit_profit_rate'] = \\\n (df_tmp['unit_price'] - df_tmp['unit_cost']) / df_tmp['unit_price']\ndf_tmp['unit_profit_rate'].mean(skipna=True)",
"_____no_output_____"
]
],
[
[
"---\n> P-065: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅๅๅใซใคใใฆใๅฉ็็ใ30%ใจใชใๆฐใใชๅไพกใๆฑใใใใใ ใใ1ๅๆชๆบใฏๅใๆจใฆใใใจใใใใฆ็ตๆใ10ไปถ่กจ็คบใใใๅฉ็็ใใใใ30๏ผ
ไป่ฟใงใใใใจใ็ขบ่ชใใใใใ ใใๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใซใฏNULLใๅญๅจใใใใจใซๆณจๆใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_product.copy()\ndf_tmp['new_price'] = np.floor(df_tmp['unit_cost'] / 0.7)\ndf_tmp['new_profit_rate'] = \\\n (df_tmp['new_price'] - df_tmp['unit_cost']) / df_tmp['new_price']\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-066: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅๅๅใซใคใใฆใๅฉ็็ใ30%ใจใชใๆฐใใชๅไพกใๆฑใใใไปๅใฏใ1ๅๆชๆบใไธธใใใใจ๏ผๅๆจไบๅ
ฅใพใใฏๅถๆฐใธใฎไธธใใง่ฏใ๏ผใใใใฆ็ตๆใ10ไปถ่กจ็คบใใใๅฉ็็ใใใใ30๏ผ
ไป่ฟใงใใใใจใ็ขบ่ชใใใใใ ใใๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใซใฏNULLใๅญๅจใใใใจใซๆณจๆใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_product.copy()\ndf_tmp['new_price'] = np.round(df_tmp['unit_cost'] / 0.7)\ndf_tmp['new_profit_rate'] = \\\n (df_tmp['new_price'] - df_tmp['unit_cost']) / df_tmp['new_price']\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-067: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅๅๅใซใคใใฆใๅฉ็็ใ30%ใจใชใๆฐใใชๅไพกใๆฑใใใไปๅใฏใ1ๅๆชๆบใๅใไธใใใใจใใใใฆ็ตๆใ10ไปถ่กจ็คบใใใๅฉ็็ใใใใ30๏ผ
ไป่ฟใงใใใใจใ็ขบ่ชใใใใใ ใใๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใซใฏNULLใๅญๅจใใใใจใซๆณจๆใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_product.copy()\ndf_tmp['new_price'] = np.ceil(df_tmp['unit_cost'] / 0.7)\ndf_tmp['new_profit_rate'] = \\\n (df_tmp['new_price'] - df_tmp['unit_cost']) / df_tmp['new_price']\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-068: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅๅๅใซใคใใฆใๆถ่ฒป็จ็10%ใฎ็จ่พผใฟ้้กใๆฑใใใ 1ๅๆชๆบใฎ็ซฏๆฐใฏๅใๆจใฆใจใใ็ตๆใฏ10ไปถ่กจ็คบใใใฐ่ฏใใใใ ใใๅไพก๏ผunit_price๏ผใซใฏNULLใๅญๅจใใใใจใซๆณจๆใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_product.copy()\ndf_tmp['price_tax'] = np.floor(df_tmp['unit_price'] * 1.1)\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-069: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใจๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใ็ตๅใใ้กงๅฎขๆฏใซๅ
จๅๅใฎๅฃฒไธ้้กๅ่จใจใใซใใดใชๅคงๅบๅ๏ผcategory_major_cd๏ผใ\"07\"๏ผ็ถ่ฉฐ็ผถ่ฉฐ๏ผใฎๅฃฒไธ้้กๅ่จใ่จ็ฎใฎไธใไธก่
ใฎๆฏ็ใๆฑใใใๆฝๅบๅฏพ่ฑกใฏใซใใดใชๅคงๅบๅ\"07\"๏ผ็ถ่ฉฐ็ผถ่ฉฐ๏ผใฎๅฃฒไธๅฎ็ธพใใใ้กงๅฎขใฎใฟใจใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐใใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_tmp_1 = pd.merge(df_receipt, df_product, \n how='inner', on='product_cd').groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\n\ndf_tmp_2 = pd.merge(df_receipt, df_product.query('category_major_cd == \"07\"'), \n how='inner', on='product_cd').groupby('customer_id').\\\n agg({'amount':'sum'}).reset_index()\n\ndf_tmp_3 = pd.merge(df_tmp_1, df_tmp_2, how='inner', on='customer_id')\ndf_tmp_3['rate_07'] = df_tmp_3['amount_y'] / df_tmp_3['amount_x']\ndf_tmp_3.head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2\ndf_temp = df_receipt.merge(df_product, how='left', on='product_cd'). \\\n groupby(['customer_id', 'category_major_cd'])['amount'].sum().unstack()\ndf_temp = df_temp[df_temp['07'] > 0]\ndf_temp['sum'] = df_temp.sum(axis=1)\ndf_temp['07_rate'] = df_temp['07'] / df_temp['sum']\ndf_temp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-070: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธๆฅ๏ผsales_ymd๏ผใซๅฏพใใ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎไผๅก็ณ่พผๆฅ๏ผapplication_date๏ผใใใฎ็ต้ๆฅๆฐใ่จ็ฎใใ้กงๅฎขID๏ผcustomer_id๏ผใๅฃฒไธๆฅใไผๅก็ณ่พผๆฅใจใจใใซ่กจ็คบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใ๏ผใชใใsales_ymdใฏๆฐๅคใapplication_dateใฏๆๅญๅใงใใผใฟใไฟๆใใฆใใ็นใซๆณจๆ๏ผใ",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.merge(df_receipt[['customer_id', 'sales_ymd']], \n df_customer[['customer_id', 'application_date']],\n how='inner', on='customer_id')\n\ndf_tmp = df_tmp.drop_duplicates()\n\ndf_tmp['sales_ymd'] = pd.to_datetime(df_tmp['sales_ymd'].astype('str'))\ndf_tmp['application_date'] = pd.to_datetime(df_tmp['application_date'])\ndf_tmp['elapsed_date'] = df_tmp['sales_ymd'] - df_tmp['application_date']\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-071: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธๆฅ๏ผsales_ymd๏ผใซๅฏพใใ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎไผๅก็ณ่พผๆฅ๏ผapplication_date๏ผใใใฎ็ต้ๆๆฐใ่จ็ฎใใ้กงๅฎขID๏ผcustomer_id๏ผใๅฃฒไธๆฅใไผๅก็ณ่พผๆฅใจใจใใซ่กจ็คบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใ๏ผใชใใsales_ymdใฏๆฐๅคใapplication_dateใฏๆๅญๅใงใใผใฟใไฟๆใใฆใใ็นใซๆณจๆ๏ผใ1ใถๆๆชๆบใฏๅใๆจใฆใใใจใ",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.merge(df_receipt[['customer_id', 'sales_ymd']], \n df_customer[['customer_id', 'application_date']],\n how='inner', on='customer_id')\n\ndf_tmp = df_tmp.drop_duplicates()\n\ndf_tmp['sales_ymd'] = pd.to_datetime(df_tmp['sales_ymd'].astype('str'))\ndf_tmp['application_date'] = pd.to_datetime(df_tmp['application_date'])\n\ndf_tmp['elapsed_date'] = df_tmp[['sales_ymd', 'application_date']]. \\\n apply(lambda x: relativedelta(x[0], x[1]).years * 12 + \\\n relativedelta(x[0], x[1]).months, axis=1)\n\ndf_tmp.sort_values('customer_id').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-072: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธๆฅ๏ผsales_ymd๏ผใซๅฏพใใ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎไผๅก็ณ่พผๆฅ๏ผapplication_date๏ผใใใฎ็ต้ๅนดๆฐใ่จ็ฎใใ้กงๅฎขID๏ผcustomer_id๏ผใๅฃฒไธๆฅใไผๅก็ณ่พผๆฅใจใจใใซ่กจ็คบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใ๏ผใชใใsales_ymdใฏๆฐๅคใapplication_dateใฏๆๅญๅใงใใผใฟใไฟๆใใฆใใ็นใซๆณจๆ๏ผใ1ๅนดๆชๆบใฏๅใๆจใฆใใใจใ",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.merge(df_receipt[['customer_id', 'sales_ymd']], \n df_customer[['customer_id', 'application_date']],\n how='inner', on='customer_id')\n\ndf_tmp = df_tmp.drop_duplicates()\n\ndf_tmp['sales_ymd'] = pd.to_datetime(df_tmp['sales_ymd'].astype('str'))\ndf_tmp['application_date'] = pd.to_datetime(df_tmp['application_date'])\n\ndf_tmp['elapsed_date'] = df_tmp[['sales_ymd', 'application_date']]. \\\n apply(lambda x: relativedelta(x[0], x[1]).years, axis=1)\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-073: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธๆฅ๏ผsales_ymd๏ผใซๅฏพใใ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎไผๅก็ณ่พผๆฅ๏ผapplication_date๏ผใใใฎใจใใใฏ็งใซใใ็ต้ๆ้ใ่จ็ฎใใ้กงๅฎขID๏ผcustomer_id๏ผใๅฃฒไธๆฅใไผๅก็ณ่พผๆฅใจใจใใซ่กจ็คบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใ๏ผใชใใsales_ymdใฏๆฐๅคใapplication_dateใฏๆๅญๅใงใใผใฟใไฟๆใใฆใใ็นใซๆณจๆ๏ผใใชใใๆ้ๆ
ๅ ฑใฏไฟๆใใฆใใชใใใๅๆฅไปใฏ0ๆ0ๅ0็งใ่กจใใใฎใจใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.merge(df_receipt[['customer_id', 'sales_ymd']], \n df_customer[['customer_id', 'application_date']],\n how='inner', on='customer_id')\n\ndf_tmp = df_tmp.drop_duplicates()\n\ndf_tmp['sales_ymd'] = pd.to_datetime(df_tmp['sales_ymd'].astype('str'))\ndf_tmp['application_date'] = pd.to_datetime(df_tmp['application_date'])\n\ndf_tmp['elapsed_date'] = \\\n (df_tmp['sales_ymd'].view(np.int64) / 10**9) - (df_tmp['application_date'].\\\n view(np.int64) / 10**9)\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-074: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธๆฅ๏ผsales_ymd๏ผใซๅฏพใใๅฝ่ฉฒ้ฑใฎๆๆๆฅใใใฎ็ต้ๆฅๆฐใ่จ็ฎใใๅฃฒไธๆฅใๅฝ่ฉฒ้ฑใฎๆๆๆฅไปใจใจใใซ่กจ็คบใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใ๏ผใชใใsales_ymdใฏๆฐๅคใงใใผใฟใไฟๆใใฆใใ็นใซๆณจๆ๏ผใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_receipt[['customer_id', 'sales_ymd']]\ndf_tmp = df_tmp.drop_duplicates()\ndf_tmp['sales_ymd'] = pd.to_datetime(df_tmp['sales_ymd'].astype('str'))\ndf_tmp['monday'] = df_tmp['sales_ymd']. \\\n apply(lambda x: x - relativedelta(days=x.weekday()))\ndf_tmp['elapsed_weekday'] = df_tmp['sales_ymd'] - df_tmp['monday']\ndf_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-075: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใใใฉใณใใ ใซ1%ใฎใใผใฟใๆฝๅบใใๅ
้ ญใใ10ไปถใใผใฟใๆฝๅบใใใ",
"_____no_output_____"
]
],
[
[
"df_customer.sample(frac=0.01).head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-076: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใใๆงๅฅ๏ผgender_cd๏ผใฎๅฒๅใซๅบใฅใใฉใณใใ ใซ10%ใฎใใผใฟใๅฑคๅๆฝๅบใใๆงๅฅใใจใซไปถๆฐใ้่จใใใ",
"_____no_output_____"
]
],
[
[
"# sklearn.model_selection.train_test_splitใไฝฟ็จใใไพ\n_, df_tmp = train_test_split(df_customer, test_size=0.1, \n stratify=df_customer['gender_cd'])\ndf_tmp.groupby('gender_cd').agg({'customer_id' : 'count'})",
"_____no_output_____"
],
[
"df_tmp.head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-077: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขๅไฝใซๅ่จใใๅ่จใใๅฃฒไธ้้กใฎๅคใๅคใๆฝๅบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใใชใใใใใงใฏๅคใๅคใๅนณๅใใ3ฯไปฅไธ้ขใใใใฎใจใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"# skleanใฎpreprocessing.scaleใๅฉ็จใใใใใๆจๆฌๆจๆบๅๅทฎใง่จ็ฎใใใฆใใ\ndf_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\ndf_sales_amount['amount_ss'] = preprocessing.scale(df_sales_amount['amount'])\ndf_sales_amount.query('abs(amount_ss) >= 3').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-078: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฎๅฃฒไธ้้ก๏ผamount๏ผใ้กงๅฎขๅไฝใซๅ่จใใๅ่จใใๅฃฒไธ้้กใฎๅคใๅคใๆฝๅบใใใใใ ใใ้กงๅฎขIDใ\"Z\"ใใๅงใพใใฎใใฎใฏ้ไผๅกใ่กจใใใใ้คๅคใใฆ่จ็ฎใใใใจใใชใใใใใงใฏๅคใๅคใ็ฌฌไธๅๅไฝใจ็ฌฌไธๅๅไฝใฎๅทฎใงใใIQRใ็จใใฆใใ็ฌฌไธๅๅไฝๆฐ-1.5รIQRใใใใไธๅใใใฎใใพใใฏใ็ฌฌไธๅๅไฝๆฐ+1.5รIQRใใ่ถ
ใใใใฎใจใใใ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใ",
"_____no_output_____"
]
],
[
[
"df_sales_amount = df_receipt.query('not customer_id.str.startswith(\"Z\")', \n engine='python'). \\\n groupby('customer_id'). \\\n agg({'amount':'sum'}).reset_index()\n\npct75 = np.percentile(df_sales_amount['amount'], q=75)\npct25 = np.percentile(df_sales_amount['amount'], q=25)\niqr = pct75 - pct25\namount_low = pct25 - (iqr * 1.5)\namount_hight = pct75 + (iqr * 1.5)\ndf_sales_amount.query('amount < @amount_low or @amount_hight < amount').head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-079: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎๅ้
็ฎใซๅฏพใใๆฌ ๆๆฐใ็ขบ่ชใใใ",
"_____no_output_____"
]
],
[
[
"df_product.isnull().sum()",
"_____no_output_____"
]
],
[
[
"---\n> P-080: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใฎใใใใใฎ้
็ฎใซๆฌ ๆใ็บ็ใใฆใใใฌใณใผใใๅ
จใฆๅ้คใใๆฐใใชdf_product_1ใไฝๆใใใใชใใๅ้คๅๅพใฎไปถๆฐใ่กจ็คบใใใๅ่จญๅใง็ขบ่ชใใไปถๆฐใ ใๆธๅฐใใฆใใใใจใ็ขบ่ชใใใใจใ",
"_____no_output_____"
]
],
[
[
"df_product_1 = df_product.copy()\nprint('ๅ้คๅ:', len(df_product_1))\ndf_product_1.dropna(inplace=True)\nprint('ๅ้คๅพ:', len(df_product_1))",
"ๅ้คๅ: 10030\nๅ้คๅพ: 10023\n"
]
],
[
[
"---\n> P-081: ๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใฎๆฌ ๆๅคใซใคใใฆใใใใใใฎๅนณๅๅคใง่ฃๅฎใใๆฐใใชdf_product_2ใไฝๆใใใใชใใๅนณๅๅคใซใคใใฆใฏ1ๅๆชๆบใไธธใใใใจ๏ผๅๆจไบๅ
ฅใพใใฏๅถๆฐใธใฎไธธใใง่ฏใ๏ผใ่ฃๅฎๅฎๆฝๅพใๅ้
็ฎใซใคใใฆๆฌ ๆใ็ใใฆใใชใใใจใ็ขบ่ชใใใใจใ",
"_____no_output_____"
]
],
[
[
"\ndf_product_2 = df_product.fillna({\n 'unit_price':np.round(np.nanmean(df_product['unit_price'])), \n 'unit_cost':np.round(np.nanmean(df_product['unit_cost']))})\ndf_product_2.isnull().sum()",
"_____no_output_____"
]
],
[
[
"---\n> P-082: ๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใฎๆฌ ๆๅคใซใคใใฆใใใใใใฎไธญๅคฎๅคใง่ฃๅฎใใๆฐใใชdf_product_3ใไฝๆใใใใชใใไธญๅคฎๅคใซใคใใฆใฏ1ๅๆชๆบใไธธใใใใจ๏ผๅๆจไบๅ
ฅใพใใฏๅถๆฐใธใฎไธธใใง่ฏใ๏ผใ่ฃๅฎๅฎๆฝๅพใๅ้
็ฎใซใคใใฆๆฌ ๆใ็ใใฆใใชใใใจใ็ขบ่ชใใใใจใ",
"_____no_output_____"
]
],
[
[
"df_product_3 = df_product.fillna({\n 'unit_price':np.round(np.nanmedian(df_product['unit_price'])), \n 'unit_cost':np.round(np.nanmedian(df_product['unit_cost']))})\ndf_product_3.isnull().sum()",
"_____no_output_____"
]
],
[
[
"---\n> P-083: ๅไพก๏ผunit_price๏ผใจๅไพก๏ผunit_cost๏ผใฎๆฌ ๆๅคใซใคใใฆใๅๅๅใฎๅฐๅบๅ๏ผcategory_small_cd๏ผใใจใซ็ฎๅบใใไธญๅคฎๅคใง่ฃๅฎใใๆฐใใชdf_product_4ใไฝๆใใใใชใใไธญๅคฎๅคใซใคใใฆใฏ1ๅๆชๆบใไธธใใใใจ๏ผๅๆจไบๅ
ฅใพใใฏๅถๆฐใธใฎไธธใใง่ฏใ๏ผใ่ฃๅฎๅฎๆฝๅพใๅ้
็ฎใซใคใใฆๆฌ ๆใ็ใใฆใใชใใใจใ็ขบ่ชใใใใจใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_tmp = df_product.groupby('category_small_cd'). \\\n agg({'unit_price':'median', 'unit_cost':'median'}).reset_index()\ndf_tmp.columns = ['category_small_cd', 'median_price', 'median_cost']\n\ndf_product_4 = pd.merge(df_product, df_tmp, how='inner', on='category_small_cd')\n\ndf_product_4['unit_price'] = df_product_4[['unit_price', 'median_price']]. \\\n apply(lambda x: np.round(x[1]) if np.isnan(x[0]) else x[0], axis=1)\ndf_product_4['unit_cost'] = df_product_4[['unit_cost', 'median_cost']]. \\\n apply(lambda x: np.round(x[1]) if np.isnan(x[0]) else x[0], axis=1)\n\ndf_product_4.isnull().sum()",
"_____no_output_____"
],
[
"# ใณใผใไพ2๏ผmaskใฎๆดป็จ๏ผ\ndf_tmp = (df_product\n .groupby('category_small_cd')\n .agg(median_price=('unit_price', 'median'), \n median_cost=('unit_cost', 'median'))\n .reset_index())\n\ndf_product_4 = df_product.merge(df_tmp, \n how='inner', \n on='category_small_cd')\n\ndf_product_4['unit_price'] = (df_product_4['unit_price']\n .mask(df_product_4['unit_price'].isnull(), \n df_product_4['median_price'].round()))\ndf_product_4['unit_cost'] = (df_product_4['unit_cost']\n .mask(df_product_4['unit_cost'].isnull(), \n df_product_4['median_cost'].round()))\n\ndf_product_4.isnull().sum()",
"_____no_output_____"
],
[
"# ใณใผใไพ3๏ผfillnaใtransformใฎๆดป็จ๏ผ\ndf_product_4 = df_product.copy()\n\nfor x in ['unit_price', 'unit_cost']: \n df_product_4[x] = (df_product_4[x]\n .fillna(df_product_4.groupby('category_small_cd')[x]\n .transform('median')\n .round()))\n\ndf_product_4.isnull().sum()",
"_____no_output_____"
]
],
[
[
"---\n> P-084: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎๅ
จ้กงๅฎขใซๅฏพใใๅ
จๆ้ใฎๅฃฒไธ้้กใซๅ ใใ2019ๅนดๅฃฒไธ้้กใฎๅฒๅใ่จ็ฎใใใใใ ใใๅฃฒไธๅฎ็ธพใใชใๅ ดๅใฏ0ใจใใฆๆฑใใใจใใใใฆ่จ็ฎใใๅฒๅใ0่ถ
ใฎใใฎใๆฝๅบใใใ ็ตๆใฏ10ไปถ่กจ็คบใใใใฐ่ฏใใใพใใไฝๆใใใใผใฟใซNAใNANใๅญๅจใใชใใใจใ็ขบ่ชใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp_1 = df_receipt.query('20190101 <= sales_ymd <= 20191231')\ndf_tmp_1 = pd.merge(df_customer['customer_id'], \n df_tmp_1[['customer_id', 'amount']], \n how='left', on='customer_id'). \\\n groupby('customer_id').sum().reset_index(). \\\n rename(columns={'amount':'amount_2019'})\n\ndf_tmp_2 = pd.merge(df_customer['customer_id'], \n df_receipt[['customer_id', 'amount']], \n how='left', on='customer_id'). \\\n groupby('customer_id').sum().reset_index()\n\ndf_tmp = pd.merge(df_tmp_1, df_tmp_2, how='inner', on='customer_id')\ndf_tmp['amount_2019'] = df_tmp['amount_2019'].fillna(0)\ndf_tmp['amount'] = df_tmp['amount'].fillna(0)\n\ndf_tmp['amount_rate'] = df_tmp['amount_2019'] / df_tmp['amount']\ndf_tmp['amount_rate'] = df_tmp['amount_rate'].fillna(0)",
"_____no_output_____"
],
[
"df_tmp.query('amount_rate > 0').head(10)",
"_____no_output_____"
],
[
"df_tmp.isnull().sum()",
"_____no_output_____"
]
],
[
[
"---\n> P-085: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎๅ
จ้กงๅฎขใซๅฏพใใ้ตไพฟ็ชๅท๏ผpostal_cd๏ผใ็จใใฆ็ตๅบฆ็ทฏๅบฆๅคๆ็จใใผใฟใใฌใผใ ๏ผdf_geocode๏ผใ็ดไปใใๆฐใใชdf_customer_1ใไฝๆใใใใใ ใใ่คๆฐ็ดใฅใๅ ดๅใฏ็ตๅบฆ๏ผlongitude๏ผใ็ทฏๅบฆ๏ผlatitude๏ผใใใใๅนณๅใ็ฎๅบใใใใจใ\n",
"_____no_output_____"
]
],
[
[
"df_customer_1 = pd.merge(df_customer[['customer_id', 'postal_cd']],\n df_geocode[['postal_cd', 'longitude' ,'latitude']],\n how='inner', on='postal_cd')\ndf_customer_1 = df_customer_1.groupby('customer_id'). \\\n agg({'longitude':'mean', 'latitude':'mean'}).reset_index(). \\\n rename(columns={'longitude':'m_longitude', 'latitude':'m_latitude'})\n\ndf_customer_1 = pd.merge(df_customer, df_customer_1, \n how='inner', on='customer_id')\ndf_customer_1.head(3)",
"_____no_output_____"
]
],
[
[
"---\n> P-086: ๅ่จญๅใงไฝๆใใ็ทฏๅบฆ็ตๅบฆใคใ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer_1๏ผใซๅฏพใใ็ณ่พผใฟๅบ่ใณใผใ๏ผapplication_store_cd๏ผใใญใผใซๅบ่ใใผใฟใใฌใผใ ๏ผdf_store๏ผใจ็ตๅใใใใใใฆ็ณ่พผใฟๅบ่ใฎ็ทฏๅบฆ๏ผlatitude๏ผใป็ตๅบฆๆ
ๅ ฑ๏ผlongitude)ใจ้กงๅฎขใฎ็ทฏๅบฆใป็ตๅบฆใ็จใใฆ่ท้ข๏ผkm๏ผใๆฑใใ้กงๅฎขID๏ผcustomer_id๏ผใ้กงๅฎขไฝๆ๏ผaddress๏ผใๅบ่ไฝๆ๏ผaddress๏ผใจใจใใซ่กจ็คบใใใ่จ็ฎๅผใฏ็ฐกๆๅผใง่ฏใใใฎใจใใใใใใฎไป็ฒพๅบฆใฎ้ซใๆนๅผใๅฉ็จใใใฉใคใใฉใชใๅฉ็จใใฆใใใพใใชใใ็ตๆใฏ10ไปถ่กจ็คบใใใฐ่ฏใใ",
"_____no_output_____"
],
[
"$$\n็ทฏๅบฆ๏ผใฉใธใขใณ๏ผ๏ผ\\phi \\\\\n็ตๅบฆ๏ผใฉใธใขใณ๏ผ๏ผ\\lambda \\\\\n่ท้ขL = 6371 * arccos(sin \\phi_1 * sin \\phi_2\n+ cos \\phi_1 * cos \\phi_2 * cos(\\lambda_1 โ \\lambda_2))\n$$",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndef calc_distance(x1, y1, x2, y2):\n distance = 6371 * math.acos(math.sin(math.radians(y1)) \n * math.sin(math.radians(y2)) \n + math.cos(math.radians(y1)) \n * math.cos(math.radians(y2)) \n * math.cos(math.radians(x1) - math.radians(x2)))\n return distance\n\ndf_tmp = pd.merge(df_customer_1, df_store, how='inner', left_on='application_store_cd', right_on='store_cd') \n\ndf_tmp['distance'] = df_tmp[['m_longitude', 'm_latitude','longitude', 'latitude']]. \\\n apply(lambda x: calc_distance(x[0], x[1], x[2], x[3]), axis=1)\n\ndf_tmp[['customer_id', 'address_x', 'address_y', 'distance']].head(10)",
"_____no_output_____"
],
[
"# ใณใผใไพ2\ndef calc_distance_numpy(x1, y1, x2, y2):\n x1_r = np.radians(x1)\n x2_r = np.radians(x2)\n y1_r = np.radians(y1)\n y2_r = np.radians(y2)\n return 6371 * np.arccos(np.sin(y1_r) * np.sin(y2_r) \n + np.cos(y1_r) * np.cos(y2_r) \n * np.cos(x1_r - x2_r))\n\ndf_tmp = df_customer_1.merge(df_store, \n how='inner', \n left_on='application_store_cd', \n right_on='store_cd') \n\ndf_tmp['distance'] = calc_distance_numpy(df_tmp['m_longitude'], \n df_tmp['m_latitude'],\n df_tmp['longitude'], \n df_tmp['latitude'])\n\ndf_tmp[['customer_id', 'address_x', 'address_y', 'distance']].head(10)",
"_____no_output_____"
]
],
[
[
"---\n> P-087: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใงใฏใ็ฐใชใๅบ่ใงใฎ็ณ่พผใฟใชใฉใซใใๅไธ้กงๅฎขใ่คๆฐ็ป้ฒใใใฆใใใๅๅ๏ผcustomer_name๏ผใจ้ตไพฟ็ชๅท๏ผpostal_cd๏ผใๅใ้กงๅฎขใฏๅไธ้กงๅฎขใจใฟใชใใ1้กงๅฎข1ใฌใณใผใใจใชใใใใซๅๅฏใใใๅๅฏ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer_u๏ผใไฝๆใใใใใ ใใๅไธ้กงๅฎขใซๅฏพใใฆใฏๅฃฒไธ้้กๅ่จใๆใ้ซใใใฎใๆฎใใใฎใจใใๅฃฒไธ้้กๅ่จใๅไธใใใใฏๅฃฒไธๅฎ็ธพใใชใ้กงๅฎขใซใคใใฆใฏ้กงๅฎขID๏ผcustomer_id๏ผใฎ็ชๅทใๅฐใใใใฎใๆฎใใใจใจใใใ",
"_____no_output_____"
]
],
[
[
"df_receipt_tmp = df_receipt.groupby('customer_id') \\\n .agg(sum_amount=('amount','sum')).reset_index()\n\ndf_customer_u = pd.merge(df_customer, df_receipt_tmp, \n how='left', \n on='customer_id')\n\ndf_customer_u['sum_amount'] = df_customer_u['sum_amount'].fillna(0)\n\ndf_customer_u = df_customer_u.sort_values(['sum_amount', 'customer_id'], \n ascending=[False, True])\n\ndf_customer_u.drop_duplicates(subset=['customer_name', 'postal_cd'], \n keep='first', inplace=True)\n\nprint('df_customer:', len(df_customer),\n 'df_customer_u:', len(df_customer_u),\n 'diff:', len(df_customer) - len(df_customer_u))\n\n",
"df_customer: 21971 df_customer_u: 21941 diff: 30\n"
]
],
[
[
"---\n> P-088: ๅ่จญๅใงไฝๆใใใใผใฟใๅ
ใซใ้กงๅฎขใใผใฟใใฌใผใ ใซ็ตฑๅๅๅฏIDใไปไธใใใใผใฟใใฌใผใ ๏ผdf_customer_n๏ผใไฝๆใใใใใ ใใ็ตฑๅๅๅฏIDใฏไปฅไธใฎไปๆงใงไปไธใใใใฎใจใใใ\n>\n> - ้่คใใฆใใชใ้กงๅฎข๏ผ้กงๅฎขID๏ผcustomer_id๏ผใ่จญๅฎ\n> - ้่คใใฆใใ้กงๅฎข๏ผๅ่จญๅใงๆฝๅบใใใฌใณใผใใฎ้กงๅฎขIDใ่จญๅฎ",
"_____no_output_____"
]
],
[
[
"df_customer_n = pd.merge(df_customer, \n df_customer_u[['customer_name', \n 'postal_cd', 'customer_id']],\n how='inner', on =['customer_name', 'postal_cd'])\ndf_customer_n.rename(columns={'customer_id_x':'customer_id', \n 'customer_id_y':'integration_id'}, inplace=True)\n\nprint('IDๆฐใฎๅทฎ', len(df_customer_n['customer_id'].unique()) \n - len(df_customer_n['integration_id'].unique()))",
"IDๆฐใฎๅทฎ 30\n"
]
],
[
[
"---\n> P-้่ฉฑ: df_customer_1, df_customer_nใฏไฝฟใใชใใฎใงๅ้คใใใ",
"_____no_output_____"
]
],
[
[
"del df_customer_1\ndel df_customer_n",
"_____no_output_____"
]
],
[
[
"---\n> P-089: ๅฃฒไธๅฎ็ธพใใใ้กงๅฎขใซๅฏพใใไบๆธฌใขใใซๆง็ฏใฎใใๅญฆ็ฟ็จใใผใฟใจใในใ็จใใผใฟใซๅๅฒใใใใใใใใ8:2ใฎๅฒๅใงใฉใณใใ ใซใใผใฟใๅๅฒใใใ",
"_____no_output_____"
]
],
[
[
"df_sales= df_receipt.groupby('customer_id').agg({'amount':sum}).reset_index()\ndf_tmp = pd.merge(df_customer, df_sales['customer_id'], \n how='inner', on='customer_id')\ndf_train, df_test = train_test_split(df_tmp, test_size=0.2, random_state=71)\nprint('ๅญฆ็ฟใใผใฟๅฒๅ: ', len(df_train) / len(df_tmp))\nprint('ใในใใใผใฟๅฒๅ: ', len(df_test) / len(df_tmp))",
"ๅญฆ็ฟใใผใฟๅฒๅ: 0.7999036840837949\nใในใใใผใฟๅฒๅ: 0.20009631591620516\n"
]
],
[
[
"---\n> P-090: ใฌใทใผใๆ็ดฐใใผใฟใใฌใผใ ๏ผdf_receipt๏ผใฏ2017ๅนด1ๆ1ๆฅใ2019ๅนด10ๆ31ๆฅใพใงใฎใใผใฟใๆใใฆใใใๅฃฒไธ้้ก๏ผamount๏ผใๆๆฌกใง้่จใใๅญฆ็ฟ็จใซ12ใถๆใใในใ็จใซ6ใถๆใฎใขใใซๆง็ฏ็จใใผใฟใ3ใปใใไฝๆใใใ",
"_____no_output_____"
]
],
[
[
"df_tmp = df_receipt[['sales_ymd', 'amount']].copy()\ndf_tmp['sales_ym'] = df_tmp['sales_ymd'].astype('str').str[0:6]\ndf_tmp = df_tmp.groupby('sales_ym').agg({'amount':'sum'}).reset_index()\n\n# ้ขๆฐๅใใใใจใง้ทๆ้ใใผใฟใซๅฏพใใๅคๆฐใฎใใผใฟใปใใใใซใผใใชใฉใงๅฆ็ใงใใใใใซใใ\ndef split_data(df, train_size, test_size, slide_window, start_point):\n train_start = start_point * slide_window\n test_start = train_start + train_size\n return df[train_start : test_start], df[test_start : test_start + test_size]\n\ndf_train_1, df_test_1 = split_data(df_tmp, train_size=12, \n test_size=6, slide_window=6, start_point=0)\ndf_train_2, df_test_2 = split_data(df_tmp, train_size=12, \n test_size=6, slide_window=6, start_point=1)\ndf_train_3, df_test_3 = split_data(df_tmp, train_size=12, \n test_size=6, slide_window=6, start_point=2)",
"_____no_output_____"
],
[
"df_train_1",
"_____no_output_____"
],
[
"df_test_1",
"_____no_output_____"
]
],
[
[
"---\n> P-091: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใฎๅ้กงๅฎขใซๅฏพใใๅฃฒไธๅฎ็ธพใใใ้กงๅฎขๆฐใจๅฃฒไธๅฎ็ธพใใชใ้กงๅฎขๆฐใ1:1ใจใชใใใใซใขใณใใผใตใณใใชใณใฐใงๆฝๅบใใใ",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\n#unbalancedใฎubUnderใไฝฟใฃใไพ\ndf_tmp = df_receipt.groupby('customer_id').agg({'amount':'sum'}).reset_index()\ndf_tmp = pd.merge(df_customer, df_tmp, how='left', on='customer_id')\ndf_tmp['buy_flg'] = df_tmp['amount'].apply(lambda x: 0 if np.isnan(x) else 1)\n\nprint('0ใฎไปถๆฐ', len(df_tmp.query('buy_flg == 0')))\nprint('1ใฎไปถๆฐ', len(df_tmp.query('buy_flg == 1')))\n\npositive_count = len(df_tmp.query('buy_flg == 1'))\n\nrs = RandomUnderSampler(random_state=71)\n\ndf_sample, _ = rs.fit_resample(df_tmp, df_tmp.buy_flg)\n\nprint('0ใฎไปถๆฐ', len(df_sample.query('buy_flg == 0')))\nprint('1ใฎไปถๆฐ', len(df_sample.query('buy_flg == 1')))",
"0ใฎไปถๆฐ 13665\n1ใฎไปถๆฐ 8306\n0ใฎไปถๆฐ 8306\n1ใฎไปถๆฐ 8306\n"
],
[
"# ใณใผใไพ2\n#unbalancedใฎubUnderใไฝฟใฃใไพ\ndf_tmp = df_customer.merge(df_receipt\n .groupby('customer_id')['amount'].sum()\n .reset_index(), \n how='left', \n on='customer_id')\ndf_tmp['buy_flg'] = np.where(df_tmp['amount'].isnull(), 0, 1)\n\nprint(\"ใตใณใใชใณใฐๅใฎbuy_flgใฎไปถๆฐ\")\nprint(df_tmp['buy_flg'].value_counts(), \"\\n\")\n\npositive_count = (df_tmp['buy_flg'] == 1).sum()\n\nrs = RandomUnderSampler(random_state=71)\n\ndf_sample, _ = rs.fit_resample(df_tmp, df_tmp.buy_flg)\n\nprint(\"ใตใณใใชใณใฐๅพใฎbuy_flgใฎไปถๆฐ\")\nprint(df_sample['buy_flg'].value_counts())",
"ใตใณใใชใณใฐๅใฎbuy_flgใฎไปถๆฐ\n0 13665\n1 8306\nName: buy_flg, dtype: int64 \n\nใตใณใใชใณใฐๅพใฎbuy_flgใฎไปถๆฐ\n0 8306\n1 8306\nName: buy_flg, dtype: int64\n"
]
],
[
[
"---\n> P-092: ้กงๅฎขใใผใฟใใฌใผใ ๏ผdf_customer๏ผใงใฏใๆงๅฅใซ้ขใใๆ
ๅ ฑใ้ๆญฃ่ฆๅใฎ็ถๆ
ใงไฟๆใใใฆใใใใใใ็ฌฌไธๆญฃ่ฆๅใใใ",
"_____no_output_____"
]
],
[
[
"df_gender = df_customer[['gender_cd', 'gender']].drop_duplicates()\ndf_customer_s = df_customer.drop(columns='gender')",
"_____no_output_____"
]
],
[
[
"---\n\n> P-093: ๅๅใใผใฟใใฌใผใ ๏ผdf_product๏ผใงใฏๅใซใใดใชใฎใณใผใๅคใ ใใไฟๆใใใซใใดใชๅใฏไฟๆใใฆใใชใใใซใใดใชใใผใฟใใฌใผใ ๏ผdf_category๏ผใจ็ตใฟๅใใใฆ้ๆญฃ่ฆๅใใใซใใดใชๅใไฟๆใใๆฐใใชๅๅใใผใฟใใฌใผใ ใไฝๆใใใ",
"_____no_output_____"
]
],
[
[
"df_product_full = pd.merge(df_product, df_category[['category_small_cd', \n 'category_major_name',\n 'category_medium_name',\n 'category_small_name']], \n how = 'inner', on = 'category_small_cd')",
"_____no_output_____"
]
],
[
[
"---\n> P-094: ๅ
ใซไฝๆใใใซใใดใชๅไปใๅๅใใผใฟใไปฅไธใฎไปๆงใงใใกใคใซๅบๅใใใใชใใๅบๅๅ
ใฎใในใฏdata้
ไธใจใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏCSV๏ผใซใณใๅบๅใ๏ผ\n> - ใใใๆใ\n> - ๆๅญใณใผใใฏUTF-8",
"_____no_output_____"
]
],
[
[
"# ใณใผใไพ1\ndf_product_full.to_csv('../data/P_df_product_full_UTF-8_header.csv', \n encoding='UTF-8', index=False)",
"_____no_output_____"
],
[
"# ใณใผใไพ2๏ผBOMไปใใงExcelใฎๆๅญๅใใ้ฒใ๏ผ\ndf_product_full.to_csv('../data/P_df_product_full_UTF-8_header.csv', \n encoding='utf_8_sig', index=False)",
"_____no_output_____"
]
],
[
[
"---\n> P-095: ๅ
ใซไฝๆใใใซใใดใชๅไปใๅๅใใผใฟใไปฅไธใฎไปๆงใงใใกใคใซๅบๅใใใใชใใๅบๅๅ
ใฎใในใฏdata้
ไธใจใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏCSV๏ผใซใณใๅบๅใ๏ผ\n> - ใใใๆใ\n> - ๆๅญใณใผใใฏCP932",
"_____no_output_____"
]
],
[
[
"df_product_full.to_csv('../data/P_df_product_full_CP932_header.csv', \n encoding='CP932', index=False)",
"_____no_output_____"
]
],
[
[
"---\n> P-096: ๅ
ใซไฝๆใใใซใใดใชๅไปใๅๅใใผใฟใไปฅไธใฎไปๆงใงใใกใคใซๅบๅใใใใชใใๅบๅๅ
ใฎใในใฏdata้
ไธใจใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏCSV๏ผใซใณใๅบๅใ๏ผ\n> - ใใใ็กใ\n> - ๆๅญใณใผใใฏUTF-8",
"_____no_output_____"
]
],
[
[
"df_product_full.to_csv('../data/P_df_product_full_UTF-8_noh.csv', \n header=False ,encoding='UTF-8', index=False)",
"_____no_output_____"
]
],
[
[
"---\n> P-097: ๅ
ใซไฝๆใใไปฅไธๅฝขๅผใฎใใกใคใซใ่ชญใฟ่พผใฟใใใผใฟใใฌใผใ ใไฝๆใใใใพใใๅ
้ ญ3ไปถใ่กจ็คบใใใๆญฃใใใจใใพใใฆใใใใจใ็ขบ่ชใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏCSV๏ผใซใณใๅบๅใ๏ผ\n> - ใใใๆใ\n> - ๆๅญใณใผใใฏUTF-8",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.read_csv('../data/P_df_product_full_UTF-8_header.csv',\n dtype={'category_major_cd':str,\n 'category_medium_cd':str,\n 'category_small_cd':str},\n encoding='UTF-8')\ndf_tmp.head(3)",
"_____no_output_____"
]
],
[
[
"---\n> P-098: ๅ
ใซไฝๆใใไปฅไธๅฝขๅผใฎใใกใคใซใ่ชญใฟ่พผใฟใใใผใฟใใฌใผใ ใไฝๆใใใใพใใๅ
้ ญ3ไปถใ่กจ็คบใใใๆญฃใใใจใใพใใฆใใใใจใ็ขบ่ชใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏCSV๏ผใซใณใๅบๅใ๏ผ\n> - ใใใ็กใ\n> - ๆๅญใณใผใใฏUTF-8",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.read_csv('../data/P_df_product_full_UTF-8_noh.csv',\n dtype={1:str,\n 2:str,\n 3:str},\n encoding='UTF-8', header=None)\ndf_tmp.head(3)",
"_____no_output_____"
]
],
[
[
"---\n> P-099: ๅ
ใซไฝๆใใใซใใดใชๅไปใๅๅใใผใฟใไปฅไธใฎไปๆงใงใใกใคใซๅบๅใใใใชใใๅบๅๅ
ใฎใในใฏdata้
ไธใจใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏTSV๏ผใฟใๅบๅใ๏ผ\n> - ใใใๆใ\n> - ๆๅญใณใผใใฏUTF-8",
"_____no_output_____"
]
],
[
[
"df_product_full.to_csv('../data/P_df_product_full_UTF-8_header.tsv', \n sep='\\t', encoding='UTF-8', index=False)",
"_____no_output_____"
]
],
[
[
"---\n> P-100: ๅ
ใซไฝๆใใไปฅไธๅฝขๅผใฎใใกใคใซใ่ชญใฟ่พผใฟใใใผใฟใใฌใผใ ใไฝๆใใใใพใใๅ
้ ญ3ไปถใ่กจ็คบใใใๆญฃใใใจใใพใใฆใใใใจใ็ขบ่ชใใใ\n>\n> - ใใกใคใซๅฝขๅผใฏTSV๏ผใฟใๅบๅใ๏ผ\n> - ใใใๆใ\n> - ๆๅญใณใผใใฏUTF-8",
"_____no_output_____"
]
],
[
[
"df_tmp = pd.read_table('../data/P_df_product_full_UTF-8_header.tsv', \n dtype={'category_major_cd':str,\n 'category_medium_cd':str,\n 'category_small_cd':str},\n encoding='UTF-8')\ndf_tmp.head(3)",
"_____no_output_____"
]
],
[
[
"# ใใใง๏ผ๏ผ๏ผๆฌ็ตใใใงใใใใคใใใใพใงใใ๏ผ",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbf8c4f1352a9feba2d8423ad37c8e591604bb75
| 2,483 |
ipynb
|
Jupyter Notebook
|
notebooks/bootstrap.ipynb
|
kimito/jetbox
|
c765d51e6376f597aeea0cf878bced92d734d022
|
[
"Apache-2.0"
] | 2 |
2021-06-20T09:41:38.000Z
|
2021-07-16T06:11:53.000Z
|
notebooks/bootstrap.ipynb
|
kimito/lunchjet
|
c765d51e6376f597aeea0cf878bced92d734d022
|
[
"Apache-2.0"
] | null | null | null |
notebooks/bootstrap.ipynb
|
kimito/lunchjet
|
c765d51e6376f597aeea0cf878bced92d734d022
|
[
"Apache-2.0"
] | null | null | null | 2,483 | 2,483 | 0.701168 |
[
[
[
"from google.colab import drive\ndrive.mount('/content/drive')",
"Mounted at /content/drive\n"
],
[
"%cd /content/drive/MyDrive/lunchbox/\n%ls",
"/content/drive/MyDrive/lunchbox\n\u001b[0m\u001b[01;34msrc\u001b[0m/ train_2021_05_22_12_03_59.tar.gz\n"
],
[
"!mkdir src && cd src && git clone https://github.com/kimito/lunchjet.git",
"mkdir: cannot create directory โsrcโ: File exists\n"
],
[
"%cd src/lunchjet/notebooks/\n%ls",
"/content/drive/MyDrive/lunchbox/src/lunchjet/notebooks\nbootstrap.ipynb\n"
],
[
"",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8df7ec951d69de2cc75a84c5b00a6da50ea05
| 48,251 |
ipynb
|
Jupyter Notebook
|
plots/multi-conlinearity.ipynb
|
WMDA/ENIGpy
|
687dd096bedbc7859e611e74742fc498482c6680
|
[
"MIT"
] | null | null | null |
plots/multi-conlinearity.ipynb
|
WMDA/ENIGpy
|
687dd096bedbc7859e611e74742fc498482c6680
|
[
"MIT"
] | null | null | null |
plots/multi-conlinearity.ipynb
|
WMDA/ENIGpy
|
687dd096bedbc7859e611e74742fc498482c6680
|
[
"MIT"
] | null | null | null | 241.255 | 25,358 | 0.926696 |
[
[
[
"import nibabel as nb\nimport numpy as np\nimport pandas as pd\nfrom statsmodels.stats.outliers_influence import variance_inflation_factor\nimport os\nimport seaborn as sns\nimport matplotlib.pyplot as plt \nfrom decouple import config",
"_____no_output_____"
]
],
[
[
"Get enviornmental variables and change to data folder",
"_____no_output_____"
]
],
[
[
"data = config('data')\nos.chdir(data) ",
"_____no_output_____"
]
],
[
[
"load the eres.mgh file and the behavioural results csv",
"_____no_output_____"
]
],
[
[
"img = nb.load('eres.mgh')\ndf = pd.read_csv('behavioural_results.csv')",
"_____no_output_____"
]
],
[
[
"Extract the data from the eres.mgh and create dataframe",
"_____no_output_____"
]
],
[
[
"data = img.get_fdata()\nvoxel_data = data[10000].flatten()\nvoxel_data = np.array(voxel_data,dtype=float) \nvoxel_df = pd.DataFrame(voxel_data)",
"_____no_output_____"
]
],
[
[
"Build a dataframe to create the correlation matricies",
"_____no_output_____"
]
],
[
[
"model = pd.concat([df[['G-Number','age_adjusted_group','Age']],voxel_df],axis=1).rename(columns={0:'voxel'})\ngroups = pd.get_dummies(model['age_adjusted_group'])\nmodel = pd.concat([model,groups],axis=1)\nX_age = model[['voxel','AAN','HC','WR','Age']] \nX_minage = model[['voxel','AAN','HC','WR']] ",
"_____no_output_____"
]
],
[
[
"Plot heatmaps",
"_____no_output_____"
]
],
[
[
"plt.suptitle('Heatmap with age in model')\nsns.heatmap(X_age.corr(), annot=True)",
"_____no_output_____"
],
[
"plt.suptitle('Heatmap without age in model')\nsns.heatmap(X_minage.corr(), annot=True)",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbf8dfed2fc2c43f724f001b278304d73f9a1552
| 4,720 |
ipynb
|
Jupyter Notebook
|
supervised/uber-ludwig.ipynb
|
archydeberker/armoury
|
cdea51e6faf4659b97aa631cdcec80df7052d79b
|
[
"MIT"
] | 3 |
2018-09-26T03:00:25.000Z
|
2020-02-09T15:55:29.000Z
|
supervised/uber-ludwig.ipynb
|
archydeberker/armoury
|
cdea51e6faf4659b97aa631cdcec80df7052d79b
|
[
"MIT"
] | null | null | null |
supervised/uber-ludwig.ipynb
|
archydeberker/armoury
|
cdea51e6faf4659b97aa631cdcec80df7052d79b
|
[
"MIT"
] | null | null | null | 25.106383 | 260 | 0.588771 |
[
[
[
"# Ludwig from Uber\n[Ludwig](https://eng.uber.com/introducing-ludwig/) is a \"code-free\" system for training and deploying simple ML models when your data is in a tabular format. They have the [fanciest Github pages](https://uber.github.io/ludwig/) I've ever seen.\n\nThey also provide a Python API, but my experience playing with it in this notebook was not great. Better to try the CLI interface they advocate.",
"_____no_output_____"
],
[
"### Titanic example",
"_____no_output_____"
]
],
[
[
"import yaml\nimport pandas as pd\nimport numpy as np\n\nfrom ludwig.api import LudwigModel",
"_____no_output_____"
]
],
[
[
"If you use the CLI Ludwig will do your splits for you, but I'm not entirely sure how they persist the splits over calls to `ludwig train` and `ludwig predict`. \n\nSince we're using the Python API I think we have to do the splits ourselves.",
"_____no_output_____"
]
],
[
[
"titanic_df",
"_____no_output_____"
],
[
"titanic_df = pd.read_csv('http://biostat.mc.vanderbilt.edu/wiki/pub/Main/DataSets/titanic3.csv')\n\ntrain_idx = np.random.randint(0, len(titanic_df), np.int(0.8*len(titanic_df)))\ntrain_df = titanic_df.iloc[train_idx]\ntest_df = titanic_df.iloc[~titanic_df.index.isin(train_idx)]\nassert len(set(train_df.index).intersection(set(test_df.index))) is 0",
"_____no_output_____"
]
],
[
[
"Use the model definition from [the docs](https://uber.github.io/ludwig/examples/#kaggles-titanic-predicting-survivors), although I converted the feature names to lowercase to match this csv.\n\nThe model definition defines and *types* th input features,",
"_____no_output_____"
]
],
[
[
"model_definition = yaml.safe_load(open('./titanic-model-def.yaml'))\nmodel_definition",
"_____no_output_____"
],
[
"# train a model\nmodel = LudwigModel(model_definition)\ntrain_stats = model.train(train_df)\n\n# obtain predictions\npredictions = model.predict(test_df)",
"_____no_output_____"
],
[
"predictions",
"_____no_output_____"
],
[
"print(\"Accuracy was \", sum(test_df.survived.values == predictions.survived_predictions.values) / len(predictions))",
"_____no_output_____"
]
],
[
[
"# Visualization\nThis training run cretated a folder called `results` which we can now interrogate. The interaction with the visualization tools seems a bit clunky via python. It's also really not clear where the different `experiment runs` come from and how they differ!",
"_____no_output_____"
]
],
[
[
"!pip install seaborn",
"_____no_output_____"
],
[
"from ludwig import visualize",
"_____no_output_____"
],
[
"visualize.learning_curves_cli(training_statistics=['./results/api_experiment_run_2/training_statistics.json'],\n output_feature_name='survived')",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
cbf8e6c4f0109658b32962f0e75b2751cb2198cf
| 5,474 |
ipynb
|
Jupyter Notebook
|
aiml40/absa/notebooks/setup.ipynb
|
nswitanek/ignite-learning-paths-training-aiml
|
018968f413f6c1aa11230c802f785fdaea54e480
|
[
"CC-BY-4.0",
"MIT"
] | 203 |
2019-10-07T10:44:09.000Z
|
2021-11-08T09:21:17.000Z
|
aiml40/absa/notebooks/setup.ipynb
|
nswitanek/ignite-learning-paths-training-aiml
|
018968f413f6c1aa11230c802f785fdaea54e480
|
[
"CC-BY-4.0",
"MIT"
] | 53 |
2019-10-08T15:15:04.000Z
|
2020-11-23T16:29:39.000Z
|
aiml40/absa/notebooks/setup.ipynb
|
nswitanek/ignite-learning-paths-training-aiml
|
018968f413f6c1aa11230c802f785fdaea54e480
|
[
"CC-BY-4.0",
"MIT"
] | 210 |
2019-10-04T14:41:49.000Z
|
2021-11-04T23:05:22.000Z
| 28.962963 | 132 | 0.584399 |
[
[
[
"# Setup the ABSA Demo",
"_____no_output_____"
],
[
"### Step 1 - Install aditional pip packages on your Compute instance",
"_____no_output_____"
]
],
[
[
"!pip install git+https://github.com/hnky/nlp-architect.git@absa",
"_____no_output_____"
],
[
"!pip install spacy==2.1.8",
"_____no_output_____"
]
],
[
[
"### Step 2 - Download Notebooks, Training Data, Training / Inference scripts",
"_____no_output_____"
]
],
[
[
"import azureml\nfrom azureml.core import Workspace, Datastore, Experiment, Environment, Model\nimport urllib.request\nfrom pathlib import Path",
"_____no_output_____"
],
[
"# This will open an device login prompt. Login with your credentials that have access to the workspace.\n\n# Connect to the workspace\nws = Workspace.from_config()\nprint(\"Using workspace:\",ws.name,\"in region\", ws.location)\n\n# Connect to the default datastore\nds = ws.get_default_datastore()\nprint(\"Datastore:\",ds.name)",
"_____no_output_____"
],
[
"# Create directories\nPath(\"dataset\").mkdir(parents=True, exist_ok=True)\nPath(\"notebooks\").mkdir(parents=True, exist_ok=True)\nPath(\"scripts\").mkdir(parents=True, exist_ok=True)\nPath(\"temp\").mkdir(parents=True, exist_ok=True)",
"_____no_output_____"
]
],
[
[
"The cell below will take some time to run as it is downloading a large dataset plus code files. Please allow around 10-15 mins",
"_____no_output_____"
]
],
[
[
"# Download all files needed\nbase_link = \"https://raw.githubusercontent.com/microsoft/ignite-learning-paths-training-aiml/main/aiml40/absa/\"\n\n# Download Data \nif not Path(\"dataset/glove.840B.300d.zip\").is_file():\n urllib.request.urlretrieve('http://nlp.stanford.edu/data/glove.840B.300d.zip', 'dataset/glove.840B.300d.zip')\n\nurllib.request.urlretrieve(base_link+'../dataset/clothing_absa_train.csv', 'dataset/clothing_absa_train.csv')\nurllib.request.urlretrieve(base_link+'../dataset/clothing-absa-validation.json', 'dataset/clothing-absa-validation.json')\nurllib.request.urlretrieve(base_link+'../dataset/clothing_absa_train_small.csv', 'dataset/clothing_absa_train_small.csv')\n\n# Download Notebooks\nurllib.request.urlretrieve(base_link+'notebooks/absa-hyperdrive.ipynb', 'notebooks/absa-hyperdrive.ipynb')\nurllib.request.urlretrieve(base_link+'notebooks/absa.ipynb', 'notebooks/absa.ipynb')\n\n# Download Scripts \nurllib.request.urlretrieve(base_link+'scripts/score.py', 'scripts/score.py')\nurllib.request.urlretrieve(base_link+'scripts/train.py', 'scripts/train.py')\n",
"_____no_output_____"
],
[
"# Upload data to the data store\nds.upload('dataset', target_path='clothing_data', overwrite=False, show_progress=True)",
"_____no_output_____"
],
[
"### Step 3 - Setup AMLS\nfrom azureml.core.compute import ComputeTarget, AmlCompute\nfrom azureml.core.compute_target import ComputeTargetException\n\ncluster_name = \"absa-cluster\"\n\ntry:\n cluster = ComputeTarget(workspace=ws, name=cluster_name)\n print('Using compute cluster:', cluster_name)\nexcept ComputeTargetException:\n compute_config = AmlCompute.provisioning_configuration(vm_size='STANDARD_D3_V2',\n vm_priority='lowpriority',\n min_nodes=0,\n max_nodes=8)\n cluster = ComputeTarget.create(ws, cluster_name, compute_config)\n cluster.wait_for_completion(show_output=True)\n",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
cbf8efdc7b444999b3c9571fd57feac36128da36
| 11,906 |
ipynb
|
Jupyter Notebook
|
notebooks/02f_pds.utils.ipynb
|
michaelaye/nbplanetary
|
3798cd00c654b6e80fe816ed30edea5cfa3b9330
|
[
"Apache-2.0"
] | 1 |
2022-02-22T19:57:50.000Z
|
2022-02-22T19:57:50.000Z
|
notebooks/02f_pds.utils.ipynb
|
michaelaye/nbplanetary
|
3798cd00c654b6e80fe816ed30edea5cfa3b9330
|
[
"Apache-2.0"
] | 28 |
2021-08-14T21:24:05.000Z
|
2022-03-24T17:48:50.000Z
|
notebooks/02f_pds.utils.ipynb
|
michaelaye/nbplanetary
|
3798cd00c654b6e80fe816ed30edea5cfa3b9330
|
[
"Apache-2.0"
] | null | null | null | 34.017143 | 122 | 0.501932 |
[
[
[
"# default_exp pds.utils\n# default_cls_lvl 3",
"_____no_output_____"
]
],
[
[
"# PDS Utils\n> Utilities used by the `pds` sub-package.",
"_____no_output_____"
]
],
[
[
"# hide\nfrom nbverbose.showdoc import show_doc # noqa",
"_____no_output_____"
],
[
"# export\nfrom typing import Union\n\nfrom fastcore.utils import Path\n\nimport pandas as pd\nimport pvl\nfrom planetarypy import utils",
"_____no_output_____"
],
[
"# export\nclass IndexLabel:\n \"Support working with label files of PDS Index tables.\"\n\n def __init__(\n self,\n # Path to the labelfile for a PDS Indexfile.\n # The actual table should reside in the same folder to be automatically parsed\n # when calling the `read_index_data` method.\n labelpath: Union[str, Path],\n ):\n self.path = Path(labelpath)\n \"search for table name pointer and store key and fpath.\"\n tuple = [i for i in self.pvl_lbl if i[0].startswith(\"^\")][0]\n self.tablename = tuple[0][1:]\n self.index_name = tuple[1]\n\n @property\n def index_path(self):\n p = self.path.parent / self.index_name\n if not p.exists():\n import warnings\n\n warnings.warn(\n \"Fudging path name to lower case, opposing label value. (PDS data inconsistency)\"\n )\n p = self.path.parent / self.index_name.lower()\n if not p.exists():\n warnings.warn(\"`index_path` still doesn't exist.\")\n return p\n\n @property\n def pvl_lbl(self):\n return pvl.load(str(self.path))\n\n @property\n def table(self):\n return self.pvl_lbl[self.tablename]\n\n @property\n def pvl_columns(self):\n return self.table.getlist(\"COLUMN\")\n\n @property\n def columns_dic(self):\n return {col[\"NAME\"]: col for col in self.pvl_columns}\n\n @property\n def colnames(self):\n \"\"\"Read the columns in an PDS index label file.\n\n The label file for the PDS indices describes the content\n of the index files.\n \"\"\"\n colnames = []\n for col in self.pvl_columns:\n colnames.extend(PVLColumn(col).name_as_list)\n return colnames\n\n @property\n def colspecs(self):\n colspecs = []\n columns = self.table.getlist(\"COLUMN\")\n for column in columns:\n pvlcol = PVLColumn(column)\n if pvlcol.items is None:\n colspecs.append(pvlcol.colspecs)\n else:\n colspecs.extend(pvlcol.colspecs)\n return colspecs\n\n def read_index_data(self, convert_times=True):\n return index_to_df(self.index_path, self, convert_times=convert_times)",
"_____no_output_____"
],
[
"# export\ndef index_to_df(\n # Path to the index TAB file\n indexpath: Union[str, Path],\n # Label object that has both the column names and the columns widths as attributes\n # 'colnames' and 'colspecs'\n label: IndexLabel,\n # Switch to control if to convert columns with \"TIME\" in name (unless COUNT is as well in name) to datetime\n convert_times=True,\n):\n \"\"\"The main reader function for PDS Indexfiles.\n\n In conjunction with an IndexLabel object that figures out the column widths,\n this reader should work for all PDS TAB files.\n \"\"\"\n indexpath = Path(indexpath)\n df = pd.read_fwf(\n indexpath, header=None, names=label.colnames, colspecs=label.colspecs\n )\n if convert_times:\n for column in [col for col in df.columns if \"TIME\" in col]:\n if column in [\"LOCAL_TIME\", \"DWELL_TIME\"]:\n continue\n try:\n df[column] = pd.to_datetime(df[column])\n except ValueError:\n df[column] = pd.to_datetime(\n df[column], format=utils.nasa_dt_format_with_ms, errors=\"coerce\"\n )\n except KeyError:\n raise KeyError(f\"{column} not in {df.columns}\")\n print(\"Done.\")\n return df",
"_____no_output_____"
],
[
"# export\nclass PVLColumn:\n \"Manages just one of the columns in a table that is described via PVL.\"\n\n def __init__(self, pvlobj):\n self.pvlobj = pvlobj\n\n @property\n def name(self):\n return self.pvlobj[\"NAME\"]\n\n @property\n def name_as_list(self):\n \"needs to return a list for consistency for cases when it's an array.\"\n if self.items is None:\n return [self.name]\n else:\n return [self.name + \"_\" + str(i + 1) for i in range(self.items)]\n\n @property\n def start(self):\n \"Decrease by one as Python is 0-indexed.\"\n return self.pvlobj[\"START_BYTE\"] - 1\n\n @property\n def stop(self):\n return self.start + self.pvlobj[\"BYTES\"]\n\n @property\n def items(self):\n return self.pvlobj.get(\"ITEMS\")\n\n @property\n def item_bytes(self):\n return self.pvlobj.get(\"ITEM_BYTES\")\n\n @property\n def item_offset(self):\n return self.pvlobj.get(\"ITEM_OFFSET\")\n\n @property\n def colspecs(self):\n if self.items is None:\n return (self.start, self.stop)\n else:\n i = 0\n bucket = []\n for _ in range(self.items):\n off = self.start + self.item_offset * i\n bucket.append((off, off + self.item_bytes))\n i += 1\n return bucket\n\n def decode(self, linedata):\n if self.items is None:\n start, stop = self.colspecs\n return linedata[start:stop]\n else:\n bucket = []\n for (start, stop) in self.colspecs:\n bucket.append(linedata[start:stop])\n return bucket\n\n def __repr__(self):\n return self.pvlobj.__repr__()",
"_____no_output_____"
],
[
"# export\ndef decode_line(\n linedata: str, # One line of a .tab data file\n labelpath: Union[\n str, Path\n ], # Path to the appropriate label that describes the data.\n):\n \"Decode one line of tabbed data with the appropriate label file.\"\n label = IndexLabel(labelpath)\n for column in label.pvl_columns:\n pvlcol = PVLColumn(column)\n print(pvlcol.name, pvlcol.decode(linedata))",
"_____no_output_____"
],
[
"# export\ndef find_mixed_type_cols(\n # Dataframe to be searched for mixed data-types\n df: pd.DataFrame,\n # Switch to control if NaN values in these problem columns should be replaced by the string 'UNKNOWN'\n fix: bool = True,\n) -> list: # List of column names that have data type changes within themselves.\n \"\"\"For a given dataframe, find the columns that are of mixed type.\n\n Tool to help with the performance warning when trying to save a pandas DataFrame as a HDF.\n When a column changes datatype somewhere, pickling occurs, slowing down the reading process of the HDF file.\n \"\"\"\n result = []\n for col in df.columns:\n weird = (df[[col]].applymap(type) != df[[col]].iloc[0].apply(type)).any(axis=1)\n if len(df[weird]) > 0:\n print(col)\n result.append(col)\n if fix is True:\n for col in result:\n df[col].fillna(\"UNKNOWN\", inplace=True)\n return result",
"_____no_output_____"
],
[
"# export\ndef fix_hirise_edrcumindex(\n infname: Union[str, Path], # Path to broken EDRCUMINDEX.TAB\n outfname: Union[str, Path], # Path where to store the fixed TAB file\n):\n \"\"\"Fix HiRISE EDRCUMINDEX.\n\n The HiRISE EDRCUMINDEX has some broken lines where the SCAN_EXPOSURE_DURATION is of format\n F10.4 instead of the defined F9.4.\n This function simply replaces those incidences with one less decimal fraction, so 20000.0000\n becomes 20000.000.\n \"\"\"\n with open(str(infname)) as f:\n with open(str(outfname, \"w\")) as newf:\n for line in tqdm(f):\n exp = line.split(\",\")[21]\n if float(exp) > 9999.999:\n # catching the return of write into dummy variable\n _ = newf.write(line.replace(exp, exp[:9]))\n else:\n _ = newf.write(line)",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf8f49bd3f1fe725b8efecd4bb3c7722c4359ec
| 1,861 |
ipynb
|
Jupyter Notebook
|
Recitation_UMITEN.ipynb
|
aaronbened/Numerical-Methods-58011
|
673ceda611de2860b5cd2b2a6d0266f600a8fd7f
|
[
"Apache-2.0"
] | null | null | null |
Recitation_UMITEN.ipynb
|
aaronbened/Numerical-Methods-58011
|
673ceda611de2860b5cd2b2a6d0266f600a8fd7f
|
[
"Apache-2.0"
] | null | null | null |
Recitation_UMITEN.ipynb
|
aaronbened/Numerical-Methods-58011
|
673ceda611de2860b5cd2b2a6d0266f600a8fd7f
|
[
"Apache-2.0"
] | null | null | null | 25.493151 | 248 | 0.460505 |
[
[
[
"<a href=\"https://colab.research.google.com/github/aaronbened/Numerical-Methods-58011/blob/main/Recitation_UMITEN.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
],
[
"#Trapezoidal Rule",
"_____no_output_____"
]
],
[
[
"def f(x):\n return( 0.2 + 25*x - 200*x**2 + 675*x**3 - 900*x**4 + 400*x**5 ) #define the trigo function\na = 0\nb = 0.8\nn = 10\nh = (b-a)/n #width of the trapezoid\nS = h* (f(a)+f(b)) #beginning value of summation\nfor i in range(1,n):\n S+=f(a+i*h)\nIntegral = S*h\nprint('Integral = %0.4f' %Integral)",
"Integral = 1.6005\n"
]
]
] |
[
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code"
]
] |
cbf9108834c6bded4c73d1ffc9066d70281324d6
| 12,629 |
ipynb
|
Jupyter Notebook
|
Built_in_Functions/Built-in.ipynb
|
LivioAlvarenga/Built_in_Functions
|
ad28ad970cecda8e2916e49a8885b7b9e2a064d9
|
[
"MIT"
] | 1 |
2021-12-23T11:43:30.000Z
|
2021-12-23T11:43:30.000Z
|
Built_in_Functions/Built-in.ipynb
|
LivioAlvarenga/Python_3
|
ad28ad970cecda8e2916e49a8885b7b9e2a064d9
|
[
"MIT"
] | null | null | null |
Built_in_Functions/Built-in.ipynb
|
LivioAlvarenga/Python_3
|
ad28ad970cecda8e2916e49a8885b7b9e2a064d9
|
[
"MIT"
] | null | null | null | 23.783427 | 146 | 0.469 |
[
[
[
"# **Built in Functions**",
"_____no_output_____"
],
[
"# **bool()**\r\nValores vazios ou zeros sรฃo considerado False, do contrรกrio sรฃo considerados True (Truth Value Testing).\r\n\r\n\"Truth Value Testing\". Isto รฉ, decidir quando um valor รฉ considerado True ou False",
"_____no_output_____"
]
],
[
[
"print(bool(0))\r\nprint(bool(\"\"))\r\nprint(bool(None))\r\nprint(bool(1))\r\nprint(bool(-100))\r\nprint(bool(13.5))\r\nprint(bool(\"teste\"))\r\nprint(bool(True))",
"False\nFalse\nFalse\nTrue\nTrue\nTrue\nTrue\nTrue\n"
]
],
[
[
"# **f'' / .format()**",
"_____no_output_____"
]
],
[
[
"# antes da versรฃo 3.6 \r\na = ('Hello World!')\r\nprint('----> {} <----'.format(a))\r\n\r\n# ou f''\r\nprint(f'----> {a} <----')",
"----> Hello World! <----\n----> Hello World! <----\n"
],
[
"nome = 'Josรฉ'\r\nidade = 23\r\nsalario = 987.30\r\nprint(f'O {nome} tem {idade} anos e ganha R${salario:.2f}.') #Python 3.6+\r\nprint(f'O {nome:-^20} tem {idade} anos e ganha R${salario:.2f}.') #Python 3.6+\r\nprint(f'O {nome:-<20} tem {idade} anos e ganha R${salario:.2f}.') #Python 3.6+\r\nprint(f'O {nome:->20} tem {idade} anos e ganha R${salario:.2f}.') #Python 3.6+\r\nprint('O {} tem {} anos e ganha R${:.2f}.'.format(nome, idade, salario)) #Python 3\r\nprint('O %s tem %d anos.' % (nome, idade)) #Python 2",
"O Josรฉ tem 23 anos e ganha R$987.30.\nO --------Josรฉ-------- tem 23 anos e ganha R$987.30.\nO Josรฉ---------------- tem 23 anos e ganha R$987.30.\nO ----------------Josรฉ tem 23 anos e ganha R$987.30.\nO Josรฉ tem 23 anos e ganha R$987.30.\nO Josรฉ tem 23 anos.\n"
],
[
"# formataรงรฃo de traรงos + casas decimais\r\nvlr = 120\r\nvlr2 = 10.1\r\nprint(f'{vlr:->20.2f}')\r\nprint(f'{vlr2:-^20.2f}')\r\nprint(f'R${vlr:5.2f}')\r\nprint(f'R${vlr:6.2f}')\r\nprint(f'R${vlr:7.2f}')\r\nprint(f'R${vlr:8.2f}')\r\nprint(f'R${vlr:08.2f}')\r\nprint(f'R${vlr:010.4f}') # f para float\r\nprint(f'R${vlr:010d}')# d para inteiros\r\nprint(f'R${vlr:04d}')\r\nprint(f'R${vlr:4d}')",
"--------------120.00\n-------10.10--------\nR$120.00\nR$120.00\nR$ 120.00\nR$ 120.00\nR$00120.00\nR$00120.0000\nR$0000000120\nR$0120\nR$ 120\n"
],
[
"n1, n2, n3, n4 = 100, 1, 00.758, 15.77\r\n\r\nprint(f'n1 = {n1:6}\\nn1 = {n1:06}') # total de casas = 6 com opรงรฃo de colocar ou nรฃo zero\r\nprint(f'n2 = {n2:06}')\r\nprint(f'n2 = {n2: 6}')\r\nprint(f'n3 = {n3:06.3f}') # variavel + ':' + total de casas decimais + '.' + casas decimais a direita da ','\r\nprint(f'n4 = {n4:06.3f} ou {n4:.2f}')",
"n1 = 100\nn1 = 000100\nn2 = 000001\nn2 = 1\nn3 = 00.758\nn4 = 15.770 ou 15.77\n"
],
[
"# formataรงรฃo com tab \\t\r\nfor c in range(0,5):\r\n print(f'O {c}ยบ valor recebido รฉ \\t R$1000,00')\r\n print('Agora sem o tab')\r\n print(f'O {c}ยบ valor recebido รฉ R$1000,00')\r\n print('-' * 35)",
"O 0ยบ valor recebido รฉ \t R$1000,00\nAgora sem o tab\nO 0ยบ valor recebido รฉ R$1000,00\n-----------------------------------\nO 1ยบ valor recebido รฉ \t R$1000,00\nAgora sem o tab\nO 1ยบ valor recebido รฉ R$1000,00\n-----------------------------------\nO 2ยบ valor recebido รฉ \t R$1000,00\nAgora sem o tab\nO 2ยบ valor recebido รฉ R$1000,00\n-----------------------------------\nO 3ยบ valor recebido รฉ \t R$1000,00\nAgora sem o tab\nO 3ยบ valor recebido รฉ R$1000,00\n-----------------------------------\nO 4ยบ valor recebido รฉ \t R$1000,00\nAgora sem o tab\nO 4ยบ valor recebido รฉ R$1000,00\n-----------------------------------\n"
]
],
[
[
"# **.find() .rfind**",
"_____no_output_____"
]
],
[
[
"frase = ' Curso em Vรญdeo Python '\r\nprint(frase.find('Curso'))\r\nprint('A letra \"o\" aparece a ultima vez na posiรงรฃo {}.'.format(frase.lower().rfind('o')+1))",
"2\nA letra \"o\" aparece a ultima vez na posiรงรฃo 22.\n"
]
],
[
[
"# **print()**\r\nvalue รฉ o valor que queremos imprimir, as reticรชncias indicam que a funรงรฃo pode receber mais de um valor, basta separรก-los por vรญrgula.\r\n\r\nsep รฉ o separador entre os valores, por padrรฃo o separador รฉ um espaรงo em branco.\r\n\r\nend รฉ o que acontecerรก ao final da funรงรฃo, por padrรฃo hรก uma quebra de linha, uma nova linha (\\n). ",
"_____no_output_____"
],
[
"## fomatando o print",
"_____no_output_____"
]
],
[
[
"nome = 'Livio Alvarenga'\r\nprint(f'Prazer em te conhecer\\n{nome}!') #\\n executa um enter\r\nprint(f'Prazer em te conhecer {nome:20}!')\r\nprint(f'Prazer em te conhecer {nome:>20}!')\r\nprint(f'Prazer em te conhecer {nome:<20}!')\r\nprint(f'Prazer em te conhecer {nome:^20}!')\r\nprint(f'Prazer em te conhecer {nome:=^21}!')",
"Prazer em te conhecer\nLivio Alvarenga!\nPrazer em te conhecer Livio Alvarenga !\nPrazer em te conhecer Livio Alvarenga!\nPrazer em te conhecer Livio Alvarenga !\nPrazer em te conhecer Livio Alvarenga !\nPrazer em te conhecer ===Livio Alvarenga===!\n"
],
[
"print(f'{\"FIM DO PROGRAMA\":-^30}')\r\nprint(f'{\"FIM DO PROGRAMA\":^30}')",
"-------FIM DO PROGRAMA--------\n FIM DO PROGRAMA \n"
],
[
"frase = ' Curso em Vรญdeo Python '\r\nprint(frase[3])\r\nprint(frase[:3])\r\nprint(frase[3:])\r\nprint(frase[0:10:2])\r\nprint(\"\"\"imprimindo um texto longo!!! imprimindo um texto longo!!!\r\nimprimindo um texto longo!!! imprimindo um texto longo!!!\r\nimprimindo um texto longo!!! imprimindo um texto longo!!!\"\"\")",
"u\n C\nurso em Vรญdeo Python \n Croe\nimprimindo um texto longo!!! imprimindo um texto longo!!!\nimprimindo um texto longo!!! imprimindo um texto longo!!!\nimprimindo um texto longo!!! imprimindo um texto longo!!!\n"
]
],
[
[
"## print sep e end",
"_____no_output_____"
]
],
[
[
"# print com end\r\nt1 = 't1'\r\nt2 = 't2'\r\nt3 = 't3'\r\nprint('{} --> {}'.format(t1, t2), end='')\r\nprint(f' --> {t3}', end='')\r\nprint(' --> FIM')",
"t1 --> t2 --> t3 --> FIM\n"
],
[
"print(\"Brasil\", \"ganhou\", 5, \"titulos mundiais\", sep=\"-\")",
"Brasil-ganhou-5-titulos mundiais\n"
]
],
[
[
"## Imprimindo com pprint( )",
"_____no_output_____"
]
],
[
[
"from pprint import pprint\r\n\r\n# ! Imprimindo com pprint + width\r\n\r\ncliente = {'nome': 'Livio', 'Idade': 40, 'Cidade': 'Belo Horizonte'}\r\n\r\npprint(cliente, width=40)",
"{'Cidade': 'Belo Horizonte',\n 'Idade': 40,\n 'nome': 'Livio'}\n"
]
],
[
[
"# **round()**",
"_____no_output_____"
]
],
[
[
"# Retorna o valor com arredondamento\r\nround(3.14151922,2)",
"_____no_output_____"
]
],
[
[
"# os.path.isdir\r\n\r\nEste mรฉtodo vai nos retornar um booleano, True ou False, que vai dizer se o diretรณrio existe ou nรฃo",
"_____no_output_____"
]
],
[
[
"from os.path import isdir\r\n\r\ndiretorio = \"c:\\\\\"\r\n\r\nif isdir(diretorio):\r\n print(f\"O diretรณrio {diretorio} existe!\")\r\nelse:\r\n print(\"O diretรณrio nรฃo existe!\")",
"O diretรณrio c:\\ existe!\n"
],
[
"diretorio = \"xx:\\\\\"\r\n\r\nif isdir(diretorio):\r\n print(f\"O diretรณrio {diretorio} existe!\")\r\nelse:\r\n print(\"O diretรณrio nรฃo existe!\")",
"O diretรณrio nรฃo existe!\n"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbf91800a06e2175a624eb591c017f7143427b7a
| 7,093 |
ipynb
|
Jupyter Notebook
|
test/notebooks/validate_gf_eia923.ipynb
|
erictleung/pudl
|
32bfbf3a959114f766b630f5b873a93b7a930c71
|
[
"MIT"
] | null | null | null |
test/notebooks/validate_gf_eia923.ipynb
|
erictleung/pudl
|
32bfbf3a959114f766b630f5b873a93b7a930c71
|
[
"MIT"
] | null | null | null |
test/notebooks/validate_gf_eia923.ipynb
|
erictleung/pudl
|
32bfbf3a959114f766b630f5b873a93b7a930c71
|
[
"MIT"
] | null | null | null | 27.925197 | 363 | 0.619907 |
[
[
[
"# Validation of gf_eia923\nThis notebook runs sanity checks on the Generation Fuel data that are reported in EIA Form 923. These are the same tests which are run by the gf_eia923 validation tests by PyTest. The notebook and visualizations are meant to be used as a diagnostic tool, to help understand what's wrong when the PyTest based data validations fail for some reason.",
"_____no_output_____"
]
],
[
[
"%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"import sys\nimport pandas as pd\nimport sqlalchemy as sa\nimport pudl",
"_____no_output_____"
],
[
"import warnings\nimport logging\nlogger = logging.getLogger()\nlogger.setLevel(logging.INFO)\nhandler = logging.StreamHandler(stream=sys.stdout)\nformatter = logging.Formatter('%(message)s')\nhandler.setFormatter(formatter)\nlogger.handlers = [handler]",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt\nimport matplotlib as mpl\n%matplotlib inline",
"_____no_output_____"
],
[
"plt.style.use('ggplot')\nmpl.rcParams['figure.figsize'] = (10,4)\nmpl.rcParams['figure.dpi'] = 150\npd.options.display.max_columns = 56",
"_____no_output_____"
],
[
"pudl_settings = pudl.workspace.setup.get_defaults()\nferc1_engine = sa.create_engine(pudl_settings['ferc1_db'])\npudl_engine = sa.create_engine(pudl_settings['pudl_db'])\npudl_settings",
"_____no_output_____"
]
],
[
[
"## Get the original EIA 923 data\nFirst we pull the original (post-ETL) EIA 923 data out of the database. We will use the values in this dataset as a baseline for checking that latter aggregated data and derived values remain valid. We will also eyeball these values here to make sure they are within the expected range. This may take a minute or two depending on the speed of your machine.",
"_____no_output_____"
]
],
[
[
"pudl_out_orig = pudl.output.pudltabl.PudlTabl(pudl_engine, freq=None)\ngf_eia923_orig = pudl_out_orig.gf_eia923()",
"_____no_output_____"
]
],
[
[
"# Validation Against Fixed Bounds \nSome of the variables reported in this table have a fixed range of reasonable values, like the heat content per unit of a given fuel type. These varaibles can be tested for validity against external standards directly. In general we have two kinds of tests in this section:\n* **Tails:** are the exteme values too extreme? Typically, this is at the 5% and 95% level, but depending on the distribution, sometimes other thresholds are used.\n* **Middle:** Is the central value of the distribution where it should be?\n\n### Fields that need checking:\nThese are all contained in the `frc_eia923` table data validations, and those should just be re-used if possible. Ugh, names not all the same though. Annoying.\n* `fuel_mmbtu_per_unit` (BIT, SUB, LIG, coal, DFO, oil, gas)",
"_____no_output_____"
]
],
[
[
"gf_eia923_orig.sample(10)",
"_____no_output_____"
]
],
[
[
"## Coal Heat Content",
"_____no_output_____"
]
],
[
[
"pudl.validate.plot_vs_bounds(gf_eia923_orig, pudl.validate.gf_eia923_coal_heat_content)",
"_____no_output_____"
]
],
[
[
"## Oil Heat Content",
"_____no_output_____"
]
],
[
[
"pudl.validate.plot_vs_bounds(gf_eia923_orig, pudl.validate.gf_eia923_oil_heat_content)",
"_____no_output_____"
]
],
[
[
"## Gas Heat Content",
"_____no_output_____"
]
],
[
[
"pudl.validate.plot_vs_bounds(gf_eia923_orig, pudl.validate.gf_eia923_gas_heat_content)",
"_____no_output_____"
]
],
[
[
"# Validate Monthly Aggregation\nIt's possible that the distribution will change as a function of aggregation, or we might make an error in the aggregation process. These tests check that a collection of quantiles for the original and the data aggregated by month have internally consistent values.",
"_____no_output_____"
]
],
[
[
"pudl_out_month = pudl.output.pudltabl.PudlTabl(pudl_engine, freq=\"MS\")\ngf_eia923_month = pudl_out_month.gf_eia923()",
"_____no_output_____"
],
[
"pudl.validate.plot_vs_agg(gf_eia923_orig, gf_eia923_month, pudl.validate.gf_eia923_agg)",
"_____no_output_____"
]
],
[
[
"# Validate Annual Aggregation\nIt's possible that the distribution will change as a function of aggregation, or we might make an error in the aggregation process. These tests check that a collection of quantiles for the original and the data aggregated by year have internally consistent values.",
"_____no_output_____"
]
],
[
[
"pudl_out_year = pudl.output.pudltabl.PudlTabl(pudl_engine, freq=\"AS\")\ngf_eia923_year = pudl_out_year.gf_eia923()",
"_____no_output_____"
],
[
"pudl.validate.plot_vs_agg(gf_eia923_orig, gf_eia923_year, pudl.validate.gf_eia923_agg)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbf9241ea62fbb7b31ffbf4313a589124aa7c3e5
| 20,749 |
ipynb
|
Jupyter Notebook
|
ex2.3.5.ipynb
|
howeverforever/Numerical-Optimization-Exp
|
3a6957f2a3ea54e150dde44fe7c5baae3cf59397
|
[
"MIT"
] | 1 |
2019-04-18T17:34:43.000Z
|
2019-04-18T17:34:43.000Z
|
ex2.3.5.ipynb
|
howeverforever/Numerical-Optimization-Exp
|
3a6957f2a3ea54e150dde44fe7c5baae3cf59397
|
[
"MIT"
] | null | null | null |
ex2.3.5.ipynb
|
howeverforever/Numerical-Optimization-Exp
|
3a6957f2a3ea54e150dde44fe7c5baae3cf59397
|
[
"MIT"
] | null | null | null | 26.001253 | 643 | 0.374331 |
[
[
[
"<h1>Table of Contents<span class=\"tocSkip\"></span></h1>\n<div class=\"toc\" style=\"margin-top: 1em;\"><ul class=\"toc-item\"><li><ul class=\"toc-item\"><li><span><a href=\"#(a)\" data-toc-modified-id=\"(a)-0.1\"><span class=\"toc-item-num\">0.1 </span>(a)</a></span></li><li><span><a href=\"#(b)\" data-toc-modified-id=\"(b)-0.2\"><span class=\"toc-item-num\">0.2 </span>(b)</a></span></li><li><span><a href=\"#(c)\" data-toc-modified-id=\"(c)-0.3\"><span class=\"toc-item-num\">0.3 </span>(c)</a></span></li><li><span><a href=\"#(d)\" data-toc-modified-id=\"(d)-0.4\"><span class=\"toc-item-num\">0.4 </span>(d)</a></span></li></ul></li></ul></div>",
"_____no_output_____"
],
[
"Use Newtonโs method to find solutions accurate to within $10^{โ4}$ for the following problems.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom numpy import linalg\nfrom abc import abstractmethod\nimport pandas as pd\nimport math\n\npd.options.display.float_format = '{:,.8f}'.format\nnp.set_printoptions(suppress=True, precision=8)\n\nTOR = pow(10.0, -4)\nMAX_ITR = 150",
"_____no_output_____"
],
[
"class NewtonMethod(object):\n\n def __init__(self):\n return\n\n @abstractmethod\n def f(self, x):\n return NotImplementedError('Implement f()!')\n\n @abstractmethod\n def jacobian(self, x):\n return NotImplementedError('Implement jacobian()!')\n\n @abstractmethod\n def run(self, x):\n return NotImplementedError('Implement run()!')",
"_____no_output_____"
]
],
[
[
"## (a) \n$$x^3 โ 2x^2 โ 5 = 0, [1, 4]$$",
"_____no_output_____"
]
],
[
[
"class Newton1D(NewtonMethod):\n\n def __init__(self):\n super(NewtonMethod, self).__init__()\n\n def f(self, x):\n return pow(x, 3) - 2 * pow(x, 2) - 5\n\n def jacobian(self, x):\n return 3 * pow(x, 2) - 4 * x\n\n def run(self, x0):\n df = pd.DataFrame(columns=['f(x)'])\n row = len(df)\n x = x0\n df.loc[row] = [x]\n for k in range(MAX_ITR):\n try:\n y = x - self.f(x) / self.jacobian(x)\n except ValueError:\n break\n residual = math.fabs(x - y)\n x = y\n\n row = len(df)\n df.loc[row] = [y]\n if residual < TOR or x > 1e9:\n break\n return df",
"_____no_output_____"
],
[
"Newton1D().run(2.5).astype(np.float64)",
"_____no_output_____"
]
],
[
[
"## (b) \n$$x^3 + 3x^2 โ 1 = 0, [-3, -2]$$",
"_____no_output_____"
]
],
[
[
"class Newton1D(NewtonMethod):\n\n def __init__(self):\n super(NewtonMethod, self).__init__()\n\n def f(self, x):\n return pow(x, 3) + 3 * pow(x, 2) - 1\n\n def jacobian(self, x):\n return 3 * pow(x, 2) - 6 * x\n\n def run(self, x0):\n df = pd.DataFrame(columns=['f(x)'])\n row = len(df)\n x = x0\n df.loc[row] = [x]\n for k in range(MAX_ITR):\n try:\n y = x - self.f(x) / self.jacobian(x)\n except ValueError:\n break\n residual = math.fabs(x - y)\n x = y\n\n row = len(df)\n df.loc[row] = [y]\n if residual < TOR or x > 1e9:\n break\n return df",
"_____no_output_____"
],
[
"Newton1D().run(-2.5).astype(np.float64)",
"_____no_output_____"
]
],
[
[
"## (c) \n$$xโ\\cos x=0, [0, \\frac{\\pi}{2}]$$",
"_____no_output_____"
]
],
[
[
"class Newton1D(NewtonMethod):\n\n def __init__(self):\n super(NewtonMethod, self).__init__()\n\n def f(self, x):\n return x - math.cos(x)\n\n def jacobian(self, x):\n return 1 + math.sin(x)\n\n def run(self, x0):\n df = pd.DataFrame(columns=['f(x)'])\n row = len(df)\n x = x0\n df.loc[row] = [x]\n for k in range(MAX_ITR):\n try:\n y = x - self.f(x) / self.jacobian(x)\n except ValueError:\n break\n residual = math.fabs(x - y)\n x = y\n\n row = len(df)\n df.loc[row] = [y]\n if residual < TOR or x > 1e9:\n break\n return df",
"_____no_output_____"
],
[
"Newton1D().run(math.pi / 4.0).astype(np.float64)",
"_____no_output_____"
]
],
[
[
"## (d)\n$$x โ 0.8 โ 0.2 \\sin x = 0, [0, \\frac{\\pi}{2}]$$",
"_____no_output_____"
]
],
[
[
"class Newton1D(NewtonMethod):\n\n def __init__(self):\n super(NewtonMethod, self).__init__()\n\n def f(self, x):\n return x - 0.8 - 0.2 * math.sin(x)\n\n def jacobian(self, x):\n return 1 - 0.2 * math.cos(x)\n\n def run(self, x0):\n df = pd.DataFrame(columns=['f(x)'])\n row = len(df)\n x = x0\n df.loc[row] = [x]\n for k in range(MAX_ITR):\n try:\n y = x - self.f(x) / self.jacobian(x)\n except ValueError:\n break\n residual = math.fabs(x - y)\n x = y\n\n row = len(df)\n df.loc[row] = [y]\n if residual < TOR or x > 1e9:\n break\n return df",
"_____no_output_____"
],
[
"Newton1D().run(math.pi / 4.0).astype(np.float64)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbf932d05f25789f3546536d231082ed328ccc7c
| 955,554 |
ipynb
|
Jupyter Notebook
|
Tutorials/03_Linear_Regression_Models.ipynb
|
NimbleBoxAI/PyTorch
|
00b776f51cd715c9a48131533d7b9830113f524b
|
[
"MIT"
] | 61 |
2020-02-27T10:48:45.000Z
|
2022-03-09T14:56:24.000Z
|
Basic/03_Linear_Regression_Models.ipynb
|
royayon/PyTorch
|
1f129670ede2bdc5baf3568e9dcc011aa15f5b58
|
[
"MIT"
] | null | null | null |
Basic/03_Linear_Regression_Models.ipynb
|
royayon/PyTorch
|
1f129670ede2bdc5baf3568e9dcc011aa15f5b58
|
[
"MIT"
] | 25 |
2020-03-11T16:38:48.000Z
|
2022-02-01T02:36:20.000Z
| 508.273404 | 16,640 | 0.942867 |
[
[
[
"import torch\nfrom torch.autograd import Variable\n\nfrom torch import nn",
"_____no_output_____"
],
[
"import matplotlib.pyplot as plt\n%matplotlib inline\ntorch.manual_seed(3)",
"_____no_output_____"
]
],
[
[
"\n# make data",
"_____no_output_____"
]
],
[
[
"x_train = torch.Tensor([[1],[2],[3]])\ny_train = torch.Tensor([[1],[2],[3]])\n\nx, y = Variable(x_train), Variable(y_train)\n\nplt.scatter(x.data.numpy(), y.data.numpy())\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Naive Model",
"_____no_output_____"
],
[
"\n## Define Linear model",
"_____no_output_____"
]
],
[
[
"x, y",
"_____no_output_____"
],
[
"W = Variable(torch.rand(1,1))\nW",
"_____no_output_____"
],
[
"x.mm(W) ",
"_____no_output_____"
]
],
[
[
"\n## Define cost function\n\nloss(x,y)=1/nโ|xiโyi|2loss(x,y)=1/nโ|xiโyi|2",
"_____no_output_____"
]
],
[
[
"cost_func = nn.MSELoss()\n\ncost_func",
"_____no_output_____"
]
],
[
[
"\n## Training Linear Regression",
"_____no_output_____"
]
],
[
[
"plt.ion()\n\nlr = 0.01\n\nfor step in range(300):\n prediction = x.mm(W)\n cost = cost_func(prediction, y)\n gradient = (prediction-y).view(-1).dot(x.view(-1)) / len(x)\n W -= lr * gradient\n \n if step % 10 == 0:\n plt.cla()\n plt.scatter(x.data.numpy(), y.data.numpy())\n plt.plot(x.data.numpy(), prediction.data.numpy(), 'r-')\n plt.title('step %d, cost=%.4f, w=%.4f,grad=%.4f' % (step,cost.data, W.data[0], gradient.data))\n plt.show()\n \n# if step %10 == 0:\n# print(step, \"going cost\")\n# print(cost)\n# print((prediction-y).view(-1))\n# print((x.view(-1)))\n# print(gradient)\n# print(W)\nplt.ioff()",
"_____no_output_____"
],
[
"x_test = Variable(torch.Tensor([[5]]))\ny_test = x_test.mm(W)\ny_test",
"_____no_output_____"
]
],
[
[
"\n# w/ nn Module",
"_____no_output_____"
],
[
"## Define Linear Model",
"_____no_output_____"
]
],
[
[
"model = nn.Linear(1, 1, bias=True)\n\nprint(model)\n\nmodel.weight, model.bias",
"Linear(in_features=1, out_features=1, bias=True)\n"
],
[
"cost_func = nn.MSELoss()",
"_____no_output_____"
],
[
"for i in model.parameters():\n print(i)",
"Parameter containing:\ntensor([[-0.7889]], requires_grad=True)\nParameter containing:\ntensor([-0.4283], requires_grad=True)\n"
],
[
"optimizer = torch.optim.SGD(model.parameters(), lr=0.01)",
"_____no_output_____"
]
],
[
[
"\n## Training w/ nn module",
"_____no_output_____"
]
],
[
[
"model(x)",
"_____no_output_____"
],
[
"plt.ion()\n\nfor step in range(300):\n prediction = model(x)\n cost = cost_func(prediction, y)\n \n optimizer.zero_grad()\n cost.backward()\n optimizer.step()\n \n if step % 10 == 0:\n plt.cla()\n plt.scatter(x.data.numpy(), y.data.numpy())\n plt.plot(x.data.numpy(), prediction.data.numpy(), 'b--')\n plt.title('cost=%.4f, w=%.4f, b=%.4f' % (cost.data,model.weight.data[0][0],model.bias.data))\n plt.show()\n\nplt.ioff()",
"_____no_output_____"
],
[
"x_test = Variable(torch.Tensor([[7]]))\ny_test = model(x_test)\n\nprint('input : %.4f, output:%.4f' % (x_test.data[0][0], y_test.data[0][0]))",
"input : 7.0000, output:6.6965\n"
],
[
"for step in range(300):\n prediction = model(x)\n cost = cost_func(prediction, y)\n \n optimizer.zero_grad()\n cost.backward()\n optimizer.step()\n\nx_test = Variable(torch.Tensor([[7]]))\ny_test = model(x_test)\n\nprint('input : %.4f, output:%.4f' % (x_test.data[0][0], y_test.data[0][0]))",
"input : 7.0000, output:6.8526\n"
],
[
"model.weight, model.bias",
"_____no_output_____"
]
],
[
[
"\n### Has \"nn.MSELoss()\" Convex Cost Space?",
"_____no_output_____"
]
],
[
[
"W_val, cost_val = [], []\n\nfor i in range(-30, 51):\n W = i * 0.1\n model.weight.data.fill_(W)\n cost = cost_func(model(x),y)\n \n W_val.append(W)\n cost_val.append(cost.data)\n\nplt.plot(W_val, cost_val, 'ro')\nplt.show()",
"_____no_output_____"
]
],
[
[
"\n# Multivariate Linear model",
"_____no_output_____"
]
],
[
[
"import numpy as np",
"_____no_output_____"
]
],
[
[
"\n## make Data",
"_____no_output_____"
]
],
[
[
"xy = np.loadtxt('data-01-test-score.csv', delimiter=',', dtype=np.float32)\nx_data = xy[:, 0:-1]\ny_data = xy[:, [-1]]\n\nprint('shape: ', x_data.shape, '\\nlength:', len(x_data), '\\n', x_data )\nprint('shape: ', y_data.shape, '\\nlength:', len(y_data), '\\n', y_data )",
"shape: (25, 3) \nlength: 25 \n [[ 73. 80. 75.]\n [ 93. 88. 93.]\n [ 89. 91. 90.]\n [ 96. 98. 100.]\n [ 73. 66. 70.]\n [ 53. 46. 55.]\n [ 69. 74. 77.]\n [ 47. 56. 60.]\n [ 87. 79. 90.]\n [ 79. 70. 88.]\n [ 69. 70. 73.]\n [ 70. 65. 74.]\n [ 93. 95. 91.]\n [ 79. 80. 73.]\n [ 70. 73. 78.]\n [ 93. 89. 96.]\n [ 78. 75. 68.]\n [ 81. 90. 93.]\n [ 88. 92. 86.]\n [ 78. 83. 77.]\n [ 82. 86. 90.]\n [ 86. 82. 89.]\n [ 78. 83. 85.]\n [ 76. 83. 71.]\n [ 96. 93. 95.]]\nshape: (25, 1) \nlength: 25 \n [[152.]\n [185.]\n [180.]\n [196.]\n [142.]\n [101.]\n [149.]\n [115.]\n [175.]\n [164.]\n [141.]\n [141.]\n [184.]\n [152.]\n [148.]\n [192.]\n [147.]\n [183.]\n [177.]\n [159.]\n [177.]\n [175.]\n [175.]\n [149.]\n [192.]]\n"
],
[
"x, y = Variable(torch.from_numpy(x_data)), Variable(torch.from_numpy(y_data))\nx, y",
"_____no_output_____"
]
],
[
[
"\n## make Model",
"_____no_output_____"
]
],
[
[
"mv_model = nn.Linear(3, 1, bias=True)\n\nprint(mv_model)",
"Linear(in_features=3, out_features=1, bias=True)\n"
],
[
"print('weigh : ', mv_model.weight)\nprint('bias : ', mv_model.bias)",
"weigh : Parameter containing:\ntensor([[-0.5462, -0.0328, -0.5079]], requires_grad=True)\nbias : Parameter containing:\ntensor([0.3139], requires_grad=True)\n"
],
[
"cost_func = nn.MSELoss()\n\noptimizer = torch.optim.SGD(mv_model.parameters(), lr=1e-5)",
"_____no_output_____"
]
],
[
[
"\n## Training Model",
"_____no_output_____"
]
],
[
[
"for step in range(2000):\n optimizer.zero_grad()\n \n prediction = mv_model(x)\n cost = cost_func(prediction, y)\n cost.backward()\n \n optimizer.step()\n \n if step % 50 == 0:\n print(step, \"Cost: \", cost.data.numpy(), \"\\nPrediction:\\n\", prediction.data.t().numpy())",
"0 Cost: 63235.746 \nPrediction:\n [[ -80.27773 -100.60726 -96.996895 -106.12925 -77.27918 -58.080135\n -78.91204 -57.67022 -95.51112 -89.83046 -76.7492 -77.639465\n -99.82084 -82.53922 -79.93342 -102.16385 -79.28944 -94.11811\n -94.45171 -84.123055 -93.00944 -94.55529 -88.18653 -79.98301\n -103.4257 ]]\n50 Cost: 18.114258 \nPrediction:\n [[156.05579 183.55751 182.9336 198.77402 139.43988 101.76946 149.3429\n 111.57456 170.13354 156.31465 143.15779 139.25809 189.34947 157.79968\n 149.39186 186.222 149.59128 179.80757 181.2073 162.5206 174.65862\n 172.05891 166.99348 158.21191 191.04263]]\n100 Cost: 17.597153 \nPrediction:\n [[155.98497 183.58311 182.89873 198.7732 139.45627 101.85179 149.36647\n 111.62254 170.22855 156.49464 143.17534 139.33849 189.27736 157.69406\n 149.43619 186.27318 149.47993 179.83368 181.1156 162.4333 174.68892\n 172.11212 167.0024 158.05605 191.0343 ]]\n150 Cost: 17.10247 \nPrediction:\n [[155.91576 183.60812 182.86464 198.77242 139.47223 101.93227\n 149.38956 111.669586 170.32143 156.67068 143.19252 139.41708\n 189.20683 157.59073 149.47957 186.3232 149.37097 179.85928\n 181.02594 162.34796 174.71861 172.16412 167.01118 157.90361\n 191.02612 ]]\n200 Cost: 16.629223 \nPrediction:\n [[155.8481 183.63255 182.8313 198.77168 139.48778 102.01093 149.41222\n 111.7157 170.41222 156.84283 143.20934 139.49394 189.13785 157.48964\n 149.52205 186.3721 149.2643 179.88445 180.93822 162.26448 174.7477\n 172.21498 167.0198 157.75453 191.01807]]\n250 Cost: 16.176481 \nPrediction:\n [[155.78198 183.65639 182.7987 198.77097 139.50291 102.08783 149.43442\n 111.76089 170.50096 157.01118 143.22578 139.5691 189.07036 157.39073\n 149.56364 186.4199 149.15991 179.90913 180.85243 162.18285 174.7762\n 172.2647 167.02829 157.60872 191.01018]]\n300 Cost: 15.74337 \nPrediction:\n [[155.71733 183.67966 182.76682 198.7703 139.51764 102.163 149.45619\n 111.8052 170.5877 157.1758 143.2419 139.6426 189.00433 157.29395\n 149.60435 186.46664 149.05774 179.93338 180.76854 162.10303 174.80412\n 172.31328 167.03665 157.46614 191.00241]]\n350 Cost: 15.329032 \nPrediction:\n [[155.65416 183.70238 182.73563 198.7696 139.532 102.23647 149.47752\n 111.84863 170.6725 157.33678 143.25766 139.71443 188.93976 157.19926\n 149.64421 186.51231 148.95775 179.95717 180.68648 162.02498 174.83147\n 172.36078 167.04488 157.32669 190.99478]]\n400 Cost: 14.932669 \nPrediction:\n [[155.59239 183.72453 182.70512 198.769 139.54596 102.30831 149.49844\n 111.89121 170.75539 157.49419 143.27309 139.78465 188.87657 157.10661\n 149.68323 186.55695 148.85986 179.98051 180.60622 161.94862 174.85825\n 172.4072 167.05293 157.19029 190.98727]]\n450 Cost: 14.553439 \nPrediction:\n [[155.53203 183.74619 182.6753 198.7684 139.55957 102.378525\n 149.51897 111.93295 170.83643 157.64813 143.2882 139.85335\n 188.81477 157.01599 149.72145 186.60062 148.76407 180.00346\n 180.52773 161.874 174.8845 172.4526 167.0609 157.05692\n 190.97992 ]]\n500 Cost: 14.19065 \nPrediction:\n [[155.47302 183.76733 182.64613 198.76784 139.57281 102.44716\n 149.5391 111.973885 170.91563 157.79868 143.30298 139.9205\n 188.75433 156.92732 149.75887 186.6433 148.6703 180.026\n 180.45096 161.801 174.91022 172.497 167.06873 156.92647\n 190.97269 ]]\n550 Cost: 13.843588 \nPrediction:\n [[155.41534 183.78795 182.6176 198.76727 139.5857 102.514244\n 149.5588 112.01401 170.99306 157.94588 143.31744 139.98615\n 188.69519 156.84055 149.79549 186.68498 148.57852 180.04811\n 180.37587 161.7296 174.93541 172.54036 167.0764 156.79887\n 190.96556 ]]\n600 Cost: 13.511551 \nPrediction:\n [[155.35898 183.80807 182.5897 198.76675 139.59824 102.57981\n 149.57817 112.053345 171.06874 158.08983 143.33162 140.05035\n 188.63734 156.75568 149.83136 186.72577 148.4887 180.06984\n 180.30243 161.6598 174.9601 172.58278 167.08398 156.6741\n 190.95859 ]]\n650 Cost: 13.1939125 \nPrediction:\n [[155.30386 183.82773 182.56241 198.76625 139.61044 102.64391\n 149.59712 112.091896 171.1427 158.23059 143.34547 140.1131\n 188.58075 156.6726 149.86646 186.76561 148.40076 180.09116\n 180.23062 161.59154 174.98428 172.62422 167.09142 156.55206\n 190.95169 ]]\n700 Cost: 12.890031 \nPrediction:\n [[155.25 183.84691 182.53572 198.7658 139.62233 102.70656 149.61572\n 112.12969 171.21501 158.36826 143.35905 140.17447 188.52539 156.59135\n 149.90085 186.80457 148.31471 180.1121 180.16039 161.52481 175.00797\n 172.66475 167.09875 156.43272 190.94493]]\n750 Cost: 12.599316 \nPrediction:\n [[155.19737 183.86563 182.50963 198.76534 139.6339 102.76779 149.63396\n 112.16676 171.28569 158.50287 143.37233 140.23447 188.47125 156.51186\n 149.93451 186.84264 148.23047 180.13266 180.09167 161.45952 175.03117\n 172.70436 167.10596 156.31601 190.93831]]\n800 Cost: 12.321178 \nPrediction:\n [[155.1459 183.8839 182.4841 198.76491 139.64514 102.82765 149.65184\n 112.2031 171.35477 158.63449 143.38533 140.29312 188.41826 156.43405\n 149.96745 186.87985 148.148 180.15285 180.02448 161.39569 175.05391\n 172.74306 167.11305 156.20186 190.93175]]\n850 Cost: 12.055095 \nPrediction:\n [[155.09563 183.90173 182.45915 198.76451 139.6561 102.88617 149.66939\n 112.23874 171.42232 158.76323 143.39807 140.35048 188.36646 156.35796\n 149.99974 186.91624 148.0673 180.17268 179.95877 161.33328 175.07622\n 172.78093 167.12003 156.09023 190.92537]]\n900 Cost: 11.800547 \nPrediction:\n [[155.04646 183.91914 182.43474 198.76413 139.66675 102.94336\n 149.68658 112.273674 171.48833 158.88908 143.41052 140.40654\n 188.31577 156.2835 150.03133 186.9518 147.9883 180.19215\n 179.8945 161.27225 175.09804 172.81792 167.12688 155.98103\n 190.91905 ]]\n950 Cost: 11.556996 \nPrediction:\n [[154.99841 183.93611 182.41087 198.76375 139.67711 102.99926 149.70343\n 112.30792 171.55281 159.01219 143.42271 140.46136 188.26616 156.21062\n 150.06227 186.98656 147.91096 180.21127 179.83163 161.21254 175.11943\n 172.85408 167.13362 155.87425 190.91286]]\n1000 Cost: 11.323995 \nPrediction:\n [[154.95145 183.95267 182.38751 198.7634 139.68716 103.05389 149.71996\n 112.3415 171.61588 159.13254 143.43463 140.51495 188.21765 156.13931\n 150.09254 187.02054 147.83527 180.23004 179.77014 161.15417 175.14038\n 172.88942 167.14026 155.7698 190.90674]]\n1050 Cost: 11.101091 \nPrediction:\n [[154.90558 183.96887 182.36469 198.76309 139.69699 103.1073 149.73619\n 112.37445 171.67752 159.25024 143.44633 140.56737 188.1702 156.06958\n 150.12222 187.05376 147.76118 180.2485 179.71002 161.09712 175.16092\n 172.92398 167.14682 155.6677 190.90076]]\n1100 Cost: 10.88784 \nPrediction:\n [[154.86072 183.98465 182.34236 198.76279 139.70651 103.1595 149.75209\n 112.40674 171.73776 159.36536 143.45776 140.61859 188.12378 156.00133\n 150.15128 187.08621 147.68866 180.2666 179.65121 161.0413 175.18105\n 172.95778 167.15324 155.56781 190.89487]]\n1150 Cost: 10.683798 \nPrediction:\n [[154.81686 184.00005 182.32051 198.76248 139.71577 103.21052 149.76768\n 112.43841 171.79665 159.47789 143.46896 140.66867 188.07835 155.93454\n 150.1797 187.11794 147.61766 180.2844 179.59367 160.98671 175.20076\n 172.99078 167.15956 155.47011 190.88908]]\n1200 Cost: 10.488602 \nPrediction:\n [[154.774 184.01508 182.29915 198.76222 139.72481 103.26039 149.78299\n 112.46945 171.8542 159.58797 143.47993 140.71765 188.03392 155.86922\n 150.20755 187.14894 147.54817 180.30186 179.53741 160.93333 175.22008\n 173.02306 167.16579 155.37457 190.88338]]\n1250 Cost: 10.301836 \nPrediction:\n [[154.73212 184.02974 182.27826 198.76196 139.73357 103.30913 149.798\n 112.49989 171.91046 159.69559 143.49066 140.76553 187.99046 155.80527\n 150.23483 187.17926 147.48013 180.31903 179.48239 160.88115 175.23901\n 173.0546 167.17192 155.28113 190.87779]]\n1300 Cost: 10.123156 \nPrediction:\n [[154.6912 184.04404 182.25784 198.76173 139.7421 103.35676\n 149.81271 112.529755 171.96545 159.80083 143.50116 140.81236\n 187.94794 155.74272 150.26154 187.2089 147.41354 180.33588\n 179.42859 160.83012 175.25757 173.08545 167.17796 155.18974\n 190.87228 ]]\n1350 Cost: 9.952217 \nPrediction:\n [[154.6512 184.05803 182.23787 198.76152 139.75038 103.403336\n 149.82715 112.559044 172.0192 159.90376 143.51144 140.85811\n 187.90636 155.68152 150.28769 187.23785 147.34834 180.35246\n 179.37596 160.78021 175.27574 173.11559 167.18388 155.10037\n 190.8669 ]]\n1400 Cost: 9.78867 \nPrediction:\n [[154.61209 184.07162 182.21832 198.7613 139.75842 103.44884 149.84131\n 112.58775 172.07172 160.0044 143.52151 140.90286 187.86566 155.62163\n 150.3133 187.26616 147.28455 180.36871 179.32448 160.73141 175.29356\n 173.14505 167.18973 155.01297 190.86157]]\n1450 Cost: 9.632195 \nPrediction:\n [[154.57388 184.08493 182.19923 198.76112 139.76625 103.493324\n 149.8552 112.61591 172.12306 160.1028 143.53137 140.94661\n 187.82585 155.56303 150.33836 187.29384 147.22208 180.38469\n 179.27414 160.6837 175.31102 173.17386 167.19548 154.92749\n 190.85634 ]]\n1500 Cost: 9.482492 \nPrediction:\n [[154.53651 184.0979 182.18054 198.76094 139.77386 103.5368 149.8688\n 112.64353 172.17325 160.19902 143.541 140.98938 187.78693 155.50569\n 150.36292 187.32086 147.1609 180.40038 179.22488 160.63702 175.32812\n 173.202 167.20111 154.84387 190.8512 ]]\n1550 Cost: 9.339257 \nPrediction:\n [[154.50003 184.11055 182.16228 198.76079 139.78125 103.57929 149.88217\n 112.67062 172.22227 160.29314 143.55046 141.03119 187.74883 155.4496\n 150.38698 187.3473 147.10106 180.4158 179.17673 160.59138 175.3449\n 173.22952 167.2067 154.76212 190.84615]]\n1600 Cost: 9.202201 \nPrediction:\n [[154.46436 184.1229 182.14438 198.76065 139.78842 103.620804\n 149.89526 112.697174 172.2702 160.38516 143.5597 141.07207\n 187.71156 155.39468 150.41052 187.37314 147.04243 180.43094\n 179.12961 160.54677 175.36133 173.25641 167.21217 154.68214\n 190.84119 ]]\n1650 Cost: 9.071091 \nPrediction:\n [[154.42949 184.13493 182.1269 198.76051 139.79541 103.66141\n 149.90813 112.723236 172.31706 160.47514 143.56876 141.11205\n 187.67511 155.34099 150.43358 187.39839 146.98508 180.44585\n 179.08357 160.50313 175.37744 173.2827 167.21758 154.60396\n 190.83629 ]]\n1700 Cost: 8.945654 \nPrediction:\n [[154.39543 184.1467 182.10983 198.7604 139.80219 103.70108\n 149.92075 112.748795 172.36285 160.56314 143.57762 141.15112\n 187.63945 155.28842 150.45618 187.42307 146.92892 180.46045\n 179.03851 160.46048 175.39322 173.3084 167.22289 154.52747\n 190.83148 ]]\n1750 Cost: 8.825626 \nPrediction:\n [[154.36214 184.15813 182.09311 198.76028 139.80878 103.73983 149.93312\n 112.77384 172.40761 160.64919 143.5863 141.18932 187.60455 155.237\n 150.47829 187.44717 146.87393 180.47481 178.99443 160.41875 175.40869\n 173.33351 167.2281 154.45267 190.82677]]\n1800 Cost: 8.710775 \nPrediction:\n [[154.32957 184.1693 182.07674 198.76018 139.81516 103.77772 149.94524\n 112.79842 172.45131 160.73329 143.59477 141.22664 187.5704 155.18666\n 150.49992 187.47075 146.82008 180.4889 178.95131 160.37793 175.42384\n 173.35805 167.23323 154.37949 190.8221 ]]\n1850 Cost: 8.600917 \nPrediction:\n [[154.29779 184.1802 182.06078 198.7601 139.82138 103.81476 149.95715\n 112.82253 172.49408 160.81558 143.6031 141.26317 187.53703 155.13742\n 150.52115 187.49379 146.76741 180.50278 178.90918 160.33804 175.4387\n 173.38203 167.2383 154.30795 190.81754]]\n1900 Cost: 8.495797 \nPrediction:\n [[154.26671 184.19084 182.04512 198.76004 139.82741 103.85093\n 149.96883 112.846176 172.53584 160.89603 143.61124 141.29887\n 187.50438 155.08925 150.54193 187.51631 146.71582 180.5164\n 178.86794 160.29903 175.45326 173.40549 167.24327 154.23799\n 190.81306 ]]\n1950 Cost: 8.39521 \nPrediction:\n [[154.23634 184.20122 182.02985 198.75998 139.83325 103.886314\n 149.98029 112.86938 172.57668 160.97469 143.61922 141.33376\n 187.4724 155.0421 150.56227 187.53831 146.66533 180.52979\n 178.82762 160.26088 175.46754 173.42842 167.24818 154.16956\n 190.80862 ]]\n"
],
[
"mv_model.state_dict()",
"_____no_output_____"
]
],
[
[
"\n## test",
"_____no_output_____"
]
],
[
[
"print(\"Model score : \",mv_model(Variable(torch.Tensor([[73,80,75]]))).data.numpy())\nprint(\"Real score : 73,80,75,152\")",
"Model score : [[154.20667]]\nReal score : 73,80,75,152\n"
],
[
"accuracy_list = []\nfor i,real_y in enumerate(y):\n accuracy = (mv_model((x[i])).data.numpy() - real_y.data.numpy())\n accuracy_list.append(np.absolute(accuracy))\n\nfor accuracy in accuracy_list:\n print(accuracy)\n\nprint(\"sum accuracy : \",sum(accuracy_list))\nprint(\"avg accuracy : \",sum(accuracy_list)/len(y))",
"[2.206665]\n[0.7886658]\n[2.0148926]\n[2.7599335]\n[2.1610565]\n[2.9208755]\n[0.9915314]\n[2.1078796]\n[2.3834229]\n[2.9483948]\n[2.6270142]\n[0.36787415]\n[3.4411316]\n[2.9959717]\n[2.582199]\n[4.4401703]\n[0.38412476]\n[2.4570618]\n[1.7881927]\n[1.223587]\n[1.5184631]\n[1.5491943]\n[7.7469788]\n[5.1026306]\n[1.1957245]\nsum accuracy : [60.703636]\navg accuracy : [2.4281454]\n"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbf93336cec22cbe10ad208fd965ebb0da61e661
| 11,109 |
ipynb
|
Jupyter Notebook
|
praktikum/Basic MNIST Preprocessing with Variational Quantum Circuits_RAW.ipynb
|
FelixBieswanger/masKIT
|
c649234f15e72050535106dc3d5636608668a6a3
|
[
"MIT"
] | null | null | null |
praktikum/Basic MNIST Preprocessing with Variational Quantum Circuits_RAW.ipynb
|
FelixBieswanger/masKIT
|
c649234f15e72050535106dc3d5636608668a6a3
|
[
"MIT"
] | null | null | null |
praktikum/Basic MNIST Preprocessing with Variational Quantum Circuits_RAW.ipynb
|
FelixBieswanger/masKIT
|
c649234f15e72050535106dc3d5636608668a6a3
|
[
"MIT"
] | null | null | null | 30.105691 | 438 | 0.57665 |
[
[
[
"import random\nimport pennylane as qml\nfrom pennylane import numpy as np\nimport sys\nsys.path.insert(0,'..')\nfrom maskit.datasets import load_data",
"_____no_output_____"
],
[
"# Setting seeds for reproducible results\nnp.random.seed(1337)\nrandom.seed(1337)",
"_____no_output_____"
]
],
[
[
"# Loading the data\n\nData of interest is MNIST data. As we want to go for reproducible results, we\nwill first go with the option `shuffle=False`. For the rest of the parameters,\nwe now go with the default options. This gives us data for two classes, the\nwritten numbers 6 and 9. We also only get a limited number of sampes, that is\n100 samples for training and 50 for testing. For further details see the\nappropriate docstring.",
"_____no_output_____"
]
],
[
[
"data = load_data(\"mnist\", shuffle=False, target_length=2)",
"Metal device set to: Apple M1 Pro\n\nsystemMemory: 16.00 GB\nmaxCacheSize: 5.33 GB\n\n"
]
],
[
[
"# Setting up a Variational Quantum Circuit for training\n\nThere is an example on the [PennyLane website](https://pennylane.ai/qml/demos/tutorial_variational_classifier.html#iris-classification) for iris data showing a setup for a variational classifier. That is variational quantum circuits that can be trained from labelled (classical) data.",
"_____no_output_____"
]
],
[
[
"wires = 4\nlayers = 4\nepochs = 5\nparameters = np.random.uniform(low=-np.pi, high=np.pi, size=(layers, wires, 2))",
"_____no_output_____"
],
[
"def variational_circuit(params):\n for layer in range(layers):\n for wire in range(wires):\n qml.RX(params[layer][wire][0], wires=wire)\n qml.RY(params[layer][wire][1], wires=wire)\n for wire in range(0, wires - 1, 2):\n qml.CZ(wires=[wire, wire + 1])\n for wire in range(1, wires - 1, 2):\n qml.CZ(wires=[wire, wire + 1])\n return qml.expval(qml.PauliZ(0))",
"_____no_output_____"
],
[
"def variational_training_circuit(params, data):\n qml.templates.embeddings.AngleEmbedding(\n features=data, wires=range(wires), rotation=\"X\"\n )\n return variational_circuit(params)",
"_____no_output_____"
],
[
"dev = qml.device('default.qubit', wires=wires, shots=1000)\ncircuit = qml.QNode(func=variational_circuit, device=dev)\ntraining_circuit = qml.QNode(func=variational_training_circuit, device=dev)",
"_____no_output_____"
],
[
"circuit(parameters)",
"_____no_output_____"
],
[
"training_circuit(parameters, data.train_data[0])",
"_____no_output_____"
],
[
"print(training_circuit.draw())",
"_____no_output_____"
],
[
"# some helpers\ndef correctly_classified(params, data, target):\n prediction = training_circuit(params, data)\n if prediction < 0 and target[0] > 0:\n return True\n elif prediction > 0 and target[1] > 0:\n return True\n return False\n\ndef overall_cost_and_correct(cost_fn, params, data, targets):\n cost = correct_count = 0\n for datum, target in zip(data, targets):\n cost += cost_fn(params, datum, target)\n correct_count += int(correctly_classified(params, datum, target))\n return cost, correct_count",
"_____no_output_____"
],
[
"# Playing with different cost functions\ndef crossentropy_cost(params, data, target):\n prediction = training_circuit(params, data)\n scaled_prediction = prediction + 1 / 2\n predictions = np.array([1 - scaled_prediction, scaled_prediction])\n return cross_entropy(predictions, target)\n\ndef distributed_cost(params, data, target):\n \"\"\"Cost function distributes probabilities to both classes.\"\"\"\n prediction = training_circuit(params, data)\n scaled_prediction = prediction + 1 / 2\n predictions = np.array([1 - scaled_prediction, scaled_prediction])\n return np.sum(np.abs(target - predictions))\n\ndef cost(params, data, target):\n \"\"\"Cost function penalizes choosing wrong class.\"\"\"\n prediction = training_circuit(params, data)\n predictions = np.array([0, prediction]) if prediction > 0 else np.array([prediction * -1, 0])\n return np.sum(np.abs(target - predictions))",
"_____no_output_____"
],
[
"optimizer = qml.AdamOptimizer()\ncost_fn = cost",
"_____no_output_____"
],
[
"start_cost, correct_count = overall_cost_and_correct(cost_fn, parameters, data.test_data, data.test_target)\nprint(f\"start cost: {start_cost}, with {correct_count}/{len(data.test_target)} correct samples\")",
"start cost: 56.230000000000004, with 24/50 correct samples\n"
],
[
"params = parameters.copy()\nfor _ in range(epochs):\n for datum, target in zip(data.train_data, data.train_target):\n params = optimizer.step(lambda weights: cost_fn(weights, datum, target), params)\n\n cost, correct_count = overall_cost_and_correct(cost_fn, params, data.test_data, data.test_target)\n print(f\"epoch{_} cost: {cost}, with {correct_count}/{len(data.test_target)} correct samples\")",
"epoch0 cost: 43.85400000000002, with 29/50 correct samples\nepoch1 cost: 38.38000000000001, with 34/50 correct samples\nepoch2 cost: 33.187999999999995, with 38/50 correct samples\nepoch3 cost: 31.218000000000004, with 39/50 correct samples\nepoch4 cost: 31.368, with 39/50 correct samples\n"
],
[
"final_cost, correct_count = overall_cost_and_correct(cost_fn, params, data.test_data, data.test_target)\nprint(f\"final cost: {final_cost}, with {correct_count}/{len(data.test_target)} correct samples\")",
"final cost: 31.08, with 40/50 correct samples\n"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf93a26d14b808fd82067376ac46e87a4a5c758
| 83,930 |
ipynb
|
Jupyter Notebook
|
#3 Data Manipulation & Visualization/Visualization/#3.2.3 - Simple Scatter Plots.ipynb
|
Sphincz/dsml
|
a292fd717fc01980c08f4ea23fde910d37fbd1cb
|
[
"MIT"
] | null | null | null |
#3 Data Manipulation & Visualization/Visualization/#3.2.3 - Simple Scatter Plots.ipynb
|
Sphincz/dsml
|
a292fd717fc01980c08f4ea23fde910d37fbd1cb
|
[
"MIT"
] | null | null | null |
#3 Data Manipulation & Visualization/Visualization/#3.2.3 - Simple Scatter Plots.ipynb
|
Sphincz/dsml
|
a292fd717fc01980c08f4ea23fde910d37fbd1cb
|
[
"MIT"
] | null | null | null | 499.583333 | 62,144 | 0.948326 |
[
[
[
"# Simple Scatter Plots",
"_____no_output_____"
],
[
"Another commonly used plot type is the simple scatter plot, a close cousin of the line plot.\nInstead of points being joined by line segments, here the points are represented individually with a dot, circle, or other shape.\nWeโll start by setting up the notebook for plotting and importing the functions we will use:",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\nimport matplotlib.pyplot as plt\nplt.style.use('seaborn-whitegrid')\nimport numpy as np",
"_____no_output_____"
]
],
[
[
"## Scatter Plots with ``plt.plot``\n\nIn the previous section we looked at ``plt.plot``/``ax.plot`` to produce line plots.\nIt turns out that this same function can produce scatter plots as well:",
"_____no_output_____"
]
],
[
[
"x = np.linspace(0, 10, 30)\ny = np.sin(x)\n\nplt.plot(x, y, 'o', color='black');",
"_____no_output_____"
]
],
[
[
"This type of flexibility in the ``plt.plot`` function allows for a wide variety of possible visualization options.\nFor a full description of the options available, refer to the ``plt.plot`` documentation.",
"_____no_output_____"
],
[
"## Scatter Plots with ``plt.scatter``\n\nA second, more powerful method of creating scatter plots is the ``plt.scatter`` function, which can be used very similarly to the ``plt.plot`` function:",
"_____no_output_____"
]
],
[
[
"plt.scatter(x, y, marker='o');",
"_____no_output_____"
]
],
[
[
"The primary difference of ``plt.scatter`` from ``plt.plot`` is that it can be used to create scatter plots where the properties of each individual point (size, face color, edge color, etc.) can be individually controlled or mapped to data.\n\nLet's show this by creating a random scatter plot with points of many colors and sizes.\nIn order to better see the overlapping results, we'll also use the ``alpha`` keyword to adjust the transparency level:",
"_____no_output_____"
]
],
[
[
"rng = np.random.RandomState(0)\nx = rng.randn(100)\ny = rng.randn(100)\ncolors = rng.rand(100)\nsizes = 1000 * rng.rand(100)\n\nplt.scatter(x, y, c=colors, s=sizes, alpha=0.3,\n cmap='viridis')\nplt.colorbar(); # show color scale",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbf94ad43f35d879579d0850703583b0ec000b66
| 7,951 |
ipynb
|
Jupyter Notebook
|
Brewery DF.ipynb
|
sponre01/project-one
|
e6ae83d78f16819f447f5654c83581cddecc5cf5
|
[
"MIT"
] | 4 |
2019-01-04T01:05:55.000Z
|
2019-03-16T15:37:59.000Z
|
Brewery DF.ipynb
|
sponre01/project-one
|
e6ae83d78f16819f447f5654c83581cddecc5cf5
|
[
"MIT"
] | 4 |
2019-01-04T01:34:46.000Z
|
2019-01-15T19:32:46.000Z
|
Brewery DF.ipynb
|
sponre01/project-one
|
e6ae83d78f16819f447f5654c83581cddecc5cf5
|
[
"MIT"
] | null | null | null | 29.231618 | 314 | 0.566092 |
[
[
[
"# Import Dependencies",
"_____no_output_____"
]
],
[
[
"from config import api_key\nimport matplotlib.pyplot as plt\nimport pandas as pd\nimport numpy as np\nimport requests\nimport datetime\nimport json",
"_____no_output_____"
]
],
[
[
"# Use API to get .json",
"_____no_output_____"
]
],
[
[
"endpoint = 'breweries'\npage = 1\nurl = f\"https://sandbox-api.brewerydb.com/v2/{endpoint}/?key={api_key}&p={page}&withLocations=Y&withSocialAccounts=Y\"\nbrewery_data = requests.get(url).json()\n#print(json.dumps(brewery_data, indent=4, sort_keys=True))",
"_____no_output_____"
]
],
[
[
"# Create DataFrame",
"_____no_output_____"
],
[
"- Initially, we pull just a few interesting columns for the dataframe, most importantly, the established dates and lat/lon coordinates for each brewery\n- We will add distance columns later after doing some math\n- Change the Established Date column to numeric in order to use in the scatter plot",
"_____no_output_____"
]
],
[
[
"brewery_dict = []\n\nfor result in range(0,19):\n try: \n brewery_info = {\n 'Brewery Name': brewery_data['data'][result]['name'],\n 'Brewery ID': brewery_data['data'][result]['id'], \n 'Established Date': brewery_data['data'][result]['established'], \n 'Is in business?': brewery_data['data'][result]['isInBusiness'], \n 'Website': brewery_data['data'][result]['website'],\n 'Country': brewery_data['data'][result]['locations'][0]['country']['isoCode'],\n 'City':brewery_data['data'][result]['locations'][0]['locality'],\n 'Latitude':brewery_data['data'][result]['locations'][0]['latitude'],\n 'Longitude':brewery_data['data'][result]['locations'][0]['longitude'],\n 'Primary Location':brewery_data['data'][result]['locations'][0]['isPrimary'],\n 'Distance from Chicago (km)':'',\n 'Distance from Pottsville (km)':''\n } \n except:\n print('id not found')\n brewery_dict.append(brewery_info)",
"_____no_output_____"
],
[
"brewery_df = pd.DataFrame(brewery_dict)\nbrewery_df['Established Date']=pd.to_numeric(brewery_df['Established Date'])\n#brewery_df",
"_____no_output_____"
]
],
[
[
"# Determine Distances from Chicago",
"_____no_output_____"
],
[
"- use geopy to determine distances via lat/long data\n- Chicago is one of the hot-spots for early American breweries, made possible by the German immigrant community\n- Pottsville (Becky's hometown) is home to the oldest brewery in America - Yeungling!\n- update the dataframe, clean it and export as a csv",
"_____no_output_____"
]
],
[
[
"#!pip install geopy",
"_____no_output_____"
],
[
"import geopy.distance\n\nChi_coords = (41.8781, -87.6298)\nPottsville_coords = (40.6856, -76.1955)\n\nfor x in range(0,19):\n Brewery_coords = (brewery_df['Latitude'][x], brewery_df['Longitude'][x])\n brewery_df['Distance from Chicago (km)'][x] = geopy.distance.distance(Chi_coords, Brewery_coords).km\n brewery_df['Distance from Pottsville (km)'][x] = geopy.distance.distance(Pottsville_coords, Brewery_coords).km",
"_____no_output_____"
],
[
"brewery_df = brewery_df.drop_duplicates(subset=['Brewery ID'], keep='first')\n\nbrewery_df",
"_____no_output_____"
],
[
"brewery_df.to_csv(\"data/brewery_data.csv\", encoding=\"utf-8\", index=False)",
"_____no_output_____"
]
],
[
[
"# Figures",
"_____no_output_____"
],
[
"- I expect a greater number of older breweries closer to Chicago, given that some of the first instances of brewing in America occured here.\n- With such few breweries available for free (boo sandbox), the scatter plot looks a little sparse. However, the general trend gives us preliminary data that shows that there may be a coorlation! If I wanted to do more with this, this would be good enough to convince me to splurge the $20 for full access\n\n- plot for Pottsville is just for fun",
"_____no_output_____"
]
],
[
[
"#Chicago\nplt.scatter(brewery_df['Distance from Chicago (km)'], brewery_df['Established Date'], \n alpha=0.5, edgecolor ='black', color=\"blue\",s=100)\n\n#Chart elements\nplt.title(f\"Distance from Chicago vs. Established Year\")\nplt.xlabel('Distance from Chicago (km)')\nplt.ylabel('Established Year')\nplt.grid(True)\n\n#Save and print\nplt.savefig(\"images/Distance from Chicago vs. Established Year.png\")\nplt.show()",
"_____no_output_____"
],
[
"#Pottsville\nplt.scatter(brewery_df['Distance from Pottsville (km)'], brewery_df['Established Date'], alpha=0.5, edgecolor ='black', color=\"red\",s=100)\n\n#Chart elements\nplt.title(f\"Distance from Pottsville vs. Established Year\")\nplt.xlabel('Distance from Pottsville (km)')\nplt.ylabel('Established Year')\nplt.grid(True)\n\n#Save and print\n#plt.savefig(\"images/Distance from Pottsville vs. Established Year.png\")\nplt.show()",
"_____no_output_____"
],
[
"#Empty Plot\nplt.scatter(brewery_df['Distance from Chicago (km)'], brewery_df['Established Date'], alpha=0.5, edgecolor ='none', color=\"none\",s=100)\n\n#Chart elements\nplt.title(f\"Distance from Chicago vs. Established Year\")\nplt.xlabel('Distance from Chicago (km)')\nplt.ylabel('Established Year')\nplt.grid(True)\n\n#Save and print\nplt.savefig(\"images/Empty plot.png\")\nplt.show()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
]
] |
cbf9527c206b59033d56041bd86ea28b200c440d
| 98,269 |
ipynb
|
Jupyter Notebook
|
week7/InClass Assignment.ipynb
|
ZoeLeBlanc/ZoeLeBlanc-IntroToDH2020
|
4225ac910b65ff2b54cea98906033df84cbe5701
|
[
"MIT"
] | 2 |
2020-07-24T18:03:49.000Z
|
2020-10-23T22:33:43.000Z
|
week7/InClass Assignment.ipynb
|
ZoeLeBlanc/ZoeLeBlanc-IntroToDH2020
|
4225ac910b65ff2b54cea98906033df84cbe5701
|
[
"MIT"
] | null | null | null |
week7/InClass Assignment.ipynb
|
ZoeLeBlanc/ZoeLeBlanc-IntroToDH2020
|
4225ac910b65ff2b54cea98906033df84cbe5701
|
[
"MIT"
] | null | null | null | 95.406796 | 25,648 | 0.776684 |
[
[
[
"import pandas as pd\nimport matplotlib as plt",
"_____no_output_____"
],
[
"film_data = pd.read_csv('meta_data7.csv', encoding = \"ISO-8859-1\")\nch_mapping = pd.read_csv('character_mapping.csv', encoding = \"ISO-8859-1\")\nch_list = pd.read_csv('character_list5.csv', encoding = \"ISO-8859-1\")",
"_____no_output_____"
],
[
"film_data",
"_____no_output_____"
],
[
"ch_mapping",
"_____no_output_____"
],
[
"ch_list",
"_____no_output_____"
],
[
"ch_list.imdb_character_name.isna().any()",
"_____no_output_____"
],
[
"ch_list[ch_list.imdb_character_name.isna()]",
"_____no_output_____"
]
],
[
[
"How could we tell if the amount of dialogue was increasing over time in movies? How might this influence the assessment about the breakdown of gender dialogue?\nHow could test if there was any relationship between the film's gross value and the amount of dialogue in the film?",
"_____no_output_____"
]
],
[
[
"# How could we tell if the amount of dialogue was increasing over time in movies?",
"_____no_output_____"
],
[
"ch_list.columns.tolist()",
"_____no_output_____"
],
[
"ch_list.gender.unique().tolist()",
"_____no_output_____"
],
[
"script_dialogue = ch_list.groupby('script_id')['words'].sum().reset_index(name=\"total_dialogue\")",
"_____no_output_____"
],
[
"film_dialogue = pd.merge(film_data, script_dialogue, on='script_id')",
"_____no_output_____"
],
[
"len(film_data), len(script_dialogue)",
"_____no_output_____"
],
[
"film_dialogue",
"_____no_output_____"
],
[
"film_dialogue.plot.scatter(x='year', y='total_dialogue')",
"_____no_output_____"
],
[
"film_dialogue.plot.scatter(x='gross', y='total_dialogue')",
"_____no_output_____"
],
[
"gross_dialogue = film_dialogue[film_dialogue.gross > 0]",
"_____no_output_____"
],
[
"gross_dialogue.plot.scatter(x='gross', y='total_dialogue')",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf955c856cf620b6e3cb3b094f6e36506b4f637
| 692,389 |
ipynb
|
Jupyter Notebook
|
assignments/2019/assignment2/TensorFlow.ipynb
|
adychn/cs231n.github.io
|
f04e6852ef727d46bc3d0cf6d0ca47cc62af542d
|
[
"MIT"
] | null | null | null |
assignments/2019/assignment2/TensorFlow.ipynb
|
adychn/cs231n.github.io
|
f04e6852ef727d46bc3d0cf6d0ca47cc62af542d
|
[
"MIT"
] | null | null | null |
assignments/2019/assignment2/TensorFlow.ipynb
|
adychn/cs231n.github.io
|
f04e6852ef727d46bc3d0cf6d0ca47cc62af542d
|
[
"MIT"
] | null | null | null | 363.649685 | 300,090 | 0.078316 |
[
[
[
"# What's this TensorFlow business?\n\nYou've written a lot of code in this assignment to provide a whole host of neural network functionality. Dropout, Batch Norm, and 2D convolutions are some of the workhorses of deep learning in computer vision. You've also worked hard to make your code efficient and vectorized.\n\nFor the last part of this assignment, though, we're going to leave behind your beautiful codebase and instead migrate to one of two popular deep learning frameworks: in this instance, TensorFlow (or PyTorch, if you choose to work with that notebook).",
"_____no_output_____"
],
[
"#### What is it?\nTensorFlow is a system for executing computational graphs over Tensor objects, with native support for performing backpropogation for its Variables. In it, we work with Tensors which are n-dimensional arrays analogous to the numpy ndarray.\n\n#### Why?\n\n* Our code will now run on GPUs! Much faster training. Writing your own modules to run on GPUs is beyond the scope of this class, unfortunately.\n* We want you to be ready to use one of these frameworks for your project so you can experiment more efficiently than if you were writing every feature you want to use by hand. \n* We want you to stand on the shoulders of giants! TensorFlow and PyTorch are both excellent frameworks that will make your lives a lot easier, and now that you understand their guts, you are free to use them :) \n* We want you to be exposed to the sort of deep learning code you might run into in academia or industry. ",
"_____no_output_____"
],
[
"## How will I learn TensorFlow?\n\nTensorFlow has many excellent tutorials available, including those from [Google themselves](https://www.tensorflow.org/get_started/get_started).\n\nOtherwise, this notebook will walk you through much of what you need to do to train models in TensorFlow. See the end of the notebook for some links to helpful tutorials if you want to learn more or need further clarification on topics that aren't fully explained here.\n\n**NOTE: This notebook is meant to teach you the latest version of Tensorflow 2.0. Most examples on the web today are still in 1.x, so be careful not to confuse the two when looking up documentation**.\n\n## Install Tensorflow 2.0\nTensorflow 2.0 is still not in a fully 100% stable release, but it's still usable and more intuitive than TF 1.x. Please make sure you have it installed before moving on in this notebook! Here are some steps to get started:\n\n1. Have the latest version of Anaconda installed on your machine.\n2. Create a new conda environment starting from Python 3.7. In this setup example, we'll call it `tf_20_env`.\n3. Run the command: `source activate tf_20_env`\n4. Then pip install TF 2.0 as described here: https://www.tensorflow.org/install/pip \n\nA guide on creating Anaconda enviornments: https://uoa-eresearch.github.io/eresearch-cookbook/recipe/2014/11/20/conda/\n\nThis will give you an new enviornemnt to play in TF 2.0. Generally, if you plan to also use TensorFlow in your other projects, you might also want to keep a seperate Conda environment or virtualenv in Python 3.7 that has Tensorflow 1.9, so you can switch back and forth at will. ",
"_____no_output_____"
],
[
"# Table of Contents\n\nThis notebook has 5 parts. We will walk through TensorFlow at **three different levels of abstraction**, which should help you better understand it and prepare you for working on your project.\n\n1. Part I, Preparation: load the CIFAR-10 dataset.\n2. Part II, Barebone TensorFlow: **Abstraction Level 1**, we will work directly with low-level TensorFlow graphs. \n3. Part III, Keras Model API: **Abstraction Level 2**, we will use `tf.keras.Model` to define arbitrary neural network architecture. \n4. Part IV, Keras Sequential + Functional API: **Abstraction Level 3**, we will use `tf.keras.Sequential` to define a linear feed-forward network very conveniently, and then explore the functional libraries for building unique and uncommon models that require more flexibility.\n5. Part V, CIFAR-10 open-ended challenge: please implement your own network to get as high accuracy as possible on CIFAR-10. You can experiment with any layer, optimizer, hyperparameters or other advanced features. \n\nWe will discuss Keras in more detail later in the notebook.\n\nHere is a table of comparison:\n\n| API | Flexibility | Convenience |\n|---------------|-------------|-------------|\n| Barebone | High | Low |\n| `tf.keras.Model` | High | Medium |\n| `tf.keras.Sequential` | Low | High |",
"_____no_output_____"
],
[
"# Part I: Preparation\n\nFirst, we load the CIFAR-10 dataset. This might take a few minutes to download the first time you run it, but after that the files should be cached on disk and loading should be faster.\n\nIn previous parts of the assignment we used CS231N-specific code to download and read the CIFAR-10 dataset; however the `tf.keras.datasets` package in TensorFlow provides prebuilt utility functions for loading many common datasets.\n\nFor the purposes of this assignment we will still write our own code to preprocess the data and iterate through it in minibatches. The `tf.data` package in TensorFlow provides tools for automating this process, but working with this package adds extra complication and is beyond the scope of this notebook. However using `tf.data` can be much more efficient than the simple approach used in this notebook, so you should consider using it for your project.",
"_____no_output_____"
]
],
[
[
"import os\nimport tensorflow as tf\nimport numpy as np\nimport math\nimport timeit\nimport matplotlib.pyplot as plt\n\n%matplotlib inline\n\n\nprint(tf.__version__) # need tf 2.0",
"2.0.0\n"
],
[
"def load_cifar10(num_training=49000, num_validation=1000, num_test=10000):\n \"\"\"\n Fetch the CIFAR-10 dataset from the web and perform preprocessing to prepare\n it for the two-layer neural net classifier. These are the same steps as\n we used for the SVM, but condensed to a single function.\n \"\"\"\n # Load the raw CIFAR-10 dataset and use appropriate data types and shapes\n cifar10 = tf.keras.datasets.cifar10.load_data()\n (X_train, y_train), (X_test, y_test) = cifar10\n X_train = np.asarray(X_train, dtype=np.float32)\n y_train = np.asarray(y_train, dtype=np.int32).flatten()\n X_test = np.asarray(X_test, dtype=np.float32)\n y_test = np.asarray(y_test, dtype=np.int32).flatten()\n\n # Subsample the data\n mask = range(num_training, num_training + num_validation)\n X_val = X_train[mask]\n y_val = y_train[mask]\n mask = range(num_training)\n X_train = X_train[mask]\n y_train = y_train[mask]\n mask = range(num_test)\n X_test = X_test[mask]\n y_test = y_test[mask]\n\n # Normalize the data: subtract the mean pixel and divide by std\n mean_pixel = X_train.mean(axis=(0, 1, 2), keepdims=True)\n std_pixel = X_train.std(axis=(0, 1, 2), keepdims=True)\n X_train = (X_train - mean_pixel) / std_pixel\n X_val = (X_val - mean_pixel) / std_pixel\n X_test = (X_test - mean_pixel) / std_pixel\n\n return X_train, y_train, X_val, y_val, X_test, y_test\n\n# If there are errors with SSL downloading involving self-signed certificates,\n# it may be that your Python version was recently installed on the current machine.\n# See: https://github.com/tensorflow/tensorflow/issues/10779\n# To fix, run the command: /Applications/Python\\ 3.7/Install\\ Certificates.command\n# ...replacing paths as necessary.\n\n# Invoke the above function to get our data.\nNHW = (0, 1, 2)\nX_train, y_train, X_val, y_val, X_test, y_test = load_cifar10()\nprint('Train data shape: ', X_train.shape)\nprint('Train labels shape: ', y_train.shape, y_train.dtype)\nprint('Validation data shape: ', X_val.shape)\nprint('Validation labels shape: ', y_val.shape)\nprint('Test data shape: ', X_test.shape)\nprint('Test labels shape: ', y_test.shape)",
"Train data shape: (49000, 32, 32, 3)\nTrain labels shape: (49000,) int32\nValidation data shape: (1000, 32, 32, 3)\nValidation labels shape: (1000,)\nTest data shape: (10000, 32, 32, 3)\nTest labels shape: (10000,)\n"
],
[
"class Dataset(object):\n def __init__(self, X, y, batch_size, shuffle=False):\n \"\"\"\n Construct a Dataset object to iterate over data X and labels y\n \n Inputs:\n - X: Numpy array of data, of any shape\n - y: Numpy array of labels, of any shape but with y.shape[0] == X.shape[0]\n - batch_size: Integer giving number of elements per minibatch\n - shuffle: (optional) Boolean, whether to shuffle the data on each epoch\n \"\"\"\n assert X.shape[0] == y.shape[0], 'Got different numbers of data and labels'\n self.X, self.y = X, y\n self.batch_size, self.shuffle = batch_size, shuffle\n\n def __iter__(self):\n N, B = self.X.shape[0], self.batch_size\n idxs = np.arange(N)\n if self.shuffle:\n np.random.shuffle(idxs)\n return iter((self.X[i:i+B], self.y[i:i+B]) for i in range(0, N, B))\n\n\ntrain_dset = Dataset(X_train, y_train, batch_size=64, shuffle=True)\nval_dset = Dataset(X_val, y_val, batch_size=64, shuffle=False)\ntest_dset = Dataset(X_test, y_test, batch_size=64)",
"_____no_output_____"
],
[
"# We can iterate through a dataset like this:\nfor t, (x, y) in enumerate(train_dset):\n print(t, x.shape, y.shape)\n if t > 5: break",
"0 (64, 32, 32, 3) (64,)\n1 (64, 32, 32, 3) (64,)\n2 (64, 32, 32, 3) (64,)\n3 (64, 32, 32, 3) (64,)\n4 (64, 32, 32, 3) (64,)\n5 (64, 32, 32, 3) (64,)\n6 (64, 32, 32, 3) (64,)\n"
]
],
[
[
"You can optionally **use GPU by setting the flag to True below**. It's not neccessary to use a GPU for this assignment; if you are working on Google Cloud then we recommend that you do not use a GPU, as it will be significantly more expensive.",
"_____no_output_____"
]
],
[
[
"# Set up some global variables\nUSE_GPU = True\n\nif USE_GPU:\n device = '/device:GPU:0'\nelse:\n device = '/cpu:0'\n\n# Constant to control how often we print when training models\nprint_every = 100\n\nprint('Using device: ', device)",
"Using device: /device:GPU:0\n"
]
],
[
[
"# Part II: Barebones TensorFlow\nTensorFlow ships with various high-level APIs which make it very convenient to define and train neural networks; we will cover some of these constructs in Part III and Part IV of this notebook. In this section we will start by building a model with basic TensorFlow constructs to help you better understand what's going on under the hood of the higher-level APIs.\n\n**\"Barebones Tensorflow\" is important to understanding the building blocks of TensorFlow, but much of it involves concepts from TensorFlow 1.x.** We will be working with legacy modules such as `tf.Variable`.\n\nTherefore, please read and understand the differences between legacy (1.x) TF and the new (2.0) TF.\n\n### Historical background on TensorFlow 1.x\n\nTensorFlow 1.x is primarily a framework for working with **static computational graphs**. Nodes in the computational graph are Tensors which will hold n-dimensional arrays when the graph is run; edges in the graph represent functions that will operate on Tensors when the graph is run to actually perform useful computation.\n\nBefore Tensorflow 2.0, we had to configure the graph into two phases. There are plenty of tutorials online that explain this two-step process. The process generally looks like the following for TF 1.x:\n1. **Build a computational graph that describes the computation that you want to perform**. This stage doesn't actually perform any computation; it just builds up a symbolic representation of your computation. This stage will typically define one or more `placeholder` objects that represent inputs to the computational graph.\n2. **Run the computational graph many times.** Each time the graph is run (e.g. for one gradient descent step) you will specify which parts of the graph you want to compute, and pass a `feed_dict` dictionary that will give concrete values to any `placeholder`s in the graph.\n\n### The new paradigm in Tensorflow 2.0\nNow, with Tensorflow 2.0, we can simply adopt a functional form that is more Pythonic and similar in spirit to PyTorch and direct Numpy operation. Instead of the 2-step paradigm with computation graphs, making it (among other things) easier to debug TF code. You can read more details at https://www.tensorflow.org/guide/eager.\n\nThe main difference between the TF 1.x and 2.0 approach is that the 2.0 approach doesn't make use of `tf.Session`, `tf.run`, `placeholder`, `feed_dict`. To get more details of what's different between the two version and how to convert between the two, check out the official migration guide: https://www.tensorflow.org/alpha/guide/migration_guide\n\nLater, in the rest of this notebook we'll focus on this new, simpler approach.",
"_____no_output_____"
],
[
"### TensorFlow warmup: Flatten Function\n\nWe can see this in action by defining a simple `flatten` function that will reshape image data for use in a fully-connected network.\n\nIn TensorFlow, data for convolutional feature maps is typically stored in a Tensor of shape N x H x W x C where:\n\n- N is the number of datapoints (minibatch size)\n- H is the height of the feature map\n- W is the width of the feature map\n- C is the number of channels in the feature map\n\nThis is the right way to represent the data when we are doing something like a 2D convolution, that needs spatial understanding of where the intermediate features are relative to each other. When we use fully connected affine layers to process the image, however, we want each datapoint to be represented by a single vector -- it's no longer useful to segregate the different channels, rows, and columns of the data. So, we use a \"flatten\" operation to collapse the `H x W x C` values per representation into a single long vector. \n\nNotice the `tf.reshape` call has the target shape as `(N, -1)`, meaning it will reshape/keep the first dimension to be N, and then infer as necessary what the second dimension is in the output, so we can collapse the remaining dimensions from the input properly.\n\n**NOTE**: TensorFlow and PyTorch differ on the default Tensor layout; TensorFlow uses N x H x W x C but PyTorch uses N x C x H x W.",
"_____no_output_____"
]
],
[
[
"def flatten(x):\n \"\"\" \n Input:\n - TensorFlow Tensor of shape (N, D1, ..., DM)\n \n Output:\n - TensorFlow Tensor of shape (N, D1 * ... * DM)\n \"\"\"\n N = tf.shape(x)[0]\n return tf.reshape(x, (N, -1))",
"_____no_output_____"
],
[
"def test_flatten():\n # Construct concrete values of the input data x using numpy\n x_np = np.arange(24).reshape((2, 3, 4))\n print('x_np:\\n', x_np, '\\n')\n # Compute a concrete output value.\n x_flat_np = flatten(x_np)\n print('x_flat_np:\\n', x_flat_np, '\\n')\n\ntest_flatten()",
"x_np:\n [[[ 0 1 2 3]\n [ 4 5 6 7]\n [ 8 9 10 11]]\n\n [[12 13 14 15]\n [16 17 18 19]\n [20 21 22 23]]] \n\nx_flat_np:\n tf.Tensor(\n[[ 0 1 2 3 4 5 6 7 8 9 10 11]\n [12 13 14 15 16 17 18 19 20 21 22 23]], shape=(2, 12), dtype=int32) \n\n"
]
],
[
[
"### Barebones TensorFlow: Define a Two-Layer Network\nWe will now implement our first neural network with TensorFlow: a fully-connected ReLU network with two hidden layers and no biases on the CIFAR10 dataset. For now we will use only low-level TensorFlow operators to define the network; later we will see how to use the higher-level abstractions provided by `tf.keras` to simplify the process.\n\nWe will define the forward pass of the network in the function `two_layer_fc`; this will accept TensorFlow Tensors for the inputs and weights of the network, and return a TensorFlow Tensor for the scores. \n\nAfter defining the network architecture in the `two_layer_fc` function, we will test the implementation by checking the shape of the output.\n\n**It's important that you read and understand this implementation.**",
"_____no_output_____"
]
],
[
[
"def two_layer_fc(x, params):\n \"\"\"\n A fully-connected neural network; the architecture is:\n fully-connected layer -> ReLU -> fully connected layer.\n Note that we only need to define the forward pass here; TensorFlow will take\n care of computing the gradients for us.\n \n The input to the network will be a minibatch of data, of shape\n (N, d1, ..., dM) where d1 * ... * dM = D. The hidden layer will have H units,\n and the output layer will produce scores for C classes.\n\n Inputs:\n - x: A TensorFlow Tensor of shape (N, d1, ..., dM) giving a minibatch of\n input data.\n - params: A list [w1, w2] of TensorFlow Tensors giving weights for the\n network, where w1 has shape (D, H) and w2 has shape (H, C).\n \n Returns:\n - scores: A TensorFlow Tensor of shape (N, C) giving classification scores\n for the input data x.\n \"\"\"\n w1, w2 = params # Unpack the parameters\n x = flatten(x) # Flatten the input; now x has shape (N, D)\n h = tf.nn.relu(tf.matmul(x, w1)) # Hidden layer: h has shape (N, H)\n scores = tf.matmul(h, w2) # Compute scores of shape (N, C)\n return scores",
"_____no_output_____"
],
[
"def two_layer_fc_test():\n hidden_layer_size = 42\n\n # Scoping our TF operations under a tf.device context manager \n # lets us tell TensorFlow where we want these Tensors to be\n # multiplied and/or operated on, e.g. on a CPU or a GPU.\n with tf.device(device): \n x = tf.zeros((64, 32, 32, 3))\n w1 = tf.zeros((32 * 32 * 3, hidden_layer_size))\n w2 = tf.zeros((hidden_layer_size, 10))\n\n # Call our two_layer_fc function for the forward pass of the network.\n scores = two_layer_fc(x, [w1, w2])\n\n print(scores.shape)\n\ntwo_layer_fc_test()",
"(64, 10)\n"
]
],
[
[
"### Barebones TensorFlow: Three-Layer ConvNet\nHere you will complete the implementation of the function `three_layer_convnet` which will perform the forward pass of a three-layer convolutional network. The network should have the following architecture:\n\n1. A convolutional layer (with bias) with `channel_1` filters, each with shape `KW1 x KH1`, and zero-padding of two\n2. ReLU nonlinearity\n3. A convolutional layer (with bias) with `channel_2` filters, each with shape `KW2 x KH2`, and zero-padding of one\n4. ReLU nonlinearity\n5. Fully-connected layer with bias, producing scores for `C` classes.\n\n**HINT**: For convolutions: https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/nn/conv2d; be careful with padding!\n\n**HINT**: For biases: https://www.tensorflow.org/performance/xla/broadcasting",
"_____no_output_____"
]
],
[
[
"def three_layer_convnet(x, params):\n \"\"\"\n A three-layer convolutional network with the architecture described above.\n \n Inputs:\n - x: A TensorFlow Tensor of shape (N, H, W, 3) giving a minibatch of images\n - params: A list of TensorFlow Tensors giving the weights and biases for the\n network; should contain the following:\n - conv_w1: TensorFlow Tensor of shape (KH1, KW1, 3, channel_1) giving\n weights for the first convolutional layer.\n - conv_b1: TensorFlow Tensor of shape (channel_1,) giving biases for the\n first convolutional layer.\n - conv_w2: TensorFlow Tensor of shape (KH2, KW2, channel_1, channel_2)\n giving weights for the second convolutional layer\n - conv_b2: TensorFlow Tensor of shape (channel_2,) giving biases for the\n second convolutional layer.\n - fc_w: TensorFlow Tensor giving weights for the fully-connected layer.\n Can you figure out what the shape should be?\n - fc_b: TensorFlow Tensor giving biases for the fully-connected layer.\n Can you figure out what the shape should be?\n \"\"\"\n conv_w1, conv_b1, conv_w2, conv_b2, fc_w, fc_b = params\n scores = None\n ############################################################################\n # TODO: Implement the forward pass for the three-layer ConvNet. #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n\n conv1 = tf.nn.conv2d(x, conv_w1, 1, [[0, 0], [2, 2], [2, 2], [0, 0]]) + conv_b1\n relu1 = tf.nn.relu(conv1)\n conv2 = tf.nn.conv2d(relu1, conv_w2, 1, [[0, 0], [1, 1], [1, 1], [0, 0]]) + conv_b2\n relu2 = tf.nn.relu(conv2)\n relu2_flat = flatten(relu2) \n scores = tf.matmul(relu2_flat, fc_w) + fc_b\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n return scores",
"_____no_output_____"
]
],
[
[
"After defing the forward pass of the three-layer ConvNet above, run the following cell to test your implementation. Like the two-layer network, we run the graph on a batch of zeros just to make sure the function doesn't crash, and produces outputs of the correct shape.\n\nWhen you run this function, `scores_np` should have shape `(64, 10)`.",
"_____no_output_____"
]
],
[
[
"def three_layer_convnet_test():\n \n with tf.device(device):\n x = tf.zeros((64, 32, 32, 3))\n conv_w1 = tf.zeros((5, 5, 3, 6))\n conv_b1 = tf.zeros((6,))\n conv_w2 = tf.zeros((3, 3, 6, 9))\n conv_b2 = tf.zeros((9,))\n fc_w = tf.zeros((32 * 32 * 9, 10))\n fc_b = tf.zeros((10,))\n params = [conv_w1, conv_b1, conv_w2, conv_b2, fc_w, fc_b]\n scores = three_layer_convnet(x, params)\n\n # Inputs to convolutional layers are 4-dimensional arrays with shape\n # [batch_size, height, width, channels]\n print('scores_np has shape: ', scores.shape)\n\nthree_layer_convnet_test()",
"scores_np has shape: (64, 10)\n"
]
],
[
[
"### Barebones TensorFlow: Training Step\n\nWe now define the `training_step` function performs a single training step. This will take three basic steps:\n\n1. Compute the loss\n2. Compute the gradient of the loss with respect to all network weights\n3. Make a weight update step using (stochastic) gradient descent.\n\n\nWe need to use a few new TensorFlow functions to do all of this:\n- For computing the cross-entropy loss we'll use `tf.nn.sparse_softmax_cross_entropy_with_logits`: https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/nn/sparse_softmax_cross_entropy_with_logits\n\n- For averaging the loss across a minibatch of data we'll use `tf.reduce_mean`:\nhttps://www.tensorflow.org/versions/r2.0/api_docs/python/tf/reduce_mean\n\n- For computing gradients of the loss with respect to the weights we'll use `tf.GradientTape` (useful for Eager execution): https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/GradientTape\n\n- We'll mutate the weight values stored in a TensorFlow Tensor using `tf.assign_sub` (\"sub\" is for subtraction): https://www.tensorflow.org/api_docs/python/tf/assign_sub \n",
"_____no_output_____"
]
],
[
[
"def training_step(model_fn, x, y, params, learning_rate):\n \n with tf.GradientTape() as tape:\n scores = model_fn(x, params) # Forward pass of the model\n loss = tf.nn.sparse_softmax_cross_entropy_with_logits(labels=y, logits=scores)\n total_loss = tf.reduce_mean(loss)\n grad_params = tape.gradient(total_loss, params)\n\n # Make a vanilla gradient descent step on all of the model parameters\n # Manually update the weights using assign_sub()\n for w, grad_w in zip(params, grad_params):\n w.assign_sub(learning_rate * grad_w)\n \n return total_loss",
"_____no_output_____"
],
[
"def train_part2(model_fn, init_fn, learning_rate):\n \"\"\"\n Train a model on CIFAR-10.\n \n Inputs:\n - model_fn: A Python function that performs the forward pass of the model\n using TensorFlow; it should have the following signature:\n scores = model_fn(x, params) where x is a TensorFlow Tensor giving a\n minibatch of image data, params is a list of TensorFlow Tensors holding\n the model weights, and scores is a TensorFlow Tensor of shape (N, C)\n giving scores for all elements of x.\n - init_fn: A Python function that initializes the parameters of the model.\n It should have the signature params = init_fn() where params is a list\n of TensorFlow Tensors holding the (randomly initialized) weights of the\n model.\n - learning_rate: Python float giving the learning rate to use for SGD.\n \"\"\"\n \n \n params = init_fn() # Initialize the model parameters \n \n for t, (x_np, y_np) in enumerate(train_dset):\n # Run the graph on a batch of training data.\n loss = training_step(model_fn, x_np, y_np, params, learning_rate)\n \n # Periodically print the loss and check accuracy on the val set.\n if t % print_every == 0:\n print('Iteration %d, loss = %.4f' % (t, loss))\n check_accuracy(val_dset, x_np, model_fn, params)",
"_____no_output_____"
],
[
"def check_accuracy(dset, x, model_fn, params):\n \"\"\"\n Check accuracy on a classification model, e.g. for validation.\n \n Inputs:\n - dset: A Dataset object against which to check accuracy\n - x: A TensorFlow placeholder Tensor where input images should be fed\n - model_fn: the Model we will be calling to make predictions on x\n - params: parameters for the model_fn to work with\n \n Returns: Nothing, but prints the accuracy of the model\n \"\"\"\n num_correct, num_samples = 0, 0\n for x_batch, y_batch in dset:\n scores_np = model_fn(x_batch, params).numpy()\n y_pred = scores_np.argmax(axis=1)\n num_samples += x_batch.shape[0]\n num_correct += (y_pred == y_batch).sum()\n acc = float(num_correct) / num_samples\n print('Got %d / %d correct (%.2f%%)' % (num_correct, num_samples, 100 * acc))",
"_____no_output_____"
]
],
[
[
"### Barebones TensorFlow: Initialization\nWe'll use the following utility method to initialize the weight matrices for our models using Kaiming's normalization method.\n\n[1] He et al, *Delving Deep into Rectifiers: Surpassing Human-Level Performance on ImageNet Classification\n*, ICCV 2015, https://arxiv.org/abs/1502.01852",
"_____no_output_____"
]
],
[
[
"def create_matrix_with_kaiming_normal(shape):\n if len(shape) == 2:\n fan_in, fan_out = shape[0], shape[1]\n elif len(shape) == 4:\n fan_in, fan_out = np.prod(shape[:3]), shape[3]\n return tf.keras.backend.random_normal(shape) * np.sqrt(2.0 / fan_in)",
"_____no_output_____"
]
],
[
[
"### Barebones TensorFlow: Train a Two-Layer Network\nWe are finally ready to use all of the pieces defined above to train a two-layer fully-connected network on CIFAR-10.\n\nWe just need to define a function to initialize the weights of the model, and call `train_part2`.\n\nDefining the weights of the network introduces another important piece of TensorFlow API: `tf.Variable`. A TensorFlow Variable is a Tensor whose value is stored in the graph and persists across runs of the computational graph; however unlike constants defined with `tf.zeros` or `tf.random_normal`, the values of a Variable can be mutated as the graph runs; these mutations will persist across graph runs. Learnable parameters of the network are usually stored in Variables.\n\nYou don't need to tune any hyperparameters, but you should achieve validation accuracies above 40% after one epoch of training.",
"_____no_output_____"
]
],
[
[
"def two_layer_fc_init():\n \"\"\"\n Initialize the weights of a two-layer network, for use with the\n two_layer_network function defined above. \n You can use the `create_matrix_with_kaiming_normal` helper!\n \n Inputs: None\n \n Returns: A list of:\n - w1: TensorFlow tf.Variable giving the weights for the first layer\n - w2: TensorFlow tf.Variable giving the weights for the second layer\n \"\"\"\n hidden_layer_size = 4000\n w1 = tf.Variable(create_matrix_with_kaiming_normal((3 * 32 * 32, 4000)))\n w2 = tf.Variable(create_matrix_with_kaiming_normal((4000, 10)))\n return [w1, w2]\n\nlearning_rate = 1e-2\ntrain_part2(two_layer_fc, two_layer_fc_init, learning_rate)",
"Iteration 0, loss = 3.1796\nGot 138 / 1000 correct (13.80%)\nIteration 100, loss = 1.8606\nGot 371 / 1000 correct (37.10%)\nIteration 200, loss = 1.4913\nGot 389 / 1000 correct (38.90%)\nIteration 300, loss = 1.8782\nGot 363 / 1000 correct (36.30%)\nIteration 400, loss = 1.8509\nGot 426 / 1000 correct (42.60%)\nIteration 500, loss = 1.8515\nGot 419 / 1000 correct (41.90%)\nIteration 600, loss = 1.9118\nGot 428 / 1000 correct (42.80%)\nIteration 700, loss = 1.8966\nGot 450 / 1000 correct (45.00%)\n"
]
],
[
[
"### Barebones TensorFlow: Train a three-layer ConvNet\nWe will now use TensorFlow to train a three-layer ConvNet on CIFAR-10.\n\nYou need to implement the `three_layer_convnet_init` function. Recall that the architecture of the network is:\n\n1. Convolutional layer (with bias) with 32 5x5 filters, with zero-padding 2\n2. ReLU\n3. Convolutional layer (with bias) with 16 3x3 filters, with zero-padding 1\n4. ReLU\n5. Fully-connected layer (with bias) to compute scores for 10 classes\n\nYou don't need to do any hyperparameter tuning, but you should see validation accuracies above 43% after one epoch of training.",
"_____no_output_____"
]
],
[
[
"def three_layer_convnet_init():\n \"\"\"\n Initialize the weights of a Three-Layer ConvNet, for use with the\n three_layer_convnet function defined above.\n You can use the `create_matrix_with_kaiming_normal` helper!\n \n Inputs: None\n \n Returns a list containing:\n - conv_w1: TensorFlow tf.Variable giving weights for the first conv layer\n - conv_b1: TensorFlow tf.Variable giving biases for the first conv layer\n - conv_w2: TensorFlow tf.Variable giving weights for the second conv layer\n - conv_b2: TensorFlow tf.Variable giving biases for the second conv layer\n - fc_w: TensorFlow tf.Variable giving weights for the fully-connected layer\n - fc_b: TensorFlow tf.Variable giving biases for the fully-connected layer\n \"\"\"\n params = None\n ############################################################################\n # TODO: Initialize the parameters of the three-layer network. #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n # a sample input is 32 x 32 x 3\n conv_w1 = tf.Variable(create_matrix_with_kaiming_normal((5, 5, 3, 32)))\n conv_b1 = tf.Variable(create_matrix_with_kaiming_normal((1, 32)))\n conv_w2 = tf.Variable(create_matrix_with_kaiming_normal((3, 3, 32, 16)))\n conv_b2 = tf.Variable(create_matrix_with_kaiming_normal((1, 16)))\n fc_w = tf.Variable(create_matrix_with_kaiming_normal((32 * 32 * 16, 10))) # the input size after two convs is 32 x 32 x 16.\n fc_b = tf.Variable(create_matrix_with_kaiming_normal((1, 10)))\n \n params = [conv_w1, conv_b1, conv_w2, conv_b2, fc_w, fc_b]\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n return params\n\nlearning_rate = 3e-3\ntrain_part2(three_layer_convnet, three_layer_convnet_init, learning_rate)",
"Iteration 0, loss = 4.8878\nGot 79 / 1000 correct (7.90%)\nIteration 100, loss = 1.9799\nGot 348 / 1000 correct (34.80%)\nIteration 200, loss = 1.7410\nGot 391 / 1000 correct (39.10%)\nIteration 300, loss = 1.7193\nGot 390 / 1000 correct (39.00%)\nIteration 400, loss = 1.7154\nGot 427 / 1000 correct (42.70%)\nIteration 500, loss = 1.6703\nGot 438 / 1000 correct (43.80%)\nIteration 600, loss = 1.6941\nGot 446 / 1000 correct (44.60%)\nIteration 700, loss = 1.6794\nGot 460 / 1000 correct (46.00%)\n"
]
],
[
[
"# Part III: Keras Model Subclassing API\n\nImplementing a neural network using the low-level TensorFlow API is a good way to understand how TensorFlow works, but it's a little inconvenient - we had to manually keep track of all Tensors holding learnable parameters. This was fine for a small network, but could quickly become unweildy for a large complex model.\n\nFortunately TensorFlow 2.0 provides higher-level APIs such as `tf.keras` which make it easy to build models out of modular, object-oriented layers. Further, TensorFlow 2.0 uses eager execution that evaluates operations immediately, without explicitly constructing any computational graphs. This makes it easy to write and debug models, and reduces the boilerplate code.\n\nIn this part of the notebook we will define neural network models using the `tf.keras.Model` API. To implement your own model, you need to do the following:\n\n1. Define a new class which subclasses `tf.keras.Model`. Give your class an intuitive name that describes it, like `TwoLayerFC` or `ThreeLayerConvNet`.\n2. In the initializer `__init__()` for your new class, define all the layers you need as class attributes. The `tf.keras.layers` package provides many common neural-network layers, like `tf.keras.layers.Dense` for fully-connected layers and `tf.keras.layers.Conv2D` for convolutional layers. Under the hood, these layers will construct `Variable` Tensors for any learnable parameters. **Warning**: Don't forget to call `super(YourModelName, self).__init__()` as the first line in your initializer!\n3. Implement the `call()` method for your class; this implements the forward pass of your model, and defines the *connectivity* of your network. Layers defined in `__init__()` implement `__call__()` so they can be used as function objects that transform input Tensors into output Tensors. Don't define any new layers in `call()`; any layers you want to use in the forward pass should be defined in `__init__()`.\n\nAfter you define your `tf.keras.Model` subclass, you can instantiate it and use it like the model functions from Part II.\n\n### Keras Model Subclassing API: Two-Layer Network\n\nHere is a concrete example of using the `tf.keras.Model` API to define a two-layer network. There are a few new bits of API to be aware of here:\n\nWe use an `Initializer` object to set up the initial values of the learnable parameters of the layers; in particular `tf.initializers.VarianceScaling` gives behavior similar to the Kaiming initialization method we used in Part II. You can read more about it here: https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/initializers/VarianceScaling\n\nWe construct `tf.keras.layers.Dense` objects to represent the two fully-connected layers of the model. In addition to multiplying their input by a weight matrix and adding a bias vector, these layer can also apply a nonlinearity for you. For the first layer we specify a ReLU activation function by passing `activation='relu'` to the constructor; the second layer uses softmax activation function. Finally, we use `tf.keras.layers.Flatten` to flatten the output from the previous fully-connected layer.",
"_____no_output_____"
]
],
[
[
"class TwoLayerFC(tf.keras.Model):\n def __init__(self, hidden_size, num_classes):\n super().__init__() #super(TwoLayerFC, self).__init__() \n \n initializer = tf.initializers.VarianceScaling(scale=2.0)\n self.flatten = tf.keras.layers.Flatten()\n self.fc1 = tf.keras.layers.Dense(hidden_size, activation='relu',\n kernel_initializer=initializer)\n self.fc2 = tf.keras.layers.Dense(num_classes, activation='softmax',\n kernel_initializer=initializer)\n \n \n def call(self, x, training=False):\n x = self.flatten(x)\n x = self.fc1(x)\n x = self.fc2(x)\n return x\n\n\ndef test_TwoLayerFC():\n \"\"\" A small unit test to exercise the TwoLayerFC model above. \"\"\"\n input_size, hidden_size, num_classes = 50, 42, 10\n x = tf.zeros((64, input_size))\n model = TwoLayerFC(hidden_size, num_classes)\n with tf.device(device):\n scores = model(x)\n print(scores.shape)\n \ntest_TwoLayerFC()",
"(64, 10)\n"
]
],
[
[
"### Keras Model Subclassing API: Three-Layer ConvNet\nNow it's your turn to implement a three-layer ConvNet using the `tf.keras.Model` API. Your model should have the same architecture used in Part II:\n\n1. Convolutional layer with 5 x 5 kernels, with zero-padding of 2\n2. ReLU nonlinearity\n3. Convolutional layer with 3 x 3 kernels, with zero-padding of 1\n4. ReLU nonlinearity\n5. Fully-connected layer to give class scores\n6. Softmax nonlinearity\n\nYou should initialize the weights of your network using the same initialization method as was used in the two-layer network above.\n\n**Hint**: Refer to the documentation for `tf.keras.layers.Conv2D` and `tf.keras.layers.Dense`:\n\nhttps://www.tensorflow.org/versions/r2.0/api_docs/python/tf/keras/layers/Conv2D\n\nhttps://www.tensorflow.org/versions/r2.0/api_docs/python/tf/keras/layers/Dense",
"_____no_output_____"
]
],
[
[
"class ThreeLayerConvNet(tf.keras.Model):\n def __init__(self, channel_1, channel_2, num_classes):\n super().__init__()\n ########################################################################\n # TODO: Implement the __init__ method for a three-layer ConvNet. You #\n # should instantiate layer objects to be used in the forward pass. #\n ########################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n initializer = tf.initializers.VarianceScaling(scale=2.0)\n \n self.conv1 = tf.keras.layers.Conv2D(channel_1, (5, 5), padding='same', activation='relu', kernel_initializer=initializer)\n self.conv2 = tf.keras.layers.Conv2D(channel_2, (3, 3), padding='same', activation='relu', kernel_initializer=initializer)\n self.flatten = tf.keras.layers.Flatten()\n self.fc = tf.keras.layers.Dense(num_classes, activation='softmax')\n \n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ########################################################################\n # END OF YOUR CODE #\n ########################################################################\n \n def call(self, x, training=False):\n scores = None\n ########################################################################\n # TODO: Implement the forward pass for a three-layer ConvNet. You #\n # should use the layer objects defined in the __init__ method. #\n ########################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n \n x = self.conv1(x)\n x = self.conv2(x)\n x = self.flatten(x)\n scores = self.fc(x)\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ########################################################################\n # END OF YOUR CODE #\n ######################################################################## \n return scores",
"_____no_output_____"
]
],
[
[
"Once you complete the implementation of the `ThreeLayerConvNet` above you can run the following to ensure that your implementation does not crash and produces outputs of the expected shape.",
"_____no_output_____"
]
],
[
[
"def test_ThreeLayerConvNet(): \n channel_1, channel_2, num_classes = 12, 8, 10\n model = ThreeLayerConvNet(channel_1, channel_2, num_classes)\n with tf.device(device):\n x = tf.zeros((64, 3, 32, 32))\n scores = model(x)\n print(scores.shape)\n\ntest_ThreeLayerConvNet()",
"(64, 10)\n"
]
],
[
[
"### Keras Model Subclassing API: Eager Training\n\nWhile keras models have a builtin training loop (using the `model.fit`), sometimes you need more customization. Here's an example, of a training loop implemented with eager execution.\n\nIn particular, notice `tf.GradientTape`. Automatic differentiation is used in the backend for implementing backpropagation in frameworks like TensorFlow. During eager execution, `tf.GradientTape` is used to trace operations for computing gradients later. A particular `tf.GradientTape` can only compute one gradient; subsequent calls to tape will throw a runtime error. \n\nTensorFlow 2.0 ships with easy-to-use built-in metrics under `tf.keras.metrics` module. Each metric is an object, and we can use `update_state()` to add observations and `reset_state()` to clear all observations. We can get the current result of a metric by calling `result()` on the metric object.",
"_____no_output_____"
]
],
[
[
"def train_part34(model_init_fn, optimizer_init_fn, num_epochs=1, is_training=False):\n \"\"\"\n Simple training loop for use with models defined using tf.keras. It trains\n a model for one epoch on the CIFAR-10 training set and periodically checks\n accuracy on the CIFAR-10 validation set.\n \n Inputs:\n - model_init_fn: A function that takes no parameters; when called it\n constructs the model we want to train: model = model_init_fn()\n - optimizer_init_fn: A function which takes no parameters; when called it\n constructs the Optimizer object we will use to optimize the model:\n optimizer = optimizer_init_fn()\n - num_epochs: The number of epochs to train for\n \n Returns: Nothing, but prints progress during trainingn\n \"\"\" \n with tf.device(device):\n\n # Compute the loss like we did in Part II\n loss_fn = tf.keras.losses.SparseCategoricalCrossentropy()\n \n model = model_init_fn()\n optimizer = optimizer_init_fn()\n \n train_loss = tf.keras.metrics.Mean(name='train_loss')\n train_accuracy = tf.keras.metrics.SparseCategoricalAccuracy(name='train_accuracy')\n \n val_loss = tf.keras.metrics.Mean(name='val_loss')\n val_accuracy = tf.keras.metrics.SparseCategoricalAccuracy(name='val_accuracy')\n \n t = 0\n for epoch in range(num_epochs):\n \n # Reset the metrics - https://www.tensorflow.org/alpha/guide/migration_guide#new-style_metrics\n train_loss.reset_states()\n train_accuracy.reset_states()\n \n for x_np, y_np in train_dset:\n with tf.GradientTape() as tape:\n \n # Use the model function to build the forward pass.\n scores = model(x_np, training=is_training)\n loss = loss_fn(y_np, scores)\n \n gradients = tape.gradient(loss, model.trainable_variables)\n optimizer.apply_gradients(zip(gradients, model.trainable_variables))\n \n # Update the metrics\n train_loss.update_state(loss)\n train_accuracy.update_state(y_np, scores)\n \n if t % print_every == 0:\n val_loss.reset_states()\n val_accuracy.reset_states()\n for test_x, test_y in val_dset:\n # During validation at end of epoch, training set to False\n prediction = model(test_x, training=False)\n t_loss = loss_fn(test_y, prediction)\n\n val_loss.update_state(t_loss)\n val_accuracy.update_state(test_y, prediction)\n \n template = 'Iteration {}, Epoch {}, Loss: {}, Accuracy: {}, Val Loss: {}, Val Accuracy: {}'\n print (template.format(t, epoch+1,\n train_loss.result(),\n train_accuracy.result()*100,\n val_loss.result(),\n val_accuracy.result()*100))\n t += 1",
"_____no_output_____"
]
],
[
[
"### Keras Model Subclassing API: Train a Two-Layer Network\nWe can now use the tools defined above to train a two-layer network on CIFAR-10. We define the `model_init_fn` and `optimizer_init_fn` that construct the model and optimizer respectively when called. Here we want to train the model using stochastic gradient descent with no momentum, so we construct a `tf.keras.optimizers.SGD` function; you can [read about it here](https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/optimizers/SGD).\n\nYou don't need to tune any hyperparameters here, but you should achieve validation accuracies above 40% after one epoch of training.",
"_____no_output_____"
]
],
[
[
"hidden_size, num_classes = 4000, 10\nlearning_rate = 1e-2\n\ndef model_init_fn():\n return TwoLayerFC(hidden_size, num_classes)\n\ndef optimizer_init_fn():\n return tf.keras.optimizers.SGD(learning_rate=learning_rate)\n\ntrain_part34(model_init_fn, optimizer_init_fn, num_epochs=1)",
"Iteration 0, Epoch 1, Loss: 2.8405842781066895, Accuracy: 10.9375, Val Loss: 2.9787204265594482, Val Accuracy: 11.800000190734863\nIteration 100, Epoch 1, Loss: 2.2377357482910156, Accuracy: 27.691831588745117, Val Loss: 1.9073878526687622, Val Accuracy: 38.29999923706055\nIteration 200, Epoch 1, Loss: 2.0827560424804688, Accuracy: 31.778606414794922, Val Loss: 1.834181785583496, Val Accuracy: 39.89999771118164\nIteration 300, Epoch 1, Loss: 2.000749349594116, Accuracy: 33.80917739868164, Val Loss: 1.8754873275756836, Val Accuracy: 36.099998474121094\nIteration 400, Epoch 1, Loss: 1.9329524040222168, Accuracy: 35.582916259765625, Val Loss: 1.7133322954177856, Val Accuracy: 41.20000076293945\nIteration 500, Epoch 1, Loss: 1.8873575925827026, Accuracy: 36.76708984375, Val Loss: 1.6488999128341675, Val Accuracy: 41.60000228881836\nIteration 600, Epoch 1, Loss: 1.8568066358566284, Accuracy: 37.72878646850586, Val Loss: 1.6973183155059814, Val Accuracy: 41.60000228881836\nIteration 700, Epoch 1, Loss: 1.830464243888855, Accuracy: 38.46290969848633, Val Loss: 1.6309181451797485, Val Accuracy: 42.5\n"
]
],
[
[
"### Keras Model Subclassing API: Train a Three-Layer ConvNet\nHere you should use the tools we've defined above to train a three-layer ConvNet on CIFAR-10. Your ConvNet should use 32 filters in the first convolutional layer and 16 filters in the second layer.\n\nTo train the model you should use gradient descent with Nesterov momentum 0.9. \n\n**HINT**: https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/optimizers/SGD\n\nYou don't need to perform any hyperparameter tuning, but you should achieve validation accuracies above 50% after training for one epoch.",
"_____no_output_____"
]
],
[
[
"learning_rate = 3e-3\nchannel_1, channel_2, num_classes = 32, 16, 10\n\ndef model_init_fn():\n model = None\n ############################################################################\n # TODO: Complete the implementation of model_fn. #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n\n model = ThreeLayerConvNet(channel_1, channel_2, num_classes)\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n return model\n\ndef optimizer_init_fn():\n optimizer = None\n ############################################################################\n # TODO: Complete the implementation of model_fn. #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n\n optimizer = tf.keras.optimizers.SGD(learning_rate=learning_rate, nesterov=True, momentum=0.9)\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n return optimizer\n\ntrain_part34(model_init_fn, optimizer_init_fn, num_epochs=1)",
"Iteration 0, Epoch 1, Loss: 3.154758930206299, Accuracy: 10.9375, Val Loss: 5.975311279296875, Val Accuracy: 8.399999618530273\nIteration 100, Epoch 1, Loss: 2.041959047317505, Accuracy: 31.07982635498047, Val Loss: 1.68193519115448, Val Accuracy: 40.70000076293945\nIteration 200, Epoch 1, Loss: 1.8226760625839233, Accuracy: 37.31343078613281, Val Loss: 1.5124176740646362, Val Accuracy: 47.400001525878906\nIteration 300, Epoch 1, Loss: 1.714808464050293, Accuracy: 40.785919189453125, Val Loss: 1.4400432109832764, Val Accuracy: 49.0\nIteration 400, Epoch 1, Loss: 1.6360254287719727, Accuracy: 42.94731903076172, Val Loss: 1.3696250915527344, Val Accuracy: 52.29999923706055\nIteration 500, Epoch 1, Loss: 1.5823934078216553, Accuracy: 44.61701583862305, Val Loss: 1.3425835371017456, Val Accuracy: 51.5\nIteration 600, Epoch 1, Loss: 1.5470515489578247, Accuracy: 45.705074310302734, Val Loss: 1.3165476322174072, Val Accuracy: 52.79999923706055\nIteration 700, Epoch 1, Loss: 1.5153937339782715, Accuracy: 46.83710479736328, Val Loss: 1.3428152799606323, Val Accuracy: 52.79999923706055\n"
]
],
[
[
"# Part IV: Keras Sequential API\nIn Part III we introduced the `tf.keras.Model` API, which allows you to define models with any number of learnable layers and with arbitrary connectivity between layers.\n\nHowever for many models you don't need such flexibility - a lot of models can be expressed as a sequential stack of layers, with the output of each layer fed to the next layer as input. If your model fits this pattern, then there is an even easier way to define your model: using `tf.keras.Sequential`. You don't need to write any custom classes; you simply call the `tf.keras.Sequential` constructor with a list containing a sequence of layer objects.\n\nOne complication with `tf.keras.Sequential` is that you must define the shape of the input to the model by passing a value to the `input_shape` of the first layer in your model.\n\n### Keras Sequential API: Two-Layer Network\nIn this subsection, we will rewrite the two-layer fully-connected network using `tf.keras.Sequential`, and train it using the training loop defined above.\n\nYou don't need to perform any hyperparameter tuning here, but you should see validation accuracies above 40% after training for one epoch.",
"_____no_output_____"
]
],
[
[
"learning_rate = 1e-2\n\ndef model_init_fn():\n input_shape = (32, 32, 3)\n hidden_layer_size, num_classes = 4000, 10\n initializer = tf.initializers.VarianceScaling(scale=2.0)\n layers = [\n tf.keras.layers.Flatten(input_shape=input_shape),\n tf.keras.layers.Dense(hidden_layer_size, activation='relu',\n kernel_initializer=initializer),\n tf.keras.layers.Dense(num_classes, activation='softmax', \n kernel_initializer=initializer),\n ]\n model = tf.keras.Sequential(layers)\n return model\n\ndef optimizer_init_fn():\n return tf.keras.optimizers.SGD(learning_rate=learning_rate) \n\ntrain_part34(model_init_fn, optimizer_init_fn)",
"Iteration 0, Epoch 1, Loss: 3.049715995788574, Accuracy: 9.375, Val Loss: 2.892936944961548, Val Accuracy: 11.699999809265137\nIteration 100, Epoch 1, Loss: 2.2131028175354004, Accuracy: 29.068687438964844, Val Loss: 1.8789817094802856, Val Accuracy: 39.89999771118164\nIteration 200, Epoch 1, Loss: 2.063352584838867, Accuracy: 32.75808334350586, Val Loss: 1.8269221782684326, Val Accuracy: 39.79999923706055\nIteration 300, Epoch 1, Loss: 1.9906622171401978, Accuracy: 34.55668640136719, Val Loss: 1.851961374282837, Val Accuracy: 39.20000076293945\nIteration 400, Epoch 1, Loss: 1.9242477416992188, Accuracy: 36.264808654785156, Val Loss: 1.7083684206008911, Val Accuracy: 43.70000076293945\nIteration 500, Epoch 1, Loss: 1.8799132108688354, Accuracy: 37.34406280517578, Val Loss: 1.6594358682632446, Val Accuracy: 44.29999923706055\nIteration 600, Epoch 1, Loss: 1.848968744277954, Accuracy: 38.2149543762207, Val Loss: 1.6731466054916382, Val Accuracy: 43.29999923706055\nIteration 700, Epoch 1, Loss: 1.8228310346603394, Accuracy: 38.906471252441406, Val Loss: 1.6165885925292969, Val Accuracy: 45.69999694824219\n"
]
],
[
[
"### Abstracting Away the Training Loop\nIn the previous examples, we used a customised training loop to train models (e.g. `train_part34`). Writing your own training loop is only required if you need more flexibility and control during training your model. Alternately, you can also use built-in APIs like `tf.keras.Model.fit()` and `tf.keras.Model.evaluate` to train and evaluate a model. Also remember to configure your model for training by calling `tf.keras.Model.compile.\n\nYou don't need to perform any hyperparameter tuning here, but you should see validation and test accuracies above 42% after training for one epoch.",
"_____no_output_____"
]
],
[
[
"model = model_init_fn()\nmodel.compile(optimizer=tf.keras.optimizers.SGD(learning_rate=learning_rate),\n loss='sparse_categorical_crossentropy',\n metrics=[tf.keras.metrics.sparse_categorical_accuracy])\nmodel.fit(X_train, y_train, batch_size=64, epochs=1, validation_data=(X_val, y_val))\nmodel.evaluate(X_test, y_test)",
"Train on 49000 samples, validate on 1000 samples\n49000/49000 [==============================] - 3s 57us/sample - loss: 1.8204 - sparse_categorical_accuracy: 0.3874 - val_loss: 1.6748 - val_sparse_categorical_accuracy: 0.4180\n10000/1 [================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 1s 77us/sample - loss: 1.6879 - sparse_categorical_accuracy: 0.4223\n"
]
],
[
[
"### Keras Sequential API: Three-Layer ConvNet\nHere you should use `tf.keras.Sequential` to reimplement the same three-layer ConvNet architecture used in Part II and Part III. As a reminder, your model should have the following architecture:\n\n1. Convolutional layer with 32 5x5 kernels, using zero padding of 2\n2. ReLU nonlinearity\n3. Convolutional layer with 16 3x3 kernels, using zero padding of 1\n4. ReLU nonlinearity\n5. Fully-connected layer giving class scores\n6. Softmax nonlinearity\n\nYou should initialize the weights of the model using a `tf.initializers.VarianceScaling` as above.\n\nYou should train the model using Nesterov momentum 0.9.\n\nYou don't need to perform any hyperparameter search, but you should achieve accuracy above 45% after training for one epoch.",
"_____no_output_____"
]
],
[
[
"def model_init_fn():\n model = None\n ############################################################################\n # TODO: Construct a three-layer ConvNet using tf.keras.Sequential. #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n\n input_shape = (32, 32, 3)\n num_classes = 10\n \n initializer = tf.initializers.VarianceScaling(scale=2.0) \n layers = [\n tf.keras.layers.Conv2D(32, (5, 5), padding='same', activation='relu', kernel_initializer=initializer, \n input_shape=input_shape),\n tf.keras.layers.Conv2D(16, (3, 3), padding='same', activation='relu', kernel_initializer=initializer),\n tf.keras.layers.Flatten(),\n tf.keras.layers.Dense(num_classes, activation='softmax', kernel_initializer=initializer),\n ]\n model = tf.keras.Sequential(layers)\n \n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n return model\n\nlearning_rate = 5e-4\ndef optimizer_init_fn():\n optimizer = None\n ############################################################################\n # TODO: Complete the implementation of model_fn. #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n\n optimizer = tf.keras.optimizers.SGD(learning_rate=learning_rate, nesterov=True, momentum=0.9)\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n return optimizer\n\ntrain_part34(model_init_fn, optimizer_init_fn)",
"Iteration 0, Epoch 1, Loss: 2.9712705612182617, Accuracy: 9.375, Val Loss: 3.2196428775787354, Val Accuracy: 12.700000762939453\nIteration 100, Epoch 1, Loss: 2.0532257556915283, Accuracy: 29.981433868408203, Val Loss: 1.8071397542953491, Val Accuracy: 36.29999923706055\nIteration 200, Epoch 1, Loss: 1.899694800376892, Accuracy: 34.16511154174805, Val Loss: 1.6743762493133545, Val Accuracy: 41.0\nIteration 300, Epoch 1, Loss: 1.8233054876327515, Accuracy: 36.59675979614258, Val Loss: 1.6346663236618042, Val Accuracy: 45.0\nIteration 400, Epoch 1, Loss: 1.7578901052474976, Accuracy: 38.88325881958008, Val Loss: 1.5955396890640259, Val Accuracy: 45.20000076293945\nIteration 500, Epoch 1, Loss: 1.7127082347869873, Accuracy: 40.272579193115234, Val Loss: 1.5494569540023804, Val Accuracy: 46.29999923706055\nIteration 600, Epoch 1, Loss: 1.6833456754684448, Accuracy: 41.28015899658203, Val Loss: 1.5043326616287231, Val Accuracy: 48.79999923706055\nIteration 700, Epoch 1, Loss: 1.6563769578933716, Accuracy: 42.189727783203125, Val Loss: 1.480652928352356, Val Accuracy: 48.0\n"
]
],
[
[
"We will also train this model with the built-in training loop APIs provided by TensorFlow.",
"_____no_output_____"
]
],
[
[
"model = model_init_fn()\nmodel.compile(optimizer='sgd',\n loss='sparse_categorical_crossentropy',\n metrics=[tf.keras.metrics.sparse_categorical_accuracy])\nmodel.fit(X_train, y_train, batch_size=64, epochs=1, validation_data=(X_val, y_val))\nmodel.evaluate(X_test, y_test)",
"Train on 49000 samples, validate on 1000 samples\n49000/49000 [==============================] - 4s 89us/sample - loss: 1.5548 - sparse_categorical_accuracy: 0.4531 - val_loss: 1.4325 - val_sparse_categorical_accuracy: 0.4920\n10000/1 [================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================================] - 1s 93us/sample - loss: 1.3228 - sparse_categorical_accuracy: 0.4854\n"
]
],
[
[
"## Part IV: Functional API\n### Demonstration with a Two-Layer Network \n\nIn the previous section, we saw how we can use `tf.keras.Sequential` to stack layers to quickly build simple models. But this comes at the cost of losing flexibility.\n\nOften we will have to write complex models that have non-sequential data flows: a layer can have **multiple inputs and/or outputs**, such as stacking the output of 2 previous layers together to feed as input to a third! (Some examples are residual connections and dense blocks.)\n\nIn such cases, we can use Keras functional API to write models with complex topologies such as:\n\n 1. Multi-input models\n 2. Multi-output models\n 3. Models with shared layers (the same layer called several times)\n 4. Models with non-sequential data flows (e.g. residual connections)\n\nWriting a model with Functional API requires us to create a `tf.keras.Model` instance and explicitly write input tensors and output tensors for this model. ",
"_____no_output_____"
]
],
[
[
"def two_layer_fc_functional(input_shape, hidden_size, num_classes): \n initializer = tf.initializers.VarianceScaling(scale=2.0)\n inputs = tf.keras.Input(shape=input_shape)\n flattened_inputs = tf.keras.layers.Flatten()(inputs)\n fc1_output = tf.keras.layers.Dense(hidden_size, activation='relu',\n kernel_initializer=initializer)(flattened_inputs)\n scores = tf.keras.layers.Dense(num_classes, activation='softmax',\n kernel_initializer=initializer)(fc1_output)\n\n # Instantiate the model given inputs and outputs.\n model = tf.keras.Model(inputs=inputs, outputs=scores)\n return model\n\ndef test_two_layer_fc_functional():\n \"\"\" A small unit test to exercise the TwoLayerFC model above. \"\"\"\n input_size, hidden_size, num_classes = 50, 42, 10\n input_shape = (50,)\n \n x = tf.zeros((64, input_size))\n model = two_layer_fc_functional(input_shape, hidden_size, num_classes)\n \n with tf.device(device):\n scores = model(x)\n print(scores.shape)\n \ntest_two_layer_fc_functional()",
"(64, 10)\n"
]
],
[
[
"### Keras Functional API: Train a Two-Layer Network\nYou can now train this two-layer network constructed using the functional API.\n\nYou don't need to perform any hyperparameter tuning here, but you should see validation accuracies above 40% after training for one epoch.",
"_____no_output_____"
]
],
[
[
"input_shape = (32, 32, 3)\nhidden_size, num_classes = 4000, 10\nlearning_rate = 1e-2\n\ndef model_init_fn():\n return two_layer_fc_functional(input_shape, hidden_size, num_classes)\n\ndef optimizer_init_fn():\n return tf.keras.optimizers.SGD(learning_rate=learning_rate)\n\ntrain_part34(model_init_fn, optimizer_init_fn)",
"Iteration 0, Epoch 1, Loss: 2.8402795791625977, Accuracy: 17.1875, Val Loss: 2.810950517654419, Val Accuracy: 15.09999942779541\nIteration 100, Epoch 1, Loss: 2.226226568222046, Accuracy: 28.496286392211914, Val Loss: 1.9141902923583984, Val Accuracy: 38.60000228881836\nIteration 200, Epoch 1, Loss: 2.080077886581421, Accuracy: 32.06623077392578, Val Loss: 1.8725242614746094, Val Accuracy: 38.29999923706055\nIteration 300, Epoch 1, Loss: 2.0010735988616943, Accuracy: 34.29194259643555, Val Loss: 1.8864822387695312, Val Accuracy: 37.400001525878906\nIteration 400, Epoch 1, Loss: 1.9321510791778564, Accuracy: 36.06218719482422, Val Loss: 1.7192223072052002, Val Accuracy: 41.70000076293945\nIteration 500, Epoch 1, Loss: 1.886867642402649, Accuracy: 37.12263107299805, Val Loss: 1.6607885360717773, Val Accuracy: 44.0\nIteration 600, Epoch 1, Loss: 1.8582278490066528, Accuracy: 37.86917495727539, Val Loss: 1.698154330253601, Val Accuracy: 42.39999771118164\nIteration 700, Epoch 1, Loss: 1.8322076797485352, Accuracy: 38.547611236572266, Val Loss: 1.6500277519226074, Val Accuracy: 42.39999771118164\n"
]
],
[
[
"# Part V: CIFAR-10 open-ended challenge\n\nIn this section you can experiment with whatever ConvNet architecture you'd like on CIFAR-10.\n\nYou should experiment with architectures, hyperparameters, loss functions, regularization, or anything else you can think of to train a model that achieves **at least 70%** accuracy on the **validation** set within 10 epochs. You can use the built-in train function, the `train_part34` function from above, or implement your own training loop.\n\nDescribe what you did at the end of the notebook.\n\n### Some things you can try:\n- **Filter size**: Above we used 5x5 and 3x3; is this optimal?\n- **Number of filters**: Above we used 16 and 32 filters. Would more or fewer do better?\n- **Pooling**: We didn't use any pooling above. Would this improve the model?\n- **Normalization**: Would your model be improved with batch normalization, layer normalization, group normalization, or some other normalization strategy?\n- **Network architecture**: The ConvNet above has only three layers of trainable parameters. Would a deeper model do better?\n- **Global average pooling**: Instead of flattening after the final convolutional layer, would global average pooling do better? This strategy is used for example in Google's Inception network and in Residual Networks.\n- **Regularization**: Would some kind of regularization improve performance? Maybe weight decay or dropout?\n\n### NOTE: Batch Normalization / Dropout\nIf you are using Batch Normalization and Dropout, remember to pass `is_training=True` if you use the `train_part34()` function. BatchNorm and Dropout layers have different behaviors at training and inference time. `training` is a specific keyword argument reserved for this purpose in any `tf.keras.Model`'s `call()` function. Read more about this here : https://www.tensorflow.org/versions/r2.0/api_docs/python/tf/keras/layers/BatchNormalization#methods\nhttps://www.tensorflow.org/versions/r2.0/api_docs/python/tf/keras/layers/Dropout#methods\n\n### Tips for training\nFor each network architecture that you try, you should tune the learning rate and other hyperparameters. When doing this there are a couple important things to keep in mind: \n\n- If the parameters are working well, you should see improvement within a few hundred iterations\n- Remember the coarse-to-fine approach for hyperparameter tuning: start by testing a large range of hyperparameters for just a few training iterations to find the combinations of parameters that are working at all.\n- Once you have found some sets of parameters that seem to work, search more finely around these parameters. You may need to train for more epochs.\n- You should use the validation set for hyperparameter search, and save your test set for evaluating your architecture on the best parameters as selected by the validation set.\n\n### Going above and beyond\nIf you are feeling adventurous there are many other features you can implement to try and improve your performance. You are **not required** to implement any of these, but don't miss the fun if you have time!\n\n- Alternative optimizers: you can try Adam, Adagrad, RMSprop, etc.\n- Alternative activation functions such as leaky ReLU, parametric ReLU, ELU, or MaxOut.\n- Model ensembles\n- Data augmentation\n- New Architectures\n - [ResNets](https://arxiv.org/abs/1512.03385) where the input from the previous layer is added to the output.\n - [DenseNets](https://arxiv.org/abs/1608.06993) where inputs into previous layers are concatenated together.\n - [This blog has an in-depth overview](https://chatbotslife.com/resnets-highwaynets-and-densenets-oh-my-9bb15918ee32)\n \n### Have fun and happy training! ",
"_____no_output_____"
]
],
[
[
"class CustomConvNet(tf.keras.Model):\n def __init__(self):\n super().__init__()\n ############################################################################\n # TODO: Construct a model that performs well on CIFAR-10 #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n # input is [N, 32, 32, 3]\n initializer = tf.initializers.VarianceScaling(scale=2.0)\n \n self.conv11 = tf.keras.layers.Conv2D(512, (3, 3), padding='same', kernel_initializer=initializer) # 32, 32, 128\n self.prelu11 = tf.keras.layers.PReLU(alpha_initializer=initializer)\n self.bn11 = tf.keras.layers.BatchNormalization()\n \n self.conv12 = tf.keras.layers.Conv2D(256, (3, 3), padding='same', kernel_initializer=initializer) # 32, 32, 128\n self.prelu12 = tf.keras.layers.PReLU(alpha_initializer=initializer)\n self.bn12 = tf.keras.layers.BatchNormalization()\n \n self.conv13 = tf.keras.layers.Conv2D(128, (3, 3), padding='same', kernel_initializer=initializer) # 32, 32, 128\n self.prelu13 = tf.keras.layers.PReLU(alpha_initializer=initializer)\n self.bn13 = tf.keras.layers.BatchNormalization()\n \n self.conv2 = tf.keras.layers.Conv2D(64, (3, 3), padding='same', kernel_initializer=initializer) # 32, 32, 64\n self.prelu2 = tf.keras.layers.PReLU(alpha_initializer=initializer)\n self.bn2 = tf.keras.layers.BatchNormalization()\n self.maxpool2 = tf.keras.layers.MaxPool2D((2, 2), padding='same') # 16, 16, 64\n \n self.conv3 = tf.keras.layers.Conv2D(32, (3, 3), padding='same', kernel_initializer=initializer) # 16, 16, 32\n self.prelu3 = tf.keras.layers.PReLU(alpha_initializer=initializer)\n self.bn3 = tf.keras.layers.BatchNormalization()\n self.maxpool3 = tf.keras.layers.MaxPool2D((2, 2), padding='same') # 8, 8, 32\n\n self.flatten = tf.keras.layers.Flatten()\n self.fc = tf.keras.layers.Dense(10, activation='softmax', kernel_initializer=initializer)\n \n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n \n def call(self, input_tensor, training=False):\n ############################################################################\n # TODO: Construct a model that performs well on CIFAR-10 #\n ############################################################################\n # *****START OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n x = input_tensor\n \n x = self.conv11(x)\n x = self.prelu11(x)\n x = self.bn11(x, training)\n \n x = self.conv12(x)\n x = self.prelu12(x)\n x = self.bn12(x, training)\n \n x = self.conv13(x)\n x = self.prelu13(x)\n x = self.bn13(x, training)\n \n x = self.conv2(x)\n x = self.prelu2(x)\n x = self.bn2(x, training)\n x = self.maxpool2(x)\n \n x = self.conv3(x)\n x = self.prelu3(x)\n x = self.bn3(x, training)\n x = self.maxpool3(x)\n \n x = self.flatten(x)\n x = self.fc(x)\n\n # *****END OF YOUR CODE (DO NOT DELETE/MODIFY THIS LINE)*****\n ############################################################################\n # END OF YOUR CODE #\n ############################################################################\n \n return x\n ",
"_____no_output_____"
],
[
"device = '/device:GPU:0' # Change this to a CPU/GPU as you wish!\n# device = '/cpu:0' # Change this to a CPU/GPU as you wish!\nprint_every = 300\nnum_epochs = 10\n\n# model = CustomConvNet()\n# model = CustomResNet()\n\ndef model_init_fn():\n return CustomConvNet()\n\ndef optimizer_init_fn():\n learning_rate = 1e-3\n return tf.keras.optimizers.Adam(learning_rate) \n\ntrain_part34(model_init_fn, optimizer_init_fn, num_epochs=num_epochs, is_training=True)",
"Iteration 0, Epoch 1, Loss: 4.086295127868652, Accuracy: 12.5, Val Loss: 4.507420539855957, Val Accuracy: 13.199999809265137\nIteration 300, Epoch 1, Loss: 1.6533818244934082, Accuracy: 43.55793380737305, Val Loss: 1.471709132194519, Val Accuracy: 50.19999694824219\nIteration 600, Epoch 1, Loss: 1.4560245275497437, Accuracy: 49.78681564331055, Val Loss: 1.2510604858398438, Val Accuracy: 57.20000457763672\nIteration 900, Epoch 2, Loss: 1.007521152496338, Accuracy: 64.97685241699219, Val Loss: 1.0304434299468994, Val Accuracy: 63.900001525878906\nIteration 1200, Epoch 2, Loss: 0.9641342163085938, Accuracy: 66.3936767578125, Val Loss: 0.9729200601577759, Val Accuracy: 65.80000305175781\nIteration 1500, Epoch 2, Loss: 0.926266610622406, Accuracy: 67.551025390625, Val Loss: 0.9301822781562805, Val Accuracy: 67.19999694824219\nIteration 1800, Epoch 3, Loss: 0.7934252619743347, Accuracy: 72.49070739746094, Val Loss: 0.8698236346244812, Val Accuracy: 70.30000305175781\nIteration 2100, Epoch 3, Loss: 0.763357937335968, Accuracy: 73.67091369628906, Val Loss: 0.83391273021698, Val Accuracy: 71.5\nIteration 2400, Epoch 4, Loss: 0.6582642793655396, Accuracy: 77.12378692626953, Val Loss: 0.7901626825332642, Val Accuracy: 73.69999694824219\nIteration 2700, Epoch 4, Loss: 0.6490508913993835, Accuracy: 77.60545349121094, Val Loss: 0.8368052840232849, Val Accuracy: 71.80000305175781\nIteration 3000, Epoch 4, Loss: 0.6276915669441223, Accuracy: 78.28724670410156, Val Loss: 0.8429279923439026, Val Accuracy: 72.79999542236328\nIteration 3300, Epoch 5, Loss: 0.5404477119445801, Accuracy: 81.48734283447266, Val Loss: 0.8269108533859253, Val Accuracy: 74.5\nIteration 3600, Epoch 5, Loss: 0.5194686055183411, Accuracy: 82.16072845458984, Val Loss: 0.9136738181114197, Val Accuracy: 71.0\nIteration 3900, Epoch 6, Loss: 0.42960742115974426, Accuracy: 86.35562896728516, Val Loss: 0.8334789276123047, Val Accuracy: 73.69999694824219\nIteration 4200, Epoch 6, Loss: 0.409514844417572, Accuracy: 86.14386749267578, Val Loss: 0.859889030456543, Val Accuracy: 72.0\nIteration 4500, Epoch 6, Loss: 0.3866899609565735, Accuracy: 86.96441650390625, Val Loss: 0.9733158349990845, Val Accuracy: 71.80000305175781\nIteration 4800, Epoch 7, Loss: 0.31196486949920654, Accuracy: 89.12347412109375, Val Loss: 0.9835796356201172, Val Accuracy: 71.9000015258789\nIteration 5100, Epoch 7, Loss: 0.29267334938049316, Accuracy: 90.00928497314453, Val Loss: 1.1022939682006836, Val Accuracy: 72.79999542236328\nIteration 5400, Epoch 8, Loss: 0.24182185530662537, Accuracy: 91.34615325927734, Val Loss: 1.1322561502456665, Val Accuracy: 71.69999694824219\n"
]
],
[
[
"## Describe what you did \n\nIn the cell below you should write an explanation of what you did, any additional features that you implemented, and/or any graphs that you made in the process of training and evaluating your network.",
"_____no_output_____"
],
[
"Add more layers, conv layers with deeper channel and padding to keep the same size, only downsample when doing maxpool. Use PReLU instead of ReLU with Kaiming initialization like in his paper. Added batch norm after each conv-relu pair.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
]
] |
cbf95e4d3f0d2ddf700f70a3721a6e8b80d672f6
| 11,477 |
ipynb
|
Jupyter Notebook
|
reference/Simon's Algorithm.ipynb
|
LSaldyt/curry-examples
|
3a5ed137ff7a040cb6505813af849465cc39d59d
|
[
"MIT"
] | null | null | null |
reference/Simon's Algorithm.ipynb
|
LSaldyt/curry-examples
|
3a5ed137ff7a040cb6505813af849465cc39d59d
|
[
"MIT"
] | null | null | null |
reference/Simon's Algorithm.ipynb
|
LSaldyt/curry-examples
|
3a5ed137ff7a040cb6505813af849465cc39d59d
|
[
"MIT"
] | null | null | null | 37.877888 | 762 | 0.52566 |
[
[
[
"import itertools\nimport numpy as np\nimport pyquil.api as api\nfrom pyquil.gates import *\nfrom pyquil.quil import Program\nfrom gaussian_elimination import *",
"_____no_output_____"
]
],
[
[
"##### Problem Setup",
"_____no_output_____"
],
[
"The setup for Simon's problem consists of a given black-box operator that is a generalization from those given in the Deutsch and Deutsch-Jozsa problems, and maps $\\mathbf{f}: \\{0, 1\\}^n \\rightarrow \\{0, 1\\}^m$, such that<br>\n<br>\n$$ U_f : \\left\\vert \\mathbf{x} \\right\\rangle \\left\\vert \\mathbf{b} \\right\\rangle \\rightarrow \\left\\vert \\mathbf{x} \\right\\rangle \\left\\vert \\mathbf{b} \\oplus \\mathbf{f}(\\mathbf{x}) \\right\\rangle$$\n<br>\nwhere $\\mathbf{f}(\\mathbf{x}) \\in \\{0, 1\\}^m \\, \\, \\forall \\mathbf{x} \\in \\{ 0, 1 \\}^n$, $\\mathbf{b} \\in \\{0, 1\\}^m$, and the $\\oplus$ sign represents mod 2 addition on each of the components separately. The problem consists of finding \n$\\mathbf{s} \\in \\{0, 1\\}^n$ such that<br>\n<br>\n$$\\mathbf{f} (\\mathbf{x} \\oplus \\mathbf{s}) = \\mathbf{f} (\\mathbf{x})$$\nso that the function $\\mathbf{f}$ is periodic with period $\\mathbf{s}$.\n\nWe solve by first preparing the state $\\left\\vert\\mathbf{x} \\right\\rangle \\left\\vert 0 \\right\\rangle$, applying the black-box to produce the state $\\left\\vert\\mathbf{x}\\right\\rangle \\left\\vert \\mathbf{f}(\\mathbf{x})\\right\\rangle$, then applying $H^{\\otimes n}$ to the first register $\\left\\vert\\mathbf{x}\\right\\rangle$, then measuring it and recording the value $\\mathbf{w}_i$, repeating these steps until $\\text{span}\\{\\mathbf{w}_i\\}$ equals $n-1$, at which point we solve the equation $\\mathbf{W}\\mathbf{s}^{T} = \\mathbf{0}^{T}$ via Gaussian elimination to obtain $\\mathbf{s}$ as the unique non-zero solution. To see _why_ this works, the reader is referred to \"An introduction to quantum computing\" by P. Kaye et al.",
"_____no_output_____"
],
[
"##### Implementation Notes",
"_____no_output_____"
],
[
"We can generalize the black-box operator from the Deutsch-Jozsa problem to construct the one required here\n$$U_f = \\sum_{\\mathbf{x}=0}^{2^{n} - 1} \\left\\vert \\mathbf{x} \\right\\rangle \\left\\langle \\mathbf{x} \\right\\vert \\otimes \\left[ I + f_{i} (\\mathbf{x}) \\left( X - I \\right) \\right]^{\\otimes_{i=m-1}^{i=0}}$$\nFor example, if $m=2$, then\n$$ \\left[ I + f_{i} (\\mathbf{x}) \\left( X - I \\right) \\right]^{\\otimes_{i=m-1}^{i=0}} = \\left[ I + f_1(\\mathbf{x}) \\left( X - I \\right) \\right] \\otimes \\left[ I + f_0(\\mathbf{x}) \\left( X - I \\right)\\right]$$\n<br>\nand further if $n=3$, $\\mathbf{x} = 010$, and $\\mathbf{f}(\\mathbf{x}) = 10$, then\n$$ \\left[ I + f_{i} (\\mathbf{x}) \\left( X - I \\right) \\right]^{\\otimes_{i=m-1}^{i=0}} = \\left[ I + f_1(010) \\left( X - I\\right)\\right] \\otimes \\left[ I + f_0(010) \\left( X - I\\right)\\right] \\\\\n= \\left[ I + (1)(X-I)\\right] \\otimes \\left[ I + (0) (X-I)\\right] \\\\\n= X \\otimes I$$\n<br>\nThe sampling of the $\\mathbf{w}_{i}$ is done in such a way as to ensure the reduced row-echelon form of the collective $\\mathbf{W}$ matrix (note that since we're working with mod 2 arithmetic, we automatically have reduced row-echelon, and not just row-echelon form). Back-substitution is modified to work with mod 2 arithmetic. The entire process is implemented in gaussian_elimination.py, and for an excellent discussion of the mathematical details involved, the reader is referred to Section 18.13 of \"<c|Q|c> : A Course in Quantum Computing (for the Community College)\", Vol. 1 by Michael Loceff.",
"_____no_output_____"
],
[
"### Simon's Algorithm using (n+m) qubits",
"_____no_output_____"
]
],
[
[
"def qubit_strings(n):\n qubit_strings = []\n for q in itertools.product(['0', '1'], repeat=n):\n qubit_strings.append(''.join(q))\n return qubit_strings",
"_____no_output_____"
],
[
"def black_box_map(n, m, s):\n \"\"\"\n Black-box map f:{0,1}^n -> {0,1}^m, randomly taking values,\n and periodic with period s\n \"\"\"\n # ensure s lives in {0,1}^n\n if len(s) != n:\n raise AssertionError(\"Length of period vector should equal n\")\n # control qubits\n cont_qubs = qubit_strings(n)\n # target qubits\n targ_qubs = qubit_strings(m)\n \n # initialize empty dictionary to store map values\n d_blackbox = {}\n # initialize counter over control qubits\n i = 0\n # randomly select values from {0,1}^m for the periodic function\n while set(cont_qubs) - set(d_blackbox.keys()) != set():\n # pick a random target\n rand_targ = np.random.choice(targ_qubs)\n # set the same value for x and x + s\n d_blackbox[cont_qubs[i]] = rand_targ\n d_blackbox[add_vec_mod2(cont_qubs[i], s)] = rand_targ\n # avoid iterating over keys already assigned values\n while cont_qubs[i] in d_blackbox.keys():\n i = i + 1\n if i >= n:\n break\n \n return d_blackbox",
"_____no_output_____"
],
[
"def qubit_ket(qub_string):\n \"\"\"\n Form a basis ket out of n-bit string specified by the input 'qub_string', e.g.\n '001' -> |001>\n \"\"\"\n e0 = np.array([[1], [0]])\n e1 = np.array([[0], [1]])\n d_qubstring = {'0': e0, '1': e1}\n\n # initialize ket\n ket = d_qubstring[qub_string[0]]\n for i in range(1, len(qub_string)):\n ket = np.kron(ket, d_qubstring[qub_string[i]])\n \n return ket",
"_____no_output_____"
],
[
"def projection_op(qub_string):\n \"\"\"\n Creates a projection operator out of the basis element specified by 'qub_string', e.g.\n '101' -> |101> <101|\n \"\"\"\n ket = qubit_ket(qub_string)\n bra = np.transpose(ket) # all entries real, so no complex conjugation necessary\n proj = np.kron(ket, bra)\n return proj",
"_____no_output_____"
],
[
"def black_box(n, m, s):\n \"\"\"\n Inputs:-\n n: no. of control qubits\n m: no. of target qubits\n s: bit-string equal to the period of the black-box map\n \n Output:-\n Unitary representation of the black-box operator\n \"\"\"\n d_bb = black_box_map(n, m, s)\n # initialize unitary matrix\n N = 2**(n+m)\n unitary_rep = np.zeros(shape=(N, N))\n # populate unitary matrix\n for k, v in d_bb.items():\n # initialize target qubit operator\n targ_op = np.eye(2) + int(v[0])*(-np.eye(2) + np.array([[0, 1], [1, 0]]))\n # fill out the rest of the target qubit operator\n for i in range(1, m):\n cont_op = np.eye(2) + int(v[i])*(-np.eye(2) + np.array([[0, 1], [1, 0]]))\n targ_op = np.kron(targ_op, cont_op)\n # complete the unitary operator for current control qubit-register\n unitary_rep += np.kron(projection_op(k), targ_op)\n \n return unitary_rep",
"_____no_output_____"
],
[
"qvm = api.QVMConnection()\n# pick number of control qubits to be used\nn = 4\n# pick number of target qubits to be used\nm = 2\n# specify the period as an n bit-string\ns = '1011'\n# make sure s has the correct length\nif len(s) != n:\n raise ValueError(\"s does not have correct bit-string length\")\n# make sure s is non-zero\nif s == '0' * n:\n raise ValueError(\"s should not be zero vector\")\n# create the unitary black_box operator\nblackbox = black_box(n, m, s)\n# initialize the augmented matrix to be solved via Gaussian elimination\nW = []\n# initialize counter\ncounter = 0\n# run main loop\nwhile rank(W) < n-1:\n # initialize the program\n p = Program()\n\n # Define U_f\n p.defgate(\"U_f\", blackbox)\n\n # Prepare the initial state (1/sqrt[2])*(|0> + |1>)^(\\otimes n) \\otimes |0>^(\\otimes m)\n for m_ in range(m):\n p.inst(I(m_))\n for n_ in range(m, n+m):\n p.inst(H(n_))\n\n # Apply U_f\n p.inst((\"U_f\",) + tuple(range(n+m)[::-1]))\n\n # Apply final H^(\\otimes n)\n for n_ in range(m, n+m):\n p.inst(H(n_))\n\n # Final measurement\n classical_regs = list(range(n))\n for i, n_ in enumerate(list(range(m, n+m))[::-1]):\n p.measure(n_, classical_regs[i])\n\n measure_n_qubits = qvm.run(p, classical_regs)\n\n # flatten out list\n z = [item for sublist in measure_n_qubits for item in sublist]\n z.append(0)\n \n # add (or not) the new sample z to W\n W = new_sample(W, z)\n \n # increment counter\n counter = counter + 1\n del p\n \nprint (\"The period vector is found to be: \", solve_reduced_row_echelon_form(W))",
"The period vector is found to be: [1, 0, 1, 1]\n"
]
]
] |
[
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf96257f5c91ef3d587dee8dade6c10b4560e83
| 426,137 |
ipynb
|
Jupyter Notebook
|
assignment1/.ipynb_checkpoints/features-checkpoint.ipynb
|
adoval4/CS231n-Convolutional-Neural-Networks-for-Visual-Recognition
|
044f5b50bddd8626ebda6fa7ac71fed2292da1ad
|
[
"MIT"
] | 1 |
2018-03-16T19:46:48.000Z
|
2018-03-16T19:46:48.000Z
|
assignment1/.ipynb_checkpoints/features-checkpoint.ipynb
|
adoval4/CS231n-Convolutional-Neural-Networks-for-Visual-Recognition
|
044f5b50bddd8626ebda6fa7ac71fed2292da1ad
|
[
"MIT"
] | null | null | null |
assignment1/.ipynb_checkpoints/features-checkpoint.ipynb
|
adoval4/CS231n-Convolutional-Neural-Networks-for-Visual-Recognition
|
044f5b50bddd8626ebda6fa7ac71fed2292da1ad
|
[
"MIT"
] | null | null | null | 638.886057 | 325,672 | 0.932259 |
[
[
[
"# Image features exercise\n*Complete and hand in this completed worksheet (including its outputs and any supporting code outside of the worksheet) with your assignment submission. For more details see the [assignments page](http://vision.stanford.edu/teaching/cs231n/assignments.html) on the course website.*\n\nWe have seen that we can achieve reasonable performance on an image classification task by training a linear classifier on the pixels of the input image. In this exercise we will show that we can improve our classification performance by training linear classifiers not on raw pixels but on features that are computed from the raw pixels.\n\nAll of your work for this exercise will be done in this notebook.",
"_____no_output_____"
]
],
[
[
"import random\nimport numpy as np\nfrom cs231n.data_utils import load_CIFAR10\nimport matplotlib.pyplot as plt\n\nfrom __future__ import print_function\n\n%matplotlib inline\nplt.rcParams['figure.figsize'] = (10.0, 8.0) # set default size of plots\nplt.rcParams['image.interpolation'] = 'nearest'\nplt.rcParams['image.cmap'] = 'gray'\n\n# for auto-reloading extenrnal modules\n# see http://stackoverflow.com/questions/1907993/autoreload-of-modules-in-ipython\n%load_ext autoreload\n%autoreload 2",
"_____no_output_____"
]
],
[
[
"## Load data\nSimilar to previous exercises, we will load CIFAR-10 data from disk.",
"_____no_output_____"
]
],
[
[
"from cs231n.features import color_histogram_hsv, hog_feature\n\ndef get_CIFAR10_data(num_training=49000, num_validation=1000, num_test=1000):\n # Load the raw CIFAR-10 data\n cifar10_dir = 'cs231n/datasets/cifar-10-batches-py'\n X_train, y_train, X_test, y_test = load_CIFAR10(cifar10_dir)\n \n # Subsample the data\n mask = list(range(num_training, num_training + num_validation))\n X_val = X_train[mask]\n y_val = y_train[mask]\n mask = list(range(num_training))\n X_train = X_train[mask]\n y_train = y_train[mask]\n mask = list(range(num_test))\n X_test = X_test[mask]\n y_test = y_test[mask]\n \n return X_train, y_train, X_val, y_val, X_test, y_test\n\nX_train, y_train, X_val, y_val, X_test, y_test = get_CIFAR10_data()",
"_____no_output_____"
]
],
[
[
"## Extract Features\nFor each image we will compute a Histogram of Oriented\nGradients (HOG) as well as a color histogram using the hue channel in HSV\ncolor space. We form our final feature vector for each image by concatenating\nthe HOG and color histogram feature vectors.\n\nRoughly speaking, HOG should capture the texture of the image while ignoring\ncolor information, and the color histogram represents the color of the input\nimage while ignoring texture. As a result, we expect that using both together\nought to work better than using either alone. Verifying this assumption would\nbe a good thing to try for the bonus section.\n\nThe `hog_feature` and `color_histogram_hsv` functions both operate on a single\nimage and return a feature vector for that image. The extract_features\nfunction takes a set of images and a list of feature functions and evaluates\neach feature function on each image, storing the results in a matrix where\neach column is the concatenation of all feature vectors for a single image.",
"_____no_output_____"
]
],
[
[
"from cs231n.features import *\n\nnum_color_bins = 10 # Number of bins in the color histogram\nfeature_fns = [hog_feature, lambda img: color_histogram_hsv(img, nbin=num_color_bins)]\nX_train_feats = extract_features(X_train, feature_fns, verbose=True)\nX_val_feats = extract_features(X_val, feature_fns)\nX_test_feats = extract_features(X_test, feature_fns)\n\n# Preprocessing: Subtract the mean feature\nmean_feat = np.mean(X_train_feats, axis=0, keepdims=True)\nX_train_feats -= mean_feat\nX_val_feats -= mean_feat\nX_test_feats -= mean_feat\n\n# Preprocessing: Divide by standard deviation. This ensures that each feature\n# has roughly the same scale.\nstd_feat = np.std(X_train_feats, axis=0, keepdims=True)\nX_train_feats /= std_feat\nX_val_feats /= std_feat\nX_test_feats /= std_feat\n\n# Preprocessing: Add a bias dimension\nX_train_feats = np.hstack([X_train_feats, np.ones((X_train_feats.shape[0], 1))])\nX_val_feats = np.hstack([X_val_feats, np.ones((X_val_feats.shape[0], 1))])\nX_test_feats = np.hstack([X_test_feats, np.ones((X_test_feats.shape[0], 1))])",
"Done extracting features for 1000 / 49000 images\nDone extracting features for 2000 / 49000 images\nDone extracting features for 3000 / 49000 images\nDone extracting features for 4000 / 49000 images\nDone extracting features for 5000 / 49000 images\nDone extracting features for 6000 / 49000 images\nDone extracting features for 7000 / 49000 images\nDone extracting features for 8000 / 49000 images\nDone extracting features for 9000 / 49000 images\nDone extracting features for 10000 / 49000 images\nDone extracting features for 11000 / 49000 images\nDone extracting features for 12000 / 49000 images\nDone extracting features for 13000 / 49000 images\nDone extracting features for 14000 / 49000 images\nDone extracting features for 15000 / 49000 images\nDone extracting features for 16000 / 49000 images\nDone extracting features for 17000 / 49000 images\nDone extracting features for 18000 / 49000 images\nDone extracting features for 19000 / 49000 images\nDone extracting features for 20000 / 49000 images\nDone extracting features for 21000 / 49000 images\nDone extracting features for 22000 / 49000 images\nDone extracting features for 23000 / 49000 images\nDone extracting features for 24000 / 49000 images\nDone extracting features for 25000 / 49000 images\nDone extracting features for 26000 / 49000 images\nDone extracting features for 27000 / 49000 images\nDone extracting features for 28000 / 49000 images\nDone extracting features for 29000 / 49000 images\nDone extracting features for 30000 / 49000 images\nDone extracting features for 31000 / 49000 images\nDone extracting features for 32000 / 49000 images\nDone extracting features for 33000 / 49000 images\nDone extracting features for 34000 / 49000 images\nDone extracting features for 35000 / 49000 images\nDone extracting features for 36000 / 49000 images\nDone extracting features for 37000 / 49000 images\nDone extracting features for 38000 / 49000 images\nDone extracting features for 39000 / 49000 images\nDone extracting features for 40000 / 49000 images\nDone extracting features for 41000 / 49000 images\nDone extracting features for 42000 / 49000 images\nDone extracting features for 43000 / 49000 images\nDone extracting features for 44000 / 49000 images\nDone extracting features for 45000 / 49000 images\nDone extracting features for 46000 / 49000 images\nDone extracting features for 47000 / 49000 images\nDone extracting features for 48000 / 49000 images\n"
]
],
[
[
"## Train SVM on features\nUsing the multiclass SVM code developed earlier in the assignment, train SVMs on top of the features extracted above; this should achieve better results than training SVMs directly on top of raw pixels.",
"_____no_output_____"
]
],
[
[
"# Use the validation set to tune the learning rate and regularization strength\n\nfrom cs231n.classifiers.linear_classifier import LinearSVM\n\nlearning_rates = [1e-9, 1e-8, 1e-7]\nregularization_strengths = [5e4, 5e5, 5e6]\n\nresults = {}\nbest_val = -1\nbest_svm = None\n\npass\n################################################################################\n# TODO: #\n# Use the validation set to set the learning rate and regularization strength. #\n# This should be identical to the validation that you did for the SVM; save #\n# the best trained classifer in best_svm. You might also want to play #\n# with different numbers of bins in the color histogram. If you are careful #\n# you should be able to get accuracy of near 0.44 on the validation set. #\n################################################################################\nfor learning_rate in learning_rates:\n for reg in regularization_strengths:\n \n print('lr %e reg %e' % (learning_rate, reg,))\n \n svm = LinearSVM()\n loss_hist = svm.train(X_train_feats, y_train, learning_rate=learning_rate, reg=reg,\n num_iters=1500, verbose=True)\n \n y_train_pred = svm.predict(X_train_feats)\n y_val_pred = svm.predict(X_val_feats)\n \n accuracy_train = np.mean(y_train == y_train_pred)\n accuracy_val = np.mean(y_val == y_val_pred)\n \n results[(learning_rate, reg)] = (accuracy_train, accuracy_val)\n \n if best_val < accuracy_val:\n best_val = accuracy_val\n best_svm = svm\n################################################################################\n# END OF YOUR CODE #\n################################################################################\n\n# Print out results.\nfor lr, reg in sorted(results):\n train_accuracy, val_accuracy = results[(lr, reg)]\n print('lr %e reg %e train accuracy: %f val accuracy: %f' % (\n lr, reg, train_accuracy, val_accuracy))\n \nprint('best validation accuracy achieved during cross-validation: %f' % best_val)",
"lr 1.000000e-09 reg 5.000000e+04\niteration 0 / 1500: loss 45.926306\niteration 100 / 1500: loss 45.554489\niteration 200 / 1500: loss 45.183917\niteration 300 / 1500: loss 44.821866\niteration 400 / 1500: loss 44.484931\niteration 500 / 1500: loss 44.124498\niteration 600 / 1500: loss 43.783974\niteration 700 / 1500: loss 43.428738\niteration 800 / 1500: loss 43.101879\niteration 900 / 1500: loss 42.753336\niteration 1000 / 1500: loss 42.419674\niteration 1100 / 1500: loss 42.086988\niteration 1200 / 1500: loss 41.751431\niteration 1300 / 1500: loss 41.435967\niteration 1400 / 1500: loss 41.095771\nlr 1.000000e-09 reg 5.000000e+05\niteration 0 / 1500: loss 414.627935\niteration 100 / 1500: loss 376.044424\niteration 200 / 1500: loss 341.082728\niteration 300 / 1500: loss 309.478557\niteration 400 / 1500: loss 280.888943\niteration 500 / 1500: loss 254.999288\niteration 600 / 1500: loss 231.583832\niteration 700 / 1500: loss 210.403827\niteration 800 / 1500: loss 191.226810\niteration 900 / 1500: loss 173.878564\niteration 1000 / 1500: loss 158.184842\niteration 1100 / 1500: loss 143.983799\niteration 1200 / 1500: loss 131.139730\niteration 1300 / 1500: loss 119.517390\niteration 1400 / 1500: loss 108.985160\nlr 1.000000e-09 reg 5.000000e+06\niteration 0 / 1500: loss 4153.086168\niteration 100 / 1500: loss 1529.710713\niteration 200 / 1500: loss 567.031729\niteration 300 / 1500: loss 213.774634\niteration 400 / 1500: loss 84.145800\niteration 500 / 1500: loss 36.575035\niteration 600 / 1500: loss 19.118216\niteration 700 / 1500: loss 12.713243\niteration 800 / 1500: loss 10.362637\niteration 900 / 1500: loss 9.500105\niteration 1000 / 1500: loss 9.183302\niteration 1100 / 1500: loss 9.067288\niteration 1200 / 1500: loss 9.024710\niteration 1300 / 1500: loss 9.009060\niteration 1400 / 1500: loss 9.003322\nlr 1.000000e-08 reg 5.000000e+04\niteration 0 / 1500: loss 49.084934\niteration 100 / 1500: loss 45.275134\niteration 200 / 1500: loss 41.813541\niteration 300 / 1500: loss 38.684660\niteration 400 / 1500: loss 35.864991\niteration 500 / 1500: loss 33.304159\niteration 600 / 1500: loss 30.988808\niteration 700 / 1500: loss 28.881095\niteration 800 / 1500: loss 26.995657\niteration 900 / 1500: loss 25.281102\niteration 1000 / 1500: loss 23.728634\niteration 1100 / 1500: loss 22.338749\niteration 1200 / 1500: loss 21.067701\niteration 1300 / 1500: loss 19.921585\niteration 1400 / 1500: loss 18.887040\nlr 1.000000e-08 reg 5.000000e+05\niteration 0 / 1500: loss 393.122096\niteration 100 / 1500: loss 149.955394\niteration 200 / 1500: loss 60.719932\niteration 300 / 1500: loss 27.978500\niteration 400 / 1500: loss 15.966029\niteration 500 / 1500: loss 11.555313\niteration 600 / 1500: loss 9.937852\niteration 700 / 1500: loss 9.344030\niteration 800 / 1500: loss 9.126315\niteration 900 / 1500: loss 9.046275\niteration 1000 / 1500: loss 9.016956\niteration 1100 / 1500: loss 9.006148\niteration 1200 / 1500: loss 9.002199\niteration 1300 / 1500: loss 9.000751\niteration 1400 / 1500: loss 9.000247\nlr 1.000000e-08 reg 5.000000e+06\niteration 0 / 1500: loss 3793.621751\niteration 100 / 1500: loss 9.132679\niteration 200 / 1500: loss 8.999999\niteration 300 / 1500: loss 8.999994\niteration 400 / 1500: loss 8.999993\niteration 500 / 1500: loss 8.999993\niteration 600 / 1500: loss 8.999994\niteration 700 / 1500: loss 8.999992\niteration 800 / 1500: loss 8.999995\niteration 900 / 1500: loss 8.999993\niteration 1000 / 1500: loss 8.999994\niteration 1100 / 1500: loss 8.999994\niteration 1200 / 1500: loss 8.999994\niteration 1300 / 1500: loss 8.999992\niteration 1400 / 1500: loss 8.999994\nlr 1.000000e-07 reg 5.000000e+04\niteration 0 / 1500: loss 50.379821\niteration 100 / 1500: loss 24.187335\niteration 200 / 1500: loss 14.567724\niteration 300 / 1500: loss 11.045816\niteration 400 / 1500: loss 9.749179\niteration 500 / 1500: loss 9.274037\niteration 600 / 1500: loss 9.099745\niteration 700 / 1500: loss 9.036239\niteration 800 / 1500: loss 9.012568\niteration 900 / 1500: loss 9.004513\niteration 1000 / 1500: loss 9.001106\niteration 1100 / 1500: loss 9.000029\niteration 1200 / 1500: loss 8.999610\niteration 1300 / 1500: loss 8.999325\niteration 1400 / 1500: loss 8.999321\nlr 1.000000e-07 reg 5.000000e+05\niteration 0 / 1500: loss 388.581246\niteration 100 / 1500: loss 9.013197\niteration 200 / 1500: loss 8.999935\niteration 300 / 1500: loss 8.999929\niteration 400 / 1500: loss 8.999955\niteration 500 / 1500: loss 8.999937\niteration 600 / 1500: loss 8.999932\niteration 700 / 1500: loss 8.999940\niteration 800 / 1500: loss 8.999924\niteration 900 / 1500: loss 8.999936\niteration 1000 / 1500: loss 8.999924\niteration 1100 / 1500: loss 8.999926\niteration 1200 / 1500: loss 8.999930\niteration 1300 / 1500: loss 8.999946\niteration 1400 / 1500: loss 8.999910\nlr 1.000000e-07 reg 5.000000e+06\niteration 0 / 1500: loss 3932.836250\niteration 100 / 1500: loss 8.999994\niteration 200 / 1500: loss 8.999995\niteration 300 / 1500: loss 8.999995\niteration 400 / 1500: loss 8.999995\niteration 500 / 1500: loss 8.999997\niteration 600 / 1500: loss 8.999996\niteration 700 / 1500: loss 8.999995\niteration 800 / 1500: loss 8.999995\niteration 900 / 1500: loss 8.999996\niteration 1000 / 1500: loss 8.999997\niteration 1100 / 1500: loss 8.999996\niteration 1200 / 1500: loss 8.999996\niteration 1300 / 1500: loss 8.999994\niteration 1400 / 1500: loss 8.999995\nlr 1.000000e-09 reg 5.000000e+04 train accuracy: 0.086510 val accuracy: 0.083000\nlr 1.000000e-09 reg 5.000000e+05 train accuracy: 0.101449 val accuracy: 0.107000\nlr 1.000000e-09 reg 5.000000e+06 train accuracy: 0.119122 val accuracy: 0.112000\nlr 1.000000e-08 reg 5.000000e+04 train accuracy: 0.097673 val accuracy: 0.093000\nlr 1.000000e-08 reg 5.000000e+05 train accuracy: 0.351020 val accuracy: 0.354000\nlr 1.000000e-08 reg 5.000000e+06 train accuracy: 0.411286 val accuracy: 0.405000\nlr 1.000000e-07 reg 5.000000e+04 train accuracy: 0.411143 val accuracy: 0.409000\nlr 1.000000e-07 reg 5.000000e+05 train accuracy: 0.414061 val accuracy: 0.415000\nlr 1.000000e-07 reg 5.000000e+06 train accuracy: 0.372694 val accuracy: 0.389000\nbest validation accuracy achieved during cross-validation: 0.415000\n"
],
[
"# Evaluate your trained SVM on the test set\ny_test_pred = best_svm.predict(X_test_feats)\ntest_accuracy = np.mean(y_test == y_test_pred)\nprint(test_accuracy)",
"0.412\n"
],
[
"# An important way to gain intuition about how an algorithm works is to\n# visualize the mistakes that it makes. In this visualization, we show examples\n# of images that are misclassified by our current system. The first column\n# shows images that our system labeled as \"plane\" but whose true label is\n# something other than \"plane\".\n\nexamples_per_class = 8\nclasses = ['plane', 'car', 'bird', 'cat', 'deer', 'dog', 'frog', 'horse', 'ship', 'truck']\nfor cls, cls_name in enumerate(classes):\n idxs = np.where((y_test != cls) & (y_test_pred == cls))[0]\n idxs = np.random.choice(idxs, examples_per_class, replace=False)\n for i, idx in enumerate(idxs):\n plt.subplot(examples_per_class, len(classes), i * len(classes) + cls + 1)\n plt.imshow(X_test[idx].astype('uint8'))\n plt.axis('off')\n if i == 0:\n plt.title(cls_name)\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Inline question 1:\nDescribe the misclassification results that you see. Do they make sense?",
"_____no_output_____"
],
[
"## Neural Network on image features\nEarlier in this assigment we saw that training a two-layer neural network on raw pixels achieved better classification performance than linear classifiers on raw pixels. In this notebook we have seen that linear classifiers on image features outperform linear classifiers on raw pixels. \n\nFor completeness, we should also try training a neural network on image features. This approach should outperform all previous approaches: you should easily be able to achieve over 55% classification accuracy on the test set; our best model achieves about 60% classification accuracy.",
"_____no_output_____"
]
],
[
[
"print(X_train_feats.shape)",
"(49000, 155)\n"
],
[
"from cs231n.classifiers.neural_net import TwoLayerNet\n\ninput_dim = X_train_feats.shape[1]\nhidden_dim = 1024\nnum_classes = 10\n\nnet = TwoLayerNet(input_dim, hidden_dim, num_classes)\nbest_net = None\n\n################################################################################\n# TODO: Train a two-layer neural network on image features. You may want to #\n# cross-validate various parameters as in previous sections. Store your best #\n# model in the best_net variable. #\n################################################################################\n# Train the network\n_reg=0\n\n_learning_rate=1e-4\n_learning_rate_decay=0.95\n_num_iters=1000\n\nnet = TwoLayerNet(input_dim, hidden_dim, num_classes)\n\n# Train the network\nstats = net.train(X_train_feats, y_train, X_val_feats, y_val,\n num_iters=_num_iters, batch_size=200,\n learning_rate=_learning_rate, learning_rate_decay=_learning_rate_decay,\n reg=_reg, verbose=True)\n\n\n# Predict on the validation set\nval_acc = (net.predict(X_val_feats) == y_val).mean()\nprint('Validation accuracy: ', val_acc)\n\n# Plot the loss function and train / validation accuracies\nplt.subplot(2, 1, 1)\nplt.plot(stats['loss_history'])\nplt.title('Loss history')\nplt.xlabel('Iteration')\nplt.ylabel('Loss')\n\nplt.subplot(2, 1, 2)\nplt.plot(stats['train_acc_history'], label='train')\nplt.plot(stats['val_acc_history'], label='val')\nplt.title('Classification accuracy history')\nplt.xlabel('Epoch')\nplt.ylabel('Clasification accuracy')\nplt.show()\n################################################################################\n# END OF YOUR CODE #\n################################################################################",
"iteration 0 / 1000: loss 2.302585\niteration 100 / 1000: loss 2.302586\niteration 200 / 1000: loss 2.302586\niteration 300 / 1000: loss 2.302584\niteration 400 / 1000: loss 2.302586\niteration 500 / 1000: loss 2.302585\niteration 600 / 1000: loss 2.302586\niteration 700 / 1000: loss 2.302583\niteration 800 / 1000: loss 2.302591\niteration 900 / 1000: loss 2.302585\nValidation accuracy: 0.078\n"
],
[
"# Run your neural net classifier on the test set. You should be able to\n# get more than 55% accuracy.\n\ntest_acc = (net.predict(X_test_feats) == y_test).mean()\nprint(test_acc)",
"_____no_output_____"
]
],
[
[
"# Bonus: Design your own features!\n\nYou have seen that simple image features can improve classification performance. So far we have tried HOG and color histograms, but other types of features may be able to achieve even better classification performance.\n\nFor bonus points, design and implement a new type of feature and use it for image classification on CIFAR-10. Explain how your feature works and why you expect it to be useful for image classification. Implement it in this notebook, cross-validate any hyperparameters, and compare its performance to the HOG + Color histogram baseline.",
"_____no_output_____"
],
[
"# Bonus: Do something extra!\nUse the material and code we have presented in this assignment to do something interesting. Was there another question we should have asked? Did any cool ideas pop into your head as you were working on the assignment? This is your chance to show off!",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
]
] |
cbf966071a73d4ab0f4ffdac2773b5cc13615e1f
| 40,104 |
ipynb
|
Jupyter Notebook
|
workshops/Sagemaker_Pipelines_Automated_Retraining/sagemaker_pipelines_automated_retraining.ipynb
|
voitau/aws-healthcare-lifescience-ai-ml-sample-notebooks
|
e9b435483cc45bc3187fbdfd9c30b2506a7daeb6
|
[
"MIT-0"
] | 19 |
2021-12-20T20:15:49.000Z
|
2022-03-29T00:22:35.000Z
|
workshops/Sagemaker_Pipelines_Automated_Retraining/sagemaker_pipelines_automated_retraining.ipynb
|
voitau/aws-healthcare-lifescience-ai-ml-sample-notebooks
|
e9b435483cc45bc3187fbdfd9c30b2506a7daeb6
|
[
"MIT-0"
] | 3 |
2022-01-12T17:55:49.000Z
|
2022-02-03T15:32:59.000Z
|
workshops/Sagemaker_Pipelines_Automated_Retraining/sagemaker_pipelines_automated_retraining.ipynb
|
voitau/aws-healthcare-lifescience-ai-ml-sample-notebooks
|
e9b435483cc45bc3187fbdfd9c30b2506a7daeb6
|
[
"MIT-0"
] | 6 |
2022-01-08T01:10:13.000Z
|
2022-03-30T17:06:44.000Z
| 86.431034 | 26,400 | 0.830216 |
[
[
[
"# Automate Retraining of Models using SageMaker Pipelines and Lambda\n\n# Learning Objectives\n1. Construct a [SageMaker Pipeline](https://aws.amazon.com/sagemaker/pipelines/) that consists of a data preprocessing step and a model training step.\n2. Execute a SageMaker Pipeline manually\n3. Build infrastructure, using [CloudFormation](https://aws.amazon.com/cloudformation/) and [AWS Lambda](https://aws.amazon.com/lambda/) to allow the Pipeline steps be executed in an event-driven manner when new data is dropped in S3.\n\n\n## Introduction\nThis workshop shows how you can build and deploy SageMaker Pipelines for multistep processes. In this example, we will build a pipeline that:\n\n 1. Deduplicates the underlying data\n \n 2. Trains a built-in SageMaker algorithm (XGBoost) \n\nA common workflow is that models need to be retrained when new data arrives. This notebook also shows how you can set up a Lambda function that will retrigger the retraining pipeline when new data comes in.\n\nPlease use the `Python 3 (Data Science)` kernel for this workshop.",
"_____no_output_____"
]
],
[
[
"import boto3\nimport json\nimport logging\nimport os\nimport pandas\nimport sagemaker\nfrom sagemaker.workflow.parameters import ParameterString\nfrom sagemaker.workflow.steps import ProcessingStep, TrainingStep\nfrom sagemaker.sklearn.processing import SKLearnProcessor\nfrom sagemaker.workflow.pipeline import Pipeline\nfrom sagemaker.inputs import TrainingInput\nfrom sagemaker.processing import ProcessingInput, ProcessingOutput\nfrom sagemaker.estimator import Estimator\nfrom time import gmtime, strftime\n\n# set logs if not done already\nlogger = logging.getLogger(\"log\")\nif not logger.handlers:\n logger.setLevel(logging.INFO)\n logger.addHandler(logging.StreamHandler())",
"_____no_output_____"
]
],
[
[
"First, get permissions and other information. We will also create a pipeline name",
"_____no_output_____"
]
],
[
[
"session = sagemaker.Session()\ndefault_bucket = session.default_bucket()\nrole = sagemaker.get_execution_role()\nregion = boto3.Session().region_name\ns3_client = boto3.client(\"s3\", region_name=region)\n\ncurrent_timestamp = strftime(\"%m-%d-%H-%M\", gmtime())\npipeline_name = f\"my-pipeline-{current_timestamp}\"\nprefix = f\"pipeline-lab{current_timestamp}\"",
"_____no_output_____"
]
],
[
[
"## Transfer Data into Your Account",
"_____no_output_____"
]
],
[
[
"copy_source = {\n \"Bucket\": \"aws-hcls-ml\",\n \"Key\": \"workshop/immersion_day_workshop_data_DO_NOT_DELETE/data/ObesityDataSet_with_duplicates.csv\",\n}\ns3_client.copy(\n copy_source, default_bucket, f\"{prefix}/ObesityDataSet_with_duplicates.csv\"\n)\n\ncopy_source = {\n \"Bucket\": \"aws-hcls-ml\",\n \"Key\": \"workshop/immersion_day_workshop_data_DO_NOT_DELETE/kick_off_sagemaker_pipelines_lambda/other_material/lambda.zip\",\n}\ns3_client.copy(copy_source, default_bucket, f\"{prefix}/lambda.zip\")",
"_____no_output_____"
]
],
[
[
"## Define the Pipeline\n\nFirst we will create a preprocessing step. The preprocessing step simply removes duplicated rows from the dataset. The `preprocessing.py` script will be written locally, and then built as a SageMaker Pipelines step.",
"_____no_output_____"
]
],
[
[
"input_data = ParameterString(\n name=\"InputData\",\n default_value=f\"s3://{default_bucket}/{prefix}/ObesityDataSet_with_duplicates.csv\",\n)",
"_____no_output_____"
],
[
"%%writefile preprocessing.py\nimport pandas\nimport os\nbase_dir = \"/opt/ml/processing/input\"\nthe_files = os.listdir(base_dir)\nthe_file=[i for i in the_files if \".csv\" in i][0] #get the first csv\nprint(the_file)\ndf_1=pandas.read_csv(f'{base_dir}/{the_file}',engine='python')\ndf_2=df_1.drop_duplicates()\ndf_2.to_csv(f'/opt/ml/processing/output/deduped_{the_file}.csv') \n",
"_____no_output_____"
],
[
"# Specify the container and framework options\n\nsklearn_processor = SKLearnProcessor(\n framework_version=\"0.23-1\",\n instance_type=\"ml.t3.medium\",\n instance_count=1,\n base_job_name=\"sklearn-abalone-process\",\n role=role,\n)",
"_____no_output_____"
]
],
[
[
"Now will will turn the preprocessing step as a SageMaker Processing Step with SageMaker Pipelines.",
"_____no_output_____"
]
],
[
[
"step_process = ProcessingStep(\n name=\"deduplication-process\",\n processor=sklearn_processor,\n inputs=[\n ProcessingInput(source=input_data, destination=\"/opt/ml/processing/input\"),\n ],\n outputs=[\n ProcessingOutput(output_name=\"deduplicated\", source=\"/opt/ml/processing/output\")\n ],\n code=\"preprocessing.py\",\n)",
"_____no_output_____"
]
],
[
[
"## Define the Model\nNow we will create a SageMaker model. We will use the SageMaker built-in XGBoost Algorithm.",
"_____no_output_____"
]
],
[
[
"# Define the model training parameters\nmodel_path = f\"s3://{default_bucket}/{prefix}/myPipelineTrain\"\n\nimage_uri = sagemaker.image_uris.retrieve(\n framework=\"xgboost\",\n region=region,\n version=\"1.0-1\",\n py_version=\"py3\",\n instance_type=\"ml.m5.large\",\n)\nxgb_train = Estimator(\n image_uri=image_uri,\n instance_type=\"ml.m5.large\",\n instance_count=1,\n output_path=model_path,\n role=role,\n)\nxgb_train.set_hyperparameters(\n objective=\"reg:linear\",\n num_round=50,\n max_depth=5,\n eta=0.2,\n gamma=4,\n min_child_weight=6,\n subsample=0.7,\n silent=0,\n)",
"_____no_output_____"
]
],
[
[
"Turn the model training into a SageMaker Pipeline Training Step.",
"_____no_output_____"
]
],
[
[
"# Define the training steps\n\nstep_train = TrainingStep(\n name=\"model-training\",\n estimator=xgb_train,\n inputs={\n \"train\": TrainingInput(\n s3_data=step_process.properties.ProcessingOutputConfig.Outputs[\n \"deduplicated\"\n ].S3Output.S3Uri,\n content_type=\"text/csv\",\n ),\n \"validation\": TrainingInput(\n s3_data=step_process.properties.ProcessingOutputConfig.Outputs[\n \"deduplicated\"\n ].S3Output.S3Uri,\n content_type=\"text/csv\",\n ),\n },\n)",
"_____no_output_____"
]
],
[
[
"## Create and Start the Pipeline",
"_____no_output_____"
]
],
[
[
"# Create a two-step data processing and model training pipeline\n\npipeline_name = \"ObesityModelRetrainingPipeLine\"\npipeline = Pipeline(\n name=pipeline_name,\n parameters=[\n input_data,\n ],\n steps=[step_process, step_train],\n)\npipeline.upsert(role_arn=role)\npipeline_execution = pipeline.start()",
"_____no_output_____"
],
[
"# Wait 15 minutes for the pipeline to finish running. In the meantime, you can monitor its progress in SageMaker Studio\npipeline_execution.wait()",
"_____no_output_____"
]
],
[
[
"## Deploy a CloudFormation Template to retrain the Pipeline\n\nNow we will deploy a cloudformation template that will allow for automated calling of the Pipeline when new files are dropped in an S3 bucket.\n\nThe architecture looks like this:\n\n",
"_____no_output_____"
],
[
"NOTE: In order to run the following steps you must first associate the following IAM policies to your SageMaker execution role:\n- cloudformation:CreateStack\n- cloudformation:DeleteStack\n- cloudformation:DescribeStacks\n- iam:CreateRole\n- iam:DeleteRole\n- iam:DeleteRolePolicy\n- iam:GetRole\n- iam:GetRolePolicy\n- iam:PassRole\n- iam:PutRolePolicy\n- lambda:AddPermission\n- lambda:CreateFunction\n- lambda:GetFunction\n- lambda:DeleteFuncton",
"_____no_output_____"
]
],
[
[
"# Create a new CloudFormation stack to trigger retraining with new data\n\nstack_name = \"sagemaker-automated-retraining\"\n\nwith open(\"cfn_sagemaker_pipelines.yaml\") as f:\n template_str = f.read()\ncfn = boto3.client(\"cloudformation\")\ncfn.create_stack(\n StackName=stack_name,\n TemplateBody=template_str,\n Capabilities=[\"CAPABILITY_IAM\"],\n Parameters=[\n {\"ParameterKey\": \"StaticCodeBucket\", \"ParameterValue\": default_bucket},\n {\"ParameterKey\": \"StaticCodeKey\", \"ParameterValue\": f\"{prefix}/lambda.zip\"},\n ],\n)",
"_____no_output_____"
],
[
"# Wait until stack creation is complete\nwaiter = cfn.get_waiter(\"stack_create_complete\")\nwaiter.wait(StackName=stack_name)",
"_____no_output_____"
],
[
"# Identify the S3 bucket for triggering the training pipeline\ninput_bucket_name = cfn.describe_stacks(StackName=stack_name)[\"Stacks\"][0][\"Outputs\"][0][\"OutputValue\"]",
"_____no_output_____"
],
[
"# Copy the training data to the input bucket to start a new pipeline execution\ncopy_source = {\n \"Bucket\": default_bucket,\n \"Key\": f\"{prefix}/ObesityDataSet_with_duplicates.csv\",\n}\ns3_client.copy(copy_source, input_bucket_name, \"ObesityDataSet_with_duplicates.csv\")",
"_____no_output_____"
]
],
[
[
"### (Optional)",
"_____no_output_____"
],
[
"1. Inspect that the `InputBucket` has new data\n2. Examine the `SageMaker Pipelines` execution from the SageMaker Studio console",
"_____no_output_____"
]
],
[
[
"#!aws s3 rm --recursive s3://{input_bucket_name}",
"_____no_output_____"
]
],
[
[
"## Closing",
"_____no_output_____"
],
[
"In this notebook we demonstrated how to create a SageMaker pipeline for data processing and model training and triggered it using an S3 event.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
cbf96a78d0fd8dc9ce98ec45ceb9cacf07ece0be
| 4,392 |
ipynb
|
Jupyter Notebook
|
notebooks/DEVELOPMENT-real_time_analysis_w_microphone.ipynb
|
cshawnkeech/rainforest_audio
|
95cd61304740bd6da46a559cfe7feda4e6539763
|
[
"MIT"
] | 1 |
2021-05-05T01:38:46.000Z
|
2021-05-05T01:38:46.000Z
|
notebooks/DEVELOPMENT-real_time_analysis_w_microphone.ipynb
|
cshawnkeech/rainforest_audio
|
95cd61304740bd6da46a559cfe7feda4e6539763
|
[
"MIT"
] | null | null | null |
notebooks/DEVELOPMENT-real_time_analysis_w_microphone.ipynb
|
cshawnkeech/rainforest_audio
|
95cd61304740bd6da46a559cfe7feda4e6539763
|
[
"MIT"
] | null | null | null | 24 | 540 | 0.548497 |
[
[
[
"# Realtime Predict Test",
"_____no_output_____"
]
],
[
[
"mel_model = tf.keras.models.load_model('../data/saved_models/mel_1_sec_model_(95acc)/mel_1_sec_model')",
"_____no_output_____"
],
[
"import src.audio_functions as afun\nimport src.audio_prep as aprep",
"_____no_output_____"
],
[
"sr = 48000\ntest_tp_slice = librosa_test[:sr]\n",
"_____no_output_____"
],
[
"ipd.Audio(test_tp_slice, \n rate=sr,\n normalize=True)",
"_____no_output_____"
],
[
"test_tp_slice.shape",
"_____no_output_____"
],
[
"test_spect = aprep.mel_1s_snapshot(test_tp_slice)",
"_____no_output_____"
],
[
"test_spect.shape",
"_____no_output_____"
],
[
"# mel_reshape = [tf.expand_dims(i, -1) for i in stft_dict['mel']]\n# mel_reshape_np = np.array(mel_reshape)\n\n\ntest_spect_reshape = tf.expand_dims(test_spect, -1)\ntest_spect_np = np.array(test_spect_reshape)",
"_____no_output_____"
],
[
"pred = tp_model.predict(test_spect)",
"_____no_output_____"
],
[
"import serve_prediction as serve_p\nprint(test_tp_slice.shape)\n\nnew_pred = serve_p.get_pred(test_tp_slice)\nprint(new_pred.shape)",
"_____no_output_____"
],
[
"prediction = tp_model(test_spect_np)\n\n\nplt.bar([x for x in unique_classes], tf.nn.softmax(prediction[1]))\nplt.xticks(ticks = [i for i in range(24)], labels=[str(i) for i in range(24)])\nplt.title('Which Species?')\nplt.ylabel('Probability')\nplt.xlabel('Species')\nplt.ylim(0,1)",
"_____no_output_____"
],
[
"import src.audio_functions as a_fun\n\na_fun.stream_seconds()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf9743aa52836fc2cb26715dad35c050b0c254e
| 16,620 |
ipynb
|
Jupyter Notebook
|
House Prices: Advanced Regression Techniques/examples_from_others/.ipynb_checkpoints/kernel_3_how_to_handle_missing_value-checkpoint.ipynb
|
hanmeng31/Kaggle_Practices
|
989932842e63e194a1bebf74cb307387f9faf915
|
[
"MIT"
] | null | null | null |
House Prices: Advanced Regression Techniques/examples_from_others/.ipynb_checkpoints/kernel_3_how_to_handle_missing_value-checkpoint.ipynb
|
hanmeng31/Kaggle_Practices
|
989932842e63e194a1bebf74cb307387f9faf915
|
[
"MIT"
] | null | null | null |
House Prices: Advanced Regression Techniques/examples_from_others/.ipynb_checkpoints/kernel_3_how_to_handle_missing_value-checkpoint.ipynb
|
hanmeng31/Kaggle_Practices
|
989932842e63e194a1bebf74cb307387f9faf915
|
[
"MIT"
] | null | null | null | 46.038781 | 1,111 | 0.639531 |
[
[
[
"*This tutorial is part Level 2 in the [Learn Machine Learning](https://www.kaggle.com/learn/machine-learning) curriculum. This tutorial picks up where Level 1 finished, so you will get the most out of it if you've done the exercise from Level 1.*\n\nIn this step, you will learn three approaches to dealing with missing values. You will then learn to compare the effectiveness of these approaches on any given dataset.* \n\n# Introduction\n\nThere are many ways data can end up with missing values. For example\n- A 2 bedroom house wouldn't include an answer for _How large is the third bedroom_\n- Someone being surveyed may choose not to share their income\n\nPython libraries represent missing numbers as **nan** which is short for \"not a number\". You can detect which cells have missing values, and then count how many there are in each column with the command:\n```\nmissing_val_count_by_column = (data.isnull().sum())\nprint(missing_val_count_by_column[missing_val_count_by_column > 0\n```\n\nMost libraries (including scikit-learn) will give you an error if you try to build a model using data with missing values. So you'll need to choose one of the strategies below.\n\n---\n## Solutions\n\n\n## 1) A Simple Option: Drop Columns with Missing Values\nIf your data is in a DataFrame called `original_data`, you can drop columns with missing values. One way to do that is\n```\ndata_without_missing_values = original_data.dropna(axis=1)\n```\n\nIn many cases, you'll have both a training dataset and a test dataset. You will want to drop the same columns in both DataFrames. In that case, you would write\n\n```\ncols_with_missing = [col for col in original_data.columns \n if original_data[col].isnull().any()]\nredued_original_data = original_data.drop(cols_with_missing, axis=1)\nreduced_test_data = test_data.drop(cols_with_missing, axis=1)\n```\nIf those columns had useful information (in the places that were not missing), your model loses access to this information when the column is dropped. Also, if your test data has missing values in places where your training data did not, this will result in an error. \n\nSo, it's somewhat usually not the best solution. However, it can be useful when most values in a column are missing.\n\n\n\n## 2) A Better Option: Imputation\nImputation fills in the missing value with some number. The imputed value won't be exactly right in most cases, but it usually gives more accurate models than dropping the column entirely.\n\nThis is done with\n```\nfrom sklearn.impute import SimpleImputer\nmy_imputer = SimpleImputer()\ndata_with_imputed_values = my_imputer.fit_transform(original_data)\n```\nThe default behavior fills in the mean value for imputation. Statisticians have researched more complex strategies, but those complex strategies typically give no benefit once you plug the results into sophisticated machine learning models.\n\nOne (of many) nice things about Imputation is that it can be included in a scikit-learn Pipeline. Pipelines simplify model building, model validation and model deployment.\n\n## 3) An Extension To Imputation\nImputation is the standard approach, and it usually works well. However, imputed values may by systematically above or below their actual values (which weren't collected in the dataset). Or rows with missing values may be unique in some other way. In that case, your model would make better predictions by considering which values were originally missing. Here's how it might look:\n```\n# make copy to avoid changing original data (when Imputing)\nnew_data = original_data.copy()\n\n# make new columns indicating what will be imputed\ncols_with_missing = (col for col in new_data.columns \n if new_data[col].isnull().any())\nfor col in cols_with_missing:\n new_data[col + '_was_missing'] = new_data[col].isnull()\n\n# Imputation\nmy_imputer = SimpleImputer()\nnew_data = pd.DataFrame(my_imputer.fit_transform(new_data))\nnew_data.columns = original_data.columns\n```\n\nIn some cases this approach will meaningfully improve results. In other cases, it doesn't help at all.\n\n---\n# Example (Comparing All Solutions)\n\nWe will see am example predicting housing prices from the Melbourne Housing data. To master missing value handling, fork this notebook and repeat the same steps with the Iowa Housing data. Find information about both in the **Data** section of the header menu.\n\n\n### Basic Problem Set-up",
"_____no_output_____"
]
],
[
[
"import pandas as pd\n\n# Load data\nmelb_data = pd.read_csv('../data/train.csv')\n\nprint(melb_data.columns)\n\nfrom sklearn.ensemble import RandomForestRegressor\nfrom sklearn.metrics import mean_absolute_error\nfrom sklearn.model_selection import train_test_split\n\nmelb_target = melb_data.SalePrice\nmelb_predictors = melb_data.drop(['SalePrice'], axis=1)\n\n# For the sake of keeping the example simple, we'll use only numeric predictors. \nmelb_numeric_predictors = melb_predictors.select_dtypes(exclude=['object'])\n",
"Index(['Id', 'MSSubClass', 'MSZoning', 'LotFrontage', 'LotArea', 'Street',\n 'Alley', 'LotShape', 'LandContour', 'Utilities', 'LotConfig',\n 'LandSlope', 'Neighborhood', 'Condition1', 'Condition2', 'BldgType',\n 'HouseStyle', 'OverallQual', 'OverallCond', 'YearBuilt', 'YearRemodAdd',\n 'RoofStyle', 'RoofMatl', 'Exterior1st', 'Exterior2nd', 'MasVnrType',\n 'MasVnrArea', 'ExterQual', 'ExterCond', 'Foundation', 'BsmtQual',\n 'BsmtCond', 'BsmtExposure', 'BsmtFinType1', 'BsmtFinSF1',\n 'BsmtFinType2', 'BsmtFinSF2', 'BsmtUnfSF', 'TotalBsmtSF', 'Heating',\n 'HeatingQC', 'CentralAir', 'Electrical', '1stFlrSF', '2ndFlrSF',\n 'LowQualFinSF', 'GrLivArea', 'BsmtFullBath', 'BsmtHalfBath', 'FullBath',\n 'HalfBath', 'BedroomAbvGr', 'KitchenAbvGr', 'KitchenQual',\n 'TotRmsAbvGrd', 'Functional', 'Fireplaces', 'FireplaceQu', 'GarageType',\n 'GarageYrBlt', 'GarageFinish', 'GarageCars', 'GarageArea', 'GarageQual',\n 'GarageCond', 'PavedDrive', 'WoodDeckSF', 'OpenPorchSF',\n 'EnclosedPorch', '3SsnPorch', 'ScreenPorch', 'PoolArea', 'PoolQC',\n 'Fence', 'MiscFeature', 'MiscVal', 'MoSold', 'YrSold', 'SaleType',\n 'SaleCondition', 'SalePrice'],\n dtype='object')\n"
]
],
[
[
"### Create Function to Measure Quality of An Approach\nWe divide our data into **training** and **test**. If the reason for this is unfamiliar, review [Welcome to Data Science](https://www.kaggle.com/dansbecker/welcome-to-data-science-1).\n\nWe've loaded a function `score_dataset(X_train, X_test, y_train, y_test)` to compare the quality of diffrent approaches to missing values. This function reports the out-of-sample MAE score from a RandomForest.",
"_____no_output_____"
]
],
[
[
"from sklearn.ensemble import RandomForestRegressor\nfrom sklearn.metrics import mean_absolute_error\nfrom sklearn.model_selection import train_test_split\n\nX_train, X_test, y_train, y_test = train_test_split(melb_numeric_predictors, \n melb_target,\n train_size=0.7, \n test_size=0.3, \n random_state=0)\n\ndef score_dataset(X_train, X_test, y_train, y_test):\n model = RandomForestRegressor()\n model.fit(X_train, y_train)\n preds = model.predict(X_test)\n return mean_absolute_error(y_test, preds)",
"_____no_output_____"
]
],
[
[
"### Get Model Score from Dropping Columns with Missing Values",
"_____no_output_____"
]
],
[
[
"cols_with_missing = [col for col in X_train.columns \n if X_train[col].isnull().any()]\nreduced_X_train = X_train.drop(cols_with_missing, axis=1)\nreduced_X_test = X_test.drop(cols_with_missing, axis=1)\nprint(\"Mean Absolute Error from dropping columns with Missing Values:\")\nprint(score_dataset(reduced_X_train, reduced_X_test, y_train, y_test))",
"Mean Absolute Error from dropping columns with Missing Values:\n19434.55662100457\n"
]
],
[
[
"### Get Model Score from Imputation",
"_____no_output_____"
]
],
[
[
"from sklearn.impute import SimpleImputer\n\nmy_imputer = SimpleImputer()\nimputed_X_train = my_imputer.fit_transform(X_train)\nimputed_X_test = my_imputer.transform(X_test)\nprint(\"Mean Absolute Error from Imputation:\")\nprint(score_dataset(imputed_X_train, imputed_X_test, y_train, y_test))",
"_____no_output_____"
]
],
[
[
"### Get Score from Imputation with Extra Columns Showing What Was Imputed",
"_____no_output_____"
]
],
[
[
"imputed_X_train_plus = X_train.copy()\nimputed_X_test_plus = X_test.copy()\n\ncols_with_missing = (col for col in X_train.columns \n if X_train[col].isnull().any())\nfor col in cols_with_missing:\n imputed_X_train_plus[col + '_was_missing'] = imputed_X_train_plus[col].isnull()\n imputed_X_test_plus[col + '_was_missing'] = imputed_X_test_plus[col].isnull()\n\n# Imputation\nmy_imputer = SimpleImputer()\nimputed_X_train_plus = my_imputer.fit_transform(imputed_X_train_plus)\nimputed_X_test_plus = my_imputer.transform(imputed_X_test_plus)\n\nprint(\"Mean Absolute Error from Imputation while Track What Was Imputed:\")\nprint(score_dataset(imputed_X_train_plus, imputed_X_test_plus, y_train, y_test))",
"_____no_output_____"
]
],
[
[
"# Conclusion\nAs is common, imputing missing values allowed us to improve our model compared to dropping those columns. We got an additional boost by tracking what values had been imputed.",
"_____no_output_____"
],
[
"# Your Turn\n1) Find some columns with missing values in your dataset.\n\n2) Use the Imputer class so you can impute missing values\n\n3) Add columns with missing values to your predictors. \n\nIf you find the right columns, you may see an improvement in model scores. That said, the Iowa data doesn't have a lot of columns with missing values. So, whether you see an improvement at this point depends on some other details of your model.\n\nOnce you've added the Imputer, keep using those columns for future steps. In the end, it will improve your model (and in most other datasets, it is a big improvement). \n\n# Keep Going\nOnce you've added the Imputer and included columns with missing values, you are ready to [add categorical variables](https://www.kaggle.com/dansbecker/using-categorical-data-with-one-hot-encoding), which is non-numeric data representing categories (like the name of the neighborhood a house is in).\n\n---\n\nPart of the **[Learn Machine Learning](https://www.kaggle.com/learn/machine-learning)** track.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
cbf9780bb361d6cc4a7dcc296db0773befc26ae7
| 17,152 |
ipynb
|
Jupyter Notebook
|
colab/04_built_in.ipynb
|
mfernandes61/python-intro-gapminder
|
894579dc093b03cdc211e095a5aaf7e401b525bf
|
[
"CC-BY-4.0"
] | null | null | null |
colab/04_built_in.ipynb
|
mfernandes61/python-intro-gapminder
|
894579dc093b03cdc211e095a5aaf7e401b525bf
|
[
"CC-BY-4.0"
] | null | null | null |
colab/04_built_in.ipynb
|
mfernandes61/python-intro-gapminder
|
894579dc093b03cdc211e095a5aaf7e401b525bf
|
[
"CC-BY-4.0"
] | null | null | null | 38.115556 | 251 | 0.473356 |
[
[
[
"<a href=\"https://colab.research.google.com/github/mfernandes61/python-intro-gapminder/blob/binder/colab/04_built_in.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
]
],
[
[
"",
"_____no_output_____"
]
],
[
[
"---\ntitle: \"Built-in Functions and Help\"\nteaching: 15\nexercises: 10\nquestions:\n- \"How can I use built-in functions?\"\n- \"How can I find out what they do?\"\n- \"What kind of errors can occur in programs?\"\nobjectives:\n- \"Explain the purpose of functions.\"\n- \"Correctly call built-in Python functions.\"\n- \"Correctly nest calls to built-in functions.\"\n- \"Use help to display documentation for built-in functions.\"\n- \"Correctly describe situations in which SyntaxError and NameError occur.\"\nkeypoints:\n- \"Use comments to add documentation to programs.\"\n- \"A function may take zero or more arguments.\"\n- \"Commonly-used built-in functions include `max`, `min`, and `round`.\"\n- \"Functions may only work for certain (combinations of) arguments.\"\n- \"Functions may have default values for some arguments.\"\n- \"Use the built-in function `help` to get help for a function.\"\n- \"The Jupyter Notebook has two ways to get help.\"\n- \"Every function returns something.\"\n- \"Python reports a syntax error when it can't understand the source of a program.\"\n- \"Python reports a runtime error when something goes wrong while a program is executing.\"\n- \"Fix syntax errors by reading the source code, and runtime errors by tracing the program's execution.\"\n---\n## Use comments to add documentation to programs.\n\n~~~\n# This sentence isn't executed by Python.\nadjustment = 0.5 # Neither is this - anything after '#' is ignored.\n~~~\n{: .language-python}\n\n## A function may take zero or more arguments.\n\n* We have seen some functions already --- now let's take a closer look.\n* An *argument* is a value passed into a function.\n* `len` takes exactly one.\n* `int`, `str`, and `float` create a new value from an existing one.\n* `print` takes zero or more.\n* `print` with no arguments prints a blank line.\n * Must always use parentheses, even if they're empty,\n so that Python knows a function is being called.\n\n~~~\nprint('before')\nprint()\nprint('after')\n~~~\n{: .language-python}\n~~~\nbefore\n\nafter\n~~~\n{: .output}\n\n## Every function returns something.\n\n* Every function call produces some result.\n* If the function doesn't have a useful result to return,\n it usually returns the special value `None`. `None` is a Python\n object that stands in anytime there is no value.\n\n~~~\nresult = print('example')\nprint('result of print is', result)\n~~~\n{: .language-python}\n~~~\nexample\nresult of print is None\n~~~\n{: .output}\n\n## Commonly-used built-in functions include `max`, `min`, and `round`.\n\n* Use `max` to find the largest value of one or more values.\n* Use `min` to find the smallest.\n* Both work on character strings as well as numbers.\n * \"Larger\" and \"smaller\" use (0-9, A-Z, a-z) to compare letters.\n\n~~~\nprint(max(1, 2, 3))\nprint(min('a', 'A', '0'))\n~~~\n{: .language-python}\n~~~\n3\n0\n~~~\n{: .output}\n\n## Functions may only work for certain (combinations of) arguments.\n\n* `max` and `min` must be given at least one argument.\n * \"Largest of the empty set\" is a meaningless question.\n* And they must be given things that can meaningfully be compared.\n\n~~~\nprint(max(1, 'a'))\n~~~\n{: .language-python}\n~~~\nTypeError Traceback (most recent call last)\n<ipython-input-52-3f049acf3762> in <module>\n----> 1 print(max(1, 'a'))\n\nTypeError: '>' not supported between instances of 'str' and 'int'\n~~~\n{: .error}\n\n## Functions may have default values for some arguments.\n\n* `round` will round off a floating-point number.\n* By default, rounds to zero decimal places.\n\n~~~\nround(3.712)\n~~~\n{: .language-python}\n~~~\n4\n~~~\n{: .output}\n\n* We can specify the number of decimal places we want.\n\n~~~\nround(3.712, 1)\n~~~\n{: .language-python}\n~~~\n3.7\n~~~\n{: .output}\n\n## Functions attached to objects are called methods\n\n* Functions take another form that will be common in the pandas episodes.\n* Methods have parentheses like functions, but come after the variable.\n* Some methods are used for internal Python operations, and are marked with double underlines.\n\n~~~\nmy_string = 'Hello world!' # creation of a string object \n\nprint(len(my_string)) # the len function takes a string as an argument and returns the length of the string\n\nprint(my_string.swapcase()) # calling the swapcase method on the my_string object\n\nprint(my_string.__len__()) # calling the internal __len__ method on the my_string object, used by len(my_string)\n\n~~~\n{: .language-python}\n\n~~~\n12\nhELLO WORLD!\n12\n~~~\n{: .output}\n\n* You might even see them chained together. They operate left to right.\n\n~~~\nprint(my_string.isupper()) # Not all the letters are uppercase\nprint(my_string.upper()) # This capitalizes all the letters\n\nprint(my_string.upper().isupper()) # Now all the letters are uppercase\n~~~\n{: .language-python}\n\n~~~\nFalse\nHELLO WORLD\nTrue\n~~~\n{: .output}\n\n## Use the built-in function `help` to get help for a function.\n\n* Every built-in function has online documentation.\n\n~~~\nhelp(round)\n~~~\n{: .language-python}\n~~~\nHelp on built-in function round in module builtins:\n\nround(number, ndigits=None)\n Round a number to a given precision in decimal digits.\n \n The return value is an integer if ndigits is omitted or None. Otherwise\n the return value has the same type as the number. ndigits may be negative.\n~~~\n{: .output}\n\n## The Jupyter Notebook has two ways to get help.\n\n* Option 1: Place the cursor near where the function is invoked in a cell\n (i.e., the function name or its parameters),\n * Hold down <kbd>Shift</kbd>, and press <kbd>Tab</kbd>.\n * Do this several times to expand the information returned.\n* Option 2: Type the function name in a cell with a question mark after it. Then run the cell.\n\n\n## Python reports a syntax error when it can't understand the source of a program.\n\n* Won't even try to run the program if it can't be parsed.\n\n~~~\n# Forgot to close the quote marks around the string.\nname = 'Feng\n~~~\n{: .language-python}\n~~~\n File \"<ipython-input-56-f42768451d55>\", line 2\n name = 'Feng\n ^\nSyntaxError: EOL while scanning string literal\n~~~\n{: .error}\n\n~~~\n# An extra '=' in the assignment.\nage = = 52\n~~~\n{: .language-python}\n~~~\n File \"<ipython-input-57-ccc3df3cf902>\", line 2\n age = = 52\n ^\nSyntaxError: invalid syntax\n~~~\n{: .error}\n\n* Look more closely at the error message:\n\n~~~\nprint(\"hello world\"\n~~~\n{: .language-python}\n~~~\n File \"<ipython-input-6-d1cc229bf815>\", line 1\n print (\"hello world\"\n ^\nSyntaxError: unexpected EOF while parsing\n~~~\n{: .error}\n\n* The message indicates a problem on first line of the input (\"line 1\").\n * In this case the \"ipython-input\" section of the file name tells us that\n we are working with input into IPython,\n the Python interpreter used by the Jupyter Notebook.\n* The `-6-` part of the filename indicates that\n the error occurred in cell 6 of our Notebook.\n* Next is the problematic line of code,\n indicating the problem with a `^` pointer.\n\n## <a name='runtime-error'></a> Python reports a runtime error when something goes wrong while a program is executing.\n\n~~~\nage = 53\nremaining = 100 - aege # mis-spelled 'age'\n~~~\n{: .language-python}\n~~~\nNameError Traceback (most recent call last)\n<ipython-input-59-1214fb6c55fc> in <module>\n 1 age = 53\n----> 2 remaining = 100 - aege # mis-spelled 'age'\n\nNameError: name 'aege' is not defined\n~~~\n{: .error}\n\n* Fix syntax errors by reading the source and runtime errors by tracing execution.\n\n> ## What Happens When\n>\n> 1. Explain in simple terms the order of operations in the following program:\n> when does the addition happen, when does the subtraction happen,\n> when is each function called, etc.\n> 2. What is the final value of `radiance`?\n>\n> ~~~\n> radiance = 1.0\n> radiance = max(2.1, 2.0 + min(radiance, 1.1 * radiance - 0.5))\n> ~~~\n> {: .language-python}\n> > ## Solution\n> > 1. Order of operations:\n> > 1. `1.1 * radiance = 1.1`\n> > 2. `1.1 - 0.5 = 0.6`\n> > 3. `min(radiance, 0.6) = 0.6`\n> > 4. `2.0 + 0.6 = 2.6`\n> > 5. `max(2.1, 2.6) = 2.6`\n> > 2. At the end, `radiance = 2.6`\n> {: .solution}\n{: .challenge}\n\n> ## Spot the Difference\n>\n> 1. Predict what each of the `print` statements in the program below will print.\n> 2. Does `max(len(rich), poor)` run or produce an error message?\n> If it runs, does its result make any sense?\n>\n> ~~~\n> easy_string = \"abc\"\n> print(max(easy_string))\n> rich = \"gold\"\n> poor = \"tin\"\n> print(max(rich, poor))\n> print(max(len(rich), len(poor)))\n> ~~~\n> {: .language-python}\n> > ## Solution\n> > ~~~\n> > print(max(easy_string))\n> > ~~~\n> > {: .language-python}\n> > ~~~\n> > c\n> > ~~~\n> > {: .output}\n> > ~~~\n> > print(max(rich, poor))\n> > ~~~\n> > {: .language-python}\n> > ~~~\n> > tin\n> > ~~~\n> > {: .output}\n> > ~~~\n> > print(max(len(rich), len(poor)))\n> > ~~~\n> > {: .language-python}\n> > ~~~\n> > 4\n> > ~~~\n> > {: .output}\n> > `max(len(rich), poor)` throws a TypeError. This turns into `max(4, 'tin')` and \n> > as we discussed earlier a string and integer cannot meaningfully be compared.\n> > ~~~\n> > TypeError Traceback (most recent call last)\n> > <ipython-input-65-bc82ad05177a> in <module>\n> > ----> 1 max(len(rich), poor)\n> > \n> > TypeError: '>' not supported between instances of 'str' and 'int'\n> > ~~~\n> > {: .error }\n> {: .solution}\n{: .challenge}\n\n> ## Why Not?\n>\n> Why is it that `max` and `min` do not return `None` when they are called with no arguments?\n>\n> > ## Solution\n> > `max` and `min` return TypeErrors in this case because the correct number of parameters\n> > was not supplied. If it just returned `None`, the error would be much harder to trace as it\n> > would likely be stored into a variable and used later in the program, only to likely throw\n> > a runtime error.\n> {: .solution}\n{: .challenge}\n\n> ## Last Character of a String\n>\n> If Python starts counting from zero,\n> and `len` returns the number of characters in a string,\n> what index expression will get the last character in the string `name`?\n> (Note: we will see a simpler way to do this in a later episode.)\n>\n> > ## Solution\n> >\n> > `name[len(name) - 1]`\n> {: .solution}\n{: .challenge}\n\n> ## Explore the Python docs!\n>\n> The [official Python documentation](https://docs.python.org/3/) is arguably the most complete\n> source of information about the language. It is available in different languages and contains a lot of useful\n> resources. The [Built-in Functions page](https://docs.python.org/3/library/functions.html) contains a catalogue of\n> all of these functions, including the ones that we've covered in this lesson. Some of these are more advanced and \n> unnecessary at the moment, but others are very simple and useful.\n> \n{: .callout}",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
cbf98ef83895a790cee8cff761b34af828413bcb
| 53,588 |
ipynb
|
Jupyter Notebook
|
notebooks/SortSeq_Automatic_TF_finding.ipynb
|
RPGroup-PBoC/RNAseq_SortSeq
|
e525c38af983fb0d3f97fb1187ad705cd5bf17ec
|
[
"Unlicense",
"MIT"
] | null | null | null |
notebooks/SortSeq_Automatic_TF_finding.ipynb
|
RPGroup-PBoC/RNAseq_SortSeq
|
e525c38af983fb0d3f97fb1187ad705cd5bf17ec
|
[
"Unlicense",
"MIT"
] | null | null | null |
notebooks/SortSeq_Automatic_TF_finding.ipynb
|
RPGroup-PBoC/RNAseq_SortSeq
|
e525c38af983fb0d3f97fb1187ad705cd5bf17ec
|
[
"Unlicense",
"MIT"
] | null | null | null | 46.598261 | 652 | 0.533664 |
[
[
[
"## Tutorial : Automatically determining TF binding site locations",
"_____no_output_____"
],
[
"The code in this tutorial is released under the [MIT License](https://opensource.org/licenses/MIT). All the content in this notebook is under a [CC-by 4.0 License](https://creativecommons.org/licenses/by/4.0/). \n\nCreated by Bill Ireland, Suzy Beleer and Manu Flores. ",
"_____no_output_____"
]
],
[
[
"#Import basic stuff\nimport matplotlib.pyplot as plt\nimport numpy as np\n\n#import the custom analysis software\nimport scipy as sp\nimport seaborn as sns\nimport viz\n\n# Set PBoC plotting style \nviz.pboc_style_mpl()\n\n# Activate a setting that causes all plots to be inside the notebook rather than in pop-ups.\n%matplotlib inline\n# Get svg graphics from the notebook\n%config InlineBackend.figure_format = 'svg' ",
"_____no_output_____"
]
],
[
[
"To determine locations of binding sites automatically, we must take the information footprints and expression shift plots (which we demonstrate how to generate in the information footprint tutorial). We determine which ones are truly part of binding sites. To do this we will first determine which base pairs have a signficant impact on gene expression. Then if there are 5 or more significant base pairs within a 15 base pair region, then we tentatively classify that area as a binding site. These areas need to be reviewed by hand.\n\nTo determine which base pairs have significant impacts on gene expression, we will use the MCMC sampling done when inferring the expression shift to determine the uncertainty in each measurement.\n\nFirst we will load in the MCMC samples",
"_____no_output_____"
]
],
[
[
"#We will declare the path where all the data for this notebook is stored.\npath = '../datasets/'",
"_____no_output_____"
],
[
"#we will look at the aphA gene in a low oxygen growth condition.\ngenelabel = 'aphA'\n#We load in each sample in the MCMC run. These are also stored in the datasets/ folder. We store each\n#MCMC run as a pickle file (.npy) or an sqlite database (.sql)\nMCMC_samples = np.load(path + 'aphAheat_database.npy')\n#remove burnin samples. At the start of any MCMC run there will be a 'burnin' period where the sampler\n#will not be in a region of high likelihood. In our case we will be safely past it after 60000 iterations.\n#We thin samples (only save 1 out of ever 60 samples). So we will throw out the first 1000 saved samples\n#to avoid the burnin period.\nMCMC_burnin = MCMC_samples[1000:,:]\nparameter_to_check = 0",
"_____no_output_____"
]
],
[
[
"We can then look at the distributions of the MCMC samples for a given parameter. We can then construct a confidence interval for the parameter.",
"_____no_output_____"
]
],
[
[
"fig,ax = plt.subplots(figsize=(10,3))\nplt.hist(MCMC_burnin[:,parameter_to_check])\nax.set_ylabel('Number of MCMC Counts')\nax.set_xlabel('Parameter Value (A.U.)')\nplt.show()",
"findfont: Font family ['sans-serif'] not found. Falling back to DejaVu Sans.\nfindfont: Font family ['sans-serif'] not found. Falling back to DejaVu Sans.\n"
]
],
[
[
"The confidence interval is then",
"_____no_output_____"
]
],
[
[
"#We take the mean of all MCMC samples to get the parameter value.\nmean_value = np.mean(MCMC_burnin[:,parameter_to_check],axis=0)\n\n#We then determine statistical significance, if the mean value is greater than zero, we will check\n#if five percent or more samples are less than zero.\nif mean_value > 0:\n #We generate the confidence interval by checking where the 1th percentile of all values are.\n CI = np.percentile(MCMC_burnin[:,parameter_to_check],[1,100])\nelse:\n #We generate the confidence interval by checking where the 99th percentile of all values are.\n CI = np.percentile(MCMC_burnin[:,parameter_to_check],[0,99])",
"_____no_output_____"
],
[
"#we can display the confidence interval now.\nprint(CI)",
"[0.00015512 0.11985067]\n"
]
],
[
[
"We see that 0 is within the conficent interval, so it does not have a significant effect on expression. We then determine similar information for each base pair.",
"_____no_output_____"
]
],
[
[
"#initialize an array to store whether or not a given base pair is significant. If it is we will store 'True'.\n#Otherwise we will store 'False'\nall_significance = np.zeros((160))\n\n#loop through the 160 base pair region.\nfor i in range(160):\n \n #determine confidence interval as in the above panel.\n mean_value = np.mean(MCMC_burnin[:,i],axis=0)\n if mean_value > 0:\n CI = np.percentile(MCMC_burnin[:,i],[5,100])\n else:\n CI = np.percentile(MCMC_burnin[:,i],[0,95])\n #we now check if 0 in the confidence interval. If it is not, we label the significance of\n #the base pair location as 'True'.\n if 0 > CI[0] and 0 < CI[1]:\n all_significance[i] = False\n else:\n all_significance[i] = True",
"_____no_output_____"
]
],
[
[
"We will plot the results with significant base pair in red.",
"_____no_output_____"
]
],
[
[
"fig,ax = plt.subplots(figsize=(10,1))\nplt.imshow(all_significance[np.newaxis,:],aspect='auto',cmap='coolwarm')\nplt.yticks([])\nax.set_xlabel('Base Pair')\nax.set_xticks(np.arange(0,160,10))\nax.set_xticklabels(np.arange(-115,45,10))\nplt.show()",
"_____no_output_____"
]
],
[
[
"We then check if there are 5 or more significant base pairs in a 15 base pair region. If so, we will declare it part of a binding site.",
"_____no_output_____"
]
],
[
[
"# we are looking at 15 base pair windows so we only need 145 entries.\nTF_locations = np.zeros(145)\n\nfor i in range(145):\n #we get the total number of significant base pairs and see if that is 5 or more.\n if all_significance[i:i+15].sum() > 4:\n TF_locations[i] = True\n else:\n TF_locations[i] = False",
"_____no_output_____"
]
],
[
[
"Now we can plot the final results.",
"_____no_output_____"
]
],
[
[
"fig,ax = plt.subplots(figsize=(10,1))\nplt.imshow(TF_locations[np.newaxis,:],aspect='auto',cmap='coolwarm')\nplt.yticks([])\nax.set_xlabel('Base Pair')\nax.set_xticks(np.arange(0,145,10))\nax.set_xticklabels(np.arange(-108,38,10))\nplt.show()",
"_____no_output_____"
]
],
[
[
"We see that there are multiple locations identified by this method. We can see regions from -82 to -70, -53 to -47, -41 to -37, -33 to to 1, 3 to 8, and 25 to 34. The region from -82 to -70 corresponds to a confirmed DeoR binding site and the -53 to -47 binding region corresponds to a part of a known FNR binding site. All regulatory regions from -41 to 1 correspond to an RNAP binding site. \n\nHowever, the downstream regions are unlikely to correspond to true TF binding sites. This automated method includes the discovered binding sites of the *aphA* gene but also includes some secondary RNAP binding sites and some likely false positives, such as the region downstream of the TSS (2 to 10 bp). The results show that this method is useful but also we need to review all results.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbf998f65bc0a4c3d4af299b0028b4f29087f6af
| 27,852 |
ipynb
|
Jupyter Notebook
|
visualize.ipynb
|
ka40/AE4350-assignment
|
06e701cfffded6f7e9c34a64616df17c0cbed241
|
[
"MIT"
] | null | null | null |
visualize.ipynb
|
ka40/AE4350-assignment
|
06e701cfffded6f7e9c34a64616df17c0cbed241
|
[
"MIT"
] | null | null | null |
visualize.ipynb
|
ka40/AE4350-assignment
|
06e701cfffded6f7e9c34a64616df17c0cbed241
|
[
"MIT"
] | null | null | null | 242.191304 | 24,554 | 0.917852 |
[
[
[
"from baselines.common import plot_util as pu",
"_____no_output_____"
]
],
[
[
"If you want to average results for multiple seeds, LOG_DIRS must contain subfolders in the following format: ```<name_exp0>-0```, ```<name_exp0>-1```, ```<name_exp1>-0```, ```<name_exp1>-1```. Where names correspond to experiments you want to compare separated with random seeds by dash.",
"_____no_output_____"
]
],
[
[
"LOG_DIRS = 'logs/reacher/'\n# Uncomment below to see the effect of the timit limits flag\n# LOG_DIRS = 'time_limit_logs/reacher'",
"_____no_output_____"
],
[
"results = pu.load_results(LOG_DIRS)",
"/home/kostrikov/GitHub/baselines/baselines/bench/monitor.py:163: UserWarning: Pandas doesn't allow columns to be created via a new attribute name - see https://pandas.pydata.org/pandas-docs/stable/indexing.html#attribute-access\n df.headers = headers # HACK to preserve backwards compatibility\n"
],
[
"fig = pu.plot_results(results, average_group=True, split_fn=lambda _: '', shaded_std=False)",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
]
] |
cbf99e7efe8c40a9b1386e3e17c9be4901d510bf
| 8,504 |
ipynb
|
Jupyter Notebook
|
day01.ipynb
|
jastermax/mypython
|
52f2ca7998926346030c64c73be9d00d5e49ee5f
|
[
"Apache-2.0"
] | null | null | null |
day01.ipynb
|
jastermax/mypython
|
52f2ca7998926346030c64c73be9d00d5e49ee5f
|
[
"Apache-2.0"
] | null | null | null |
day01.ipynb
|
jastermax/mypython
|
52f2ca7998926346030c64c73be9d00d5e49ee5f
|
[
"Apache-2.0"
] | null | null | null | 17.076305 | 79 | 0.425329 |
[
[
[
"print(\"Hello World\")",
"Hello World\n"
],
[
"radius=100\narea=radius*radius*3.14\nprint(area)",
"31400.0\n"
],
[
"#ๅจjupyterไธญ็จshift+Tab่ทณๅบ่งฃ้ๆๆกฃ\nx=eval(input(\"่ฏท่พๅ
ฅไธไธชๆฐๅญ๏ผ\"))\nx=x*x*x\nprint(x)\n",
"่ฏท่พๅ
ฅไธไธชๆฐๅญ๏ผ10\n1000\n"
],
[
"x=input(\"่ฏท่พๅ
ฅไธไธชๆฐๅญ๏ผ\")\nx=x*3\nprint(x)\n",
"่ฏท่พๅ
ฅไธไธชๆฐๅญ๏ผ10\n101010\n"
],
[
"a=eval(input(\"่ฏท่พๅ
ฅไธไธชๆฐๅญ\"))\n#print(type(a))\na=a*(1-0.1)\nprint(a)",
"่ฏท่พๅ
ฅไธไธชๆฐๅญ100\n90.0\n"
],
[
"Joker,Misst,hahah,lalal='lalal',120,120.2,True\nprint(Joker,Misst,hahah,lalal)",
"lalal 120 120.2 True\n"
],
[
"number1=100\nnumber2=500\nprint(number1+number2)",
"600\n"
],
[
"number1=100.0\nnumber2=500.0\nprint(number1+number2)",
"600.0\n"
],
[
"number1=100.0\nnumber2=520.0\nprint(number2//number1)",
"5.0\n"
],
[
"number1=2\nnumber2=5\nprint(number2**number1)",
"25\n"
],
[
"print(25//4)",
"6\n"
],
[
"res=eval(input())\nif res%2==0:\n print(\"ๅถๆฐ๏ผ\")\nelse:\n print(\"ๅฅๆฐ๏ผ\")",
"13\nๅฅๆฐ๏ผ\n"
],
[
"res=eval(input(\"่ฏท่พๅ
ฅไธไธชๆดๆฐๆๅฐๆฐ๏ผ\"))\nif type(res)==int:\n if res%2==0:\n print(\"ๅถๆฐ๏ผ\")\n else:\n print(\"ๅฅๆฐ๏ผ\")\nelse:\n if type(res)==float:\n if res%2==0.0:\n print(\"ๅถๆฐ๏ผ\")\n else:\n print(\"ๅฅๆฐ๏ผ\")",
"4.0\nๅถๆฐ๏ผ\n"
],
[
"sen=eval(input(\"่ฏท่พๅ
ฅไธไธช็งๆฐ\"))\nfen=sen//60\nmiao=sen%60\nprint(\"{}็ง็ญไบ{}ๅ{}็ง\".format(sen,fen,miao))\nprint(\"%d็ง็ญไบ%dๅ%d็ง\"%(sen,fen,miao))",
"่ฏท่พๅ
ฅไธไธช็งๆฐ500\n500็ง็ญไบ8ๅ20็ง\n500็ง็ญไบ8ๅ20็ง\n"
],
[
"i=18\ni**=2\nprint(i)",
"324\n"
],
[
"time1=eval(input(\"่ฏท่พๅ
ฅไปๅคฉๆๆๅ ๏ผ\"))\ntime2=eval(input(\"่ฟๅ ๅคฉ๏ผ\"))\ntime1=time1+time2\nls=[\"ๆๆๆฅ\",\"ๆๆไธ\",\"ๆๆไบ\",\"ๆๆไธ\",\"ๆๆๅ\",\"ๆๆไบ\",\"ๆๆๅ
ญ\"]\ns=time1%7\nprint(\"่ฟๅ ๅคฉๆฏ๏ผ\",ls[s])",
"่ฏท่พๅ
ฅไปๅคฉๆๆๅ ๏ผ6\n่ฟๅ ๅคฉ๏ผ10\n่ฟๅ ๅคฉๆฏ๏ผ ๆๆไบ\n"
],
[
"round(27/4,0)",
"_____no_output_____"
],
[
"a=eval(input(\"่ฏท่พๅ
ฅไธไธชไธไฝๆดๆฐ\"))\nb=a//100\nc=a//10%10\nd=a%10\nif(b**3+c**3+d**3==a):\n print(\"%dๆฏๆฐดไป่ฑๆฐ\"%(a))\nelse:\n print(\"%dไธๆฏๆฐดไป่ฑๆฐ\"%(a))",
"่ฏท่พๅ
ฅไธไธชไธไฝๆดๆฐ153\n153ๆฏๆฐดไป่ฑๆฐ\n"
],
[
"for a in range(100,999):\n b=a//100\n c=a//10%10\n d=a%10\n if(b**3+c**3+d**3==a):\n print(\"%dๆฏๆฐดไป่ฑๆฐ\"%(a))\n ",
"153ๆฏๆฐดไป่ฑๆฐ\n370ๆฏๆฐดไป่ฑๆฐ\n371ๆฏๆฐดไป่ฑๆฐ\n407ๆฏๆฐดไป่ฑๆฐ\n"
],
[
"s=0.06e-2\na=197.55e+2\ns1=a*s\nprint(s1)",
"11.853\n"
],
[
"daikuanshu=eval(input(\"่ฏท่พๅ
ฅ่ดทๆฌพๆฐ๏ผ\"))\nyuelilu=eval(input(\"ๆๅฉ็\"))\nnianxian=eval(input(\"ๅนด้\"))\nmonthlyPayment=(daikuanshu*yuelilu/(1-(1/(1+yuelilu)**(nianxian*12))))\ntotalpayment=monthlyPayment*nianxian*12\nprint(totalpayment)",
"่ฏท่พๅ
ฅ่ดทๆฌพๆฐ๏ผ100\nๆๅฉ็100\nๅนด้100\n12000000.0\n"
],
[
"import time\nprint(time.time())\nsen=int(time.time())\nshi=sen//3600\nfen=sen//60\n#miao=sen%60\nprint(\"%dๆถ%dๅ%d็ง\"%(shi%24,fen%(60),miao))\n",
"1531729248.5890002\n8ๆถ20ๅ54็ง\n"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf9b6ea4c48dbcf756907eb710e56cc4740ab34
| 64,485 |
ipynb
|
Jupyter Notebook
|
notebooks/v0.1 - Description.ipynb
|
PedrosaFelipe/airbnb_predict
|
3c9259177d9281b77658901038f8df9421e5aff5
|
[
"MIT"
] | null | null | null |
notebooks/v0.1 - Description.ipynb
|
PedrosaFelipe/airbnb_predict
|
3c9259177d9281b77658901038f8df9421e5aff5
|
[
"MIT"
] | null | null | null |
notebooks/v0.1 - Description.ipynb
|
PedrosaFelipe/airbnb_predict
|
3c9259177d9281b77658901038f8df9421e5aff5
|
[
"MIT"
] | null | null | null | 63.344794 | 32,988 | 0.722509 |
[
[
[
"## 0.0. Objetivo do Problema:\n",
"_____no_output_____"
],
[
"-- 1.0. Previsao do primeiro destino que um novo usuรกrio irรก escolher.\n\n-- Porque?\n -- Qual tipo de modelo de negรณcio do Airbnb? \n \n \n - Marketplace ( Conectar pessoas que oferecem acomodacao, com pessoas que estao procurando acomodacao)\n - Oferta ( pessoas oferecendo acomodacao )\n - Tamanho do portfรณlio.\n - Diversidade/Densidade de Portfรณlio.\n - Preco Medio\n\n - Demanda ( pessoas procurando acomodacao )\n - Numero de Usuรกrios\n - LTV ( Lifetime Value )\n - CAC ( Client Acquisition Cost )\n\n\n Gross Revenue = ( Fee * Numero cliente ) - CAC ",
"_____no_output_____"
],
[
"## 0.1. Proposta de soluรงรฃo:\n",
"_____no_output_____"
],
[
"--- Modelo de Predizao do primeiro destino de um novo usario.\n\n- 1.0. Predicoes e salva em tabela do banco de dados. \n- 2.0. API \n --- Input: usuario e suas caracteristicas\n --- Output: usuario e suas caracteristicas com a **predicao do destino**\n\n--- 16 ciclos",
"_____no_output_____"
],
[
"# <font color ='red'> 1.0. Imports </font> ",
"_____no_output_____"
]
],
[
[
"import pandas as pd\n\nfrom sklearn import model_selection as ms\nfrom sklearn import preprocessing as pp\nfrom sklearn import metrics as m\n\nfrom scikitplot import metrics as mt\n\nfrom keras import models as ml\nfrom keras import layers as l\n\nimport warnings\n\nwarnings.filterwarnings(\"ignore\")\n\n",
"2022-03-02 15:40:10.777427: W tensorflow/stream_executor/platform/default/dso_loader.cc:64] Could not load dynamic library 'libcudart.so.11.0'; dlerror: libcudart.so.11.0: cannot open shared object file: No such file or directory\n2022-03-02 15:40:10.777487: I tensorflow/stream_executor/cuda/cudart_stub.cc:29] Ignore above cudart dlerror if you do not have a GPU set up on your machine.\n"
]
],
[
[
"## 1.1. Helper Function",
"_____no_output_____"
],
[
"## 1.2. Loading Data",
"_____no_output_____"
]
],
[
[
"df_raw = pd.read_csv('~/repositorio/airbnb_predict/data/raw/train_users_2.csv', low_memory=True)\ndf_sessions = pd.read_csv('~/repositorio/airbnb_predict/data/raw/sessions.csv', low_memory=True)",
"_____no_output_____"
]
],
[
[
"# 2.0. Data Description\n",
"_____no_output_____"
]
],
[
[
"df2 = df_raw.copy()",
"_____no_output_____"
],
[
"print('Number of rows: {}'.format(df2.shape[0]))\nprint('Number of columns: {}'.format(df2.shape[1]))",
"Number of rows: 213451\nNumber of columns: 16\n"
]
],
[
[
"## 2.1. Data Type\n",
"_____no_output_____"
]
],
[
[
"df2.dtypes",
"_____no_output_____"
]
],
[
[
"## 2.2. NA Check\n",
"_____no_output_____"
]
],
[
[
"df2.isna().sum()",
"_____no_output_____"
],
[
"# remove missing value completly\n\ndf2 = df2.dropna()",
"_____no_output_____"
]
],
[
[
"## 2.3. Change Data Type\n",
"_____no_output_____"
]
],
[
[
"# 'date_account_created'\ndf2['date_account_created'] = pd.to_datetime(df2['date_account_created'])\n\n# 'timestamp_first_active'\ndf2['timestamp_first_active'] = pd.to_datetime(df2['timestamp_first_active'], format = '%Y%m%d%H%M%S')\n\n# 'date_first_booking'\ndf2['date_first_booking'] = pd.to_datetime(df2['date_first_booking'])\n\n# 'age'\ndf2['age'] = df2['age'].astype('int64')\n",
"_____no_output_____"
]
],
[
[
"## 2.4. Check Balanced Data\n",
"_____no_output_____"
]
],
[
[
"df2['country_destination'].value_counts(normalize=True)",
"_____no_output_____"
]
],
[
[
"# 3.0. Data Filtering\n",
"_____no_output_____"
]
],
[
[
"df3 = df2.copy()",
"_____no_output_____"
]
],
[
[
"## 3.1. Filtering Rows\n",
"_____no_output_____"
],
[
"## 3.2. Columns Selection",
"_____no_output_____"
],
[
"# 4.0. Data Preparation\n",
"_____no_output_____"
]
],
[
[
"df4 = df3.copy()",
"_____no_output_____"
],
[
"# dummy variable\ndf4_dummy = pd.get_dummies(df4.drop(['id','country_destination'], axis =1))\n\n# join id and country destination\ndf4 = pd.concat([df4[['id','country_destination']],df4_dummy], axis =1)",
"_____no_output_____"
]
],
[
[
"# 5.0. Feature Selection\n",
"_____no_output_____"
]
],
[
[
"df5 = df4.copy()",
"_____no_output_____"
],
[
"cols_drop = ['date_account_created','timestamp_first_active','date_first_booking'] # original dates",
"_____no_output_____"
],
[
"df5 = df5.drop(cols_drop, axis =1)",
"_____no_output_____"
],
[
"X = df5.drop(['id','country_destination'], axis = 1)\nY = df5['country_destination'].copy()",
"_____no_output_____"
]
],
[
[
"# 6.0. Machine Learning Model - Neural Network MLP\n",
"_____no_output_____"
]
],
[
[
"# Split dataset into training and test\n\nX_train, X_test , y_train, y_test = ms.train_test_split(X, Y, test_size = 0.2 , random_state=32)",
"_____no_output_____"
],
[
"ohe = pp.OneHotEncoder()\ny_train_nn = ohe.fit_transform(y_train.values.reshape(-1,1)).toarray()",
"_____no_output_____"
],
[
"# model definition\nmodel = ml.Sequential()\nmodel.add(l.Dense(128, input_dim = X_train.shape[1], activation= 'relu'))\nmodel.add(l.Dense(11, activation= 'softmax'))\n\n# model compile\nmodel.compile(loss = 'categorical_crossentropy' , optimizer='adam', metrics=['accuracy'])\n\n# tain model\nmodel.fit(X_train, y_train_nn, epochs=100)",
"Epoch 1/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.3830 - accuracy: 0.7007\nEpoch 2/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.3010 - accuracy: 0.7068\nEpoch 3/100\n1705/1705 [==============================] - 2s 965us/step - loss: 1.2819 - accuracy: 0.7080\nEpoch 4/100\n1705/1705 [==============================] - 2s 974us/step - loss: 1.2399 - accuracy: 0.7081\nEpoch 5/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.2175 - accuracy: 0.7080\nEpoch 6/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1932 - accuracy: 0.7085\nEpoch 7/100\n1705/1705 [==============================] - 2s 963us/step - loss: 1.1721 - accuracy: 0.7084\nEpoch 8/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.1647 - accuracy: 0.7088\nEpoch 9/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1550 - accuracy: 0.7087\nEpoch 10/100\n1705/1705 [==============================] - 2s 982us/step - loss: 1.1451 - accuracy: 0.7087\nEpoch 11/100\n1705/1705 [==============================] - 2s 969us/step - loss: 1.1394 - accuracy: 0.7089\nEpoch 12/100\n1705/1705 [==============================] - 2s 959us/step - loss: 1.1358 - accuracy: 0.7091\nEpoch 13/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.1324 - accuracy: 0.7091\nEpoch 14/100\n1705/1705 [==============================] - 2s 967us/step - loss: 1.1304 - accuracy: 0.7095\nEpoch 15/100\n1705/1705 [==============================] - 2s 959us/step - loss: 1.1288 - accuracy: 0.7095\nEpoch 16/100\n1705/1705 [==============================] - 2s 965us/step - loss: 1.1285 - accuracy: 0.7095\nEpoch 17/100\n1705/1705 [==============================] - 2s 967us/step - loss: 1.1276 - accuracy: 0.7096\nEpoch 18/100\n1705/1705 [==============================] - 2s 974us/step - loss: 1.1266 - accuracy: 0.7097\nEpoch 19/100\n1705/1705 [==============================] - 2s 966us/step - loss: 1.1255 - accuracy: 0.7097\nEpoch 20/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.1250 - accuracy: 0.7098\nEpoch 21/100\n1705/1705 [==============================] - 2s 961us/step - loss: 1.1234 - accuracy: 0.7098\nEpoch 22/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.1233 - accuracy: 0.7098\nEpoch 23/100\n1705/1705 [==============================] - 2s 965us/step - loss: 1.1229 - accuracy: 0.7098\nEpoch 24/100\n1705/1705 [==============================] - 2s 998us/step - loss: 1.1223 - accuracy: 0.7097\nEpoch 25/100\n1705/1705 [==============================] - 2s 957us/step - loss: 1.1210 - accuracy: 0.7097\nEpoch 26/100\n1705/1705 [==============================] - 2s 959us/step - loss: 1.1204 - accuracy: 0.7098\nEpoch 27/100\n1705/1705 [==============================] - 2s 962us/step - loss: 1.1197 - accuracy: 0.7098\nEpoch 28/100\n1705/1705 [==============================] - 2s 967us/step - loss: 1.1201 - accuracy: 0.7097\nEpoch 29/100\n1705/1705 [==============================] - 2s 956us/step - loss: 1.1182 - accuracy: 0.7100\nEpoch 30/100\n1705/1705 [==============================] - 2s 958us/step - loss: 1.1171 - accuracy: 0.7100\nEpoch 31/100\n1705/1705 [==============================] - 2s 956us/step - loss: 1.1169 - accuracy: 0.7100\nEpoch 32/100\n1705/1705 [==============================] - 2s 953us/step - loss: 1.1153 - accuracy: 0.7101\nEpoch 33/100\n1705/1705 [==============================] - 2s 956us/step - loss: 1.1156 - accuracy: 0.7102\nEpoch 34/100\n1705/1705 [==============================] - 2s 961us/step - loss: 1.1148 - accuracy: 0.7102\nEpoch 35/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.1139 - accuracy: 0.7103\nEpoch 36/100\n1705/1705 [==============================] - 2s 963us/step - loss: 1.1136 - accuracy: 0.7102\nEpoch 37/100\n1705/1705 [==============================] - 2s 957us/step - loss: 1.1126 - accuracy: 0.7105\nEpoch 38/100\n1705/1705 [==============================] - 2s 957us/step - loss: 1.1116 - accuracy: 0.7106\nEpoch 39/100\n1705/1705 [==============================] - 2s 959us/step - loss: 1.1119 - accuracy: 0.7105\nEpoch 40/100\n1705/1705 [==============================] - 2s 961us/step - loss: 1.1113 - accuracy: 0.7104\nEpoch 41/100\n1705/1705 [==============================] - 2s 960us/step - loss: 1.1108 - accuracy: 0.7105\nEpoch 42/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.1098 - accuracy: 0.7106\nEpoch 43/100\n1705/1705 [==============================] - 2s 995us/step - loss: 1.1097 - accuracy: 0.7105\nEpoch 44/100\n1705/1705 [==============================] - 2s 962us/step - loss: 1.1088 - accuracy: 0.7108\nEpoch 45/100\n1705/1705 [==============================] - 2s 985us/step - loss: 1.1083 - accuracy: 0.7108\nEpoch 46/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1080 - accuracy: 0.7107\nEpoch 47/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1086 - accuracy: 0.7106\nEpoch 48/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1073 - accuracy: 0.7108\nEpoch 49/100\n1705/1705 [==============================] - 2s 984us/step - loss: 1.1064 - accuracy: 0.7109\nEpoch 50/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.1061 - accuracy: 0.7111\nEpoch 51/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1053 - accuracy: 0.7110\nEpoch 52/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1052 - accuracy: 0.7109\nEpoch 53/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1044 - accuracy: 0.7111\nEpoch 54/100\n1705/1705 [==============================] - 2s 974us/step - loss: 1.1047 - accuracy: 0.7110\nEpoch 55/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.1043 - accuracy: 0.7111\nEpoch 56/100\n1705/1705 [==============================] - 2s 965us/step - loss: 1.1039 - accuracy: 0.7109\nEpoch 57/100\n1705/1705 [==============================] - 2s 967us/step - loss: 1.1037 - accuracy: 0.7111\nEpoch 58/100\n1705/1705 [==============================] - 2s 965us/step - loss: 1.1036 - accuracy: 0.7110\nEpoch 59/100\n1705/1705 [==============================] - 2s 963us/step - loss: 1.1032 - accuracy: 0.7114\nEpoch 60/100\n1705/1705 [==============================] - 2s 971us/step - loss: 1.1022 - accuracy: 0.7113\nEpoch 61/100\n1705/1705 [==============================] - 2s 1ms/step - loss: 1.1024 - accuracy: 0.7113\nEpoch 62/100\n1705/1705 [==============================] - 2s 967us/step - loss: 1.1020 - accuracy: 0.7116\nEpoch 63/100\n1705/1705 [==============================] - 2s 963us/step - loss: 1.1017 - accuracy: 0.7116\nEpoch 64/100\n1705/1705 [==============================] - 2s 976us/step - loss: 1.1027 - accuracy: 0.7115\nEpoch 65/100\n1705/1705 [==============================] - 2s 969us/step - loss: 1.1007 - accuracy: 0.7116\nEpoch 66/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.1001 - accuracy: 0.7117\nEpoch 67/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.1007 - accuracy: 0.7115\nEpoch 68/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.1003 - accuracy: 0.7118\nEpoch 69/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.1000 - accuracy: 0.7118\nEpoch 70/100\n1705/1705 [==============================] - 2s 967us/step - loss: 1.0998 - accuracy: 0.7119\nEpoch 71/100\n1705/1705 [==============================] - 2s 982us/step - loss: 1.0984 - accuracy: 0.7117\nEpoch 72/100\n1705/1705 [==============================] - 2s 968us/step - loss: 1.0987 - accuracy: 0.7117\nEpoch 73/100\n1705/1705 [==============================] - 2s 973us/step - loss: 1.0991 - accuracy: 0.7120\nEpoch 74/100\n1705/1705 [==============================] - 2s 962us/step - loss: 1.0982 - accuracy: 0.7120\nEpoch 75/100\n1705/1705 [==============================] - 2s 964us/step - loss: 1.0983 - accuracy: 0.7119\nEpoch 76/100\n1705/1705 [==============================] - 2s 966us/step - loss: 1.0978 - accuracy: 0.7116\nEpoch 77/100\n1705/1705 [==============================] - 2s 966us/step - loss: 1.0974 - accuracy: 0.7122\nEpoch 78/100\n"
]
],
[
[
"# 7.0. NN Performance\n",
"_____no_output_____"
]
],
[
[
"# prediction\npred_nn = model.predict(X_test)\n\n# invert Predict\nyhat_nn = ohe.inverse_transform(pred_nn)\n\n# prediction prepare\ny_test_nn = y_test.to_numpy()\nyhat_nn = yhat_nn.reshape(1,-1)[0]",
"_____no_output_____"
],
[
"# accuracy\nacc_nn = m.accuracy_score(y_test_nn, yhat_nn)\nprint('Accuracy: {}'.format(acc_nn))\n\n# confusion matrix\nmt.plot_confusion_matrix(y_test_nn , yhat_nn, normalize=False, figsize=(12,12))",
"Accuracy: 0.7022368903557022\n"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbf9bbc87993bdc54b03bf78ccf1f63cf7404976
| 353,466 |
ipynb
|
Jupyter Notebook
|
session4/L2_Simple regression.ipynb
|
jordanopensource/data-science-bootcamp
|
91424dc81455be62bceb6e0fb4fac8edaca44aed
|
[
"MIT"
] | 4 |
2015-12-30T08:39:36.000Z
|
2016-03-11T13:28:40.000Z
|
session4/L2_Simple regression.ipynb
|
jordanopensource/data-science-bootcamp
|
91424dc81455be62bceb6e0fb4fac8edaca44aed
|
[
"MIT"
] | null | null | null |
session4/L2_Simple regression.ipynb
|
jordanopensource/data-science-bootcamp
|
91424dc81455be62bceb6e0fb4fac8edaca44aed
|
[
"MIT"
] | 9 |
2015-12-02T08:19:28.000Z
|
2019-03-21T00:07:43.000Z
| 851.725301 | 40,936 | 0.940204 |
[
[
[
"## Simple regression",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\n\nimport numpy as np\nimport matplotlib.pyplot as plt\n# Import relevant modules\nimport pymc \nimport numpy as np",
"_____no_output_____"
],
[
"def generateData(size, true_intercept, true_slope, order, noiseSigma):\n x = np.linspace(0, 1, size)\n # y = a + b*x\n true_y = true_intercept + true_slope * (x ** order) \n # add noise\n y = true_y + np.random.normal(scale=noiseSigma, size=size)\n return x, y, true_y\n\ndef plotData(x, y, true_y):\n fig = plt.figure(figsize=(7, 7))\n ax = fig.add_subplot(111, xlabel='x', ylabel='y', title='Generated data and underlying model')\n ax.plot(x, y, 'x', label='sampled data')\n ax.plot(x, true_y, label='true regression line', lw=2.)\n plt.legend(loc=0);\n",
"_____no_output_____"
]
],
[
[
"### Fit linear model",
"_____no_output_____"
]
],
[
[
"(x, y, true_y) = generateData(size = 200, true_intercept = 1, true_slope = 20, order = 1, noiseSigma=1.0)\nplotData(x, y, true_y)",
"_____no_output_____"
],
[
"#Fit linear model\n\nsigma = pymc.HalfCauchy('sigma', 10, 1.)\nintercept = pymc.Normal('Intercept', 0, 1)\nx_coeff = pymc.Normal('x', 0, 1)\n\[email protected]\ndef m(intercept= intercept, x_coeff=x_coeff):\n return intercept + (x ** 1) * x_coeff\n\nlikelihood = pymc.Normal(name='y', mu=m, tau=1.0/sigma, value=y, observed=True)\n",
"_____no_output_____"
],
[
"# Plot the model dependencies \nimport pymc.graph\nfrom IPython.display import display_png\ngraph = pymc.graph.graph(S)\ndisplay_png(graph.create_png(), raw=True)",
"_____no_output_____"
],
[
"# Run inference \nmcmc = pymc.MCMC([likelihood, sigma, intercept, x_coeff])\nmcmc.sample(iter=10000, burn=500, thin=2)\npymc.Matplot.plot(mcmc)",
" [-----------------100%-----------------] 10000 of 10000 complete in 1.3 secPlotting Intercept\nPlotting sigma\nPlotting x\n"
]
],
[
[
"### Exercise fit cubic model",
"_____no_output_____"
]
],
[
[
"# your code here",
"_____no_output_____"
]
],
[
[
"### Model selection",
"_____no_output_____"
]
],
[
[
"(x, y, true_y) = generateData(size = 200, true_intercept = 1, true_slope = 20, order = 3, noiseSigma=2.0)\nplotData(x, y, true_y)",
"_____no_output_____"
],
[
"#Model selection\n\nbeta = pymc.Beta('beta', 1.0, 1.0)\nber = pymc.Bernoulli('ber', beta)\n\nsigma = pymc.HalfCauchy('sigma', 10, 1.)\nintercept = pymc.Normal('Intercept', 0, 1)\nx_coeff = pymc.Normal('x', 0, 1)\n\[email protected]\ndef m(intercept= intercept, x_coeff=x_coeff, ber=ber):\n if ber:\n return intercept + (x ** 3) * x_coeff\n else:\n return intercept + (x ** 1) * x_coeff\nlikelihood = pymc.Normal(name='y', mu=m, tau=1.0/sigma, value=y, observed=True)\n\nmcmc = pymc.MCMC([likelihood, sigma, intercept, x_coeff, beta, ber])\nmcmc.sample(iter=10000, burn=500, thin=2)\npymc.Matplot.plot(mcmc)",
" [-----------------100%-----------------] 10000 of 10000 complete in 3.0 sec"
],
[
"plt.hist(np.array(mcmc.trace(\"ber\")[:], dtype=np.int))\nplt.xlim([0, 1.5])",
"_____no_output_____"
]
],
[
[
"### Exercise: find noise effect on the model linearity",
"_____no_output_____"
]
],
[
[
"# your code here",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbf9c53a9788bc65ac697d27c5458931b9ce5015
| 17,415 |
ipynb
|
Jupyter Notebook
|
example/geocoder.ipynb
|
doyeonkp/BigData_2017
|
51e71afb364d70e30e52b546331e0a44f1265230
|
[
"MIT"
] | null | null | null |
example/geocoder.ipynb
|
doyeonkp/BigData_2017
|
51e71afb364d70e30e52b546331e0a44f1265230
|
[
"MIT"
] | null | null | null |
example/geocoder.ipynb
|
doyeonkp/BigData_2017
|
51e71afb364d70e30e52b546331e0a44f1265230
|
[
"MIT"
] | null | null | null | 41.662679 | 90 | 0.50244 |
[
[
[
"import requests\nurl = 'https://maps.googleapis.com/maps/api/geocode/json'\npar = {'address': '325 W Shaw Ln, East Lansing, MI'}\nr = requests.get(url, params=par)\nresults = r.json()['results']",
"_____no_output_____"
],
[
"results",
"_____no_output_____"
],
[
"import geocoder\ng = geocoder.google('325 W Shaw Ln, Esat Lansing, MI 48824')\ng.latlng",
"_____no_output_____"
],
[
"g = geocoder.google([41.8781, -87.6298],method='reverse')\ng.address",
"_____no_output_____"
],
[
"type(g)",
"_____no_output_____"
],
[
"g.content",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbf9c6a989341487f290c7d2c3ca8811b94de137
| 23,855 |
ipynb
|
Jupyter Notebook
|
tutorial_files/5_hierarchical_generators.ipynb
|
ucb-art/BAG_XBase_demo
|
15132f98abf9f5928e8eb4b3541628d90f4e7a25
|
[
"BSD-3-Clause"
] | 3 |
2018-11-04T18:50:30.000Z
|
2020-06-02T23:03:08.000Z
|
tutorial_files/5_hierarchical_generators.ipynb
|
ucb-art/BAG_XBase_demo
|
15132f98abf9f5928e8eb4b3541628d90f4e7a25
|
[
"BSD-3-Clause"
] | null | null | null |
tutorial_files/5_hierarchical_generators.ipynb
|
ucb-art/BAG_XBase_demo
|
15132f98abf9f5928e8eb4b3541628d90f4e7a25
|
[
"BSD-3-Clause"
] | 8 |
2019-01-30T18:21:57.000Z
|
2020-06-15T16:04:33.000Z
| 45.094518 | 502 | 0.596353 |
[
[
[
"# Module 5: Hierarchical Generators\nThis module covers writing layout/schematic generators that instantiate other generators. We will write a two-stage amplifier generator, which instatiates the common-source amplifier followed by the source-follower amplifier.",
"_____no_output_____"
],
[
"## AmpChain Layout Example\nFirst, we will write a layout generator for the two-stage amplifier. The layout floorplan is drawn for you below:\n<img src=\"bootcamp_pics/5_hierarchical_generator/hierachical_generator_1.PNG\" alt=\"Drawing\" style=\"width: 400px;\"/>\nThis floorplan abuts the `AmpCS` instance next to `AmpSF` instance, the `VSS` ports are simply shorted together, and the top `VSS` port of `AmpSF` is ignored (they are connected together internally by dummy connections). The intermediate node of the two-stage amplifier is connected using a vertical routing track in the middle of the two amplifier blocks. `VDD` ports are connected to the top-most M6 horizontal track, and other ports are simply exported in-place.\n\nThe layout generator is reproduced below, with some parts missing (which you will fill out later). We will walk through the important sections of the code.\n```python\nclass AmpChain(TemplateBase):\n def __init__(self, temp_db, lib_name, params, used_names, **kwargs):\n TemplateBase.__init__(self, temp_db, lib_name, params, used_names, **kwargs)\n self._sch_params = None\n\n @property\n def sch_params(self):\n return self._sch_params\n\n @classmethod\n def get_params_info(cls):\n return dict(\n cs_params='common source amplifier parameters.',\n sf_params='source follower parameters.',\n show_pins='True to draw pin geometries.',\n )\n\n def draw_layout(self):\n \"\"\"Draw the layout of a transistor for characterization.\n \"\"\"\n\n # make copies of given dictionaries to avoid modifying external data.\n cs_params = self.params['cs_params'].copy()\n sf_params = self.params['sf_params'].copy()\n show_pins = self.params['show_pins']\n\n # disable pins in subcells\n cs_params['show_pins'] = False\n sf_params['show_pins'] = False\n\n # create layout masters for subcells we will add later\n cs_master = self.new_template(params=cs_params, temp_cls=AmpCS)\n # TODO: create sf_master. Use AmpSFSoln class\n sf_master = None\n\n if sf_master is None:\n return\n\n # add subcell instances\n cs_inst = self.add_instance(cs_master, 'XCS')\n # add source follower to the right of common source\n x0 = cs_inst.bound_box.right_unit\n sf_inst = self.add_instance(sf_master, 'XSF', loc=(x0, 0), unit_mode=True)\n\n # get VSS wires from AmpCS/AmpSF\n cs_vss_warr = cs_inst.get_all_port_pins('VSS')[0]\n sf_vss_warrs = sf_inst.get_all_port_pins('VSS')\n # only connect bottom VSS wire of source follower\n if sf_vss_warrs[0].track_id.base_index < sf_vss_warrs[1].track_id.base_index:\n sf_vss_warr = sf_vss_warrs[0]\n else:\n sf_vss_warr = sf_vss_warrs[1]\n\n # connect VSS of the two blocks together\n vss = self.connect_wires([cs_vss_warr, sf_vss_warr])[0]\n\n # get layer IDs from VSS wire\n hm_layer = vss.layer_id\n vm_layer = hm_layer + 1\n top_layer = vm_layer + 1\n\n # calculate template size\n tot_box = cs_inst.bound_box.merge(sf_inst.bound_box)\n self.set_size_from_bound_box(top_layer, tot_box, round_up=True)\n\n # get subcell ports as WireArrays so we can connect them\n vmid0 = cs_inst.get_all_port_pins('vout')[0]\n vmid1 = sf_inst.get_all_port_pins('vin')[0]\n vdd0 = cs_inst.get_all_port_pins('VDD')[0]\n vdd1 = sf_inst.get_all_port_pins('VDD')[0]\n\n # get vertical VDD TrackIDs\n vdd0_tid = TrackID(vm_layer, self.grid.coord_to_nearest_track(vm_layer, vdd0.middle))\n vdd1_tid = TrackID(vm_layer, self.grid.coord_to_nearest_track(vm_layer, vdd1.middle))\n\n # connect VDD of each block to vertical M5\n vdd0 = self.connect_to_tracks(vdd0, vdd0_tid)\n vdd1 = self.connect_to_tracks(vdd1, vdd1_tid)\n # connect M5 VDD to top M6 horizontal track\n vdd_tidx = self.grid.get_num_tracks(self.size, top_layer) - 1\n vdd_tid = TrackID(top_layer, vdd_tidx)\n vdd = self.connect_to_tracks([vdd0, vdd1], vdd_tid)\n\n # TODO: connect vmid0 and vmid1 to vertical track in the middle of two templates\n # hint: use x0\n vmid = None\n\n if vmid is None:\n return\n\n # add pins on wires\n self.add_pin('vmid', vmid, show=show_pins)\n self.add_pin('VDD', vdd, show=show_pins)\n self.add_pin('VSS', vss, show=show_pins)\n # re-export pins on subcells.\n self.reexport(cs_inst.get_port('vin'), show=show_pins)\n self.reexport(cs_inst.get_port('vbias'), net_name='vb1', show=show_pins)\n # TODO: reexport vout and vbias of source follower\n # TODO: vbias should be renamed to vb2\n\n # compute schematic parameters.\n self._sch_params = dict(\n cs_params=cs_master.sch_params,\n sf_params=sf_master.sch_params,\n )\n```",
"_____no_output_____"
],
[
"## AmpChain Constructor\n```python\nclass AmpChain(TemplateBase):\n def __init__(self, temp_db, lib_name, params, used_names, **kwargs):\n TemplateBase.__init__(self, temp_db, lib_name, params, used_names, **kwargs)\n self._sch_params = None\n\n @property\n def sch_params(self):\n return self._sch_params\n\n @classmethod\n def get_params_info(cls):\n return dict(\n cs_params='common source amplifier parameters.',\n sf_params='source follower parameters.',\n show_pins='True to draw pin geometries.',\n )\n```\nFirst, notice that instead of subclassing `AnalogBase`, the `AmpChain` class subclasses `TemplateBase`. This is because we are not trying to draw transistor rows inside this layout generator; we just want to place and route multiple layout instances together. `TemplateBase` is the base class for all layout generators and it provides most placement and routing methods you need.\n\nNext, notice that the parameters for `AmpChain` are simply parameter dictionaries for the two sub-generators. The ability to use complex data structures as generator parameters solves the parameter explosion problem when writing generators with many levels of hierarchy.",
"_____no_output_____"
],
[
"## Creating Layout Master\n```python\n# create layout masters for subcells we will add later\ncs_master = self.new_template(params=cs_params, temp_cls=AmpCS)\n# TODO: create sf_master. Use AmpSFSoln class\nsf_master = None\n```\nHere, the `new_template()` function creates a new layout master, `cs_master`, which represents a generated layout cellview from the `AmpCS` layout generator. We can later instances of this master in the current layout, which are references to the generated `AmpCS` layout cellview, perhaps shifted and rotated. The main take away is that the `new_template()` function does not add any layout geometries to the current layout, but rather create a separate layout cellview which we may use later.",
"_____no_output_____"
],
[
"## Creating Layout Instance\n```python\n# add subcell instances\ncs_inst = self.add_instance(cs_master, 'XCS')\n# add source follower to the right of common source\nx0 = cs_inst.bound_box.right_unit\nsf_inst = self.add_instance(sf_master, 'XSF', loc=(x0, 0), unit_mode=True)\n```\n\nThe `add_instance()` method adds an instance of the given layout master to the current cellview. By default, if no location or orientation is given, it puts the instance at the origin with no rotation. the `bound_box` attribute can then be used on the instance to get the bounding box of the instance. Here, the bounding box is used to determine the X coordinate of the source-follower.",
"_____no_output_____"
],
[
"## Get Instance Ports\n```python\n# get subcell ports as WireArrays so we can connect them\nvmid0 = cs_inst.get_all_port_pins('vout')[0]\nvmid1 = sf_inst.get_all_port_pins('vin')[0]\nvdd0 = cs_inst.get_all_port_pins('VDD')[0]\nvdd1 = sf_inst.get_all_port_pins('VDD')[0]\n```\nafter adding an instance, the `get_all_port_pins()` function can be used to obtain a list of all pins as `WireArray` objects with the given name. In this case, we know that there's exactly one pin, so we use Python list indexing to obtain first element of the list.",
"_____no_output_____"
],
[
"## Routing Grid Object\n```python\n# get vertical VDD TrackIDs\nvdd0_tid = TrackID(vm_layer, self.grid.coord_to_nearest_track(vm_layer, vdd0.middle))\nvdd1_tid = TrackID(vm_layer, self.grid.coord_to_nearest_track(vm_layer, vdd1.middle))\n```\n\nthe `self.grid` attribute of `TemplateBase` is a `RoutingGrid` objects, which provides many useful functions related to the routing grid. In this particular scenario, `coord_to_nearest_track()` is used to determine the vertical track index closest to the center of the `VDD` ports. These vertical tracks will be used later to connect the `VDD` ports together.",
"_____no_output_____"
],
[
"## Re-export Pins on Instances\n```python\n # re-export pins on subcells.\nself.reexport(cs_inst.get_port('vin'), show=show_pins)\nself.reexport(cs_inst.get_port('vbias'), net_name='vb1', show=show_pins)\n# TODO: reexport vout and vbias of source follower\n# TODO: vbias should be renamed to vb2\n```\n`TemplateBase` also provides a `reexport()` function, which is a convenience function to re-export an instance port in-place. The `net_name` optional parameter can be used to change the port name. In this example, the `vbias` port of common-source amplifier is renamed to `vb1`.",
"_____no_output_____"
],
[
"## Layout Exercises\nNow you should know everything you need to finish the two-stage amplifier layout generator. Fill in the missing pieces to do the following:\n\n1. Create layout master for `AmpSF` using the `AmpSFSoln` class.\n2. Using `RoutingGrid`, determine the vertical track index in the middle of the two amplifier blocks, and connect `vmid` wires together using this track.\n * Hint: variable `x0` is the X coordinate of the boundary between the two blocks.\n3. Re-export `vout` and `vbias` of the source-follower. Rename `vbias` to `vb2`.\n\nOnce you're done, evaluate the cell below, which will generate the layout and run LVS. If everything is done correctly, a layout should be generated inthe `DEMO_AMP_CHAIN` library, and LVS should pass.",
"_____no_output_____"
]
],
[
[
"from bag.layout.routing import TrackID\nfrom bag.layout.template import TemplateBase\n\nfrom xbase_demo.demo_layout.core import AmpCS, AmpSFSoln\n\n\nclass AmpChain(TemplateBase):\n def __init__(self, temp_db, lib_name, params, used_names, **kwargs):\n TemplateBase.__init__(self, temp_db, lib_name, params, used_names, **kwargs)\n self._sch_params = None\n\n @property\n def sch_params(self):\n return self._sch_params\n\n @classmethod\n def get_params_info(cls):\n return dict(\n cs_params='common source amplifier parameters.',\n sf_params='source follower parameters.',\n show_pins='True to draw pin geometries.',\n )\n\n def draw_layout(self):\n \"\"\"Draw the layout of a transistor for characterization.\n \"\"\"\n\n # make copies of given dictionaries to avoid modifying external data.\n cs_params = self.params['cs_params'].copy()\n sf_params = self.params['sf_params'].copy()\n show_pins = self.params['show_pins']\n\n # disable pins in subcells\n cs_params['show_pins'] = False\n sf_params['show_pins'] = False\n\n # create layout masters for subcells we will add later\n cs_master = self.new_template(params=cs_params, temp_cls=AmpCS)\n # TODO: create sf_master. Use AmpSFSoln class\n sf_master = None\n\n if sf_master is None:\n return\n\n # add subcell instances\n cs_inst = self.add_instance(cs_master, 'XCS')\n # add source follower to the right of common source\n x0 = cs_inst.bound_box.right_unit\n sf_inst = self.add_instance(sf_master, 'XSF', loc=(x0, 0), unit_mode=True)\n\n # get VSS wires from AmpCS/AmpSF\n cs_vss_warr = cs_inst.get_all_port_pins('VSS')[0]\n sf_vss_warrs = sf_inst.get_all_port_pins('VSS')\n # only connect bottom VSS wire of source follower\n if len(sf_vss_warrs) < 2 or sf_vss_warrs[0].track_id.base_index < sf_vss_warrs[1].track_id.base_index:\n sf_vss_warr = sf_vss_warrs[0]\n else:\n sf_vss_warr = sf_vss_warrs[1]\n\n # connect VSS of the two blocks together\n vss = self.connect_wires([cs_vss_warr, sf_vss_warr])[0]\n\n # get layer IDs from VSS wire\n hm_layer = vss.layer_id\n vm_layer = hm_layer + 1\n top_layer = vm_layer + 1\n\n # calculate template size\n tot_box = cs_inst.bound_box.merge(sf_inst.bound_box)\n self.set_size_from_bound_box(top_layer, tot_box, round_up=True)\n\n # get subcell ports as WireArrays so we can connect them\n vmid0 = cs_inst.get_all_port_pins('vout')[0]\n vmid1 = sf_inst.get_all_port_pins('vin')[0]\n vdd0 = cs_inst.get_all_port_pins('VDD')[0]\n vdd1 = sf_inst.get_all_port_pins('VDD')[0]\n\n # get vertical VDD TrackIDs\n vdd0_tid = TrackID(vm_layer, self.grid.coord_to_nearest_track(vm_layer, vdd0.middle))\n vdd1_tid = TrackID(vm_layer, self.grid.coord_to_nearest_track(vm_layer, vdd1.middle))\n\n # connect VDD of each block to vertical M5\n vdd0 = self.connect_to_tracks(vdd0, vdd0_tid)\n vdd1 = self.connect_to_tracks(vdd1, vdd1_tid)\n # connect M5 VDD to top M6 horizontal track\n vdd_tidx = self.grid.get_num_tracks(self.size, top_layer) - 1\n vdd_tid = TrackID(top_layer, vdd_tidx)\n vdd = self.connect_to_tracks([vdd0, vdd1], vdd_tid)\n\n # TODO: connect vmid0 and vmid1 to vertical track in the middle of two templates\n # hint: use x0\n vmid = None\n\n if vmid is None:\n return\n\n # add pins on wires\n self.add_pin('vmid', vmid, show=show_pins)\n self.add_pin('VDD', vdd, show=show_pins)\n self.add_pin('VSS', vss, show=show_pins)\n # re-export pins on subcells.\n self.reexport(cs_inst.get_port('vin'), show=show_pins)\n self.reexport(cs_inst.get_port('vbias'), net_name='vb1', show=show_pins)\n # TODO: reexport vout and vbias of source follower\n # TODO: vbias should be renamed to vb2\n\n # compute schematic parameters.\n self._sch_params = dict(\n cs_params=cs_master.sch_params,\n sf_params=sf_master.sch_params,\n )\n\n\nimport os\n\n# import bag package\nimport bag\nfrom bag.io import read_yaml\n\n# import BAG demo Python modules\nimport xbase_demo.core as demo_core\n\n# load circuit specifications from file\nspec_fname = os.path.join(os.environ['BAG_WORK_DIR'], 'specs_demo/demo.yaml')\ntop_specs = read_yaml(spec_fname)\n\n# obtain BagProject instance\nlocal_dict = locals()\nif 'bprj' in local_dict:\n print('using existing BagProject')\n bprj = local_dict['bprj']\nelse:\n print('creating BagProject')\n bprj = bag.BagProject()\n\ndemo_core.run_flow(bprj, top_specs, 'amp_chain_soln', AmpChain, run_lvs=True, lvs_only=True)",
"_____no_output_____"
]
],
[
[
"## AmpChain Schematic Template\nNow let's move on to schematic generator. As before, we need to create the schematic template first. A half-complete schematic template is provided for you in library `demo_templates`, cell `amp_chain`, shown below:\n<img src=\"bootcamp_pics/5_hierarchical_generator/hierachical_generator_2.PNG\" alt=\"Drawing\" style=\"width: 400px;\"/>\n\nThe schematic template for a hierarchical generator is very simple; you simply need to instantiate the schematic templates of the sub-blocks (***Not the generated schematic!***). For the exercise, instantiate the `amp_sf` schematic template from the `demo_templates` library, named it `XSF`, connect it, then evaluate the following cell to import the `amp_chain` netlist to Python.\n",
"_____no_output_____"
]
],
[
[
"import bag\n\n# obtain BagProject instance\nlocal_dict = locals()\nif 'bprj' in local_dict:\n print('using existing BagProject')\n bprj = local_dict['bprj']\nelse:\n print('creating BagProject')\n bprj = bag.BagProject()\n \nprint('importing netlist from virtuoso')\nbprj.import_design_library('demo_templates')\nprint('netlist import done')",
"_____no_output_____"
]
],
[
[
"## AmpChain Schematic Generator\nWith schematic template done, you are ready to write the schematic generator. It is also very simple, you just need to call the `design()` method, which you implemented previously, on each instance in the schematic. Complete the following schematic generator, then evaluate the cell to push it through the design flow.",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\n\nimport os\n\nfrom bag.design import Module\n\n\n# noinspection PyPep8Naming\nclass demo_templates__amp_chain(Module):\n \"\"\"Module for library demo_templates cell amp_chain.\n\n Fill in high level description here.\n \"\"\"\n\n # hard coded netlist flie path to get jupyter notebook working.\n yaml_file = os.path.join(os.environ['BAG_WORK_DIR'], 'BAG_XBase_demo', \n 'BagModules', 'demo_templates', 'netlist_info', 'amp_chain.yaml') \n\n def __init__(self, bag_config, parent=None, prj=None, **kwargs):\n Module.__init__(self, bag_config, self.yaml_file, parent=parent, prj=prj, **kwargs)\n\n @classmethod\n def get_params_info(cls):\n # type: () -> Dict[str, str]\n \"\"\"Returns a dictionary from parameter names to descriptions.\n\n Returns\n -------\n param_info : Optional[Dict[str, str]]\n dictionary from parameter names to descriptions.\n \"\"\"\n return dict(\n cs_params='common-source amplifier parameters dictionary.',\n sf_params='source-follwer amplifier parameters dictionary.',\n )\n \n def design(self, cs_params=None, sf_params=None):\n self.instances['XCS'].design(**cs_params)\n # TODO: design XSF\n\n \nimport os\n\n# import bag package\nimport bag\nfrom bag.io import read_yaml\n\n# import BAG demo Python modules\nimport xbase_demo.core as demo_core\nfrom xbase_demo.demo_layout.core import AmpChainSoln\n\n# load circuit specifications from file\nspec_fname = os.path.join(os.environ['BAG_WORK_DIR'], 'specs_demo/demo.yaml')\ntop_specs = read_yaml(spec_fname)\n\n# obtain BagProject instance\nlocal_dict = locals()\nif 'bprj' in local_dict:\n print('using existing BagProject')\n bprj = local_dict['bprj']\nelse:\n print('creating BagProject')\n bprj = bag.BagProject()\n\ndemo_core.run_flow(bprj, top_specs, 'amp_chain', AmpChainSoln, sch_cls=demo_templates__amp_chain, run_lvs=True)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbf9c9575261a6627ba4d261a19d12e12118d824
| 226,959 |
ipynb
|
Jupyter Notebook
|
Keras/04_Mnist-Ensemble.ipynb
|
Kidel/Deep-Learning-CNN-for-Image-Recognition
|
d0810efc935c357374f7f9aabe1aeb6a7c74b504
|
[
"MIT"
] | 33 |
2017-01-22T11:09:26.000Z
|
2021-09-28T12:49:35.000Z
|
Keras/04_Mnist-Ensemble.ipynb
|
kant/Deep-Learning-CNN-for-Image-Recognition
|
d0810efc935c357374f7f9aabe1aeb6a7c74b504
|
[
"MIT"
] | null | null | null |
Keras/04_Mnist-Ensemble.ipynb
|
kant/Deep-Learning-CNN-for-Image-Recognition
|
d0810efc935c357374f7f9aabe1aeb6a7c74b504
|
[
"MIT"
] | 29 |
2017-01-22T11:20:30.000Z
|
2022-02-07T09:44:12.000Z
| 305.052419 | 43,096 | 0.911557 |
[
[
[
"# MNIST Convolutional Neural Network - Ensemble Learning\nGaetano Bonofiglio, Veronica Iovinella\n\nIn this notebook we will verify if our single-column architecture can get any advantage from using **ensemble learning**, so a multi-column architecture. \n\nWe will train multiple networks identical to the best one defined in notebook 03, feeding them with pre-processed images shuffled and distorted using a different pseudo-random seed. This should give us a good ensemble of networks that we can average for each classification. \n\nA prediction doesn't take more time compared to a single-column, but training time scales by a factor of N, where N is the number of columns. Networks could be trained in parallel, but not on our current hardware that is saturated by the training of a single one.",
"_____no_output_____"
],
[
"## Imports",
"_____no_output_____"
]
],
[
[
"import os.path\nfrom IPython.display import Image\n\nfrom util import Util\nu = Util()\n\nimport numpy as np\n# Explicit random seed for reproducibility\nnp.random.seed(1337) ",
"Using TensorFlow backend.\n"
],
[
"from keras.callbacks import ModelCheckpoint\nfrom keras.models import Sequential\nfrom keras.layers import Dense, Dropout, Activation, Flatten\nfrom keras.layers import Convolution2D, MaxPooling2D\nfrom keras.layers import Merge\nfrom keras.utils import np_utils\nfrom keras.preprocessing.image import ImageDataGenerator\nfrom keras import backend as K",
"_____no_output_____"
],
[
"from keras.datasets import mnist",
"_____no_output_____"
]
],
[
[
"## Definitions\nFor this experiment we are using 5 networks, but usually a good number is in the range of 35 (but with more dataset alterations then we do).",
"_____no_output_____"
]
],
[
[
"batch_size = 1024\nnb_classes = 10\nnb_epoch = 650\n# checkpoint path\ncheckpoints_dir = \"checkpoints\"\n\n# number of networks for ensamble learning\nnumber_of_models = 5",
"_____no_output_____"
],
[
"# input image dimensions\nimg_rows, img_cols = 28, 28\n# number of convolutional filters to use\nnb_filters1 = 20\nnb_filters2 = 40\n# size of pooling area for max pooling\npool_size1 = (2, 2)\npool_size2 = (3, 3)\n# convolution kernel size\nkernel_size1 = (4, 4)\nkernel_size2 = (5, 5)\n# dense layer size\ndense_layer_size1 = 200\n# dropout rate\ndropout = 0.15\n# activation type\nactivation = 'relu'",
"_____no_output_____"
]
],
[
[
"## Data load",
"_____no_output_____"
]
],
[
[
"# the data, shuffled and split between train and test sets\n(X_train, y_train), (X_test, y_test) = mnist.load_data()",
"_____no_output_____"
],
[
"u.plot_images(X_train[0:9], y_train[0:9])",
"_____no_output_____"
],
[
"if K.image_dim_ordering() == 'th':\n X_train = X_train.reshape(X_train.shape[0], 1, img_rows, img_cols)\n X_test = X_test.reshape(X_test.shape[0], 1, img_rows, img_cols)\n input_shape = (1, img_rows, img_cols)\nelse:\n X_train = X_train.reshape(X_train.shape[0], img_rows, img_cols, 1)\n X_test = X_test.reshape(X_test.shape[0], img_rows, img_cols, 1)\n input_shape = (img_rows, img_cols, 1)",
"_____no_output_____"
],
[
"X_train = X_train.astype('float32')\nX_test = X_test.astype('float32')\nX_train /= 255\nX_test /= 255\nprint('X_train shape:', X_train.shape)\nprint(X_train.shape[0], 'train samples')\nprint(X_test.shape[0], 'test samples')",
"X_train shape: (60000, 28, 28, 1)\n60000 train samples\n10000 test samples\n"
],
[
"# convert class vectors to binary class matrices\nY_train = np_utils.to_categorical(y_train, nb_classes)\nY_test = np_utils.to_categorical(y_test, nb_classes)",
"_____no_output_____"
]
],
[
[
"## Image preprocessing",
"_____no_output_____"
]
],
[
[
"datagen = ImageDataGenerator(\n rotation_range=30,\n width_shift_range=0.1,\n height_shift_range=0.1,\n zoom_range=0.1,\n horizontal_flip=False)\n\n# compute quantities required for featurewise normalization\n# (std, mean, and principal components if ZCA whitening is applied)\ndatagen.fit(X_train)",
"_____no_output_____"
]
],
[
[
"## Model definition - Single column\nThis time we are going to define a helper functions to initialize the model, since we're going to use it on a list of models.",
"_____no_output_____"
]
],
[
[
"def initialize_network(model, dropout1=dropout, dropout2=dropout):\n model.add(Convolution2D(nb_filters1, kernel_size1[0], kernel_size1[1],\n border_mode='valid',\n input_shape=input_shape, name='covolution_1_' + str(nb_filters1) + '_filters'))\n model.add(Activation(activation, name='activation_1_' + activation))\n model.add(MaxPooling2D(pool_size=pool_size1, name='max_pooling_1_' + str(pool_size1) + '_pool_size'))\n model.add(Convolution2D(nb_filters2, kernel_size2[0], kernel_size2[1]))\n model.add(Activation(activation, name='activation_2_' + activation))\n model.add(MaxPooling2D(pool_size=pool_size2, name='max_pooling_1_' + str(pool_size2) + '_pool_size'))\n model.add(Dropout(dropout))\n\n model.add(Flatten())\n model.add(Dense(dense_layer_size1, name='fully_connected_1_' + str(dense_layer_size1) + '_neurons'))\n model.add(Activation(activation, name='activation_3_' + activation))\n model.add(Dropout(dropout))\n model.add(Dense(nb_classes, name='output_' + str(nb_classes) + '_neurons'))\n model.add(Activation('softmax', name='softmax'))\n\n model.compile(loss='categorical_crossentropy',\n optimizer='adadelta',\n metrics=['accuracy', 'precision', 'recall'])",
"_____no_output_____"
],
[
"# pseudo random generation of seeds\nseeds = np.random.randint(10000, size=number_of_models)\n\n# initializing all the models\nmodels = [None] * number_of_models\n\nfor i in range(number_of_models):\n models[i] = Sequential()\n initialize_network(models[i])",
"_____no_output_____"
]
],
[
[
"## Training and evaluation - Single column\nAgain we are going to define a helper functions to train the model, since we're going to use them on a list.",
"_____no_output_____"
]
],
[
[
"def try_load_checkpoints(model, checkpoints_filepath, warn=False):\n # loading weights from checkpoints \n if os.path.exists(checkpoints_filepath):\n model.load_weights(checkpoints_filepath)\n elif warn: \n print('Warning: ' + checkpoints_filepath + ' could not be loaded')\n\ndef fit(model, checkpoints_name='test', seed=1337, initial_epoch=0, \n verbose=1, window_size=(-1), plot_history=False, evaluation=True):\n \n if window_size == (-1):\n window = 1 + np.random.randint(14)\n else:\n window = window_size\n if window >= nb_epoch:\n window = nb_epoch - 1\n \n print(\"Not pre-processing \" + str(window) + \" epoch(s)\")\n \n checkpoints_filepath = os.path.join(checkpoints_dir, '04_MNIST_weights.best_' + checkpoints_name + '.hdf5')\n\n try_load_checkpoints(model, checkpoints_filepath, True)\n \n # checkpoint\n checkpoint = ModelCheckpoint(checkpoints_filepath, monitor='val_precision', verbose=verbose, save_best_only=True, mode='max')\n callbacks_list = [checkpoint]\n\n # fits the model on batches with real-time data augmentation, for nb_epoch-100 epochs\n history = model.fit_generator(datagen.flow(X_train, Y_train, \n batch_size=batch_size, \n # save_to_dir='distorted_data', \n # save_format='png'\n seed=1337),\n samples_per_epoch=len(X_train), nb_epoch=(nb_epoch-window), verbose=0, \n validation_data=(X_test, Y_test), callbacks=callbacks_list)\n\n # ensuring best val_precision reached during training\n try_load_checkpoints(model, checkpoints_filepath)\n\n # fits the model on clear training set, for nb_epoch-700 epochs\n history_cont = model.fit(X_train, Y_train, batch_size=batch_size, nb_epoch=window,\n verbose=0, validation_data=(X_test, Y_test), callbacks=callbacks_list)\n\n # ensuring best val_precision reached during training\n try_load_checkpoints(model, checkpoints_filepath)\n \n if plot_history:\n print(\"History: \")\n u.plot_history(history)\n u.plot_history(history, 'precision')\n print(\"Continuation of training with no pre-processing:\")\n u.plot_history(history_cont)\n u.plot_history(history_cont, 'precision')\n if evaluation:\n print('Evaluating model ' + str(index))\n score = model.evaluate(X_test, Y_test, verbose=0)\n print('Test accuracy:', score[1]*100, '%')\n print('Test error:', (1-score[2])*100, '%')\n \n return history, history_cont",
"_____no_output_____"
],
[
"for index in range(number_of_models):\n print(\"Training model \" + str(index) + \" ...\")\n \n if index == 0:\n window_size = 20\n plot_history = True\n else:\n window_size = (-1)\n plot_history = False\n \n history, history_cont = fit(models[index], \n str(index), \n seed=seeds[index],\n initial_epoch=0,\n verbose=0, \n window_size=window_size, \n plot_history=plot_history)\n print(\"Done.\\n\\n\")",
"Training model 0 ...\nNot pre-processing 20 epoch(s)\nWarning: checkpoints\\04_MNIST_weights.best_0.hdf5 could not be loaded\nHistory: \n"
]
],
[
[
"Just by the different seeds, error changes **from 0.5% to 0.42%** (our best result so far with a single column). The training took 12 hours. ",
"_____no_output_____"
],
[
"## Model definition - Multi column\nThe MCDNN is obtained by creating a new model that only has 1 layer, Merge, that does the average of the outputs of the models in the given list. No training is required since we're only doing the average.",
"_____no_output_____"
]
],
[
[
"merged_model = Sequential()\nmerged_model.add(Merge(models, mode='ave'))\n\nmerged_model.compile(loss='categorical_crossentropy',\n optimizer='adadelta',\n metrics=['accuracy', 'precision', 'recall'])",
"_____no_output_____"
]
],
[
[
"## Evaluation - Multi column",
"_____no_output_____"
]
],
[
[
"print('Evaluating ensemble')\nscore = merged_model.evaluate([np.asarray(X_test)] * number_of_models, \n Y_test, \n verbose=0)\nprint('Test accuracy:', score[1]*100, '%')\nprint('Test error:', (1-score[2])*100, '%')",
"Evaluating ensemble\nTest accuracy: 99.53 %\nTest error: 0.400967769623 %\n"
]
],
[
[
"The error improved from 0.42% with the best network of the ensemble, to 0.4%, that is out best result so far. ",
"_____no_output_____"
]
],
[
[
"# The predict_classes function outputs the highest probability class\n# according to the trained classifier for each input example.\npredicted_classes = merged_model.predict_classes([np.asarray(X_test)] * number_of_models)\n\n# Check which items we got right / wrong\ncorrect_indices = np.nonzero(predicted_classes == y_test)[0]\nincorrect_indices = np.nonzero(predicted_classes != y_test)[0]",
"10000/10000 [==============================] - 2s \n"
],
[
"u.plot_images(X_test[correct_indices[:9]], y_test[correct_indices[:9]], \n predicted_classes[correct_indices[:9]])",
"_____no_output_____"
],
[
"u.plot_images(X_test[incorrect_indices[:9]], y_test[incorrect_indices[:9]], \n predicted_classes[incorrect_indices[:9]])",
"_____no_output_____"
],
[
"u.plot_confusion_matrix(y_test, nb_classes, predicted_classes)",
"[[ 978 0 0 0 0 0 1 1 0 0]\n [ 0 1132 0 0 0 0 0 3 0 0]\n [ 0 1 1025 2 0 0 0 3 1 0]\n [ 0 0 0 1010 0 0 0 0 0 0]\n [ 0 0 0 0 977 0 1 0 0 4]\n [ 0 0 0 4 0 885 2 1 0 0]\n [ 3 1 1 0 1 2 949 0 1 0]\n [ 0 2 2 0 0 0 0 1023 0 1]\n [ 0 0 2 0 0 1 0 0 970 1]\n [ 0 0 0 0 3 0 0 1 1 1004]]\n"
]
],
[
[
"## Results\nTraining 5 networks took 12 hours, of course 5 times longer then a single one. The improvement was of 0.05% error, that is quite good considering this dataset (a human has 0.2% test error on MNIST). \n\nTo further increase the precision we would need over 30 columns trained on different widths.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
]
] |
cbf9dec5e54da939d3038a18baa65137ab3d3eb9
| 34,868 |
ipynb
|
Jupyter Notebook
|
notebooks/pymt_nwis.ipynb
|
gantian127/pymt_nwis
|
4537b33b80c38e66998fbd1f0a7e309aad340a72
|
[
"MIT"
] | null | null | null |
notebooks/pymt_nwis.ipynb
|
gantian127/pymt_nwis
|
4537b33b80c38e66998fbd1f0a7e309aad340a72
|
[
"MIT"
] | null | null | null |
notebooks/pymt_nwis.ipynb
|
gantian127/pymt_nwis
|
4537b33b80c38e66998fbd1f0a7e309aad340a72
|
[
"MIT"
] | null | null | null | 126.333333 | 26,960 | 0.88227 |
[
[
[
"<img src=\"https://github.com/gantian127/pymt_nwis/blob/master/docs/_static/logo.png?raw=true\" width='600' align='center'></a>",
"_____no_output_____"
],
[
"## Introduction",
"_____no_output_____"
],
[
"[nwis](https://github.com/gantian127/nwis) package provides a set of functions that allows downloading of the National Water Information System datasets for data visualization and analysis. nwis package also includes a Basic Model Interface ([BMI](https://bmi.readthedocs.io/en/latest/)). \n\n[pymt_nwis](https://github.com/gantian127/pymt_nwis) package uses the BMI of nwis to convert it into a reusable, plug-and-play data component for [PyMT](https://pymt.readthedocs.io/en/latest/?badge=latest) modeling framework. This allows the National Water Information System datasets to be easily coupled with other datasets or models that expose a BMI.",
"_____no_output_____"
],
[
"**To install pymt_nwis, use the following command:**",
"_____no_output_____"
]
],
[
[
"! pip install pymt_nwis",
"_____no_output_____"
]
],
[
[
"## Coding Example",
"_____no_output_____"
],
[
"Import nwis class and instantiate it. A configuration file (yaml file) is required to provide the parameter settings for data download. An example config_file.yaml is provided in the same folder with this Jupyter Notebook file. For more details of the parameters specified in the config.yaml file, please check with the link [here](https://nwis.readthedocs.io/en/latest/?badge=latest#parameter-settings).",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\nimport numpy as np\nimport cftime\n\n\nfrom pymt.models import Nwis\n\n# initiate a data component\ndata_comp = Nwis()\ndata_comp.initialize('config_file.yaml')",
"_____no_output_____"
]
],
[
[
"Use variable related methods to check the variable information of the dataset. There are multiple variables and we will check the detailed info of the \"discharge\" variable.",
"_____no_output_____"
]
],
[
[
"# get variable info\nvar_names = data_comp.output_var_names\nprint('All variable names: {}'.format(var_names))\n\nvar_name = 'discharge'\nvar_unit = data_comp.var_units(var_name)\nvar_location = data_comp.var_location(var_name)\nvar_type = data_comp.var_type(var_name)\nvar_grid = data_comp.var_grid(var_name)\n\nprint('variable_name: {} \\nvar_unit: {} \\nvar_location: {} \\nvar_type: {} \\nvar_grid: {}'.format(\n var_name, var_unit, var_location, var_type, var_grid))",
"All variable names: ('water temperature', 'discharge', 'gage height')\nvariable_name: discharge \nvar_unit: cubic feet per second \nvar_location: node \nvar_type: float64 \nvar_grid: 0\n"
]
],
[
[
"Use time related methods to check the time information of the dataset. Please note that the time values are stored in a format which follows [CF convention](http://cfconventions.org/Data/cf-conventions/cf-conventions-1.8/cf-conventions.pdf).",
"_____no_output_____"
]
],
[
[
"# get time info\nstart_time = data_comp.start_time\nend_time = data_comp.end_time\ntime_step = data_comp.time_step\ntime_units = data_comp.time_units\ntime_steps = int((end_time - start_time)/time_step) + 1\n\nprint('start_time: {} \\nend_time: {} \\ntime_step: {} \\ntime_units: {} \\ntime_steps: {}'.format(\n start_time, end_time, time_step, time_units, time_steps))",
"start_time: 1577836800.0 \nend_time: 1579046400.0 \ntime_step: 86400 \ntime_units: seconds since 1970-01-01 00:00:00 UTC \ntime_steps: 15\n"
]
],
[
[
"Loop through each time step to get the discharge and time values. stream_array stores the discharge values. cftime_array stores the numerical time values. time_array stores the corresponding Python datetime objects. get_value( ) method returns the flow forecast value at each time step. update( ) method updates the current time step of the data component.",
"_____no_output_____"
]
],
[
[
"# get variable data\nstream_array = np.empty(time_steps)\ncftime_array = np.empty(time_steps)\n\nfor i in range(0, time_steps):\n stream_array[i] = data_comp.get_value(var_name)\n cftime_array[i] = data_comp.time\n data_comp.update()\n\ntime_array = cftime.num2date(cftime_array, time_units, only_use_cftime_datetimes=False, only_use_python_datetimes=True )\n",
"_____no_output_____"
]
],
[
[
"Now let's make a plot of the discharge data. ",
"_____no_output_____"
]
],
[
[
"# plot data\nplt.figure(figsize=(9,5))\nplt.plot(time_array, stream_array)\nplt.xlabel('Year 2017')\nplt.ylabel('{} ({})'.format(var_name, var_unit))\nplt.title('Discharge Observation at USGS Gage 03339000')",
"_____no_output_____"
]
],
[
[
"Complete the example by finalizing the component.",
"_____no_output_____"
]
],
[
[
"data_comp.finalize()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbf9e87d95f3b2fedac28424a9cb06d55e98fe04
| 14,221 |
ipynb
|
Jupyter Notebook
|
DamageMapper.ipynb
|
acse-cxd20/Armageddon
|
b3d4d9d0ea90a5213aed2f72e9e86be428de56bf
|
[
"MIT"
] | null | null | null |
DamageMapper.ipynb
|
acse-cxd20/Armageddon
|
b3d4d9d0ea90a5213aed2f72e9e86be428de56bf
|
[
"MIT"
] | null | null | null |
DamageMapper.ipynb
|
acse-cxd20/Armageddon
|
b3d4d9d0ea90a5213aed2f72e9e86be428de56bf
|
[
"MIT"
] | null | null | null | 58.044898 | 643 | 0.670698 |
[
[
[
"# Supplementary information for damage mapper tool",
"_____no_output_____"
],
[
"Development of the damage mapper tool can be broken down into three parts:\n\n1. A function `damage_zones` to calculate the coordinates of the surface zero location and the airblast damage radii\n2. A function to plot the blast zones on a map\n3. Functions to locate the postcodes (or postcode sectors) within the blast zones `get_postcodes_by_radius` and look up the population in these postcodes `get_population_of_postcodes`.\n\nFor the extension task you will need to develop additional functions.",
"_____no_output_____"
],
[
"## Airblast damage\n\nThe rapid deposition of energy in the atmosphere is analogous to an explosion and so the environmental consequences of the airburst can be estimated using empirical data from atmospheric explosion experiments [(Glasstone and Dolan, 1977)](https://www.dtra.mil/Portals/61/Documents/NTPR/4-Rad_Exp_Rpts/36_The_Effects_of_Nuclear_Weapons.pdf).\n\nThe main cause of damage close to the impact site is a strong (pressure) blastwave in the air, known as the **airblast**. Empirical data suggest that the pressure in this wave $p$ (in Pa) (above ambient, also known as overpressure), as a function of explosion energy $E_k$ (in kilotons of TNT equivalent), burst altitude $z_b$ (in m) and horizontal range $r$ (in m), is given by:\n\n\\begin{equation*}\np(r) = 3.14 \\times 10^{11} \\left(\\frac{r^2 + z_b^2}{E_k^{2/3}}\\right)^{-1.3} + 1.8 \\times 10^{7} \\left(\\frac{r^2 + z_b^2}{E_k^{2/3}}\\right)^{-0.565}\n\\end{equation*}\n\nTo solve this equation using gradient methods:\n\n\\begin{equation*}\n\\frac{dp(r)}{dr} = -1.3 \\times 3.14 \\times 10^{11} \\left(\\frac{r^2 + z_b^2}{E_k^{2/3}}\\right)^{-2.3} \\frac{2r}{E_k^{2/3}}- 0.565 \\times 1.8 \\times 10^{7} \\left(\\frac{r^2 + z_b^2}{E_k^{2/3}}\\right)^{-1.565} \\frac{2r}{E_k^{2/3}}\n\\end{equation*}\n\nFor airbursts, we will take the total kinetic energy lost by the asteroid at the burst altitude as the burst energy $E_k$. For low-altitude airbursts or cratering events, we will define $E_k$ as the **larger** of the total kinetic energy lost by the asteroid at the burst altitude or the residual kinetic energy of the asteroid when it hits the ground.\n\nNote that the burst altitude $z_b$ is the vertical distance from the ground to the point of the airburst and the range $r$ is the (great circle) distance along the surface from the \"surface zero point,\" which is the point on the surface that is closest to the point of the airburst (i.e., directly below).\n\nThe following threshold pressures can then be used to define different degrees of damage.\n\n| Damage Level | Description | Pressure (kPa) |\n|:-------------:|:---------------:|:--------------:|\n| 1 | ~10% glass windows shatter | 1.0 |\n| 2 | ~90% glass windows shatter | 3.5 |\n| 3 | Wood frame buildings collapse | 27 |\n| 4 | Multistory brick buildings collapse | 43 |\n\n<p>\n<div align=\"center\">Table 1: Pressure thresholds (in kPa) for airblast damage</div>\n\nAccording to the equations that we will use in this work, an asteoroid of approximately 7-m radius is required to generate overpressures on the ground exceeding 1 kPa, and an asteoroid of approximately 35-m radius is required to generate overpressures on the ground exceeding 43 kPa.",
"_____no_output_____"
],
[
"## Notes on distance, bearing and position\n\nTo determine the surface zero location (the point on Earth's surface that is closest to the point of airburst) a useful set of spherical geometric formulae relate the bearing, $\\beta$ (also known as forward azimuth) to take to get from one point to another along a great circle,\n\n$$\\tan \\beta = \\frac {\\cos \\varphi_2\\sin (\\lambda_2-\\lambda_1)}{\\cos\\varphi_1\\sin\\varphi_2-\\sin\\varphi_1\\cos\\varphi_2\\cos(\\lambda_2-\\lambda_1)},$$\n\nas well as the related problem of the final destination given a surface distance and initial bearing:\n\n$$\\sin \\varphi_2 = \\sin \\varphi_1\\cos \\left(\\frac{r}{R_p}\\right) +\\cos \\varphi_1\\sin\\left(\\frac{r}{R_p}\\right)\\cos \\beta,$$\n\n$$ \\tan(\\lambda_2-\\lambda_1) = \\frac{\\sin\\beta\\sin\\left(\\frac{r}{R_p}\\right)\\cos\\varphi_1}{\\cos\\left(\\frac{r}{R_p}\\right)-\\sin\\varphi_1\\sin\\varphi_2}.$$\n\nThese formulae can all be derived from the spherical form of the [sine and cosine laws](https://en.wikipedia.org/wiki/Spherical_trigonometry#Cosine_rules_and_sine_rules) using relevant third points.",
"_____no_output_____"
],
[
"## Postcode locations\n\nFor those of you unfamiliar with UK postcodes, this [link](https://www.getthedata.com/postcode) might be helpful. Each postcode comprises of two strings of alpha-numeric characters that identify the geographic division of the UK. The first one or two letters of the first part of the postcode (before the number) identify the postcode **area** (e.g., WC); the whole of the first part of the postcode identifies the postcode **district**; the first part of the postcode, plus the first number of the second part of the postcode identifies the postcode **sector**. In this project, we will use the full postcode and the postcode sector.\n\n<img src=\"images/postcode_map.png\" width=\"640\">",
"_____no_output_____"
],
[
"The geographic data supplied by running the `download_data.py` script consists of two files. The larger file is `full_postcodes.csv`, which contains a list of current UK postcodes, along with a government-assigned code designating the local administrative area and information information on the average (mean) longitude and latitude of the addresses comprising the unit, using the international WGS 84 geodetic datum as supported by modern GPS.",
"_____no_output_____"
]
],
[
[
"import pandas as pd\npostcodes = pd.read_csv('./armageddon/resources/full_postcodes.csv')\npostcodes.head()",
"_____no_output_____"
]
],
[
[
"The smaller file is `population_by_postcode_sector.csv`, which contains 2011 census data arranged by postcode sector. The important columns for this project are the postcode sector (\"geography\") and the total population (\"All usual residents\"), although you are welcome to use other data in your tool if you wish.",
"_____no_output_____"
]
],
[
[
"census = pd.read_csv('./armageddon/resources/population_by_postcode_sector.csv')\ncensus.head()",
"_____no_output_____"
]
],
[
[
"## Notes on longitude, latitude and distance\n\nGiven a pair of points by longitude and latitude, converting this into a distance between them can be a surprisingly involved calculation, involving a successively improving model of the shape of the Earth (the geoid). At the lowest reasonable level of approximation, in which the Earth is considered spherical, points at the same longitude satisfy a formula\n$$|\\varphi_1 -\\varphi_2| = \\frac{r}{R_p}$$\nwhere the $\\varphi$s are the latitudes (in radians), $r$ the surface distance between the points and $R_p$ the radius of the earth. As long as $r$ and $R_p$ are in the same units, the choice doesn't matter, but metres are usually to be preferred For points at the same latitude, a similar formula applies, \n$$|\\lambda_1 -\\lambda_2| = \\frac{r}{R_p\\cos\\varphi},$$\nwhere the $\\lambda$s are the longitudes and the $\\varphi$ is the common latitude. In the general case a number of different formulas exist. [Among the more popular](https://en.wikipedia.org/wiki/Great-circle_distance) are the Haversine formula\n$$\\frac{r}{R_p} = 2\\arcsin\\sqrt{\\sin^2 \\frac{|\\varphi_1-\\varphi_2|}{2}+\\cos\\varphi_1\\cos\\varphi_2\\sin^2\\frac{|\\lambda_1-\\lambda_2|}{2}},$$\nthe spherical Vincenty formula\n$$\\frac{r}{R_p}=\\arctan\\frac{\\sqrt{(\\cos\\varphi_2\\sin|\\lambda_1-\\lambda_2|)^2+(\\cos\\varphi_1\\sin\\varphi_2-\\sin\\varphi_1\\cos\\varphi_2\\cos|\\lambda_1-\\lambda_2|)^2}}{\\sin\\varphi_1 \\sin\\varphi_2+\\cos\\varphi_1\\cos\\varphi_2\\cos|\\lambda_1-\\lambda_2|},$$\nand the law of spherical cosines,\n$$\\frac{r}{R_p}=\\arccos\\left(\\sin\\varphi_1\\sin\\varphi_2+\\cos\\varphi_1\\cos\\varphi_2\\cos|\\lambda_1-\\lambda_2|\\right).$$\nAt short distances linearizations such as Pythagoras can also be used. \n\nWhich formulae to choose is a balance between the cost of calculation and the accuracy of the result, which also depends on the specifics of the implementation. For example the two argument (also called `arctan2`) inverse tangent function should be preferred when needed (and available). In general the cheaper formulas have fewer trignometric function evaluations and square root calculations.\n\nFor this project, you should assume a spherical Earth and use one of the above approximations, but you may be interested to know that at the next level of approximation, the Earth is considered as an oblate spheriod (i.e. flattened sphere) and the full, iterative version of [Vincenty's formulae](https://en.wikipedia.org/wiki/Vincenty%27s_formulae) can be used. Further improvement includes local effects and acknowledges the implications of land elevation, but that sits well outside the scope of this exercise.",
"_____no_output_____"
],
[
"## Extended functionality\n\nAdditional credit will be given if your damage mapper function demonstrates the following extended capabilities:\n\n* The ability to present the software output on a map. The graphics should be designed to be appropriate for use in emergency response and evacuation planning.\n* The ability to perform a simple uncertainty analysis that takes as input a small uncertainty on each input parameter and calculates a risk for each affected UK postcode (sector).",
"_____no_output_____"
],
[
"### Plotting on a map\n\nAs one possible approach, we have provided a function to plot a circle on a map using the `folium` package. You can use `folium` and expand on this function or you may prefer to use a different package. Please check with us that the mapping package you wish to use is permissible before you start.",
"_____no_output_____"
]
],
[
[
"import folium\nimport armageddon\narmageddon.plot_circle(53., 0, 2000.) #Plots a circle of radius 2000 m at the lat, lon: 53., 0.",
"_____no_output_____"
]
],
[
[
"### Uncertainty analysis\n\nFor this second extension exercise, a separate function `impact_risk` should be written that takes an additional set of inputs, describing the standard deviation of each input parameter, as well as the nominal input parameters. The uncertainty in each input parameter can be assumed to follow a gaussian distribution centered on the nominal values. The standard deviations for the parameters can be taken as:\n\n* Entry latitude 0.025$^\\circ$\n* Entry longitude: 0.025$^\\circ$\n* Entry bearing: 0.5$^\\circ$\n* Meteoroid radius: 1 m\n* Meteoroid speed: 1000 m/s\n* Meteoroid density: 500 kg/m$^3$\n* Meteoroid strength: 50\\%\n* Meteoroid trajectory angle: 1$^\\circ$\n\nFor the second extension task, risk will be defined as the probability that the postcode sector (or postcode) is within a specified damage zone times the affected population. This function should therefore take as an input the overpressure used in the risk calculation and a flag to indicate whether risk should be calculated at the postcode or postcode sector level. For scoring, we will use damage level 3 (wooden buildings collapse) and postcode sectors.\n\nYour risk calculator should sample the model parameter space $n$ times, where $n$ is an input parameter, but the sampling method is up to you. The probability that a postcode (or sector) is within a specified damage level is defined as the number of times the postcode (sector) is within the specified damage level divided by $n$. \n\nThe risk calculator should output a Pandas dataframe with two columns: postcode (unit or sector) and risk.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbf9ed5dd50461297fce3f089831d6be64d33964
| 1,685 |
ipynb
|
Jupyter Notebook
|
day4.ipynb
|
baicha12/snow-fox
|
7c72ec193acd8c1b4280420c7755697e059eef41
|
[
"Apache-2.0"
] | null | null | null |
day4.ipynb
|
baicha12/snow-fox
|
7c72ec193acd8c1b4280420c7755697e059eef41
|
[
"Apache-2.0"
] | null | null | null |
day4.ipynb
|
baicha12/snow-fox
|
7c72ec193acd8c1b4280420c7755697e059eef41
|
[
"Apache-2.0"
] | null | null | null | 25.923077 | 85 | 0.537685 |
[
[
[
"import requests\nfrom lxml import etree\nfile = 'C:\\\\Users\\\\lenovo\\\\Desktop\\\\day04\\wenben.txt'\nopen_file = open(file,mode='a',encoding='utf8')\nimport requests\nfrom lxml import etree\nresponse = requests.get('http://www.17k.com/list/2784023.html')\n#print(response)\nresponse.encoding = 'utf8'\nhtml = response.text\ntree = etree.HTML(html)\na_xpath=tree.xpath('//div[@class=\"Main List\"]/dl[@class=\"Volume\"]/dd/a')\n#print(a_xpath)\nstart_url = 'http://www.17k.com'\nfor i in a_xpath:\n url =i.xpath('./@href')[0] \n all_url = start_url+url\n response2 = requests.get(all_url)\n response2.encoding = 'utf8'\n html2=response2.text \n tree2 = etree.HTML(html2) \n p_xpath=tree2.xpath('//*[@id=\"readArea\"]/div[1]/div[2]/text()') \n open_file.write(str(p_xpath)) \nopen_file.close() \n",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code"
]
] |
cbf9f09dc2ef6c7c5194159a09f730e8064b7336
| 25,073 |
ipynb
|
Jupyter Notebook
|
04-Bedingungen.ipynb
|
matheharry/Robo-Python-Course
|
51d625f5c6088dba9a81aeeca013af5abfb31542
|
[
"MIT"
] | null | null | null |
04-Bedingungen.ipynb
|
matheharry/Robo-Python-Course
|
51d625f5c6088dba9a81aeeca013af5abfb31542
|
[
"MIT"
] | null | null | null |
04-Bedingungen.ipynb
|
matheharry/Robo-Python-Course
|
51d625f5c6088dba9a81aeeca013af5abfb31542
|
[
"MIT"
] | 1 |
2021-04-17T15:44:09.000Z
|
2021-04-17T15:44:09.000Z
| 25,073 | 25,073 | 0.693854 |
[
[
[
"<a href=\"https://www.matheharry.de/\">\n <img src=\"https://www.matheharry.de/wp-content/uploads/2020/12/cropped-MatheHarry-logos-banner.jpg\" width=\"300\" align=\"center\"></a>\n\n\n---",
"_____no_output_____"
],
[
"# Bedingungen (conditions) und Ablaufsteuerung in Python",
"_____no_output_____"
],
[
"**Willkommen!** In diesem Notebook lernst du die Bedingungsanweisungen in Python kennen. Am Ende dieser Einheit wirst du wissen, wie man die Bedingungsanweisungen in Python verwendet, einschlieรlich der Operatoren und Verzweigungen.",
"_____no_output_____"
],
[
"## Vergleichsoperatoren",
"_____no_output_____"
],
[
"Vergleichsoperationen vergleichen Werte oder Ausdrรผcke miteinander und erzeugen, basierend auf einer Bedingung, einen Bool-Wert. Beim Vergleich zweier Werte kannst du folgende Operatoren verwenden:\n\n* gleich: **==**\n* ungleich: **!=** \n* grรถรer als: **>** \n* kleiner als: **<** \n* grรถรer gleich: **>=** \n* kleiner gleich: **<=** ",
"_____no_output_____"
],
[
"### Ist Gleich: ==",
"_____no_output_____"
],
[
"Wir weisen `a` einen Wert von 5 zu. Benutze den Gleichheitsoperator, der mit zwei Gleichheitszeichen **==** angegeben wird, um festzustellen, ob zwei Werte gleich sind.\n\nDer folgende Fall vergleicht die Variable `a` mit 6.",
"_____no_output_____"
]
],
[
[
"# Bedingung Gleich\n\na = 5 # Ein Gleichheitszeichen: ein Wert wird zugewiesen\na == 6 # Zwei Gleichheitszeichen: zwei Ausdrรผcke/Werte werden miteinander verglichen",
"_____no_output_____"
]
],
[
[
"Das Ergebnis ist **False**, da 5 nicht gleich 6 ist.",
"_____no_output_____"
],
[
"### Grรถรer > oder Kleiner <",
"_____no_output_____"
],
[
"Betrachte den folgenden Vergleich: `i > 5`.\n\n* Wenn der Wert des linken Operanden, in diesem Fall die Variable **i**, grรถรer ist als der Wert des rechten Operanden, in diesem Fall 5, dann ist die Aussage **True**.\n* Andernfalls ist die Aussage **False**. \n* Wรคre **i** = 6, wรคre die Aussage **True**, weil 6 grรถรer als 5 ist.",
"_____no_output_____"
]
],
[
[
"# Grรถรer als Zeichen\n\ni = 6\ni > 5",
"_____no_output_____"
]
],
[
[
"Wenn `i = 2` ist, ist die folgende Aussage falsch, da 2 nicht grรถรer als 5 ist:",
"_____no_output_____"
]
],
[
[
"# Grรถรer als Zeichen\n\ni = 2\ni > 5",
"_____no_output_____"
]
],
[
[
"Wir wollen jetzt einige Werte fรผr `i` in der Grafik anzeigen, wobei fรผr Werte grรถรer als 5 der Hintergrund grรผn und fรผr die anderen rot sein soll. Der grรผne Bereich stellt dar, wo die obige Bedingung **True** ist, der rote, wo die Aussage **False** ist. \n\nWenn der Wert von `i = 2` ist, erhalten wir **False**, da die 2 in den roten Bereich fรคllt. Wenn der Wert fรผr `i = 6` ist, erhalten wir entsprechend **True**, da die Bedingung in den grรผnen Bereich fรคllt. ",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsGreater.gif\" width=\"650\" />",
"_____no_output_____"
],
[
"### Ist Ungleich: !=",
"_____no_output_____"
],
[
"Die Ungleichheitsprรผfung verwendet ein Ausrufezeichen vor dem Gleichheitszeichen, und wenn zwei Operanden ungleich sind, dann wird die Bedingung **True**. \n\nDie folgende Bedingung ergibt beispielsweise **True**, solange der Wert von `i` nicht gleich 6 ist:",
"_____no_output_____"
]
],
[
[
"# Ungleichheitszeichen\n\ni = 2\ni != 6",
"_____no_output_____"
]
],
[
[
"Wenn `i` gleich 6 ist, gibt die Ungleichheitsaussage **False** zurรผck. ",
"_____no_output_____"
]
],
[
[
"# Ungleichheitszeichen\n\ni = 6\ni != 6",
"_____no_output_____"
]
],
[
[
"Betrachten wir die untenstehende Zahlenreihe. Wenn die Bedingung **True** ist, werden die entsprechenden Zahlen grรผn markiert und fรผr den Fall, dass die Bedingung **False** ist, wird die entsprechende Zahl rot markiert. \n\n* Wenn wir `i` gleich 2 einstellen, ist das Ergebnis **True**, da 2 im grรผnen Bereich liegt. \n* Wenn wir `i` gleich 6 setzen, erhalten wir als Ergebnis **False**, da die Bedingung in den roten Bereich fรคllt.",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsIneq.gif\" width=\"650\" />",
"_____no_output_____"
],
[
"### Strings vergleichen",
"_____no_output_____"
],
[
"Wir kรถnnen die gleichen Methoden auf Strings anwenden. Wenn du z.B. einen Gleichheitsoperator fรผr zwei unterschiedliche Strings verwendest erhalten wir **False** da die Strings ja nicht gleich sind.",
"_____no_output_____"
]
],
[
[
"# Benutze das Gleichheitszeichen, um die Strings zu vergleichen.\n\n\"ACDC\" == \"Michael Jackson\"",
"_____no_output_____"
]
],
[
[
"Wenn wir den Ungleichheitsoperator verwenden, wird die Ausgabe **True** sein, da die Strings nicht gleich sind.",
"_____no_output_____"
]
],
[
[
"# Verwenden des Ungleichheitszeichens zum Vergleichen der Strings\n\n\"ACDC\" != \"Michael Jackson\"",
"_____no_output_____"
]
],
[
[
"Man kann sogar Buchstaben nach ihrer Reihenfolge im Alphabet vergleichen, da jedes Zeichen fรผr den Computer einer Zahl entspricht (*ASCII-Code*). **A** ist z.B. 101, und der Wert fรผr **B** ist 102, deshalb gilt:",
"_____no_output_____"
]
],
[
[
"# Zeichen vergleichen\n\n'B' > 'A'",
"_____no_output_____"
]
],
[
[
"Wenn es mehrere Buchstaben gibt, hat der erste Buchstabe Vorrang bei der Reihenfolge:",
"_____no_output_____"
]
],
[
[
"# Zeichen vergleichen\n\n'BA' > 'AB'",
"_____no_output_____"
]
],
[
[
"**Hinweis**: Groรbuchstaben haben einen anderen ASCII-Code als Kleinbuchstaben, d.h. der Vergleich zwischen den Buchstaben in Python ist abhรคngig von der Groร-/Kleinschreibung.",
"_____no_output_____"
],
[
"## Verzweigungen mittels If-Else",
"_____no_output_____"
],
[
"Eine Verzweigung ermรถglicht es uns, unterschiedliche Anweisungen fรผr verschiedene Eingangsgrรถรen auszufรผhren.\n\n### if-Anweisung\n\nEs ist hilfreich, sich eine **if-Anweisung** als einen abgeschlossenen Raum vorzustellen. Wenn die Anweisung **True** ist, kรถnnen wir den Raum betreten und dein Programm wird einige dort definierte Anweisungen ausfรผhren, aber wenn die Anweisung **False** ist, wird das Programm diese Anweisungen ignorieren.\n",
"_____no_output_____"
],
[
"Nehmen wir zum Beispiel ein blaues Rechteck, das ein ACDC-Konzert darstellen soll. Wenn eine Person รคlter als 18 Jahre ist, kann sie am ACDC-Konzert teilnehmen. Wenn sie 18 oder jรผnger als 18 Jahre ist, kann sie nicht an dem Konzert teilnehmen.\n\nBenutze eine der zuvor erlernten Bedingungen und lasse sie in einer **if-Anweisung** รผberprรผfen. \n\nDas geht ganz einfach mit einer Zeile die mit dem Wort `if` beginnt, gefolgt von einer beliebigen Bedingung und einem Doppelpunkt am Ende:\n\n if Bedingung:\n Anweisungen\n ...\n weiter im Programm\n \nDie zu erledigenden Anweisungen beginnen unter dieser Bedingung in einer neuen Zeile mit einer Einrรผckung. \n\nDie Codezeilen nach dem Doppelpunkt und mit einer Einrรผckung werden nur ausgefรผhrt, wenn die **if-Anweisung** gleich **True** ist. Die Anweisungen enden, wenn die Codezeile keinen Einzug mehr hat.\n\nDa die Codezeile `print(\"Du kannst eintreten\")` einen Einzug hat wird sie nur ausgefรผhrt, wenn die Variable `alter` grรถรer als 18 ist und damit die Bedingung `true`ist.\n\nDie Zeile `print(\"weiter geht's\")` wird jedoch nicht durch die if-Anweisung beeinflusst und wird in jedem Fall ausgefรผhrt.",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine If-Anweisung\n\nalter = 19\n\nif alter > 18: # Bedingung, die wahr oder falsch sein kann.\n \n print(\"Du kannst eintreten\" ) #innerhalb eines Einzugs steht die auszufรผhrende Anweisung, fรผr den Fall dass die Bedingung wahr ist.\n\nprint(\"weiter geht's\") #Die Anweisungen nach der if-Anweisung werden unabhรคngig davon ausgefรผhrt, ob die Bedingung wahr oder falsch ist.",
"_____no_output_____"
]
],
[
[
"Versuche, die Variable `alter` auch auf andere Werte zu setzen",
"_____no_output_____"
],
[
"Es ist hilfreich, das folgende Diagramm zu verwenden, um den Prozess zu veranschaulichen.\n\nAuf der linken Seite sehen wir, was passiert, wenn die Bedingung **True** ist. Die Person geht in das AC-DC-Konzert, welches dem Code in dem gerade ausgefรผhrten Einzug entspricht, und danach geht's normal weiter. \n\nAuf der rechten Seite sehen wir, was passiert, wenn die Bedingung **False** ist; der Person wird kein Zugang gewรคhrt, und sie macht also ohne Konzert weiter. In diesem Fall wird das Codesegment im Einzug nicht ausgefรผhrt, aber der Rest der Anweisungen sehr wohl. ",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsIf.gif\" width=\"650\" />",
"_____no_output_____"
],
[
"### else-Anweisung",
"_____no_output_____"
],
[
"Die Anweisung `else` fรผhrt einen Codeblock aus, wenn keine der Bedingungen vor dieser `else`-Anweisung **True** ist. \n\nLaร uns noch einmal unser AC-DC-Konzert betrachten. Wenn der Benutzer 17 Jahre alt ist, kann er zwar nicht zum AC-DC-Konzert gehen, aber er ist alt genug, um ein Meat Loaf Konzert zu besuchen.\n\nDie Syntax der `else`-Anweisung ist รคhnlich wie die Syntax der `if`-Anweisung, also `else:`. \n\n if Bedingung:\n Anweisungen\n ...\n else:\n andere Anweisungen\n ...\n weiter im Programm\n\n\nBeachte, dass es keine Bedingung fรผr `else` gibt, da diese Anweisungen ja immer ausgefรผhrt werden sollen wenn die if-Bedingung nicht `true` ist.\n\nVersuche, die Werte von `alter` zu รคndern, um zu sehen, was passiert: ",
"_____no_output_____"
]
],
[
[
"# Else Anweisung Beispiel\n\nalter = 18\n# alter = 19\n\nif alter > 18:\n print(\"Du kannst eintreten\" )\nelse:\n print(\"schau dir Meat Loaf an\" )\n \nprint(\"weiter geht's\")",
"_____no_output_____"
]
],
[
[
"Der Ablauf wird im Folgenden demonstriert, wobei alle Mรถglichkeiten auf jeder Seite des Bildes dargestellt sind. \n\nAuf der linken Seite ist der Fall dargestellt, dass jemand 17 Jahre alt ist, wir setzen die Variable `alter` also auf 17, und das entspricht einer Person, die das Meat Loaf Konzert besucht. \n\nDer rechte Teil zeigt, was passiert, wenn die Person รผber 18 Jahre alt ist. In diesem Fall ist sie 19 Jahre alt, und sie darf zum AC-DC Konzert.",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsElse.gif\" width=\"650\" />",
"_____no_output_____"
],
[
"### elif-Anweisung",
"_____no_output_____"
],
[
"Die `elif` Anweisung, kurz fรผr **else if**, erlaubt es uns, zusรคtzliche Bedingungen zu รผberprรผfen, wenn die Bedingungen vor ihr **False** sind.\n\nWenn die Bedingung fรผr die `elif`-Anweisung **True** ist, wird dann diese Anweisung ausgefรผhrt. \n\nDenken wir uns ein Konzertbeispiel, wo die Person, wenn sie genau 18 Jahre alt ist, zum Pink Floyd-Konzert geht, anstatt am AC-DC- oder Meat-Loaf-Konzert teilzunehmen. \n\nDie Person im Alter von 18 Jahren betritt das Areal, und da sie nicht รคlter als 18 Jahre ist, kann sie AC-DC nicht sehen, aber sie geht zu Pink Floyd. Nachdem sie Pink Floyd gesehen hat, geht es weiter. \n\nDie Syntax der `elif` Anweisung ist nicht neu, da wir lediglich fรผr die `if`-Abfrage das `if` durch `elif` ersetzen mรผssen.\n\n if Bedingung:\n Anweisungen\n ...\n elif Bedingung:\n Anweisungen\n ...\n else:\n andere Anweisungen\n ...\n weiter im Programm\n",
"_____no_output_____"
],
[
"รndere wieder die Werte fรผr `alter` und schau dir an was geschieht.",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine Elif-Anweisung\n\nalter = 18\n\nif alter > 18:\n print(\"Du kannst eintreten\" )\nelif alter == 18:\n print(\"schau dir Pink Floyd an\")\nelse:\n print(\"schau dir Meat Loaf an\" )\n \nprint(\"weiter geht's\")",
"_____no_output_____"
]
],
[
[
"Die drei Mรถglichkeiten sind in der Abbildung unten dargestellt. Der Bereich ganz links zeigt, was passiert, wenn man weniger als 18 Jahre alt ist. Die mittlere Region zeigt den Ablauf wenn man genau 18 Jahre alt ist. Der ganz rechte Bereich zeigt das fรผr รผber 18 Jahre.",
"_____no_output_____"
],
[
"<img src =\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsElif.gif\" width=\"650\" />",
"_____no_output_____"
],
[
"## Erklรคrung mit anderem Beispiel",
"_____no_output_____"
],
[
"Sieh dir den folgenden Code an:",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine Bedingungsanweisung\n\nalbum_year = 1983\n#album_year = 1970\n\nif album_year > 1980:\n print(\"Das Erscheinungsjahr ist grรถรer als 1980\")\n \nprint('mach irgendetwas ...')",
"_____no_output_____"
]
],
[
[
"Verรคndere den Wert von `album_year` auf andere Werte - du wirst sehen, dass sich das Ergebnis รคndert!",
"_____no_output_____"
],
[
"Beachte, dass der Code im obigen **eingezogenen** Block nur ausgefรผhrt wird, wenn die Ergebnisse **True** sind. ",
"_____no_output_____"
],
[
"Wie zuvor kรถnnen wir einen `else` Block zum `if` Block hinzufรผgen. Der Code im Block `else` wird nur ausgefรผhrt, wenn das Ergebnis **False** ist.\n\n\n**Syntax:** \n\n if (Bedingung):\n # mach etwas\n else:\n # mach etwas anderes",
"_____no_output_____"
],
[
"Wenn die Bedingung in der `if`-Anweisung **False** ist, wird die Anweisung nach dem Block `else` ausgefรผhrt. Dies ist in der Grafik dargestellt: ",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsLogicMap.png\" width=\"650\" />",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine Bedingungsanweisung\n\n#album_year = 1983\nalbum_year = 1970\n\nif album_year > 1980:\n print(\"Das Erscheinungsjahr ist grรถรer als 1980\")\nelse:\n print(\"kleiner als 1980\")\n\nprint('mach irgendetwas ...')",
"_____no_output_____"
]
],
[
[
"รndere den Wert von `album_year` in andere Werte - du wirst sehen, dass sich das Ergebnis dementsprechend รคndert!",
"_____no_output_____"
],
[
"## Logik-Operatoren",
"_____no_output_____"
],
[
"\nManchmal muss man mehr als eine Bedingung auf einmal รผberprรผfen. Beispielsweise kannst du รผberprรผfen, ob eine Bedingung und eine andere Bedingung **True** ist. Logische Operatoren ermรถglichen es dir, Bedingungen zu kombinieren oder zu รคndern.\n* `and`\n* `or` \n* `not` \n\nDiese Operatoren werden fรผr zwei Variablen A und B anhand der folgenden Wahrheitstabellen zusammengefasst: ",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsTable.png\" width=\"650\" />",
"_____no_output_____"
],
[
"* Die Anweisung `and` ist nur **True**, wenn beide Bedingungen erfรผllt sind.\n* Die Anweisung `or` ist wahr, wenn eine Bedingung **True** ist.\n* Die Anweisung `not` gibt den jeweils entgegengesetzten Wahrheitswert aus.",
"_____no_output_____"
],
[
"Schauen wir uns an, wie wir feststellen kรถnnen, ob ein Album nach 1979 (1979 ist nicht enthalten) und vor 1990 (1990 ist nicht enthalten) verรถffentlicht wurde. Die Zeitrรคume zwischen 1980 und 1989 erfรผllen diese Bedingung. Dies wird in der folgenden Grafik veranschaulicht. Das Grรผn der Zeilen <strong>a</strong> und <strong>b</strong> reprรคsentiert Zeitrรคume, in denen die Aussage **True** ist. Das Grรผn auf der Linie <strong>c</strong> stellt dar, wo beide Bedingungen **True** sind, das entspricht dem รberlappungsbereich der oberen grรผnen Bereiche. \n\n",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsEgOne.png\" width=\"650\" />",
"_____no_output_____"
],
[
"Der Codeblock zur Ausfรผhrung dieser รberprรผfung lautet:",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine Bedingungsanweisung\n\nalbum_year = 1980\n\nif(album_year > 1979) and (album_year < 1990):\n print (\"Das Erscheinungsjahr liegt zwischen 1980 und 1989\")\n \nprint(\"\")\nprint(\"mach etwas ...\")",
"_____no_output_____"
]
],
[
[
"Um festzustellen, ob ein Album vor 1980 (? - 1979) oder nach 1989 (1990 - ?) verรถffentlicht wurde, kann eine **or** Anweisung verwendet werden. Zeitrรคume vor 1980 (? - 1979) oder nach 1989 (1990 - ?) erfรผllen diese Bedingung. Dies wird in der folgenden Abbildung dargestellt, die Farbe Grรผn in <strong>a</strong> und <strong>b</strong> reprรคsentiert Zeitrรคume, in denen die Aussage wahr ist. Die Farbe Grรผn in **c** stellt dar, wo mindestens eine der Bedingungen wahr ist. \n",
"_____no_output_____"
],
[
"<img src=\"https://s3-api.us-geo.objectstorage.softlayer.net/cf-courses-data/CognitiveClass/PY0101EN/Chapter%203/Images/CondsEgTwo.png\" width=\"650\" />",
"_____no_output_____"
],
[
"Der Codeblock zur Durchfรผhrung dieser รberprรผfung lautet:",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine Bedingungsanweisung\n\nalbum_year = 1990\n\nif(album_year < 1980) or (album_year > 1989):\n print (\"Das Album ist nicht in den 1980'ern erschienen\")\nelse:\n print(\"Das Album erschien in den 1980'ern\")",
"_____no_output_____"
]
],
[
[
"Die Anweisung `not` prรผft, ob die Anweisung falsch ist:",
"_____no_output_____"
]
],
[
[
"# Beispiel fรผr eine Bedingungsanweisung\n\nalbum_year = 1983\n\nif not (album_year == '1984'):\n print (\"Das Album ist nicht 1984 erschienen\")",
"_____no_output_____"
]
],
[
[
"<hr>",
"_____no_output_____"
],
[
"# รbungen zu Bedingungen und Verzweigungen",
"_____no_output_____"
],
[
"## รbung 1",
"_____no_output_____"
],
[
"Schreibe eine if-Anweisung, um festzustellen, ob ein Album eine Bewertung von mehr als 8 hat, und prรผfe das anhand der Bewertung fรผr das Album **\"Back in Black\"**, das eine Bewertung von 8.5 hat. Wenn die Aussage wahr ist, soll \"Dieses Album ist der Hammer!\" ausgegeben werden.",
"_____no_output_____"
]
],
[
[
"# Gib deinen Code unten ein und drรผcke Shift+Enter, um ihn auszufรผhren.\n\n",
"_____no_output_____"
]
],
[
[
"Doppelklicke __hier__ um die Lรถsung anzuzeigen. <!-- Antwort: \nbewertung = 8.5\nif bewertung > 8:\n print (\"Dieses Album ist der Hammer!\")\n -->",
"_____no_output_____"
],
[
"<hr>",
"_____no_output_____"
],
[
"## รbung 2",
"_____no_output_____"
],
[
"Schreibe eine if-else-Anweisung, die Folgendes ausfรผhrt. \n\n* Wenn die Bewertung grรถรer als acht ist, soll \"Dieses Album ist der Hammer!\" ausgegeben werden. \n* Wenn die Bewertung kleiner oder gleich 8 ist, gib \"Dieses Album ist OK.\" aus.",
"_____no_output_____"
]
],
[
[
"# Gib deinen Code unten ein und drรผcke Shift+Enter, um ihn auszufรผhren.\n\n",
"_____no_output_____"
]
],
[
[
"Doppelklicke __hier__ um die Lรถsung anzuzeigen. \n\n<!-- Antwort: \nrating = 8.5\nif rating > 8:\n print (\"Dieses Album ist der Hammer!\")\nelse:\n print (\"Dieses Album ist OK.\")\n-->",
"_____no_output_____"
],
[
"<hr>",
"_____no_output_____"
],
[
"## รbung 3",
"_____no_output_____"
],
[
"Schreibe eine if-Anweisung, um festzustellen, ob ein Album vor 1980 oder in einem der folgenden Jahre herauskam: 1991 oder 1993.\n\nWenn die Bedingung erfรผllt ist, soll das das Jahr ausgegeben werden, in dem das Album herauskam.",
"_____no_output_____"
]
],
[
[
"# Gib deinen Code unten ein und drรผcke Shift+Enter, um ihn auszufรผhren.\n\n",
"_____no_output_____"
]
],
[
[
"Doppelklicke __hier__ um die Lรถsung anzuzeigen. \n\n<!-- Antwort: \nalbum_year = 1979\n\nif album_year < 1980 or album_year == 1991 or album_year == 1993:\n print (album_year)\n-->",
"_____no_output_____"
],
[
"\n---\n\n>> Zurรผck zu [03-Turtle](03-Turtle.ipynb) --- Weiter zu [05-Schleifen](05-Schleifen.ipynb)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
cbf9fafe15b5bd2ec9b408e4b241e27f157d8ee5
| 108,635 |
ipynb
|
Jupyter Notebook
|
examples/PyCaret 2 Fugue Integration.ipynb
|
daikikatsuragawa/pycaret
|
a2436a1485b325e7f99f57f99f8049177f26a8c9
|
[
"MIT"
] | 1 |
2022-03-13T03:41:00.000Z
|
2022-03-13T03:41:00.000Z
|
examples/PyCaret 2 Fugue Integration.ipynb
|
daikikatsuragawa/pycaret
|
a2436a1485b325e7f99f57f99f8049177f26a8c9
|
[
"MIT"
] | null | null | null |
examples/PyCaret 2 Fugue Integration.ipynb
|
daikikatsuragawa/pycaret
|
a2436a1485b325e7f99f57f99f8049177f26a8c9
|
[
"MIT"
] | 1 |
2022-03-15T13:56:16.000Z
|
2022-03-15T13:56:16.000Z
| 48.737102 | 672 | 0.507083 |
[
[
[
"# PyCaret Fugue Integration\n\n[Fugue](https://github.com/fugue-project/fugue) is a low-code unified interface for different computing frameworks such as Spark, Dask and Pandas. PyCaret is using Fugue to support distributed computing scenarios.\n\n## Hello World\n\n### Classification\n\nLet's start with the most standard example, the code is exactly the same as the local version, there is no magic.",
"_____no_output_____"
]
],
[
[
"from pycaret.datasets import get_data\nfrom pycaret.classification import *\n\nsetup(data=get_data(\"juice\"), target = 'Purchase', n_jobs=1)\n\ntest_models = models().index.tolist()[:5]",
"_____no_output_____"
]
],
[
[
"`compare_model` is also exactly the same if you don't want to use a distributed system",
"_____no_output_____"
]
],
[
[
"compare_models(include=test_models, n_select=2)",
"_____no_output_____"
]
],
[
[
"Now let's make it distributed, as a toy case, on dask. The only thing changed is an additional parameter `parallel_backend`",
"_____no_output_____"
]
],
[
[
"from pycaret.parallel import FugueBackend\n\ncompare_models(include=test_models, n_select=2, parallel=FugueBackend(\"dask\"))",
"_____no_output_____"
]
],
[
[
"In order to use Spark as the execution engine, you must have access to a Spark cluster, and you must have a `SparkSession`, let's initialize a local Spark session",
"_____no_output_____"
]
],
[
[
"from pyspark.sql import SparkSession\n\nspark = SparkSession.builder.getOrCreate()",
"_____no_output_____"
]
],
[
[
"Now just change `parallel_backend` to this session object, you make it run on Spark. You must understand this is a toy case. In the real situation, you need to have a SparkSession pointing to a real Spark cluster to enjoy the power of Spark",
"_____no_output_____"
]
],
[
[
"compare_models(include=test_models, n_select=2, parallel=FugueBackend(spark))",
" \r"
]
],
[
[
"In the end, you can `pull` to get the metrics table",
"_____no_output_____"
]
],
[
[
"pull()",
"_____no_output_____"
]
],
[
[
"### Regression\n\nIt's follows the same pattern as classification.",
"_____no_output_____"
]
],
[
[
"from pycaret.datasets import get_data\nfrom pycaret.regression import *\n\nsetup(data=get_data(\"insurance\"), target = 'charges', n_jobs=1)\n\ntest_models = models().index.tolist()[:5]",
"_____no_output_____"
]
],
[
[
"`compare_model` is also exactly the same if you don't want to use a distributed system",
"_____no_output_____"
]
],
[
[
"compare_models(include=test_models, n_select=2)",
"_____no_output_____"
]
],
[
[
"Now let's make it distributed, as a toy case, on dask. The only thing changed is an additional parameter `parallel_backend`",
"_____no_output_____"
]
],
[
[
"from pycaret.parallel import FugueBackend\n\ncompare_models(include=test_models, n_select=2, parallel=FugueBackend(\"dask\"))",
"_____no_output_____"
]
],
[
[
"In order to use Spark as the execution engine, you must have access to a Spark cluster, and you must have a `SparkSession`, let's initialize a local Spark session",
"_____no_output_____"
]
],
[
[
"from pyspark.sql import SparkSession\n\nspark = SparkSession.builder.getOrCreate()",
"_____no_output_____"
]
],
[
[
"Now just change `parallel_backend` to this session object, you make it run on Spark. You must understand this is a toy case. In the real situation, you need to have a SparkSession pointing to a real Spark cluster to enjoy the power of Spark",
"_____no_output_____"
]
],
[
[
"compare_models(include=test_models, n_select=2, parallel=FugueBackend(spark))",
" \r"
]
],
[
[
"In the end, you can `pull` to get the metrics table",
"_____no_output_____"
]
],
[
[
"pull()",
"_____no_output_____"
]
],
[
[
"As you see, the results from the distributed versions can be different from your local versions. In the next section, we will show how to make them identical.\n\n## A more practical case\n\nThe above examples are pure toys, to make things work perfectly in a distributed system you must be careful about a few things\n\n### Use a lambda instead of a dataframe in setup\n\nIf you directly provide a dataframe in `setup`, this dataset will need to be sent to all worker nodes. If the dataframe is 1G, you have 100 workers, then it is possible your dirver machine will need to send out up to 100G data (depending on specific framework's implementation), then this data transfer becomes a bottleneck itself. Instead, if you provide a lambda function, it doesn't change the local compute scenario, but the driver will only send the function reference to workers, and each worker will be responsible to load the data by themselves, so there is no heavy traffic on the driver side.\n\n### Be deterministic\n\nYou should always use `session_id` to make the distributed compute deterministic, otherwise, for the exactly same logic you could get drastically different selection for each run.\n\n### Set n_jobs\n\nIt is important to be explicit on n_jobs when you want to run something distributedly, so it will not overuse the local/remote resources. This can also avoid resrouce contention, and make the compute faster.",
"_____no_output_____"
]
],
[
[
"from pycaret.classification import *\n\nsetup(data=lambda: get_data(\"juice\", verbose=False, profile=False), target = 'Purchase', session_id=0, n_jobs=1);",
"_____no_output_____"
]
],
[
[
"### Set the appropriate batch_size\n\n`batch_size` parameter helps adjust between load balence and overhead. For each batch, setup will be called only once. So\n\n| Choice |Load Balance|Overhead|Best Scenario|\n|---|---|---|---|\n|Smaller batch size|Better|Worse|`training time >> data loading time` or `models ~= workers`|\n|Larger batch size|Worse|Better|`training time << data loading time` or `models >> workers`|\n\nThe default value is set to `1`, meaning we want the best load balance.\n\n### Display progress\n\nIn development, you can enable visual effect by `display_remote=True`, but meanwhile you must also enable [Fugue Callback](https://fugue-tutorials.readthedocs.io/tutorials/advanced/rpc.html) so that the driver can monitor worker progress. But it is recommended to turn off display in production.",
"_____no_output_____"
]
],
[
[
"fconf = {\n \"fugue.rpc.server\": \"fugue.rpc.flask.FlaskRPCServer\", # keep this value\n \"fugue.rpc.flask_server.host\": \"0.0.0.0\", # the driver ip address workers can access\n \"fugue.rpc.flask_server.port\": \"3333\", # the open port on the dirver\n \"fugue.rpc.flask_server.timeout\": \"2 sec\", # the timeout for worker to talk to driver\n}\n\nbe = FugueBackend(\"dask\", fconf, display_remote=True, batch_size=3, top_only=False)\ncompare_models(n_select=2, parallel=be)",
"_____no_output_____"
]
],
[
[
"## Notes\n\n### Spark settings\n\nIt is highly recommended to have only 1 worker on each Spark executor, so the worker can fully utilize all cpus (set `spark.task.cpus`). Also when you do this you should explicitly set `n_jobs` in `setup` to the number of cpus of each executor.\n\n```python\nexecutor_cores = 4\n\nspark = SparkSession.builder.config(\"spark.task.cpus\", executor_cores).config(\"spark.executor.cores\", executor_cores).getOrCreate()\n\nsetup(data=get_data(\"juice\", verbose=False, profile=False), target = 'Purchase', session_id=0, n_jobs=executor_cores)\n\ncompare_models(n_select=2, parallel=FugueBackend(spark))\n```\n\n### Databricks\n\nOn Databricks, `spark` is the magic variable representing a SparkSession. But there is no difference to use. You do the exactly same thing as before:\n\n```python\ncompare_models(parallel=FugueBackend(spark))\n```\n\nBut Databricks, the visualization is difficult, so it may be a good idea to do two things:\n\n* Set `verbose` to False in `setup`\n* Set `display_remote` to False in `FugueBackend`\n\n### Dask\n\nDask has fake distributed modes such as the default (multi-thread) and multi-process modes. The default mode will just work fine (but they are actually running sequentially), and multi-process doesn't work for PyCaret for now because it messes up with PyCaret's global variables. On the other hand, any Spark execution mode will just work fine.\n\n### Local Parallelization\n\nFor practical use where you try non-trivial data and models, local parallelization (The eaiest way is to use local Dask as backend as shown above) normally doesn't have performance advantage. Because it's very easy to overload the CPUS on training, increasing the contention of resources. The value of local parallelization is to verify the code and give you confidence that the distributed environment will provide the expected result with much shorter time.\n\n### How to develop \n\nDistributed systems are powerful but you must follow some good practices to use them:\n\n1. **From small to large:** initially, you must start with a small set of data, for example in `compare_model` limit the models you want to try to a small number of cheap models, and when you verify they work, you can change to a larger model collection.\n2. **From local to distributed:** you should follow this sequence: verify small data locally then verify small data distributedly and then verify large data distributedly. The current design makes the transition seamless. You can do these sequentially: `parallel=None` -> `parallel=FugueBackend()` -> `parallel=FugueBackend(spark)`. In the second step, you can replace with a local SparkSession or local dask.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbfa0f0199a83d9cac307c600f88b4e3f331067e
| 7,207 |
ipynb
|
Jupyter Notebook
|
notebooks/margin_model_evaluation2.ipynb
|
JiajunBao/neural-dimension-reduction
|
1846bc1ae1ed4bcaf962357fbf6ce57e6f0869a3
|
[
"RSA-MD"
] | null | null | null |
notebooks/margin_model_evaluation2.ipynb
|
JiajunBao/neural-dimension-reduction
|
1846bc1ae1ed4bcaf962357fbf6ce57e6f0869a3
|
[
"RSA-MD"
] | null | null | null |
notebooks/margin_model_evaluation2.ipynb
|
JiajunBao/neural-dimension-reduction
|
1846bc1ae1ed4bcaf962357fbf6ce57e6f0869a3
|
[
"RSA-MD"
] | null | null | null | 21.577844 | 280 | 0.497156 |
[
[
[
"import torch\nfrom src.models.level_kv_div import binaryTrainer, utils, network\nfrom torch.utils.data import DataLoader\n\nfrom sklearn.metrics import classification_report\n\nimport torch",
"_____no_output_____"
],
[
"test_dataset = binaryTrainer.LargeSparseDataset('../data/processed/sample/dev.csv', 100, balanced=True, random_neg=True)\n# test_dataset = binaryTrainer.get_dataset('../data/processed/sample/train.csv', '../data/processed/sample/train.level.grading')\ntest_dataloader = DataLoader(test_dataset, shuffle=False, batch_size=1024, pin_memory=True)",
"_____no_output_____"
],
[
"# torch.save(test_dataset, '../data/train.largeSparseDataset.pt')",
"_____no_output_____"
],
[
"model = torch.load('margin_model.pth.tar')",
"_____no_output_____"
],
[
"test_loss, (test_acc, test_pred, test_gold, test_dist) = binaryTrainer.val_one_epoch(test_dataloader, model.to('cuda'), 'cuda')",
"_____no_output_____"
],
[
"test_acc",
"_____no_output_____"
],
[
"test_dist.shape",
"_____no_output_____"
],
[
"test_dist[test_gold == 0.]",
"_____no_output_____"
],
[
"test_dist[test_gold == -1.]",
"_____no_output_____"
],
[
"test_dist[test_gold == 1]",
"_____no_output_____"
],
[
"len(test_pred[test_gold == 0.])",
"_____no_output_____"
],
[
"len(test_pred)",
"_____no_output_____"
],
[
"# test_pred[test_pred == 0.] = 1.\n# test_gold[test_gold == 0.] = 1.\n\n# test_gold[test_pred != test_gold]\n\n# len(test_gold[test_pred != test_gold])\n\n# 1 - len(test_gold[test_pred != test_gold]) / len(test_pred)",
"_____no_output_____"
],
[
"# for x, y, l in test_dataset:\n# if l == 0:\n# print(((x - y) ** 2).sum())",
"_____no_output_____"
],
[
"print(classification_report(test_gold.numpy(), test_pred.numpy()))",
" precision recall f1-score support\n\n -1 1.00 0.87 0.93 51636\n 0 0.00 0.00 0.00 15630\n 1 0.75 1.00 0.86 67266\n\n accuracy 0.83 134532\n macro avg 0.58 0.62 0.60 134532\nweighted avg 0.76 0.83 0.79 134532\n\n"
],
[
"len(test_pred[test_gold == 0.])",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfa1db35903fa7f67ea53bf7fa1716541ef2107
| 2,813 |
ipynb
|
Jupyter Notebook
|
source/getting_started/index.ipynb
|
zhaofeng-shu33/pymoo-doc
|
34a9809b1e6c0da79fde342d55708cc047d325cb
|
[
"Apache-2.0"
] | 2 |
2021-09-11T06:43:49.000Z
|
2021-11-10T13:36:09.000Z
|
source/getting_started/index.ipynb
|
zhaofeng-shu33/pymoo-doc
|
34a9809b1e6c0da79fde342d55708cc047d325cb
|
[
"Apache-2.0"
] | 3 |
2021-09-21T14:04:47.000Z
|
2022-03-07T13:46:09.000Z
|
source/getting_started/index.ipynb
|
zhaofeng-shu33/pymoo-doc
|
34a9809b1e6c0da79fde342d55708cc047d325cb
|
[
"Apache-2.0"
] | 3 |
2021-10-09T02:47:26.000Z
|
2022-02-10T07:02:37.000Z
| 26.790476 | 464 | 0.591539 |
[
[
[
".. meta::\n :description: A guide which introduces the most important steps to get started with pymoo, an open-source multi-objective optimization framework in Python.",
"_____no_output_____"
],
[
".. meta::\n :keywords: Multi-objective Optimization, Python, Evolutionary Computation, Optimization Test Problem, Hypervolume",
"_____no_output_____"
]
],
[
[
".. _nb_getting_started:",
"_____no_output_____"
]
],
[
[
"# Getting Started",
"_____no_output_____"
]
],
[
[
".. toctree::\n :maxdepth: 1\n :hidden:\n\n preface.ipynb\n part_1.ipynb\n part_2.ipynb\n part_3.ipynb\n part_4.ipynb\n part_5.ipynb\n source_code.ipynb\n ",
"_____no_output_____"
]
],
[
[
"Learning a new framework, in general, can be rather challenging. Thus, this getting started guide aims to make the first steps with pymoo as simple as possible by demonstrating its capabilities on an example. This guide covers the essential steps when starting with multi-objective optimization and shall be helpful to solve your own optimization problems. Some basic understanding of optimization and knowledge of Python and NumPy are expected to follow.\n\n\nThis guide is structured as follows:",
"_____no_output_____"
]
],
[
[
".. admonition:: Overview\n :class: myOwnStyle\n\n - `Preface <preface.ipynb>`_: Basics and Challenges\n - `Part I <part_1.ipynb>`_: A Constrained Bi-objective Optimization Problem\n - `Part II <part_2.ipynb>`_: Find a Solution Set using Multi-objective Optimization\n - `Part III <part_3.ipynb>`_: Multi-Criteria Decision Making\n - `Part IV <part_4.ipynb>`_: Analysis of Convergence\n - `Part V <part_5.ipynb>`_: Some more useful Information",
"_____no_output_____"
]
]
] |
[
"markdown",
"raw",
"markdown",
"raw",
"markdown",
"raw"
] |
[
[
"markdown",
"markdown"
],
[
"raw"
],
[
"markdown"
],
[
"raw"
],
[
"markdown"
],
[
"raw"
]
] |
cbfa2d6cf84c77229e44f22c6fadc9b85b388a4b
| 12,954 |
ipynb
|
Jupyter Notebook
|
NLP/4_Embeddings/Interpreting Embeddings.ipynb
|
yash1996/DL_curated_intuition
|
ae6d966bea93a0ece50dd7a2aa316ccde8c6d6b2
|
[
"CC0-1.0"
] | 33 |
2020-05-10T13:13:27.000Z
|
2021-04-22T08:40:55.000Z
|
NLP/4_Embeddings/Interpreting Embeddings.ipynb
|
yash1996/DL_curated_intuition
|
ae6d966bea93a0ece50dd7a2aa316ccde8c6d6b2
|
[
"CC0-1.0"
] | 9 |
2020-09-26T00:39:54.000Z
|
2022-03-12T00:14:11.000Z
|
NLP/4_Embeddings/Interpreting Embeddings.ipynb
|
nikhilpradeep/deep
|
d55562100fe3804e55ea1cf1637a669da69baec3
|
[
"CC0-1.0"
] | 18 |
2020-06-07T12:58:21.000Z
|
2022-02-21T17:18:25.000Z
| 30.916468 | 303 | 0.557511 |
[
[
[
"We use Embeddings to represent text into a numerical form. Either into a one-hot encoding format called sparse vector or a fixed Dense representation called Dense Vector.\n\nEvery Word gets it meaning from the words it is surrounded by, So when we train our embeddings we want word with similar meaning or words used in similar context to be together.\n\nFor Example:- \n1. Words like Aeroplane, chopper, Helicopter, Drone should be very close to each other because they share the same feature, they are flying object.\n\n2. Words like Man and Women should be exact opposite to each other.\n\n3. Sentences like \"Coders are boring people.\" and \"Programmers are boring.\" the word `coders` and `programmers` are used in similar context so they should be close to each other.\n\nWord Embeddings are nothing but vectors in a vector space. And using some vector calculation we can easily find \n1. Synonyms or similar words\n2. Finding Analogies\n3. Can be used as spell check (if trained on a large corpus)\n4. Pretty Much Anything which you can do with vectors.\n",
"_____no_output_____"
]
],
[
[
"import torchtext\nimport numpy as np\nimport torch",
"_____no_output_____"
],
[
"glove = torchtext.vocab.GloVe(name = '6B', dim = 100)\n\nprint(f'There are {len(glove.itos)} words in the vocabulary')",
"There are 400000 words in the vocabulary\n"
],
[
"glove.itos[:10]",
"_____no_output_____"
],
[
"glove.stoi[\"cat\"]",
"_____no_output_____"
],
[
"def get_embedding(word):\n return glove.vectors[glove.stoi[word]]",
"_____no_output_____"
],
[
"get_embedding(\"cat\")",
"_____no_output_____"
]
],
[
[
"# Similar Context\n\nTo find words similar to input words. We have to first take the vector representation of all words and compute the eucledian distance of the input word with respect to all words and choose the n closest words by sorting the distance ascending order.\n",
"_____no_output_____"
]
],
[
[
"def get_closest_word(word,n=10):\n input_vector = get_embedding(word).numpy() if isinstance(word,str) else word.numpy()\n distance = np.linalg.norm(input_vector-glove.vectors.numpy(),axis=1)\n sort_dis = np.argsort(distance)[:n]\n return list(zip(np.array(glove.itos)[sort_dis] , distance[sort_dis]))",
"_____no_output_____"
],
[
"get_closest_word(\"sad\",n=10)",
"_____no_output_____"
],
[
"def get_similarity_angle(word1,word2):\n word1 = get_embedding(word1).view(1,-1)\n word2 = get_embedding(word2).view(1,-1)\n simi = torch.nn.CosineSimilarity(dim=1)(word1,word2).numpy() \n return simi,np.rad2deg(np.arccos(simi))\n",
"_____no_output_____"
],
[
"get_similarity_angle(\"sad\",\"awful\")",
"_____no_output_____"
]
],
[
[
"# Analogies",
"_____no_output_____"
]
],
[
[
"def analogy( word1, word2, word3, n=5):\n \n #get vectors for each word\n word1_vector = get_embedding(word1)\n word2_vector = get_embedding(word2)\n word3_vector = get_embedding(word3)\n \n #calculate analogy vector\n analogy_vector = word2_vector - word1_vector + word3_vector\n \n# #find closest words to analogy vector\n candidate_words = get_closest_word( analogy_vector, n=n+3)\n \n #filter out words already in analogy\n candidate_words = [(word, dist) for (word, dist) in candidate_words \n if word not in [word1, word2, word3]][:n]\n \n print(f'{word1} is to {word2} as {word3} is to...')\n \n return candidate_words",
"_____no_output_____"
],
[
"analogy('man', 'king', 'woman')",
"man is to king as woman is to...\n"
]
],
[
[
"This is the canonical example which shows off this property of word embeddings. So why does it work? Why does the vector of 'woman' added to the vector of 'king' minus the vector of 'man' give us 'queen'?\n\nIf we think about it, the vector calculated from 'king' minus 'man' gives us a \"royalty vector\". This is the vector associated with traveling from a man to his royal counterpart, a king. If we add this \"royality vector\" to 'woman', this should travel to her royal equivalent, which is a queen!",
"_____no_output_____"
]
],
[
[
"analogy('india', 'delhi', 'australia')",
"india is to delhi as australia is to...\n"
],
[
"get_closest_word(\"reliable\")",
"_____no_output_____"
]
],
[
[
"# Case Studies\n1. https://forums.fast.ai/t/nlp-any-libraries-dictionaries-out-there-for-fixing-common-spelling-errors/16411\n\n2. Multilingual and Cross-lingual analysis: If you work on works in translation, or on the influence of writers who write in one language on those who write in another language, word vectors can valuable ways to study these kinds of cross-lingual relationships algorithmically.\n[Case Study: Using word vectors to study endangered languages](https://raw.githubusercontent.com/YaleDHLab/lab-workshops/master/word-vectors/papers/coeckelbergs.pdf)\n\n3. Studying Language Change over Time: If you want to study the way the meaning of a word has changed over time, word vectors provide an exceptional method for this kind of study.\n[Case Study: Using word vectors to analyze the changing meaning of the word \"gay\" in the twentieth century.](https://nlp.stanford.edu/projects/histwords/)\n\n4. Analyzing Historical Concept Formation: If you want to analyze the ways writers in a given historical period understood particular concepts like \"honor\" and \"chivalry\", then word vectors can provide excellent opportunities to uncover these hidden associations.\n[Case Study: Using word vectors to study the ways eighteenth-century authors organized moral abstractions](https://raw.githubusercontent.com/YaleDHLab/lab-workshops/master/word-vectors/papers/heuser.pdf)\n\n5. Uncovering Text Reuse: If you want to study text reuse or literary imitation (either within one language or across multiple languages), word vectors can provide excellent tools for identifying similar passages of text.\n[Case Study: Using word vectors to uncover cross-lingual text reuse in eighteenth-century writing](https://douglasduhaime.com/posts/crosslingual-plagiarism-detection.html)\n",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
] |
cbfa40f3ff350e5b93c3c5c8cdbac76689f77ef9
| 1,994 |
ipynb
|
Jupyter Notebook
|
live_video_save.ipynb
|
Siddhant231096/Computer-Vision
|
16fa302ee79d345a74551a798808b0153366baaf
|
[
"Apache-2.0"
] | null | null | null |
live_video_save.ipynb
|
Siddhant231096/Computer-Vision
|
16fa302ee79d345a74551a798808b0153366baaf
|
[
"Apache-2.0"
] | null | null | null |
live_video_save.ipynb
|
Siddhant231096/Computer-Vision
|
16fa302ee79d345a74551a798808b0153366baaf
|
[
"Apache-2.0"
] | null | null | null | 18.635514 | 77 | 0.496489 |
[
[
[
"import cv2\n",
"_____no_output_____"
],
[
"cap=cv2.VideoCapture(0)\n",
"_____no_output_____"
],
[
"[i for i in dir(cv2) if 'Video' in i]",
"_____no_output_____"
],
[
"video_plugin=cv2.VideoWriter_fourcc(*'XVID')",
"_____no_output_____"
],
[
"output=cv2.VideoWriter('G:/Movies/self.avi',video_plugin,100,(640,480))",
"_____no_output_____"
],
[
"while cap.isOpened():\n status,data=cap.read()\n cv2.imshow('live',data)\n output.write(data)\n if cv2.waitKey(25) & 0xff==ord('q'):\n break\ncv2.destroyAllWindows()\noutput.release()\ncap.release()\n ",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfa5e311b5bc0e27baaf771097042fb042bc8f6
| 16,346 |
ipynb
|
Jupyter Notebook
|
GroversAlgorithm/GroversAlgorithm.ipynb
|
samik-saha/QuantumKatas
|
b189fd7cb6b10734d67c42ae38e2c9cd73011274
|
[
"MIT"
] | 1 |
2021-09-25T08:37:23.000Z
|
2021-09-25T08:37:23.000Z
|
GroversAlgorithm/GroversAlgorithm.ipynb
|
samik-saha/QuantumKatas
|
b189fd7cb6b10734d67c42ae38e2c9cd73011274
|
[
"MIT"
] | null | null | null |
GroversAlgorithm/GroversAlgorithm.ipynb
|
samik-saha/QuantumKatas
|
b189fd7cb6b10734d67c42ae38e2c9cd73011274
|
[
"MIT"
] | null | null | null | 34.778723 | 325 | 0.588768 |
[
[
[
"empty"
]
]
] |
[
"empty"
] |
[
[
"empty"
]
] |
cbfa602fed7f5d6f063bb73fed47378e2eb60189
| 18,451 |
ipynb
|
Jupyter Notebook
|
notebooks/object2vec_document_embedding.ipynb
|
Hironsan/wiki-article-dataset
|
84f988574d64d5b4815cc6d0edef5ad2f6637e10
|
[
"MIT"
] | 9 |
2019-05-07T17:06:34.000Z
|
2021-08-28T03:12:51.000Z
|
notebooks/object2vec_document_embedding.ipynb
|
Hironsan/wiki-article-dataset
|
84f988574d64d5b4815cc6d0edef5ad2f6637e10
|
[
"MIT"
] | null | null | null |
notebooks/object2vec_document_embedding.ipynb
|
Hironsan/wiki-article-dataset
|
84f988574d64d5b4815cc6d0edef5ad2f6637e10
|
[
"MIT"
] | 1 |
2019-05-27T02:26:15.000Z
|
2019-05-27T02:26:15.000Z
| 37.425963 | 1,336 | 0.618611 |
[
[
[
"# Document Embedding with Amazon SageMaker Object2Vec",
"_____no_output_____"
],
[
"1. [Introduction](#Introduction)\n2. [Background](#Background)\n 1. [Embedding documents using Object2Vec](#Embedding-documents-using-Object2Vec)\n3. [Download and preprocess Wikipedia data](#Download-and-preprocess-Wikipedia-data)\n 1. [Install and load dependencies](#Install-and-load-dependencies)\n 2. [Build vocabulary and tokenize datasets](#Build-vocabulary-and-tokenize-datasets)\n 3. [Upload preprocessed data to S3](#Upload-preprocessed-data-to-S3)\n4. [Define SageMaker session, Object2Vec image, S3 input and output paths](#Define-SageMaker-session,-Object2Vec-image,-S3-input-and-output-paths)\n5. [Train and deploy doc2vec](#Train-and-deploy-doc2vec)\n 1. [Learning performance boost with new features](#Learning-performance-boost-with-new-features)\n 2. [Training speedup with sparse gradient update](#Training-speedup-with-sparse-gradient-update)\n6. [Apply learned embeddings to document retrieval task](#Apply-learned-embeddings-to-document-retrieval-task)\n 1. [Comparison with the StarSpace algorithm](#Comparison-with-the-StarSpace-algorithm)",
"_____no_output_____"
],
[
"## Introduction",
"_____no_output_____"
],
[
"In this notebook, we introduce four new features to Object2Vec, a general-purpose neural embedding algorithm: negative sampling, sparse gradient update, weight-sharing, and comparator operator customization. The new features together broaden the applicability of Object2Vec, improve its training speed and accuracy, and provide users with greater flexibility. See [Introduction to the Amazon SageMaker Object2Vec](https://aws.amazon.com/blogs/machine-learning/introduction-to-amazon-sagemaker-object2vec/) if you arenโt already familiar with Object2Vec.\n\nWe demonstrate how these new features extend the applicability of Object2Vec to a new Document Embedding use-case: A customer has a large collection of documents. Instead of storing these documents in its raw format or as sparse bag-of-words vectors, to achieve training efficiency in the various downstream tasks, she would like to instead embed all documents in a common low-dimensional space, so that the semantic distance between these documents are preserved.",
"_____no_output_____"
],
[
"## Background",
"_____no_output_____"
],
[
"Object2Vec is a highly customizable multi-purpose algorithm that can learn embeddings of pairs of objects. The embeddings are learned such that it preserves their pairwise similarities in the original space.\n\n- Similarity is user-defined: users need to provide the algorithm with pairs of objects that they define as similar (1) or dissimilar (0); alternatively, the users can define similarity in a continuous sense (provide a real-valued similarity score).\n\n- The learned embeddings can be used to efficiently compute nearest neighbors of objects, as well as to visualize natural clusters of related objects in the embedding space. In addition, the embeddings can also be used as features of the corresponding objects in downstream supervised tasks such as classification or regression.",
"_____no_output_____"
],
[
"### Embedding documents using Object2Vec",
"_____no_output_____"
],
[
"We demonstrate how, with the new features, Object2Vec can be used to embed a large collection of documents into vectors in the same latent space.\n\nSimilar to the widely used Word2Vec algorithm for word embedding, a natural approach to document embedding is to preprocess documents as (sentence, context) pairs, where the sentence and its matching context come from the same document. The matching context is the entire document with the given sentence removed. The idea is to embed both sentence and context into a low dimensional space such that their mutual similarity is maximized, since they belong to the same document and therefore should be semantically related. The learned encoder for the context can then be used to encode new documents into the same embedding space. In order to train the encoders for sentences and documents, we also need negative (sentence, context) pairs so that the model can learn to discriminate between semantically similar and dissimilar pairs. It is easy to generate such negatives by pairing sentences with documents that they do not belong to. Since there are many more negative pairs than positives in naturally occurring data, we typically resort to random sampling techniques to achieve a balance between positive and negative pairs in the training data. The figure below shows pictorially how the positive pairs and negative pairs are generated from unlabeled data for the purpose of learning embeddings for documents (and sentences).",
"_____no_output_____"
],
[
"We show how Object2Vec with the new *negative sampling feature* can be applied to the document embedding use-case. In addition, we show how the other new features, namely, *weight-sharing*, *customization of comparator operator*, and *sparse gradient update*, together enhance the algorithm's performance and user-experience in and beyond this use-case. Sections [Learning performance boost with new features](#Learning-performance-boost-with-new-features) and [Training speedup with sparse gradient update](#Training-speedup-with-sparse-gradient-update) in this notebook provide a detailed introduction to the new features.",
"_____no_output_____"
],
[
"## Download and preprocess Wikipedia data",
"_____no_output_____"
],
[
"Please be aware of the following requirements about the acknowledgment, copyright and availability, cited from the [data source description page](https://github.com/facebookresearch/StarSpace/blob/master/LICENSE.md).\n\n> Permission is hereby granted, free of charge, to any person obtaining a copy of this software and associated documentation files (the \"Software\"), to deal in the Software without restriction, including without limitation the rights to use, copy, modify, merge, publish, distribute, sublicense, and/or sell copies of the Software, and to permit persons to whom the Software is furnished to do so, subject to the following conditions: The above copyright notice and this permission notice shall be included in all copies or substantial portions of the Software.",
"_____no_output_____"
]
],
[
[
"%%bash\n\nDATANAME=\"wikipedia\"\nDATADIR=\"/tmp/wiki\"\n\nmkdir -p \"${DATADIR}\"\n\nif [ ! -f \"${DATADIR}/${DATANAME}_train250k.txt\" ]\nthen\n echo \"Downloading wikipedia data\"\n wget --quiet -c \"https://s3-ap-northeast-1.amazonaws.com/dev.tech-sketch.jp/chakki/public/ja.wikipedia_250k.zip\" -O \"${DATADIR}/${DATANAME}_train.zip\"\n unzip \"${DATADIR}/${DATANAME}_train.zip\" -d \"${DATADIR}\"\nfi\n",
"_____no_output_____"
],
[
"datadir = '/tmp/wiki'",
"_____no_output_____"
],
[
"!ls /tmp/wiki",
"_____no_output_____"
]
],
[
[
"### Install and load dependencies",
"_____no_output_____"
]
],
[
[
"!pip install keras tensorflow",
"_____no_output_____"
],
[
"import json\nimport os\nimport random\nfrom itertools import chain\nfrom keras.preprocessing.text import Tokenizer\nfrom sklearn.preprocessing import normalize\n\n## sagemaker api\nimport sagemaker, boto3\nfrom sagemaker.session import s3_input\nfrom sagemaker.predictor import json_serializer, json_deserializer",
"_____no_output_____"
]
],
[
[
"### Build vocabulary and tokenize datasets",
"_____no_output_____"
]
],
[
[
"def load_articles(filepath):\n with open(filepath) as f:\n for line in f:\n yield map(str.split, line.strip().split('\\t'))\n\n\ndef split_sents(article):\n return [sent.split(' ') for sent in article.split('\\t')]\n\n\ndef build_vocab(sents):\n print('Build start...')\n tok = Tokenizer(oov_token='<UNK>', filters='')\n tok.fit_on_texts(sents)\n print('Build end...')\n return tok\n\n\ndef generate_positive_pairs_from_single_article(sents, tokenizer):\n sents = list(sents)\n idx = random.randrange(0, len(sents))\n center = sents.pop(idx)\n wrapper_tokens = tokenizer.texts_to_sequences(sents)\n sent_tokens = tokenizer.texts_to_sequences([center])\n wrapper_tokens = list(chain(*wrapper_tokens))\n sent_tokens = list(chain(*sent_tokens))\n yield {'in0': sent_tokens, 'in1': wrapper_tokens, 'label': 1}\n\n\ndef generate_positive_pairs_from_single_file(sents_per_article, tokenizer):\n iter_list = [generate_positive_pairs_from_single_article(sents, tokenizer)\n for sents in sents_per_article\n ]\n return chain.from_iterable(iter_list)\n",
"_____no_output_____"
],
[
"filepath = os.path.join(datadir, 'ja.wikipedia_250k.txt')\nsents_per_article = load_articles(filepath)\nsents = chain(*sents_per_article)\ntokenizer = build_vocab(sents)\n\n# save\ndatadir = '.'\ntrain_prefix = 'train250k'\nfname = \"wikipedia_{}.txt\".format(train_prefix)\noutfname = os.path.join(datadir, '{}_tokenized.jsonl'.format(train_prefix))\nwith open(outfname, 'w') as f:\n sents_per_article = load_articles(filepath)\n for sample in generate_positive_pairs_from_single_file(sents_per_article, tokenizer):\n f.write('{}\\n'.format(json.dumps(sample)))",
"_____no_output_____"
],
[
"# Shuffle training data\n!shuf {outfname} > {train_prefix}_tokenized_shuf.jsonl",
"_____no_output_____"
]
],
[
[
"### Upload preprocessed data to S3",
"_____no_output_____"
]
],
[
[
"TRAIN_DATA=\"train250k_tokenized_shuf.jsonl\"\n\n# NOTE: define your s3 bucket and key here\nS3_BUCKET = 'YOUR_BUCKET'\nS3_KEY = 'object2vec-doc2vec'\n\n",
"_____no_output_____"
],
[
"%%bash -s \"$TRAIN_DATA\" \"$S3_BUCKET\" \"$S3_KEY\"\n\naws s3 cp \"$1\" s3://$2/$3/input/train/",
"_____no_output_____"
]
],
[
[
"## Define Sagemaker session, Object2Vec image, S3 input and output paths",
"_____no_output_____"
]
],
[
[
"from sagemaker import get_execution_role\nfrom sagemaker.amazon.amazon_estimator import get_image_uri\n\n\nregion = boto3.Session().region_name\nprint(\"Your notebook is running on region '{}'\".format(region))\n\nsess = sagemaker.Session()\n\n \nrole = get_execution_role()\nprint(\"Your IAM role: '{}'\".format(role))\n\ncontainer = get_image_uri(region, 'object2vec')\nprint(\"The image uri used is '{}'\".format(container))\n\nprint(\"Using s3 buceket: {} and key prefix: {}\".format(S3_BUCKET, S3_KEY))",
"_____no_output_____"
],
[
"## define input channels\n\ns3_input_path = os.path.join('s3://', S3_BUCKET, S3_KEY, 'input')\n\ns3_train = s3_input(os.path.join(s3_input_path, 'train', TRAIN_DATA), \n distribution='ShardedByS3Key', content_type='application/jsonlines')",
"_____no_output_____"
],
[
"## define output path\noutput_path = os.path.join('s3://', S3_BUCKET, S3_KEY, 'models')",
"_____no_output_____"
]
],
[
[
"## Train and deploy doc2vec",
"_____no_output_____"
],
[
"We combine four new features into our training of Object2Vec:\n\n- Negative sampling: With the new `negative_sampling_rate` hyperparameter, users of Object2Vec only need to provide positively labeled data pairs, and the algorithm automatically samples for negative data internally during training.\n\n- Weight-sharing of embedding layer: The new `tied_token_embedding_weight` hyperparameter gives user the flexibility to share the embedding weights for both encoders, and it improves the performance of the algorithm in this use-case\n\n- The new `comparator_list` hyperparameter gives users the flexibility to mix-and-match different operators so that they can tune the algorithm towards optimal performance for their applications.",
"_____no_output_____"
]
],
[
[
"# Define training hyperparameters\n\nhyperparameters = {\n \"_kvstore\": \"device\",\n \"_num_gpus\": 'auto',\n \"_num_kv_servers\": \"auto\",\n \"bucket_width\": 0,\n \"dropout\": 0.4,\n \"early_stopping_patience\": 2,\n \"early_stopping_tolerance\": 0.01,\n \"enc0_layers\": \"auto\",\n \"enc0_max_seq_len\": 50,\n \"enc0_network\": \"pooled_embedding\",\n \"enc0_pretrained_embedding_file\": \"\",\n \"enc0_token_embedding_dim\": 300,\n \"enc0_vocab_size\": len(tokenizer.word_index) + 1,\n \"enc1_network\": \"enc0\",\n \"enc_dim\": 300,\n \"epochs\": 20,\n \"learning_rate\": 0.01,\n \"mini_batch_size\": 512,\n \"mlp_activation\": \"relu\",\n \"mlp_dim\": 512,\n \"mlp_layers\": 2,\n \"num_classes\": 2,\n \"optimizer\": \"adam\",\n \"output_layer\": \"softmax\",\n \"weight_decay\": 0\n}\n\n\nhyperparameters['negative_sampling_rate'] = 3\nhyperparameters['tied_token_embedding_weight'] = \"true\"\nhyperparameters['comparator_list'] = \"hadamard\"\nhyperparameters['token_embedding_storage_type'] = 'row_sparse'\n\n \n# get estimator\ndoc2vec = sagemaker.estimator.Estimator(container,\n role, \n train_instance_count=1, \n train_instance_type='ml.p2.xlarge',\n output_path=output_path,\n sagemaker_session=sess)\n\n",
"_____no_output_____"
],
[
"# set hyperparameters\ndoc2vec.set_hyperparameters(**hyperparameters)\n\n# fit estimator with data\ndoc2vec.fit({'train': s3_train})\n#doc2vec.fit({'train': s3_train, 'validation':s3_valid, 'test':s3_test})",
"_____no_output_____"
],
[
"# deploy model\n\ndoc2vec_model = doc2vec.create_model(\n serializer=json_serializer,\n deserializer=json_deserializer,\n content_type='application/json')\n\npredictor = doc2vec_model.deploy(initial_instance_count=1, instance_type='ml.m4.xlarge')",
"_____no_output_____"
],
[
"sent = 'ไปๆฅ ใฎ ๆผ้ฃ ใฏ ใใฉใ ใ ใฃ ใ'\nsent_tokens = tokenizer.texts_to_sequences([sent])\npayload = {'instances': [{'in0': sent_tokens[0]}]}\nresult = predictor.predict(payload)\nprint(result)",
"_____no_output_____"
],
[
"predictor.delete_endpoint()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
]
] |
cbfa6596ea8f941a90f51c02b1a496ffb6551fd2
| 72,603 |
ipynb
|
Jupyter Notebook
|
Statistics/Week1.ipynb
|
umar-khayam/Data-Science
|
f86092a7144a696b3b81701795a50c5cd505fae3
|
[
"MIT"
] | null | null | null |
Statistics/Week1.ipynb
|
umar-khayam/Data-Science
|
f86092a7144a696b3b81701795a50c5cd505fae3
|
[
"MIT"
] | null | null | null |
Statistics/Week1.ipynb
|
umar-khayam/Data-Science
|
f86092a7144a696b3b81701795a50c5cd505fae3
|
[
"MIT"
] | null | null | null | 40.245565 | 8,088 | 0.5313 |
[
[
[
"<center>\n <img src=\"https://cf-courses-data.s3.us.cloud-object-storage.appdomain.cloud/IBMDeveloperSkillsNetwork-PY0220EN-SkillsNetwork/labs/project/Images/IDSNlogo.png\" width=\"300\" alt=\"cognitiveclass.ai logo\" />\n</center>\n",
"_____no_output_____"
],
[
"# Descriptive Statistics\n",
"_____no_output_____"
],
[
"Estimated time needed: **30** minutes\n",
"_____no_output_____"
],
[
"In this lab, you'll go over some hands-on exercises using Python.\n",
"_____no_output_____"
],
[
"## Objectives\n",
"_____no_output_____"
],
[
"* Import Libraries\n* Read in Data\n* Lab exercises and questions\n",
"_____no_output_____"
],
[
"***\n",
"_____no_output_____"
],
[
"## Import Libraries\n",
"_____no_output_____"
],
[
"All Libraries required for this lab are listed below. The libraries pre-installed on Skills Network Labs are commented. If you run this notebook in a different environment, e.g. your desktop, you may need to uncomment and install certain libraries.\n",
"_____no_output_____"
]
],
[
[
"#! mamba install pandas==1.3.3/\n#! mamba install numpy=1.21.2\n#! mamba install matplotlib=3.4.3-y",
"_____no_output_____"
]
],
[
[
"Import the libraries we need for the lab\n",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as pyplot",
"_____no_output_____"
]
],
[
[
"Read in the csv file from the URL using the request library\n",
"_____no_output_____"
]
],
[
[
"ratings_url = 'https://cf-courses-data.s3.us.cloud-object-storage.appdomain.cloud/IBMDeveloperSkillsNetwork-ST0151EN-SkillsNetwork/labs/teachingratings.csv'\nratings_df=pd.read_csv(ratings_url)\n",
"_____no_output_____"
]
],
[
[
"## Data Description\n\n| Variable | Description |\n| ----------- | ---------------------------------------------------------------------------------------------------------------------------------------------------- |\n| minority | Does the instructor belong to a minority (non-Caucasian) group? |\n| age | The professor's age |\n| gender | Indicating whether the instructor was male or female. |\n| credits | Is the course a single-credit elective? |\n| beauty | Rating of the instructor's physical appearance by a panel of six students averaged across the six panelists and standardized to have a mean of zero. |\n| eval | Course overall teaching evaluation score, on a scale of 1 (very unsatisfactory) to 5 (excellent). |\n| division | Is the course an upper or lower division course? |\n| native | Is the instructor a native English speaker? |\n| tenure | Is the instructor on a tenure track? |\n| students | Number of students that participated in the evaluation. |\n| allstudents | Number of students enrolled in the course. |\n| prof | Indicating instructor identifier. |\n",
"_____no_output_____"
],
[
"## Display information about the dataset\n\n1. Structure of the dataframe\n2. Describe the dataset\n3. Number of rows and columns\n",
"_____no_output_____"
],
[
"print out the first five rows of the data\n",
"_____no_output_____"
]
],
[
[
"ratings_df.head()",
"_____no_output_____"
]
],
[
[
"get information about each variable\n",
"_____no_output_____"
]
],
[
[
"ratings_df.info()",
"<class 'pandas.core.frame.DataFrame'>\nRangeIndex: 463 entries, 0 to 462\nData columns (total 19 columns):\n # Column Non-Null Count Dtype \n--- ------ -------------- ----- \n 0 minority 463 non-null object \n 1 age 463 non-null int64 \n 2 gender 463 non-null object \n 3 credits 463 non-null object \n 4 beauty 463 non-null float64\n 5 eval 463 non-null float64\n 6 division 463 non-null object \n 7 native 463 non-null object \n 8 tenure 463 non-null object \n 9 students 463 non-null int64 \n 10 allstudents 463 non-null int64 \n 11 prof 463 non-null int64 \n 12 PrimaryLast 463 non-null int64 \n 13 vismin 463 non-null int64 \n 14 female 463 non-null int64 \n 15 single_credit 463 non-null int64 \n 16 upper_division 463 non-null int64 \n 17 English_speaker 463 non-null int64 \n 18 tenured_prof 463 non-null int64 \ndtypes: float64(2), int64(11), object(6)\nmemory usage: 68.9+ KB\n"
]
],
[
[
"get the number of rows and columns - prints as (number of rows, number of columns)\n",
"_____no_output_____"
]
],
[
[
"ratings_df.shape",
"_____no_output_____"
]
],
[
[
"## Lab Exercises\n",
"_____no_output_____"
],
[
"### Can you identify whether the teachers' Rating data is a time series or cross-sectional?\n",
"_____no_output_____"
],
[
"Print out the first ten rows of the data\n\n1. Does it have a date or time variable? - No - it is not a time series dataset\n2. Does it observe more than one teacher being rated? - Yes - it is cross-sectional dataset\n\n> The dataset is a Cross-sectional\n",
"_____no_output_____"
]
],
[
[
"ratings_df.head(10)",
"_____no_output_____"
]
],
[
[
"### Find the mean, median, minimum, and maximum values for students\n",
"_____no_output_____"
],
[
"Find Mean value for students\n",
"_____no_output_____"
]
],
[
[
"ratings_df['students'].mean()",
"_____no_output_____"
]
],
[
[
"Find the Median value for students\n",
"_____no_output_____"
]
],
[
[
"ratings_df['students'].median()",
"_____no_output_____"
]
],
[
[
"Find the Minimum value for students\n",
"_____no_output_____"
]
],
[
[
"ratings_df['students'].min()",
"_____no_output_____"
]
],
[
[
"Find the Maximum value for students\n",
"_____no_output_____"
]
],
[
[
"ratings_df['students'].max()",
"_____no_output_____"
]
],
[
[
"### Produce a descriptive statistics table\n",
"_____no_output_____"
]
],
[
[
"ratings_df.describe()",
"_____no_output_____"
]
],
[
[
"### Create a histogram of the beauty variable and briefly comment on the distribution of data\n",
"_____no_output_____"
],
[
"using the <code>matplotlib</code> library, create a histogram\n",
"_____no_output_____"
]
],
[
[
"pyplot.hist(ratings_df['beauty'])",
"_____no_output_____"
]
],
[
[
"here are few conclusions from the histogram\nmost of the data for beauty is around the -0.5 and 0\nthe distribution is skewed to the right\ntherefore looking at the data we can say the mean is close to 0\n",
"_____no_output_____"
],
[
"### Does average beauty score differ by gender? Produce the means and standard deviations for both male and female instructors.\n",
"_____no_output_____"
],
[
"Use a group by gender to view the mean scores of the beauty we can say that beauty scores differ by gender as the mean beauty score for women is higher than men\n",
"_____no_output_____"
]
],
[
[
"ratings_df.groupby('gender').agg({'beauty':['mean', 'std', 'var']}).reset_index()",
"_____no_output_____"
]
],
[
[
"### Calculate the percentage of males and females that are tenured professors. Will you say that tenure status differ by gender?\n",
"_____no_output_____"
],
[
"First groupby to get the total sum\n",
"_____no_output_____"
]
],
[
[
"tenure_count = ratings_df[ratings_df.tenure == 'yes'].groupby('gender').agg({'tenure': 'count'}).reset_index()\ntenure_count",
"_____no_output_____"
]
],
[
[
"Find the percentage\n",
"_____no_output_____"
]
],
[
[
"tenure_count['percentage'] = 100 * tenure_count.tenure/tenure_count.tenure.sum()\ntenure_count",
"_____no_output_____"
]
],
[
[
"## Practice Questions\n",
"_____no_output_____"
],
[
"### Question 1: Calculate the percentage of visible minorities are tenure professors. Will you say that tenure status differed if teacher was a visible minority?\n",
"_____no_output_____"
]
],
[
[
"## insert code here\nminorities_count = ratings_df.groupby('minority').agg({'tenure': 'count'}).reset_index()\nminorities_count['percentage'] = 100 * minorities_count.tenure/minorities_count.tenure.sum()\nminorities_count",
"_____no_output_____"
]
],
[
[
"Double-click **here** for the solution.\n\n<!-- The answer is below:\n### we can use a groupby function for this\n## first groupby to get the total sum\ntenure_count = ratings_df.groupby('minority').agg({'tenure': 'count'}).reset_index()\n# Find the percentage\ntenure_count['percentage'] = 100 * tenure_count.tenure/tenure_count.tenure.sum()\n##print to see\ntenure_count\n-->\n",
"_____no_output_____"
],
[
"### Question 2: Does average age differ by tenure? Produce the means and standard deviations for both tenured and untenured professors.\n",
"_____no_output_____"
]
],
[
[
"## insert code here\nratings_df.groupby('tenure').agg({'beauty':['mean','std']}).reset_index()",
"_____no_output_____"
]
],
[
[
"Double-click **here** for the solution.\n\n<!-- The answer is below:\n## group by tenureship and find the mean and standard deviation for each group\nratings_df.groupby('tenure').agg({'age':['mean', 'std']}).reset_index()\n-->\n",
"_____no_output_____"
],
[
"### Question 3: Create a histogram for the age variable.\n",
"_____no_output_____"
]
],
[
[
"## insert code here\npyplot.hist(ratings_df['age'])",
"_____no_output_____"
]
],
[
[
"Double-click **here** for the solution.\n\n<!-- The answer is below:\npyplot.hist(ratings_df['age'])\n-->\n",
"_____no_output_____"
],
[
"### Question 4: Create a bar plot for the gender variable.\n",
"_____no_output_____"
]
],
[
[
"## insert code here\npyplot.bar(ratings_df.gender.unique(),ratings_df.gender.value_counts(),color=['pink','blue'])\npyplot.xlabel('Gender')\npyplot.ylabel('Count')\npyplot.title('Gender distribution bar plot')",
"_____no_output_____"
]
],
[
[
"Double-click **here** for the solution.\n\n<!-- The answer is below:\npyplot.bar(ratings_df.gender.unique(),ratings_df.gender.value_counts(),color=['pink','blue'])\npyplot.xlabel('Gender')\npyplot.ylabel('Count')\npyplot.title('Gender distribution bar plot')\n-->\n",
"_____no_output_____"
],
[
"> Note:Bar plot can be rendered vertically or horizontally. Try to replace **pyplot.bar** with **pyplot.barh** in the above cell and see the difference.\n",
"_____no_output_____"
],
[
"### Question 5: What is the Median evaluation score for tenured Professors?\n",
"_____no_output_____"
]
],
[
[
"## insert code here\nratings_df[ratings_df['tenure'] == 'yes']['eval'].median()",
"_____no_output_____"
]
],
[
[
"Double-click **here** for the solution.\n\n<!-- The answer is below:\n## you can index just tenured professors and find their median evaluation scores\nratings_df[ratings_df['tenure'] == 'yes']['eval'].median()\n-->\n",
"_____no_output_____"
],
[
"## Authors\n",
"_____no_output_____"
],
[
"[Aije Egwaikhide](https://www.linkedin.com/in/aije-egwaikhide/?utm_medium=Exinfluencer&utm_source=Exinfluencer&utm_content=000026UJ&utm_term=10006555&utm_id=NA-SkillsNetwork-Channel-SkillsNetworkCoursesIBMDeveloperSkillsNetworkST0151ENSkillsNetwork20531532-2022-01-01) is a Data Scientist at IBM who holds a degree in Economics and Statistics from the University of Manitoba and a Post-grad in Business Analytics from St. Lawrence College, Kingston. She is a current employee of IBM where she started as a Junior Data Scientist at the Global Business Services (GBS) in 2018. Her main role was making meaning out of data for their Oil and Gas clients through basic statistics and advanced Machine Learning algorithms. The highlight of her time in GBS was creating a customized end-to-end Machine learning and Statistics solution on optimizing operations in the Oil and Gas wells. She moved to the Cognitive Systems Group as a Senior Data Scientist where she will be providing the team with actionable insights using Data Science techniques and further improve processes through building machine learning solutions. She recently joined the IBM Developer Skills Network group where she brings her real-world experience to the courses she creates.\n",
"_____no_output_____"
],
[
"## Change Log\n",
"_____no_output_____"
],
[
"| Date (YYYY-MM-DD) | Version | Changed By | Change Description |\n| ----------------- | ------- | --------------- | -------------------------------------- |\n| 2020-08-14 | 0.1 | Aije Egwaikhide | Created the initial version of the lab |\n| 2022-05-10 | 0.2 | Lakshmi Holla | Added exercise for Bar plot |\n",
"_____no_output_____"
],
[
"Copyright ยฉ 2020 IBM Corporation. This notebook and its source code are released under the terms of the [MIT License](https://cognitiveclass.ai/mit-license/?utm_medium=Exinfluencer&utm_source=Exinfluencer&utm_content=000026UJ&utm_term=10006555&utm_id=NA-SkillsNetwork-Channel-SkillsNetworkCoursesIBMDeveloperSkillsNetworkST0151ENSkillsNetwork20531532-2022-01-01).\n",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
cbfa7c050c66f60585a61389393cd529add5ed39
| 43,498 |
ipynb
|
Jupyter Notebook
|
book-R/distribuicoes.ipynb
|
rlo1977/datascience
|
e8e7e32acf0dfeaa0157a28e635e6d6c7a800e53
|
[
"Apache-2.0"
] | 29 |
2017-11-02T11:05:28.000Z
|
2022-03-01T13:52:48.000Z
|
book-R/distribuicoes.ipynb
|
AhirtonLopes/datascience
|
d1a5349ddb18a6ec3d7c9cf3c7d00550c76e91ed
|
[
"Apache-2.0"
] | 2 |
2018-03-19T21:08:48.000Z
|
2020-05-22T09:40:35.000Z
|
book-R/distribuicoes.ipynb
|
AhirtonLopes/datascience
|
d1a5349ddb18a6ec3d7c9cf3c7d00550c76e91ed
|
[
"Apache-2.0"
] | 18 |
2018-01-03T22:06:44.000Z
|
2022-02-25T17:47:47.000Z
| 337.193798 | 15,574 | 0.925698 |
[
[
[
"# Distribuiรงรตes",
"_____no_output_____"
]
],
[
[
"compras <- c(1,1,1,3,3,5,5,6,6,6,6,7,8,8,9,9,9,9,10,10,10,11,13,14,14,15,15,15,15)",
"_____no_output_____"
],
[
"# Histograma padrรฃo:\nhist(compras)\n",
"_____no_output_____"
],
[
"# Histograma com classes aproximadamente de igual amplitude: \nhist(compras, breaks = c(0,3,6,7,10,13,15), freq = TRUE)",
"Warning message in plot.histogram(r, freq = freq1, col = col, border = border, angle = angle, :\nโthe AREAS in the plot are wrong -- rather use 'freq = FALSE'โ"
]
],
[
[
"## Distribuiรงรฃo binomial",
"_____no_output_____"
]
],
[
[
"x <- dbinom(0:100,size=100,prob=0.5)\nbarplot(x)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfa811f04dfab4f231e0bb797cc37e13f1363a5
| 10,307 |
ipynb
|
Jupyter Notebook
|
16. Adv Python File Handling/.ipynb_checkpoints/09. Working with CSV files and CSV Module-checkpoint.ipynb
|
penanrajput/PythonCourseContent
|
074a4af9c83a8a6b9b4608ce341ed96d1bd2e999
|
[
"MIT"
] | null | null | null |
16. Adv Python File Handling/.ipynb_checkpoints/09. Working with CSV files and CSV Module-checkpoint.ipynb
|
penanrajput/PythonCourseContent
|
074a4af9c83a8a6b9b4608ce341ed96d1bd2e999
|
[
"MIT"
] | null | null | null |
16. Adv Python File Handling/.ipynb_checkpoints/09. Working with CSV files and CSV Module-checkpoint.ipynb
|
penanrajput/PythonCourseContent
|
074a4af9c83a8a6b9b4608ce341ed96d1bd2e999
|
[
"MIT"
] | 1 |
2020-12-19T19:29:17.000Z
|
2020-12-19T19:29:17.000Z
| 29.962209 | 563 | 0.567964 |
[
[
[
"## Working with CSV files and CSV Module",
"_____no_output_____"
],
[
"[1. What is a CSV file?](#section1) \n[2. CSV Sample File.](#section2) \n[3. Python CSV Module](#section3) \n[4. CSV Module Functions](#section4) \n[5. Reading CSV Files](#section5) \n[6. Reading as a Dictionary](#section6) \n[7. Writing to CSV Files](#section7)",
"_____no_output_____"
],
[
"<a id=\"section1\"></a>\n**1. What is a CSV file** \n \nA CSV file is a type of plain text file that uses specific structuring to arrange tabular data. CSV is a common format for data interchange as it's compact, simple and general. Many online services allow its users to export tabular data from the website into a CSV file. Files of CSV will open into Excel, and nearly all databases have a tool to allow import from CSV file. The standard format is defined by rows and columns data. Moreover, each row is terminated by a newline to begin the next row. Also within the row, each column is separated by a comma.",
"_____no_output_____"
],
[
"<a id=\"section2\"></a>\n**2. CSV Sample File.** \n \nData in the form of tables is also called CSV (comma separated values) - literally \"comma-separated values.\" This is a text format intended for the presentation of tabular data. Each line of the file is one line of the table. The values of individual columns are separated by a separator symbol - a comma (,), a semicolon (;) or another symbol. CSV can be easily read and processed by Python.",
"_____no_output_____"
]
],
[
[
"f = open(\"data.csv\")\nprint(f.read())",
"รฏยปยฟProgramming language,Designed by,Appeared,Extension\nPython,Guido van Rossum,1991,.py\nJava,James Gosling,1995,.java\nC++,Bjarne Stroustrup,1983,.cpp\n\n"
]
],
[
[
"<a id=\"section3\"></a>\n**3. Python CSV Module** \n \nPython provides a CSV module to handle CSV files. To read/write data, you need to loop through rows of the CSV. You need to use the split method to get data from specified columns.",
"_____no_output_____"
],
[
"<a id=\"section4\"></a>\n**4. CSV Module Functions** \n\nIn CSV module documentation you can find following functions: \n\ncsv.field_size_limit โ return maximum field size \ncsv.get_dialect โ get the dialect which is associated with the name \ncsv.list_dialects โ show all registered dialects \ncsv.reader โ read data from a csv file \ncsv.register_dialect - associate dialect with name \ncsv.writer โ write data to a csv file \ncsv.unregister_dialect - delete the dialect associated with the name the dialect registry \ncsv.QUOTE_ALL - Quote everything, regardless of type. \ncsv.QUOTE_MINIMAL - Quote fields with special characters \ncsv.QUOTE_NONNUMERIC - Quote all fields that aren't numbers value \ncsv.QUOTE_NONE โ Don't quote anything in output ",
"_____no_output_____"
],
[
"<a id=\"section5\"></a>\n**5. How to Read a CSV File** \n\nTo read data from CSV files, you must use the reader function to generate a reader object. \n \nThe reader function is developed to take each row of the file and make a list of all columns. Then, you have to choose the column you want the variable data for. \n \nIt sounds a lot more intricate than it is. Let's take a look at this example, and we will find out that working with csv file isn't so hard. ",
"_____no_output_____"
]
],
[
[
"import csv\n# f = open(\"data.csv\")\nwith open(\"data.csv\") as f:\n data = csv.reader(f)\n for row in data:\n print(row)",
"['รฏยปยฟProgramming language', 'Designed by', 'Appeared', 'Extension']\n['Python', 'Guido van Rossum', '1991', '.py']\n['Java', 'James Gosling', '1995', '.java']\n['C++', 'Bjarne Stroustrup', '1983', '.cpp']\n"
]
],
[
[
"<a id=\"section6\"></a>\n**6. How to Read a CSV as a Dictionary** \n\n You can also you use DictReader to read CSV files. The results are interpreted as a dictionary where the header row is the key, and other rows are values.",
"_____no_output_____"
]
],
[
[
"import csv\nfile = csv.DictReader(open(\"data.csv\"))\nfor row in file:\n print(row)",
"OrderedDict([('รฏยปยฟProgramming language', 'Python'), ('Designed by', 'Guido van Rossum'), ('Appeared', '1991'), ('Extension', '.py')])\nOrderedDict([('รฏยปยฟProgramming language', 'Java'), ('Designed by', 'James Gosling'), ('Appeared', '1995'), ('Extension', '.java')])\nOrderedDict([('รฏยปยฟProgramming language', 'C++'), ('Designed by', 'Bjarne Stroustrup'), ('Appeared', '1983'), ('Extension', '.cpp')])\n"
]
],
[
[
"<a id=\"section7\"></a>\n**7. How to write CSV File** \n \nWhen you have a set of data that you would like to store in a CSV file you have to use writer() function. To iterate the data over the rows(lines), you have to use the writerow() function. \n \nConsider the following example. We write data into a file \"writeData.csv\" where the delimiter is an apostrophe.",
"_____no_output_____"
]
],
[
[
"#import necessary modules\nimport csv\n\nwith open('writeData.csv', mode='w') as file:\n writer = csv.writer(file, delimiter=',', quotechar='\"', quoting=csv.QUOTE_MINIMAL)\n\n #way to write to csv file\n writer.writerow(['Programming language', 'Designed by', 'Appeared', 'Extension'])\n writer.writerow(['Python', 'Guido van Rossum', '1991', '.py'])\n writer.writerow(['Java', 'James Gosling', '1995', '.java'])\n writer.writerow(['C++', 'Bjarne Stroustrup', '1985', '.cpp'])",
"_____no_output_____"
],
[
"f = open('writeData.csv')\ndata = f.readlines()\nfor item in data:\n print(item, end=\" \")",
"Programming language,Designed by,Appeared,Extension\n \n Python,Guido van Rossum,1991,.py\n \n Java,James Gosling,1995,.java\n \n C++,Bjarne Stroustrup,1985,.cpp\n \n "
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbfa9b948a6713fbf5ff06905ca023410f544b24
| 740,768 |
ipynb
|
Jupyter Notebook
|
content/labs/lab05/notebook/lab05_lyh.ipynb
|
luyueheng/2020-CS109B
|
e14149833574cbdef63303cd89c03e486f6aa43b
|
[
"MIT"
] | null | null | null |
content/labs/lab05/notebook/lab05_lyh.ipynb
|
luyueheng/2020-CS109B
|
e14149833574cbdef63303cd89c03e486f6aa43b
|
[
"MIT"
] | null | null | null |
content/labs/lab05/notebook/lab05_lyh.ipynb
|
luyueheng/2020-CS109B
|
e14149833574cbdef63303cd89c03e486f6aa43b
|
[
"MIT"
] | null | null | null | 462.690818 | 112,252 | 0.937337 |
[
[
[
"# <img style=\"float: left; padding-right: 10px; width: 45px\" src=\"https://raw.githubusercontent.com/Harvard-IACS/2018-CS109A/master/content/styles/iacs.png\"> CS-109B Introduction to Data Science\n## Lab 5: Convolutional Neural Networks\n\n**Harvard University**<br>\n**Spring 2020**<br>\n**Instructors:** Mark Glickman, Pavlos Protopapas, and Chris Tanner<br>\n**Lab Instructors:** Chris Tanner and Eleni Angelaki Kaxiras<br>\n**Content:** Eleni Angelaki Kaxiras, Pavlos Protopapas, Patrick Ohiomoba, and David Sondak\n\n---",
"_____no_output_____"
]
],
[
[
"# RUN THIS CELL TO PROPERLY HIGHLIGHT THE EXERCISES\nimport requests\nfrom IPython.core.display import HTML\nstyles = requests.get(\"https://raw.githubusercontent.com/Harvard-IACS/2019-CS109B/master/content/styles/cs109.css\").text\nHTML(styles)",
"_____no_output_____"
]
],
[
[
"## Learning Goals\n\nIn this lab we will look at Convolutional Neural Networks (CNNs), and their building blocks.\n\nBy the end of this lab, you should:\n\n- have a good undertanding on how images, a common type of data for a CNN, are represented in the computer and how to think of them as arrays of numbers. \n- be familiar with preprocessing images with `tf.keras` and `scipy`.\n- know how to put together the building blocks used in CNNs - such as convolutional layers and pooling layers - in `tensorflow.keras` with an example. \n- run your first CNN.",
"_____no_output_____"
]
],
[
[
"import matplotlib.pyplot as plt\nplt.rcParams[\"figure.figsize\"] = (5,5)\n\nimport numpy as np\nfrom scipy.optimize import minimize\nfrom sklearn.utils import shuffle\n%matplotlib inline",
"_____no_output_____"
],
[
"from tensorflow.keras.models import Sequential, Model\nfrom tensorflow.keras.layers import Dense, Dropout, Flatten, Activation, Input\nfrom tensorflow.keras.layers import Conv2D, Conv1D, MaxPooling2D, MaxPooling1D,\\\n GlobalAveragePooling1D, GlobalMaxPooling1D\nfrom tensorflow.keras.optimizers import Adam, SGD, RMSprop\nfrom tensorflow.keras.utils import to_categorical\nfrom tensorflow.keras.metrics import AUC, Precision, Recall, FalsePositives, FalseNegatives, \\\n TruePositives, TrueNegatives\nfrom tensorflow.keras.regularizers import l2",
"_____no_output_____"
],
[
"from __future__ import absolute_import, division, print_function, unicode_literals\n\n# TensorFlow and tf.keras\nimport tensorflow as tf\n\ntf.keras.backend.clear_session() # For easy reset of notebook state.\n\nprint(tf.__version__) # You should see a > 2.0.0 here!\n",
"2.0.0\n"
]
],
[
[
"## Part 0: Running on SEAS JupyterHub\n\n**PLEASE READ**: [Instructions for Using SEAS JupyterHub](https://canvas.harvard.edu/courses/65462/pages/instructions-for-using-seas-jupyterhub?module_item_id=638544)\n\nSEAS and FAS are providing you with a platform in AWS to use for the class (accessible from the 'Jupyter' menu link in Canvas). These are AWS p2 instances with a GPU, 10GB of disk space, and 61 GB of RAM, for faster training for your networks. Most of the libraries such as keras, tensorflow, pandas, etc. are pre-installed. If a library is missing you may install it via the Terminal.\n\n**NOTE : The AWS platform is funded by SEAS and FAS for the purposes of the class. It is not running against your individual credit.**\n\n**NOTE NOTE NOTE: You are not allowed to use it for purposes not related to this course.**\n\n**Help us keep this service: Make sure you stop your instance as soon as you do not need it.**\n\n\n*source:CS231n Stanford: Google Cloud Tutorial*",
"_____no_output_____"
],
[
"## Part 1: Parts of a Convolutional Neural Net\n\nWe can have \n- 1D CNNs which are useful for time-series or 1-Dimensional data, \n- 2D CNNs used for 2-Dimensional data such as images, and also \n- 3-D CNNs used for video.\n\n### a. Convolutional Layers.\n\nConvolutional layers are comprised of **filters** and **feature maps**. The filters are essentially the **neurons** of the layer. They have the weights and produce the input for the next layer. The feature map is the output of one filter applied to the previous layer. \n\nConvolutions operate over 3D tensors, called feature maps, with two spatial axes (height and width) as well as a depth axis (also called the channels axis). For an RGB image, the dimension of the depth axis is 3, because the image has three color channels: red, green, and blue. For a black-and-white picture, like the MNIST digits, the depth is 1 (levels of gray). The convolution operation extracts patches from its input feature map and applies the same transformation to all of these patches, producing an output feature map. This output feature map is still a 3D tensor: it has a width and a height. Its depth can be arbitrary, because the output depth is a parameter of the layer, and the different channels in that depth axis no longer stand for specific colors as in RGB input; rather, they stand for filters. Filters encode specific aspects of the input data: at a high level, a single filter could encode the concept โpresence of a face in the input,โ for instance.\n\nIn the MNIST example that we will see, the first convolution layer takes a feature map of size (28, 28, 1) and outputs a feature map of size (26, 26, 32): it computes 32 filters over its input. Each of these 32 output channels contains a 26ร26 grid of values, which is a response map of the filter over the input, indicating the response of that filter pattern at different locations in the input. \n\nConvolutions are defined by two key parameters:\n- Size of the patches extracted from the inputs. These are typically 3ร3 or 5ร5 \n- The number of filters computed by the convolution. \n\n**Padding**: One of \"valid\", \"causal\" or \"same\" (case-insensitive). \"valid\" means \"no padding\". \"same\" results in padding the input such that the output has the same length as the original input. \"causal\" results in causal (dilated) convolutions,",
"_____no_output_____"
],
[
"#### 1D Convolutional Network\n\nIn `tf.keras` see [1D convolutional layers](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Conv1D)\n\n\n\n*image source: Deep Learning with Python by Franรงois Chollet*",
"_____no_output_____"
],
[
"#### 2D Convolutional Network\n\nIn `tf.keras` see [2D convolutional layers](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Conv2D)\n\n",
"_____no_output_____"
],
[
"**keras.layers.Conv2D** (filters, kernel_size, strides=(1, 1), padding='valid', activation=None, use_bias=True, \n kernel_initializer='glorot_uniform', data_format='channels_last', \n bias_initializer='zeros')",
"_____no_output_____"
],
[
"### b. Pooling Layers.\n\nPooling layers are also comprised of filters and feature maps. Let's say the pooling layer has a 2x2 receptive field and a stride of 2. This stride results in feature maps that are one half the size of the input feature maps. We can use a max() operation for each receptive field. \n\nIn `tf.keras` see [2D pooling layers](https://www.tensorflow.org/api_docs/python/tf/keras/layers/MaxPool2D)\n\n**keras.layers.MaxPooling2D**(pool_size=(2, 2), strides=None, padding='valid', data_format=None)\n\n",
"_____no_output_____"
],
[
"### c. Dropout Layers.\n\nDropout consists in randomly setting a fraction rate of input units to 0 at each update during training time, which helps prevent overfitting. \n\nIn `tf.keras` see [Dropout layers](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dropout)\n\ntf.keras.layers.Dropout(rate, seed=None)\n\nrate: float between 0 and 1. Fraction of the input units to drop.<br>\nseed: A Python integer to use as random seed.\n\nReferences\n\n[Dropout: A Simple Way to Prevent Neural Networks from Overfitting](http://www.jmlr.org/papers/volume15/srivastava14a/srivastava14a.pdf)",
"_____no_output_____"
],
[
"### d. Fully Connected Layers.\n\nA fully connected layer flattens the square feature map into a vector. Then we can use a sigmoid or softmax activation function to output probabilities of classes. \n\nIn `tf.keras` see [Fully Connected layers](https://www.tensorflow.org/api_docs/python/tf/keras/layers/Dense)\n\n**keras.layers.Dense**(units, activation=None, use_bias=True, \n kernel_initializer='glorot_uniform', bias_initializer='zeros')",
"_____no_output_____"
],
[
"## Part 2: Preprocessing the data",
"_____no_output_____"
]
],
[
[
"img = plt.imread('../images/cat.1700.jpg')\nheight, width, channels = img.shape\nprint(f'PHOTO: height = {height}, width = {width}, number of channels = {channels}, \\\nimage datatype = {img.dtype}')",
"PHOTO: height = 252, width = 261, number of channels = 4, image datatype = uint8\n"
],
[
"img.shape",
"_____no_output_____"
],
[
"# let's look at the image\nimgplot = plt.imshow(img)",
"_____no_output_____"
]
],
[
[
"#### Visualizing the different channels",
"_____no_output_____"
]
],
[
[
"colors = [plt.cm.Reds, plt.cm.Greens, plt.cm.Blues, plt.cm.Greys]\nsubplots = np.arange(221,224)\nfor i in range(3):\n plt.subplot(subplots[i])\n plt.imshow(img[:,:,i], cmap=colors[i])\nplt.subplot(224)\nplt.imshow(img) \nplt.show()",
"_____no_output_____"
]
],
[
[
"If you want to learn more: [Image Processing with Python and Scipy](http://prancer.physics.louisville.edu/astrowiki/index.php/Image_processing_with_Python_and_SciPy)",
"_____no_output_____"
],
[
"## Part 3: Putting the Parts together to make a small ConvNet Model\n\nLet's put all the parts together to make a convnet for classifying our good old MNIST digits.",
"_____no_output_____"
]
],
[
[
"# Load data and preprocess\n(train_images, train_labels), (test_images, test_labels) = tf.keras.datasets.mnist.load_data(\n path='mnist.npz') # load MNIST data\ntrain_images.shape",
"Downloading data from https://storage.googleapis.com/tensorflow/tf-keras-datasets/mnist.npz\n11493376/11490434 [==============================] - 1s 0us/step\n"
]
],
[
[
"**Notice:** These photos do not have a third dimention channel because they are B&W.",
"_____no_output_____"
]
],
[
[
"train_images.max(), train_images.min()",
"_____no_output_____"
]
],
[
[
"**reshape data to 3 dimensions for keras**\n\n**convert int to float**",
"_____no_output_____"
]
],
[
[
"train_images = train_images.reshape((60000, 28, 28, 1)) # Reshape to get third dimension\ntest_images = test_images.reshape((10000, 28, 28, 1)) \n\ntrain_images = train_images.astype('float32') / 255 # Normalize between 0 and 1\ntest_images = test_images.astype('float32') / 255 \n\n# Convert labels to categorical data \ntrain_labels = to_categorical(train_labels)\ntest_labels = to_categorical(test_labels)",
"_____no_output_____"
]
],
[
[
"**`input_shape` shouldn't be hard coded. width, height, number of filter?**",
"_____no_output_____"
]
],
[
[
"mnist_cnn_model = Sequential() # Create sequential model\n\n# Add network layers\nmnist_cnn_model.add(Conv2D(32, (3, 3), activation='relu', input_shape=(28, 28, 1)))\nmnist_cnn_model.add(MaxPooling2D((2, 2)))\nmnist_cnn_model.add(Conv2D(64, (3, 3), activation='relu')) \nmnist_cnn_model.add(MaxPooling2D((2, 2)))\nmnist_cnn_model.add(Conv2D(64, (3, 3), activation='relu'))",
"_____no_output_____"
]
],
[
[
"The next step is to feed the last output tensor (of shape (3, 3, 64)) into a densely connected classifier network like those youโre already familiar with: a stack of Dense layers. These classifiers process vectors, which are 1D, whereas the output of the last conv layer is a 3D tensor. First we have to flatten the 3D outputs to 1D, and then add a few Dense layers on top.",
"_____no_output_____"
]
],
[
[
"mnist_cnn_model.add(Flatten())\nmnist_cnn_model.add(Dense(32, activation='relu')) # before we make decision, how much we squash the layers\nmnist_cnn_model.add(Dense(10, activation='softmax')) # number of classes, shouldn't be hard coded as well\nmnist_cnn_model.summary()",
"Model: \"sequential\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv2d (Conv2D) (None, 26, 26, 32) 320 \n_________________________________________________________________\nmax_pooling2d (MaxPooling2D) (None, 13, 13, 32) 0 \n_________________________________________________________________\nconv2d_1 (Conv2D) (None, 11, 11, 64) 18496 \n_________________________________________________________________\nmax_pooling2d_1 (MaxPooling2 (None, 5, 5, 64) 0 \n_________________________________________________________________\nconv2d_2 (Conv2D) (None, 3, 3, 64) 36928 \n_________________________________________________________________\nflatten (Flatten) (None, 576) 0 \n_________________________________________________________________\ndense (Dense) (None, 32) 18464 \n_________________________________________________________________\ndense_1 (Dense) (None, 10) 330 \n=================================================================\nTotal params: 74,538\nTrainable params: 74,538\nNon-trainable params: 0\n_________________________________________________________________\n"
]
],
[
[
"**(26, 26) after convolution, 32 filters**",
"_____no_output_____"
],
[
"<div class=\"Question\"><b>Question</b> Why are we using cross-entropy here?</div>",
"_____no_output_____"
]
],
[
[
"loss = tf.keras.losses.categorical_crossentropy\n\noptimizer = Adam(lr=0.001)\n\n#optimizer = RMSprop(lr=1e-2)\n# see https://www.tensorflow.org/api_docs/python/tf/keras/metrics\nmetrics = ['accuracy'] \n\n# Compile model\nmnist_cnn_model.compile(optimizer=optimizer,\n loss=loss,\n metrics=metrics)",
"_____no_output_____"
]
],
[
[
"<div class=\"discussion\"><b>Discussion</b> How can we choose the batch size?</div>",
"_____no_output_____"
]
],
[
[
"%%time \n\n# Fit the model \nverbose, epochs, batch_size = 1, 10, 64 # try a different num epochs and batch size : 30, 16 \nhistory = mnist_cnn_model.fit(train_images, train_labels, \n epochs=epochs, \n batch_size=batch_size, \n verbose=verbose,\n validation_split=0.2,\n # validation_data=(X_val, y_val) # IF you have val data\n shuffle=True)",
"Train on 48000 samples, validate on 12000 samples\nEpoch 1/10\n48000/48000 [==============================] - 17s 362us/sample - loss: 0.2466 - accuracy: 0.9252 - val_loss: 0.1024 - val_accuracy: 0.9690\nEpoch 2/10\n48000/48000 [==============================] - 18s 366us/sample - loss: 0.0652 - accuracy: 0.9804 - val_loss: 0.0536 - val_accuracy: 0.9845\nEpoch 3/10\n48000/48000 [==============================] - 18s 372us/sample - loss: 0.0459 - accuracy: 0.9857 - val_loss: 0.0453 - val_accuracy: 0.9860\nEpoch 4/10\n48000/48000 [==============================] - 18s 383us/sample - loss: 0.0354 - accuracy: 0.9886 - val_loss: 0.0433 - val_accuracy: 0.9861\nEpoch 5/10\n48000/48000 [==============================] - 18s 377us/sample - loss: 0.0295 - accuracy: 0.9906 - val_loss: 0.0514 - val_accuracy: 0.9857\nEpoch 6/10\n48000/48000 [==============================] - 18s 380us/sample - loss: 0.0248 - accuracy: 0.9917 - val_loss: 0.0432 - val_accuracy: 0.9886\nEpoch 7/10\n48000/48000 [==============================] - 18s 372us/sample - loss: 0.0187 - accuracy: 0.9942 - val_loss: 0.0427 - val_accuracy: 0.9883\nEpoch 8/10\n48000/48000 [==============================] - 18s 374us/sample - loss: 0.0170 - accuracy: 0.9950 - val_loss: 0.0452 - val_accuracy: 0.9877\nEpoch 9/10\n48000/48000 [==============================] - 18s 369us/sample - loss: 0.0133 - accuracy: 0.9959 - val_loss: 0.0404 - val_accuracy: 0.9890\nEpoch 10/10\n48000/48000 [==============================] - 18s 372us/sample - loss: 0.0111 - accuracy: 0.9962 - val_loss: 0.0467 - val_accuracy: 0.9887\nCPU times: user 10min, sys: 4min 6s, total: 14min 7s\nWall time: 2min 58s\n"
],
[
"print(history.history.keys())\nprint(history.history['val_accuracy'][-1])\nplt.plot(history.history['accuracy'])\nplt.plot(history.history['val_accuracy'])\nplt.title('model accuracy')\nplt.ylabel('accuracy')\nplt.xlabel('epoch')\nplt.legend(['train', 'val'], loc='upper left')\nplt.show()\n# summarize history for loss\nplt.plot(history.history['loss'])\nplt.plot(history.history['val_loss'])\nplt.title('model loss')\nplt.ylabel('loss')\nplt.xlabel('epoch')\nplt.legend(['train', 'val'], loc='upper left')\nplt.show()\n#plt.savefig('../images/batch8.png')",
"dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])\n0.98875\n"
],
[
"mnist_cnn_model.metrics_names",
"_____no_output_____"
],
[
"# Evaluate the model on the test data:\nscore = mnist_cnn_model.evaluate(test_images, test_labels, \n batch_size=batch_size, \n verbose=0, callbacks=None)\n#print(\"%s: %.2f%%\" % (mnist_cnn_model.metrics_names[1], score[1]*100))\ntest_acc = mnist_cnn_model.evaluate(test_images, test_labels)\ntest_acc",
"10000/10000 [==============================] - 1s 83us/sample - loss: 0.0392 - accuracy: 0.9898\n"
]
],
[
[
"<div class=\"discussion\"><b>Discussion</b> Compare validation accuracy and test accuracy? Comment on whether we have overfitting.</div>",
"_____no_output_____"
],
[
"### Data Preprocessing : Meet the `ImageDataGenerator` class in `keras` \n\n\n[(keras ImageGenerator documentation)](https://keras.io/preprocessing/image/)",
"_____no_output_____"
],
[
"The MNIST and other pre-loaded dataset are formatted in a way that is almost ready for feeding into the model. What about plain images? They should be formatted into appropriately preprocessed floating-point tensors before being fed into the network.\n\nThe Dogs vs. Cats dataset that youโll use isnโt packaged with Keras. It was made available by Kaggle as part of a computer-vision competition in late 2013, back when convnets werenโt mainstream. The data has been downloaded for you from https://www.kaggle.com/c/dogs-vs-cats/data The pictures are medium-resolution color JPEGs. ",
"_____no_output_____"
]
],
[
[
"# TODO: set your base dir to your correct local location\nbase_dir = '../data/cats_and_dogs_small'\n\nimport os, shutil\n\n# Set up directory information\n\ntrain_dir = os.path.join(base_dir, 'train')\nvalidation_dir = os.path.join(base_dir, 'validation')\ntest_dir = os.path.join(base_dir, 'test')\n\ntrain_cats_dir = os.path.join(train_dir, 'cats')\ntrain_dogs_dir = os.path.join(train_dir, 'dogs')\n\nvalidation_cats_dir = os.path.join(validation_dir, 'cats')\nvalidation_dogs_dir = os.path.join(validation_dir, 'dogs')\n\ntest_cats_dir = os.path.join(test_dir, 'cats')\ntest_dogs_dir = os.path.join(test_dir, 'dogs')\n\nprint('total training cat images:', len(os.listdir(train_cats_dir))) \nprint('total training dog images:', len(os.listdir(train_dogs_dir))) \nprint('total validation cat images:', len(os.listdir(validation_cats_dir)))\nprint('total validation dog images:', len(os.listdir(validation_dogs_dir)))\nprint('total test cat images:', len(os.listdir(test_cats_dir))) \nprint('total test dog images:', len(os.listdir(test_dogs_dir))) ",
"total training cat images: 1000\ntotal training dog images: 1000\ntotal validation cat images: 500\ntotal validation dog images: 500\ntotal test cat images: 500\ntotal test dog images: 500\n"
]
],
[
[
"So you do indeed have 2,000 training images, 1,000 validation images, and 1,000 test images. Each split contains the same number of samples from each class: this is a balanced binary-classification problem, which means classification accuracy will be an appropriate measure of success.",
"_____no_output_____"
],
[
"<div class=\"discussion\"><b>Discussion</b> Should you always do your own splitting of the data How about shuffling? Does it always make sense?</div>",
"_____no_output_____"
]
],
[
[
"img_path = '../data/cats_and_dogs_small/train/cats/cat.70.jpg'\n\n# We preprocess the image into a 4D tensor\nfrom keras.preprocessing import image\nimport numpy as np\n\nimg = image.load_img(img_path, target_size=(150, 150))\nimg_tensor = image.img_to_array(img)\nimg_tensor = np.expand_dims(img_tensor, axis=0)\n# Remember that the model was trained on inputs\n# that were preprocessed in the following way:\nimg_tensor /= 255.\n\n# Its shape is (1, 150, 150, 3)\nprint(img_tensor.shape)",
"(1, 150, 150, 3)\n"
],
[
"plt.imshow(img_tensor[0])\nplt.show()",
"_____no_output_____"
]
],
[
[
"Why do we need an extra dimension here?",
"_____no_output_____"
],
[
"#### Building the network",
"_____no_output_____"
]
],
[
[
"model = Sequential()\nmodel.add(Conv2D(32, (3, 3), activation='relu',\n input_shape=(150, 150, 3)))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Conv2D(64, (3, 3), activation='relu'))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Conv2D(128, (3, 3), activation='relu'))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Conv2D(128, (3, 3), activation='relu'))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Flatten())\n\nmodel.add(Dense(128, activation='relu'))\nmodel.add(Dense(1, activation='sigmoid'))\nmodel.summary()",
"Model: \"sequential_1\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv2d_3 (Conv2D) (None, 148, 148, 32) 896 \n_________________________________________________________________\nmax_pooling2d_2 (MaxPooling2 (None, 74, 74, 32) 0 \n_________________________________________________________________\nconv2d_4 (Conv2D) (None, 72, 72, 64) 18496 \n_________________________________________________________________\nmax_pooling2d_3 (MaxPooling2 (None, 36, 36, 64) 0 \n_________________________________________________________________\nconv2d_5 (Conv2D) (None, 34, 34, 128) 73856 \n_________________________________________________________________\nmax_pooling2d_4 (MaxPooling2 (None, 17, 17, 128) 0 \n_________________________________________________________________\nconv2d_6 (Conv2D) (None, 15, 15, 128) 147584 \n_________________________________________________________________\nmax_pooling2d_5 (MaxPooling2 (None, 7, 7, 128) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 6272) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 128) 802944 \n_________________________________________________________________\ndense_3 (Dense) (None, 1) 129 \n=================================================================\nTotal params: 1,043,905\nTrainable params: 1,043,905\nNon-trainable params: 0\n_________________________________________________________________\n"
]
],
[
[
"For the compilation step, youโll go with the RMSprop optimizer. Because you ended the network with a single sigmoid unit, youโll use binary crossentropy as the loss.",
"_____no_output_____"
]
],
[
[
"loss = tf.keras.losses.binary_crossentropy\n#optimizer = Adam(lr=0.001)\n\noptimizer = RMSprop(lr=1e-2)\n\nmetrics = ['accuracy'] \n\n# Compile model\nmodel.compile(optimizer=optimizer,\n loss=loss,\n metrics=metrics)",
"_____no_output_____"
]
],
[
[
"The steps for getting it into the network are roughly as follows:\n\n1. Read the picture files.\n2. Convert the JPEG content to RGB grids of pixels. \n3. Convert these into floating-point tensors.\n4. Rescale the pixel values (between 0 and 255) to the [0, 1] interval (as you know, neural networks prefer to deal with small input values).\n\nIt may seem a bit daunting, but fortunately Keras has utilities to take care of these steps automatically with the class `ImageDataGenerator`, which lets you quickly set up Python generators that can automatically turn image files on disk into batches of preprocessed tensors. This is what youโll use here.",
"_____no_output_____"
]
],
[
[
"from keras.preprocessing.image import ImageDataGenerator\n\ntrain_datagen = ImageDataGenerator(rescale=1./255)\ntest_datagen = ImageDataGenerator(rescale=1./255)\n\ntrain_generator = train_datagen.flow_from_directory(\n train_dir,\n target_size=(150, 150),\n batch_size=20,\n class_mode='binary')\n\nvalidation_generator = test_datagen.flow_from_directory(\n validation_dir,\n target_size=(150, 150),\n batch_size=20,\n class_mode='binary')",
"Found 2000 images belonging to 2 classes.\nFound 1000 images belonging to 2 classes.\n"
]
],
[
[
"Letโs look at the output of one of these generators: it yields batches of 150ร150 RGB images (shape (20, 150, 150, 3)) and binary labels (shape (20,)). There are 20 samples in each batch (the batch size). Note that the generator yields these batches indefinitely: it loops endlessly over the images in the target folder. For this reason, you need to break the iteration loop at some point:",
"_____no_output_____"
]
],
[
[
"for data_batch, labels_batch in train_generator:\n print('data batch shape:', data_batch.shape)\n print('labels batch shape:', labels_batch.shape)\n break",
"data batch shape: (20, 150, 150, 3)\nlabels batch shape: (20,)\n"
]
],
[
[
"Letโs fit the model to the data using the generator. You do so using the `.fit_generator` method, the equivalent of `.fit` for data generators like this one. It expects as its first argument a Python generator that will yield batches of inputs and targets indefinitely, like this one does. \n\nBecause the data is being generated endlessly, the Keras model needs to know how many samples to draw from the generator before declaring an epoch over. This is the role of the `steps_per_epoch` argument: after having drawn steps_per_epoch batches from the generatorโthat is, after having run for steps_per_epoch gradient descent steps - the fitting process will go to the next epoch. In this case, batches are 20 samples, so it will take 100 batches until you see your target of 2,000 samples.\n\nWhen using fit_generator, you can pass a validation_data argument, much as with the fit method. Itโs important to note that this argument is allowed to be a data generator, but it could also be a tuple of Numpy arrays. If you pass a generator as validation_data, then this generator is expected to yield batches of validation data endlessly; thus you should also specify the validation_steps argument, which tells the process how many batches to draw from the validation generator for evaluation",
"_____no_output_____"
]
],
[
[
"%%time \n# Fit the model <--- always a good idea to time it \nverbose, epochs, batch_size, steps_per_epoch = 1, 5, 64, 100\n\nhistory = model.fit_generator(\n train_generator,\n steps_per_epoch=steps_per_epoch,\n epochs=5, # TODO: should be 100\n validation_data=validation_generator,\n validation_steps=50)\n\n\n# Itโs good practice to always save your models after training.\nmodel.save('cats_and_dogs_small_1.h5')",
"WARNING:tensorflow:From <timed exec>:9: Model.fit_generator (from tensorflow.python.keras.engine.training) is deprecated and will be removed in a future version.\nInstructions for updating:\nPlease use Model.fit, which supports generators.\nWARNING:tensorflow:sample_weight modes were coerced from\n ...\n to \n ['...']\nWARNING:tensorflow:sample_weight modes were coerced from\n ...\n to \n ['...']\nTrain for 100 steps, validate for 50 steps\nEpoch 1/5\n100/100 [==============================] - 41s 411ms/step - loss: 149.8046 - accuracy: 0.4885 - val_loss: 0.6931 - val_accuracy: 0.5000\nEpoch 2/5\n100/100 [==============================] - 38s 382ms/step - loss: 0.6939 - accuracy: 0.4960 - val_loss: 0.6932 - val_accuracy: 0.5000\nEpoch 3/5\n100/100 [==============================] - 38s 376ms/step - loss: 0.6937 - accuracy: 0.4830 - val_loss: 0.6931 - val_accuracy: 0.5000\nEpoch 4/5\n100/100 [==============================] - 37s 371ms/step - loss: 0.6933 - accuracy: 0.5010 - val_loss: 0.6935 - val_accuracy: 0.5000\nEpoch 5/5\n100/100 [==============================] - 39s 392ms/step - loss: 0.6937 - accuracy: 0.4940 - val_loss: 0.6932 - val_accuracy: 0.5000\nCPU times: user 13min 32s, sys: 3min 34s, total: 17min 7s\nWall time: 3min 13s\n"
]
],
[
[
"Letโs plot the loss and accuracy of the model over the training and validation data during training:",
"_____no_output_____"
]
],
[
[
"print(history.history.keys())\nprint(history.history['val_accuracy'][-1])\nplt.plot(history.history['accuracy'])\nplt.plot(history.history['val_accuracy'])\nplt.title('model accuracy')\nplt.ylabel('accuracy')\nplt.xlabel('epoch')\nplt.legend(['train', 'val'], loc='upper left')\nplt.show()\n# summarize history for loss\nplt.plot(history.history['loss'])\nplt.plot(history.history['val_loss'])\nplt.title('model loss')\nplt.ylabel('loss')\nplt.xlabel('epoch')\nplt.legend(['train', 'val'], loc='upper left')\nplt.show()\nplt.savefig('../images/batch8.png')",
"dict_keys(['loss', 'accuracy', 'val_loss', 'val_accuracy'])\n0.5\n"
]
],
[
[
"Let's try data augmentation",
"_____no_output_____"
]
],
[
[
"datagen = ImageDataGenerator(\n rotation_range=40,\n width_shift_range=0.2,\n height_shift_range=0.2,\n shear_range=0.2,\n zoom_range=0.2,\n horizontal_flip=True,\n fill_mode='nearest')",
"_____no_output_____"
]
],
[
[
"These are just a few of the options available (for more, see the Keras documentation). \nLetโs quickly go over this code:\n\n- rotation_range is a value in degrees (0โ180), a range within which to randomly rotate pictures.\n- width_shift and height_shift are ranges (as a fraction of total width or height) within which to randomly translate pictures vertically or horizontally.\n- shear_range is for randomly applying shearing transformations.\n- zoom_range is for randomly zooming inside pictures.\n- horizontal_flip is for randomly flipping half the images horizontallyโrelevant when there are no assumptions of - horizontal asymmetry (for example, real-world pictures).\n- fill_mode is the strategy used for filling in newly created pixels, which can appear after a rotation or a width/height shift. \n\nLetโs look at the augmented images",
"_____no_output_____"
]
],
[
[
"from keras.preprocessing import image\nfnames = [os.path.join(train_dogs_dir, fname) for\n fname in os.listdir(train_dogs_dir)]\nimg_path = fnames[3] # Chooses one image to augment\nimg = image.load_img(img_path, target_size=(150, 150))\n# Reads the image and resizes it\nx = image.img_to_array(img) # Converts it to a Numpy array with shape (150, 150, 3) \nx = x.reshape((1,) + x.shape) # Reshapes it to (1, 150, 150, 3)\ni=0\nfor batch in datagen.flow(x, batch_size=1):\n plt.figure(i)\n imgplot = plt.imshow(image.array_to_img(batch[0]))\n i += 1\n if i % 4 == 0:\n break\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"If you train a new network using this data-augmentation configuration, the network will never see the same input twice. But the inputs it sees are still heavily intercorrelated, because they come from a small number of original imagesโyou canโt produce new information, you can only remix existing information. As such, this may not be enough to completely get rid of overfitting. To further fight overfitting, youโll also add a **Dropout** layer to your model right before the densely connected classifier.",
"_____no_output_____"
]
],
[
[
"model = Sequential()\nmodel.add(Conv2D(32, (3, 3), activation='relu',\n input_shape=(150, 150, 3)))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Conv2D(64, (3, 3), activation='relu'))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Conv2D(128, (3, 3), activation='relu'))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Conv2D(128, (3, 3), activation='relu'))\nmodel.add(MaxPooling2D((2, 2)))\nmodel.add(Flatten())\nmodel.add(Dropout(0.5))\nmodel.add(Dense(512, activation='relu'))\nmodel.add(Dense(1, activation='sigmoid'))\nmodel.summary()",
"Model: \"sequential_2\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv2d_7 (Conv2D) (None, 148, 148, 32) 896 \n_________________________________________________________________\nmax_pooling2d_6 (MaxPooling2 (None, 74, 74, 32) 0 \n_________________________________________________________________\nconv2d_8 (Conv2D) (None, 72, 72, 64) 18496 \n_________________________________________________________________\nmax_pooling2d_7 (MaxPooling2 (None, 36, 36, 64) 0 \n_________________________________________________________________\nconv2d_9 (Conv2D) (None, 34, 34, 128) 73856 \n_________________________________________________________________\nmax_pooling2d_8 (MaxPooling2 (None, 17, 17, 128) 0 \n_________________________________________________________________\nconv2d_10 (Conv2D) (None, 15, 15, 128) 147584 \n_________________________________________________________________\nmax_pooling2d_9 (MaxPooling2 (None, 7, 7, 128) 0 \n_________________________________________________________________\nflatten_2 (Flatten) (None, 6272) 0 \n_________________________________________________________________\ndropout (Dropout) (None, 6272) 0 \n_________________________________________________________________\ndense_4 (Dense) (None, 512) 3211776 \n_________________________________________________________________\ndense_5 (Dense) (None, 1) 513 \n=================================================================\nTotal params: 3,453,121\nTrainable params: 3,453,121\nNon-trainable params: 0\n_________________________________________________________________\n"
],
[
"loss = tf.keras.losses.binary_crossentropy\noptimizer = RMSprop(lr=1e-4)\nmetrics = ['acc', 'accuracy'] \n\n# Compile model\nmodel.compile(loss=loss,\n optimizer=optimizer,\n metrics=metrics)",
"_____no_output_____"
],
[
"# Letโs train the network using data augmentation and dropout.\ntrain_datagen = ImageDataGenerator(\n rescale=1./255,\n rotation_range=40,\n width_shift_range=0.2,\n height_shift_range=0.2,\n shear_range=0.2,\n zoom_range=0.2,\n horizontal_flip=True,)\n\ntest_datagen = ImageDataGenerator(rescale=1./255)\n\n# Note that the validation data shouldnโt be augmented!\ntrain_generator = train_datagen.flow_from_directory(\n train_dir,\n target_size=(150, 150),\n batch_size=32,\n class_mode='binary')\n\nvalidation_generator = test_datagen.flow_from_directory(\n validation_dir,\n target_size=(150, 150),\n batch_size=32,\n class_mode='binary')\n\nhistory = model.fit_generator(\n train_generator,\n steps_per_epoch=100,\n epochs=5, # TODO: should be 100\n validation_data=validation_generator,\n validation_steps=50)\n\n# save model if needed\nmodel.save('cats_and_dogs_small_2.h5')",
"Found 2000 images belonging to 2 classes.\nFound 1000 images belonging to 2 classes.\nWARNING:tensorflow:sample_weight modes were coerced from\n ...\n to \n ['...']\nWARNING:tensorflow:sample_weight modes were coerced from\n ...\n to \n ['...']\nTrain for 100 steps, validate for 50 steps\nEpoch 1/5\n 63/100 [=================>............] - ETA: 22s - loss: 0.6958 - acc: 0.5100 - accuracy: 0.5100WARNING:tensorflow:Your input ran out of data; interrupting training. Make sure that your dataset or generator can generate at least `steps_per_epoch * epochs` batches (in this case, 500 batches). You may need to use the repeat() function when building your dataset.\n"
]
],
[
[
"And letโs plot the results again. Thanks to data augmentation and dropout, youโre no longer overfitting: the training curves are closely tracking the validation curves. You now reach an accuracy of 82%, a 15% relative improvement over the non-regularized model. (Note: these numbers are for 100 epochs..)",
"_____no_output_____"
]
],
[
[
"print(history.history.keys())\nprint(history.history['val_accuracy'][-1])\nplt.plot(history.history['accuracy'])\nplt.plot(history.history['val_accuracy'])\nplt.title('Accuracy with data augmentation')\nplt.ylabel('accuracy')\nplt.xlabel('epoch')\nplt.legend(['train', 'val'], loc='upper left')\nplt.show()\n# summarize history for loss\nplt.plot(history.history['loss'])\nplt.plot(history.history['val_loss'])\nplt.title('Loss with data augmentation')\nplt.ylabel('loss')\nplt.xlabel('epoch')\nplt.legend(['train', 'val'], loc='upper left')\nplt.show()\n#plt.savefig('../images/batch8.png')",
"dict_keys(['val_loss', 'val_acc', 'val_accuracy', 'loss', 'acc', 'accuracy'])\n0.6040608882904053\n"
]
],
[
[
"By using regularization techniques even further, and by tuning the networkโs parameters (such as the number of filters per convolution layer, or the number of layers in the network), you may be able to get an even better accuracy, likely up to 86% or 87%. But it would prove difficult to go any higher just by training your own convnet from scratch, because you have so little data to work with. As a next step to improve your accuracy on this problem, youโll have to use a pretrained model.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbfaa30301252b60973642950a2f386bec2d63b6
| 1,922 |
ipynb
|
Jupyter Notebook
|
req1_load_nicely.ipynb
|
sundial-pointcloud-geometry/explore
|
f2aca1b787a5bcc8a187fefdf5048451fca1b2e1
|
[
"MIT"
] | null | null | null |
req1_load_nicely.ipynb
|
sundial-pointcloud-geometry/explore
|
f2aca1b787a5bcc8a187fefdf5048451fca1b2e1
|
[
"MIT"
] | null | null | null |
req1_load_nicely.ipynb
|
sundial-pointcloud-geometry/explore
|
f2aca1b787a5bcc8a187fefdf5048451fca1b2e1
|
[
"MIT"
] | null | null | null | 16.151261 | 94 | 0.48231 |
[
[
[
"from explore import *",
"_____no_output_____"
],
[
"plydata = load_example_cube()",
"_____no_output_____"
],
[
"plydata",
"_____no_output_____"
],
[
"import pandas as pd",
"_____no_output_____"
],
[
"xyz = dict(x=plydata['vertex']['x'], y=plydata['vertex']['y'], z=plydata['vertex']['z'])",
"_____no_output_____"
],
[
"pddf = pd.DataFrame(xyz)",
"_____no_output_____"
],
[
"pddf",
"_____no_output_____"
],
[
"pddf['x'].values",
"_____no_output_____"
],
[
"import vaex as vx",
"_____no_output_____"
],
[
"vx.from_pandas(pddf, copy_index=False)",
"_____no_output_____"
],
[
"vx.from_dict(xyz)",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfaa5518d3c128f6112321f924292b98d4a7678
| 13,058 |
ipynb
|
Jupyter Notebook
|
instructor/15-healpix_instructor.ipynb
|
aperiosoftware/esac-astropy
|
766a6bf72bf7cbb5ec805498ea8a38b251da6b61
|
[
"CC-BY-4.0"
] | 1 |
2019-10-24T12:26:20.000Z
|
2019-10-24T12:26:20.000Z
|
instructor/15-healpix_instructor.ipynb
|
aperiosoftware/esac-astropy
|
766a6bf72bf7cbb5ec805498ea8a38b251da6b61
|
[
"CC-BY-4.0"
] | null | null | null |
instructor/15-healpix_instructor.ipynb
|
aperiosoftware/esac-astropy
|
766a6bf72bf7cbb5ec805498ea8a38b251da6b61
|
[
"CC-BY-4.0"
] | null | null | null | 28.386957 | 351 | 0.591285 |
[
[
[
"# Working with HEALPix data\n\n[HEALPix](https://healpix.jpl.nasa.gov/) (Hierarchical Equal Area isoLatitude Pixelisation) is an algorithm that is often used to store data from all-sky surveys.\n\nThere are several tools in the Astropy ecosystem for working with HEALPix data, depending on what you need to do:\n\n* The [astropy-healpix](https://astropy-healpix.readthedocs.io/en/latest/index.html) coordinated package is a BSD-licensed implementation of HEALPix which focuses on being able to convert celestial coordinates to HEALPix indices and vice-versa, as well as providing a few other low-level functions.\n\n* The [reproject](https://reproject.readthedocs.io/en/stable/) coordinated package (which we've already looked at) includes functions for converting from/to HEALPix maps.\n\n* The [HiPS](https://hips.readthedocs.io/en/latest/) affiliated package implements suport for the [HiPS](http://aladin.u-strasbg.fr/hips/) scheme for storing data that is based on HEALPix.\n\nIn this tutorial, we will take a look at the two first one of these, but we encourage you to learn more about HiPS too!",
"_____no_output_____"
],
[
"\n<section class=\"objectives panel panel-warning\">\n<div class=\"panel-heading\">\n<h2><span class=\"fa fa-certificate\"></span> Objectives</h2>\n</div>\n\n\n<div class=\"panel-body\">\n\n<ul>\n<li>Convert between celestial coordinates and HEALPix indices</li>\n<li>Find the boundaries of HEALPix pixels</li>\n<li>Find healpix pixels close to a position</li>\n<li>Reproject a HEALPix map to a standard projection</li>\n</ul>\n\n</div>\n\n</section>\n",
"_____no_output_____"
],
[
"## Documentation\n\nThis notebook only shows a subset of the functionality in astropy-healpix and reproject. For more information about the features presented below as well as other available features, you can read the\n[astropy-healpix](https://astropy-healpix.readthedocs.io/en/latest/index.html) and the [reproject](https://reproject.readthedocs.io/en/stable/) documentation.",
"_____no_output_____"
]
],
[
[
"%matplotlib inline\nimport matplotlib.pyplot as plt\nplt.rc('image', origin='lower')\nplt.rc('figure', figsize=(10, 6))",
"_____no_output_____"
]
],
[
[
"## Data\n\nFor this tutorial, we will be using a downsampled version of the Planck HFI 857Ghz map which is stored as a HEALPix map ([data/HFI_SkyMap_857_2048_R1.10_nominal_ZodiCorrected_lowres.fits](data/HFI_SkyMap_857_2048_R1.10_nominal_ZodiCorrected_lowres.fits)).",
"_____no_output_____"
],
[
"## Using astropy-healpix\n\nTo start off, we can open the HEALPix file (which is a FITS file) with astropy.io.fits:",
"_____no_output_____"
]
],
[
[
"from astropy.io import fits\nhdulist = fits.open('data/HFI_SkyMap_857_2048_R1.10_nominal_ZodiCorrected_lowres.fits')\nhdulist.info()",
"_____no_output_____"
]
],
[
[
"The HEALPix map values are stored in HDU 1. This HDU also contains useful header information that helps us understand how to interpret the HEALPix values:",
"_____no_output_____"
]
],
[
[
"hdulist[1].header['NSIDE']",
"_____no_output_____"
],
[
"hdulist[1].header['ORDERING']",
"_____no_output_____"
],
[
"hdulist[1].header['COORDSYS']",
"_____no_output_____"
]
],
[
[
"With this information we can now construct a ``HEALPix`` object:",
"_____no_output_____"
]
],
[
[
"from astropy_healpix import HEALPix\nfrom astropy.coordinates import Galactic",
"_____no_output_____"
],
[
"hp = HEALPix(nside=hdulist[1].header['NSIDE'],\n order=hdulist[1].header['ORDERING'],\n frame=Galactic()) ",
"_____no_output_____"
]
],
[
[
"We can then use this object to manipulate the HEALPix map. To start off, we can find out what the coordinates of specific pixels are:",
"_____no_output_____"
]
],
[
[
"hp.healpix_to_skycoord([13322, 2231, 66432])",
"_____no_output_____"
]
],
[
[
"and vice-versa:",
"_____no_output_____"
]
],
[
[
"from astropy.coordinates import SkyCoord\nhp.skycoord_to_healpix(SkyCoord.from_name('M31'))",
"_____no_output_____"
]
],
[
[
"You can also find out what the boundaries of a pixel are:",
"_____no_output_____"
]
],
[
[
"edge = hp.boundaries_skycoord(649476, step=100)\nedge",
"_____no_output_____"
]
],
[
[
"The ``step`` argument controls how many points to sample along the edge of the pixel. The result should be a polygon:",
"_____no_output_____"
]
],
[
[
"plt.plot(edge[0].l.deg, edge[0].b.deg)",
"_____no_output_____"
]
],
[
[
"You can find all HEALPix pixels within a certain radius of a known position:",
"_____no_output_____"
]
],
[
[
"from astropy import units as u\nhp.cone_search_skycoord(SkyCoord.from_name('M31'), radius=1 * u.deg)",
"_____no_output_____"
]
],
[
[
"And finally you can interpolate the map at specific coordinates:",
"_____no_output_____"
]
],
[
[
"hp.interpolate_bilinear_skycoord(SkyCoord.from_name('M31'), hdulist[1].data['I_STOKES'])",
"_____no_output_____"
]
],
[
[
"\n<section class=\"challenge panel panel-success\">\n<div class=\"panel-heading\">\n<h2><span class=\"fa fa-pencil\"></span> Challenge</h2>\n</div>\n\n\n<div class=\"panel-body\">\n\n<ol>\n<li>Find the mean value of I_STOKES within 2 degrees of M42</li>\n<li>Use astropy.coordinates to check that all the pixels returned by the cone search are indeed within 2 degrees of M42 (if not, why not? Hint: check the documentation of <a href=\"https://astropy-healpix.readthedocs.io/en/latest/api/astropy_healpix.HEALPix.html#astropy_healpix.HEALPix.cone_search_skycoord\">cone_search_skycoord()</a>)</li>\n</ol>\n\n</div>\n\n</section>\n",
"_____no_output_____"
]
],
[
[
"#1\nimport numpy as np\nM42 = SkyCoord.from_name('M42')\nm42_pixels = hp.cone_search_skycoord(M42, radius=2 * u.deg)\nprint(np.mean(hdulist[1].data['I_STOKES'][m42_pixels]))",
"_____no_output_____"
],
[
"#2\nm42_cone_search_coords = hp.healpix_to_skycoord(m42_pixels)\nseparation = m42_cone_search_coords.separation(M42).degree\n_ = plt.hist(separation, bins=50)",
"_____no_output_____"
]
],
[
[
"## Using reproject for HEALPix data\n\nThe reproject package is useful for HEALPix data to convert a HEALPix map to a regular projection, and vice-versa. For example, let's define a simple all-sky Plate-Caree WCS:",
"_____no_output_____"
]
],
[
[
"from astropy.wcs import WCS\nwcs = WCS(naxis=2)\nwcs.wcs.ctype = 'GLON-CAR', 'GLAT-CAR'\nwcs.wcs.crval = 0, 0\nwcs.wcs.crpix = 180.5, 90.5\nwcs.wcs.cdelt = -1, 1",
"_____no_output_____"
]
],
[
[
"We can now use [reproject_from_healpix](https://reproject.readthedocs.io/en/stable/api/reproject.reproject_from_healpix.html#reproject.reproject_from_healpix) to convert the HEALPix map to this header:",
"_____no_output_____"
]
],
[
[
"from reproject import reproject_from_healpix",
"_____no_output_____"
],
[
"array, footprint = reproject_from_healpix('data/HFI_SkyMap_857_2048_R1.10_nominal_ZodiCorrected_lowres.fits',\n wcs, shape_out=(180, 360))",
"_____no_output_____"
],
[
"plt.imshow(array, vmax=100)",
"_____no_output_____"
]
],
[
[
"You can also use [reproject_to_healpix](https://reproject.readthedocs.io/en/stable/api/reproject.reproject_to_healpix.html#reproject.reproject_to_healpix) to convert a regular map to a HEALPix array.",
"_____no_output_____"
],
[
"\n<section class=\"challenge panel panel-success\">\n<div class=\"panel-heading\">\n<h2><span class=\"fa fa-pencil\"></span> Challenge</h2>\n</div>\n\n\n<div class=\"panel-body\">\n\n<ol>\n<li>Reproject the HFI HEALPix map to the projection of the GAIA point source density map as well as the IRAS map that we used in previous tutorials.</li>\n<li>Visualize the results using WCSAxes and optionally the image normalization options.</li>\n</ol>\n\n</div>\n\n</section>\n",
"_____no_output_____"
]
],
[
[
"#1\nheader_gaia = fits.getheader('data/LMCDensFits1k.fits')\nheader_irsa = fits.getheader('data/ISSA_100_LMC.fits')\n\narray_gaia, _ = reproject_from_healpix('data/HFI_SkyMap_857_2048_R1.10_nominal_ZodiCorrected_lowres.fits',\n header_gaia)\narray_irsa, _ = reproject_from_healpix('data/HFI_SkyMap_857_2048_R1.10_nominal_ZodiCorrected_lowres.fits',\n header_irsa)",
"_____no_output_____"
],
[
"#2\nfrom astropy.visualization import simple_norm\nax = plt.subplot(projection=WCS(header_gaia))\nim =ax.imshow(array_gaia, cmap='plasma',\n norm=simple_norm(array_gaia, stretch='sqrt', percent=99.5))\nplt.colorbar(im)\nax.grid()\nax.set_xlabel('Galactic Longitude')\nax.set_ylabel('Galactic Latitude')",
"_____no_output_____"
]
],
[
[
"<center><i>This notebook was written by <a href=\"https://aperiosoftware.com/\">Aperio Software Ltd.</a> © 2019, and is licensed under a <a href=\"https://creativecommons.org/licenses/by/4.0/\">Creative Commons Attribution 4.0 International License (CC BY 4.0)</a></i></center>\n\n",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
]
] |
cbfaacf18eea41a3d598d46daa9da66bc2ec88d7
| 5,135 |
ipynb
|
Jupyter Notebook
|
notebooks/Datasets.ipynb
|
kevintheduu/bioinf-python
|
ed1209f52ae81b6fff4f115f5859b93d099cb8fe
|
[
"Apache-2.0"
] | 218 |
2015-07-06T09:26:19.000Z
|
2022-03-17T23:16:09.000Z
|
notebooks/Datasets.ipynb
|
kipkurui/bioinf-python
|
f2e05df2d7baa54f9fbbc246b2526ee57d8b451c
|
[
"Apache-2.0"
] | 9 |
2015-05-08T18:03:25.000Z
|
2019-09-27T20:24:12.000Z
|
notebooks/Datasets.ipynb
|
kipkurui/bioinf-python
|
f2e05df2d7baa54f9fbbc246b2526ee57d8b451c
|
[
"Apache-2.0"
] | 136 |
2015-05-08T17:58:48.000Z
|
2022-03-17T23:16:10.000Z
| 31.697531 | 196 | 0.627653 |
[
[
[
"# Datasets for the book\n\nHere we provide links to the datasets used in the book.\n\nImportant Notes:\n\n1. Note that these datasets are provided on external servers by third parties\n2. Due to security issues with github you will have to cut and paste FTP links (they are not provided as clickable URLs)",
"_____no_output_____"
],
[
"# Python and the Surrounding Software Ecology\n\n### Interfacing with R via rpy2\n\n* sequence.index\nPlease FTP from this URL(cut and paste)\n\nftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/historical_data/former_toplevel/sequence.index",
"_____no_output_____"
],
[
"# Next-generation Sequencing (NGS)\n\n## Working with modern sequence formats\n* SRR003265.filt.fastq.gz\nPlease FTP from this URL (cut and paste)\n\nftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/phase3/data/NA18489/sequence_read/SRR003265.filt.fastq.gz\n\n## Working with BAM files\n* NA18490_20_exome.bam\nPlease FTP from this URL (cut and paste)\n\nftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/phase3/data/NA18489/exome_alignment/NA18489.chrom20.ILLUMINA.bwa.YRI.exome.20121211.bam\n\n* NA18490_20_exome.bam.bai\nPlease FTP from this URL (cut and paste)\n\nftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/phase3/data/NA18489/exome_alignment/NA18489.chrom20.ILLUMINA.bwa.YRI.exome.20121211.bam.bai\n\n## Analyzing data in Variant Call Format (VCF)\n\n* tabix link:\nftp://ftp-trace.ncbi.nih.gov/1000genomes/ftp/release/20130502/supporting/vcf_with_sample_level_annotation/ALL.chr22.phase3_shapeit2_mvncall_integrated_v5_extra_anno.20130502.genotypes.vcf.gz",
"_____no_output_____"
],
[
"# Genomics\n\n### Working with high-quality reference genomes\n\n* [falciparum.fasta](http://plasmodb.org/common/downloads/release-9.3/Pfalciparum3D7/fasta/data/PlasmoDB-9.3_Pfalciparum3D7_Genome.fasta)\n\n### Dealing with low low-quality genome references\n\n\n* gambiae.fa.gz\nPlease FTP from this URL (cut and paste)\nftp://ftp.vectorbase.org/public_data/organism_data/agambiae/Genome/agambiae.CHROMOSOMES-PEST.AgamP3.fa.gz\n\n* [atroparvus.fa.gz](https://www.vectorbase.org/download/anopheles-atroparvus-ebroscaffoldsaatre1fagz)\n\n\n### Traversing genome annotations\n\n* [gambiae.gff3.gz](http://www.vectorbase.org/download/anopheles-gambiae-pestbasefeaturesagamp42gff3gz)\n",
"_____no_output_____"
],
[
"# PopGen\n\n### Managing datasets with PLINK\n\n* [hapmap.map.bz2](http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/hapmap3/plink_format/draft_2/hapmap3_r2_b36_fwd.consensus.qc.poly.map.bz2)\n* [hapmap.ped.bz2](http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/hapmap3/plink_format/draft_2/hapmap3_r2_b36_fwd.consensus.qc.poly.ped.bz2)\n* [relationships.txt](http://hapmap.ncbi.nlm.nih.gov/downloads/genotypes/hapmap3/plink_format/draft_2/relationships_w_pops_121708.txt)",
"_____no_output_____"
],
[
"# PDB\n\n### Parsing mmCIF files with Biopython\n\n* [1TUP.cif](http://www.rcsb.org/pdb/download/downloadFile.do?fileFormat=cif&compression=NO&structureId=1TUP)",
"_____no_output_____"
],
[
"# Python for Big genomics datasets\n\n### Setting the stage for high-performance computing\n\nThese are the exact same files as _Managing datasets with PLINK_ above\n\n### Programing with lazyness\n* SRR003265_1.filt.fastq.gz Please ftp from this URL (cut and paste):\nftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/phase3/data/NA18489/sequence_read/SRR003265_1.filt.fastq.gz\n\n\n* SRR003265_2.filt.fastq.gz Please ftp from this URL (cut and paste):\nftp://ftp.1000genomes.ebi.ac.uk/vol1/ftp/phase3/data/NA18489/sequence_read/SRR003265_2.filt.fastq.gz",
"_____no_output_____"
]
]
] |
[
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
cbfaada60cc1e3d32c7f42c67454f7a416525012
| 10,357 |
ipynb
|
Jupyter Notebook
|
wienerschnitzelgemeinschaft/src/Russ/resnet32q.ipynb
|
guitarmind/HPA-competition-solutions
|
547d53aaca148fdb5f4585526ad7364dfa47967d
|
[
"MIT"
] | null | null | null |
wienerschnitzelgemeinschaft/src/Russ/resnet32q.ipynb
|
guitarmind/HPA-competition-solutions
|
547d53aaca148fdb5f4585526ad7364dfa47967d
|
[
"MIT"
] | null | null | null |
wienerschnitzelgemeinschaft/src/Russ/resnet32q.ipynb
|
guitarmind/HPA-competition-solutions
|
547d53aaca148fdb5f4585526ad7364dfa47967d
|
[
"MIT"
] | null | null | null | 38.790262 | 125 | 0.459496 |
[
[
[
"import torch\nimport torch.nn as nn\nfrom torch.autograd import Variable",
"_____no_output_____"
],
[
"def conv3x3(in_, out):\n return nn.Conv2d(in_, out, 3, padding=1)\n\n\nclass ConvRelu(nn.Module):\n def __init__(self, in_, out):\n super().__init__()\n self.conv = conv3x3(in_, out)\n self.activation = nn.ReLU(inplace=True)\n\n def forward(self, x):\n x = self.conv(x)\n x = self.activation(x)\n return x\n\n\nclass NoOperation(nn.Module):\n def forward(self, x):\n return x\n\n\nclass DecoderBlock(nn.Module):\n def __init__(self, in_channels, middle_channels, out_channels):\n super().__init__()\n\n self.block = nn.Sequential(\n ConvRelu(in_channels, middle_channels),\n nn.ConvTranspose2d(middle_channels, out_channels, kernel_size=3, stride=2, padding=1, output_padding=1),\n nn.ReLU(inplace=True)\n )\n\n def forward(self, x):\n return self.block(x)\n\n\nclass DecoderBlockV2(nn.Module):\n def __init__(self, in_channels, middle_channels, out_channels, is_deconv=True,\n output_padding=0):\n super(DecoderBlockV2, self).__init__()\n self.in_channels = in_channels\n\n if is_deconv:\n \"\"\"\n Paramaters for Deconvolution were chosen to avoid artifacts, following\n link https://distill.pub/2016/deconv-checkerboard/\n \"\"\"\n\n self.block = nn.Sequential(\n ConvRelu(in_channels, middle_channels),\n nn.ConvTranspose2d(middle_channels, out_channels, kernel_size=4, stride=2,\n padding=1, output_padding=output_padding),\n nn.ReLU(inplace=True)\n )\n else:\n self.block = nn.Sequential(\n nn.Upsample(scale_factor=2, mode='bilinear'),\n ConvRelu(in_channels, middle_channels),\n ConvRelu(middle_channels, out_channels),\n )\n\n def forward(self, x):\n return self.block(x)\n\nclass Interpolate(nn.Module):\n def __init__(self, mode='nearest', scale_factor=2,\n align_corners=False, output_padding=0):\n super(Interpolate, self).__init__()\n self.interp = nn.functional.interpolate\n self.mode = mode\n self.scale_factor = scale_factor\n self.align_corners = align_corners\n self.pad = output_padding\n \n def forward(self, x):\n if self.mode in ['linear','bilinear','trilinear']:\n x = self.interp(x, mode=self.mode,\n scale_factor=self.scale_factor,\n align_corners=self.align_corners)\n else:\n x = self.interp(x, mode=self.mode,\n scale_factor=self.scale_factor)\n \n if self.pad > 0:\n x = nn.ZeroPad2d((0, self.pad, 0, self.pad))(x)\n return x\n\nclass DecoderBlockV3(nn.Module):\n def __init__(self, in_channels, middle_channels, out_channels,\n is_deconv=True, output_padding=0):\n super(DecoderBlockV3, self).__init__()\n self.in_channels = in_channels\n\n if is_deconv:\n \"\"\"\n Paramaters for Deconvolution were chosen to avoid artifacts, following\n link https://distill.pub/2016/deconv-checkerboard/\n \"\"\"\n\n self.block = nn.Sequential(\n nn.ConvTranspose2d(in_channels, middle_channels, kernel_size=4, stride=2,\n padding=1, output_padding=output_padding),\n ConvRelu(middle_channels, out_channels),\n )\n else:\n self.block = nn.Sequential(\n Interpolate(mode='nearest', scale_factor=2,\n output_padding=output_padding),\n # nn.Upsample(scale_factor=2, mode='bilinear'),\n ConvRelu(in_channels, middle_channels),\n ConvRelu(middle_channels, out_channels),\n )\n\n def forward(self, x):\n return self.block(x)\n \n\nclass AdaptiveConcatPool2d(nn.Module):\n def __init__(self, sz=None):\n super().__init__()\n sz = sz or (1,1)\n self.ap = nn.AdaptiveAvgPool2d(sz)\n self.mp = nn.AdaptiveMaxPool2d(sz)\n def forward(self, x): return torch.cat([self.mp(x), self.ap(x)], 1)\n\nclass Resnet(nn.Module):\n\n def __init__(self, num_classes, num_filters=32, \n pretrained=True, is_deconv=False):\n super().__init__()\n self.num_classes = num_classes\n \n# self.conv4to3 = nn.Conv2d(4, 3, 1)\n \n# self.encoder = pretrainedmodels.__dict__['se_resnext50_32x4d'](num_classes=1000,\n# pretrained='imagenet') \n \n # code removes final layer\n# layers = resnet34()\n layers = list(resnet34().children())[:-2]\n \n# # replace first convolutional layer by 4->64 while keeping corresponding weights\n# # and initializing new weights with zeros\n# # https://www.kaggle.com/iafoss/pretrained-resnet34-with-rgby-0-448-public-lb/notebook\n# w = layers[0].weight\n# layers[0] = nn.Conv2d(4,64,kernel_size=(7,7),stride=(2,2),padding=(3, 3),\n# bias=False)\n# layers[0].weight = torch.nn.Parameter(torch.cat((w,torch.zeros(64,1,7,7)),\n# dim=1))\n \n# layers += [AdaptiveConcatPool2d()]\n self.encoder = nn.Sequential(*layers)\n\n self.map_logits = nn.Conv2d(512, num_classes, kernel_size=(3,3), \n stride=(1,1), padding=1)\n\n# self.encoder = nn.Sequential(*list(self.encoder.children())[:-1])\n\n# self.pool = nn.MaxPool2d(2, 2)\n# self.convp = nn.Conv2d(1056, 512, 3)\n\n# self.csize = 1024 * 1 * 1\n# self.bn1 = nn.BatchNorm1d(1024)\n# self.do1 = nn.Dropout(p=0.5)\n# self.lin1 = nn.Linear(1024, 512)\n# self.act1 = nn.ReLU()\n# self.bn2 = nn.BatchNorm1d(512)\n# self.do2 = nn.Dropout(0.5)\n# self.lin2 = nn.Linear(512, num_classes)\n \n def forward(self, x):\n \n # set to True for debugging\n print_sizes = False\n if print_sizes: \n print('')\n print('x',x.shape)\n \n # print layer dictionary\n # print(self.encoder.features)\n \n# x = self.conv4to3(x)\n \n# m = self.encoder._modules\n# layer_names = list(m.keys())\n# mx = {}\n# for i,f in enumerate(m):\n# x = m[f](x)\n# mx[layer_names[i]] = x\n# if print_sizes:\n# if isinstance(x,tuple):\n# print(i,layer_names[i],x[0].size(),x[1].size())\n# else:\n# print(i,layer_names[i],x.size())\n# if layer_names[i]=='avg_pool': break\n \n x = self.encoder(x)\n if print_sizes: print('encoder',x.shape)\n\n x = self.map_logits(x)\n if print_sizes: print('map_logits',x.shape)\n\n# x = x.view(-1, self.csize)\n# if print_sizes: print('view',x.size())\n\n# x = self.bn1(x)\n# x = self.do1(x)\n# if print_sizes: print('do1',x.size())\n \n# x = self.lin1(x)\n# if print_sizes: print('lin1',x.size())\n# x = self.act1(x)\n# x = self.bn2(x)\n# x = self.do2(x) \n# x = self.lin2(x)\n# if print_sizes: print('lin2',x.shape)\n\n return x\n ",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code"
]
] |
cbfaaeb0f73c7df6646ff01eaba7a4fb8f188ca6
| 42,440 |
ipynb
|
Jupyter Notebook
|
01 Neural Networks and Deep Learning/Assigment/C1W2/Python Basics With Numpy/Python+Basics+With+Numpy.ipynb
|
polarisZhao/Coursera-Deep-Learning
|
5959f9cc8867f00bc6521d4b770f539344506105
|
[
"MIT"
] | 6 |
2017-09-19T19:14:42.000Z
|
2020-02-13T18:44:19.000Z
|
01 Neural Networks and Deep Learning/Assigment/C1W2/Python Basics With Numpy/Python+Basics+With+Numpy.ipynb
|
polarisZhao/Coursera-Deep-Learning
|
5959f9cc8867f00bc6521d4b770f539344506105
|
[
"MIT"
] | null | null | null |
01 Neural Networks and Deep Learning/Assigment/C1W2/Python Basics With Numpy/Python+Basics+With+Numpy.ipynb
|
polarisZhao/Coursera-Deep-Learning
|
5959f9cc8867f00bc6521d4b770f539344506105
|
[
"MIT"
] | 1 |
2018-11-08T02:19:36.000Z
|
2018-11-08T02:19:36.000Z
| 34.786885 | 2,276 | 0.513219 |
[
[
[
"# Table of Contents\n <p><div class=\"lev1 toc-item\"><a href=\"#Python-Basics-with-Numpy-(optional-assignment)\" data-toc-modified-id=\"Python-Basics-with-Numpy-(optional-assignment)-1\"><span class=\"toc-item-num\">1 </span>Python Basics with Numpy (optional assignment)</a></div><div class=\"lev2 toc-item\"><a href=\"#About-iPython-Notebooks\" data-toc-modified-id=\"About-iPython-Notebooks-11\"><span class=\"toc-item-num\">1.1 </span>About iPython Notebooks</a></div><div class=\"lev2 toc-item\"><a href=\"#1---Building-basic-functions-with-numpy\" data-toc-modified-id=\"1---Building-basic-functions-with-numpy-12\"><span class=\"toc-item-num\">1.2 </span>1 - Building basic functions with numpy</a></div><div class=\"lev3 toc-item\"><a href=\"#1.1---sigmoid-function,-np.exp()\" data-toc-modified-id=\"1.1---sigmoid-function,-np.exp()-121\"><span class=\"toc-item-num\">1.2.1 </span>1.1 - sigmoid function, np.exp()</a></div><div class=\"lev3 toc-item\"><a href=\"#1.2---Sigmoid-gradient\" data-toc-modified-id=\"1.2---Sigmoid-gradient-122\"><span class=\"toc-item-num\">1.2.2 </span>1.2 - Sigmoid gradient</a></div><div class=\"lev3 toc-item\"><a href=\"#1.3---Reshaping-arrays\" data-toc-modified-id=\"1.3---Reshaping-arrays-123\"><span class=\"toc-item-num\">1.2.3 </span>1.3 - Reshaping arrays</a></div><div class=\"lev3 toc-item\"><a href=\"#1.4---Normalizing-rows\" data-toc-modified-id=\"1.4---Normalizing-rows-124\"><span class=\"toc-item-num\">1.2.4 </span>1.4 - Normalizing rows</a></div><div class=\"lev3 toc-item\"><a href=\"#1.5---Broadcasting-and-the-softmax-function\" data-toc-modified-id=\"1.5---Broadcasting-and-the-softmax-function-125\"><span class=\"toc-item-num\">1.2.5 </span>1.5 - Broadcasting and the softmax function</a></div><div class=\"lev2 toc-item\"><a href=\"#2)-Vectorization\" data-toc-modified-id=\"2)-Vectorization-13\"><span class=\"toc-item-num\">1.3 </span>2) Vectorization</a></div><div class=\"lev3 toc-item\"><a href=\"#2.1-Implement-the-L1-and-L2-loss-functions\" data-toc-modified-id=\"2.1-Implement-the-L1-and-L2-loss-functions-131\"><span class=\"toc-item-num\">1.3.1 </span>2.1 Implement the L1 and L2 loss functions</a></div>",
"_____no_output_____"
],
[
"# Python Basics with Numpy (optional assignment)\n\nWelcome to your first assignment. This exercise gives you a brief introduction to Python. Even if you've used Python before, this will help familiarize you with functions we'll need. \n\n**Instructions:**\n- You will be using Python 3.\n- Avoid using for-loops and while-loops, unless you are explicitly told to do so.\n- Do not modify the (# GRADED FUNCTION [function name]) comment in some cells. Your work would not be graded if you change this. Each cell containing that comment should only contain one function.\n- After coding your function, run the cell right below it to check if your result is correct.\n\n**After this assignment you will:**\n- Be able to use iPython Notebooks\n- Be able to use numpy functions and numpy matrix/vector operations\n- Understand the concept of \"broadcasting\"\n- Be able to vectorize code\n\nLet's get started!",
"_____no_output_____"
],
[
"## About iPython Notebooks ##\n\niPython Notebooks are interactive coding environments embedded in a webpage. You will be using iPython notebooks in this class. You only need to write code between the ### START CODE HERE ### and ### END CODE HERE ### comments. After writing your code, you can run the cell by either pressing \"SHIFT\"+\"ENTER\" or by clicking on \"Run Cell\" (denoted by a play symbol) in the upper bar of the notebook. \n\nWe will often specify \"(โ X lines of code)\" in the comments to tell you about how much code you need to write. It is just a rough estimate, so don't feel bad if your code is longer or shorter.\n\n**Exercise**: Set test to `\"Hello World\"` in the cell below to print \"Hello World\" and run the two cells below.",
"_____no_output_____"
]
],
[
[
"### START CODE HERE ### (โ 1 line of code)\ntest = \"Hello World\"\n### END CODE HERE ###",
"_____no_output_____"
],
[
"print (\"test: \" + test)",
"test: Hello World\n"
]
],
[
[
"**Expected output**:\ntest: Hello World",
"_____no_output_____"
],
[
"<font color='blue'>\n**What you need to remember**:\n- Run your cells using SHIFT+ENTER (or \"Run cell\")\n- Write code in the designated areas using Python 3 only\n- Do not modify the code outside of the designated areas",
"_____no_output_____"
],
[
"## 1 - Building basic functions with numpy ##\n\nNumpy is the main package for scientific computing in Python. It is maintained by a large community (www.numpy.org). In this exercise you will learn several key numpy functions such as np.exp, np.log, and np.reshape. You will need to know how to use these functions for future assignments.\n\n### 1.1 - sigmoid function, np.exp() ###\n\nBefore using np.exp(), you will use math.exp() to implement the sigmoid function. You will then see why np.exp() is preferable to math.exp().\n\n**Exercise**: Build a function that returns the sigmoid of a real number x. Use math.exp(x) for the exponential function.\n\n**Reminder**:\n$sigmoid(x) = \\frac{1}{1+e^{-x}}$ is sometimes also known as the logistic function. It is a non-linear function used not only in Machine Learning (Logistic Regression), but also in Deep Learning.\n\n<img src=\"images/Sigmoid.png\" style=\"width:500px;height:228px;\">\n\nTo refer to a function belonging to a specific package you could call it using package_name.function(). Run the code below to see an example with math.exp().",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: basic_sigmoid\n\nimport math\n\ndef basic_sigmoid(x):\n \"\"\"\n Compute sigmoid of x.\n\n Arguments:\n x -- A scalar\n\n Return:\n s -- sigmoid(x)\n \"\"\"\n \n ### START CODE HERE ### (โ 1 line of code)\n s = 1 / (1 + math.exp(-x))\n ### END CODE HERE ###\n \n return s",
"_____no_output_____"
],
[
"basic_sigmoid(3)",
"_____no_output_____"
]
],
[
[
"**Expected Output**: \n<table style = \"width:40%\">\n <tr>\n <td>** basic_sigmoid(3) **</td> \n <td>0.9525741268224334 </td> \n </tr>\n\n</table>",
"_____no_output_____"
],
[
"Actually, we rarely use the \"math\" library in deep learning because the inputs of the functions are real numbers. In deep learning we mostly use matrices and vectors. This is why numpy is more useful. ",
"_____no_output_____"
]
],
[
[
"### One reason why we use \"numpy\" instead of \"math\" in Deep Learning ###\nx = [1, 2, 3]\n# basic_sigmoid(x) # you will see this give an error when you run it, because x is a vector.",
"_____no_output_____"
]
],
[
[
"In fact, if $ x = (x_1, x_2, ..., x_n)$ is a row vector then $np.exp(x)$ will apply the exponential function to every element of x. The output will thus be: $np.exp(x) = (e^{x_1}, e^{x_2}, ..., e^{x_n})$",
"_____no_output_____"
]
],
[
[
"import numpy as np\n\n# example of np.exp\nx = np.array([1, 2, 3])\nprint(np.exp(x)) # result is (exp(1), exp(2), exp(3))",
"[ 2.71828183 7.3890561 20.08553692]\n"
]
],
[
[
"Furthermore, if x is a vector, then a Python operation such as $s = x + 3$ or $s = \\frac{1}{x}$ will output s as a vector of the same size as x.",
"_____no_output_____"
]
],
[
[
"# example of vector operation\nx = np.array([1, 2, 3])\nprint (x + 3)",
"[4 5 6]\n"
]
],
[
[
"Any time you need more info on a numpy function, we encourage you to look at [the official documentation](https://docs.scipy.org/doc/numpy-1.10.1/reference/generated/numpy.exp.html). \n\nYou can also create a new cell in the notebook and write `np.exp?` (for example) to get quick access to the documentation.\n\n**Exercise**: Implement the sigmoid function using numpy. \n\n**Instructions**: x could now be either a real number, a vector, or a matrix. The data structures we use in numpy to represent these shapes (vectors, matrices...) are called numpy arrays. You don't need to know more for now.\n$$ \\text{For } x \\in \\mathbb{R}^n \\text{, } sigmoid(x) = sigmoid\\begin{pmatrix}\n x_1 \\\\\n x_2 \\\\\n ... \\\\\n x_n \\\\\n\\end{pmatrix} = \\begin{pmatrix}\n \\frac{1}{1+e^{-x_1}} \\\\\n \\frac{1}{1+e^{-x_2}} \\\\\n ... \\\\\n \\frac{1}{1+e^{-x_n}} \\\\\n\\end{pmatrix}\\tag{1} $$",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: sigmoid\n\nimport numpy as np # this means you can access numpy functions by writing np.function() instead of numpy.function()\n\ndef sigmoid(x):\n \"\"\"\n Compute the sigmoid of x\n\n Arguments:\n x -- A scalar or numpy array of any size\n\n Return:\n s -- sigmoid(x)\n \"\"\"\n \n ### START CODE HERE ### (โ 1 line of code)\n s = 1. / (1 + np.exp(-x))\n ### END CODE HERE ###\n \n return s",
"_____no_output_____"
],
[
"x = np.array([1, 2, 3])\nsigmoid(x)",
"_____no_output_____"
]
],
[
[
"**Expected Output**: \n<table>\n <tr> \n <td> **sigmoid([1,2,3])**</td> \n <td> array([ 0.73105858, 0.88079708, 0.95257413]) </td> \n </tr>\n</table> \n",
"_____no_output_____"
],
[
"### 1.2 - Sigmoid gradient\n\nAs you've seen in lecture, you will need to compute gradients to optimize loss functions using backpropagation. Let's code your first gradient function.\n\n**Exercise**: Implement the function sigmoid_grad() to compute the gradient of the sigmoid function with respect to its input x. The formula is: $$sigmoid\\_derivative(x) = \\sigma'(x) = \\sigma(x) (1 - \\sigma(x))\\tag{2}$$\nYou often code this function in two steps:\n1. Set s to be the sigmoid of x. You might find your sigmoid(x) function useful.\n2. Compute $\\sigma'(x) = s(1-s)$",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: sigmoid_derivative\n\ndef sigmoid_derivative(x):\n \"\"\"\n Compute the gradient (also called the slope or derivative) of the sigmoid function with respect to its input x.\n You can store the output of the sigmoid function into variables and then use it to calculate the gradient.\n \n Arguments:\n x -- A scalar or numpy array\n\n Return:\n ds -- Your computed gradient.\n \"\"\"\n \n ### START CODE HERE ### (โ 2 lines of code)\n s = sigmoid(x)\n ds = s * (1-s)\n ### END CODE HERE ###\n \n return ds",
"_____no_output_____"
],
[
"x = np.array([1, 2, 3])\nprint (\"sigmoid_derivative(x) = \" + str(sigmoid_derivative(x)))",
"sigmoid_derivative(x) = [ 0.19661193 0.10499359 0.04517666]\n"
]
],
[
[
"**Expected Output**: \n\n\n<table>\n <tr> \n <td> **sigmoid_derivative([1,2,3])**</td> \n <td> [ 0.19661193 0.10499359 0.04517666] </td> \n </tr>\n</table> \n\n",
"_____no_output_____"
],
[
"### 1.3 - Reshaping arrays ###\n\nTwo common numpy functions used in deep learning are [np.shape](https://docs.scipy.org/doc/numpy/reference/generated/numpy.ndarray.shape.html) and [np.reshape()](https://docs.scipy.org/doc/numpy/reference/generated/numpy.reshape.html). \n- X.shape is used to get the shape (dimension) of a matrix/vector X. \n- X.reshape(...) is used to reshape X into some other dimension. \n\nFor example, in computer science, an image is represented by a 3D array of shape $(length, height, depth = 3)$. However, when you read an image as the input of an algorithm you convert it to a vector of shape $(length*height*3, 1)$. In other words, you \"unroll\", or reshape, the 3D array into a 1D vector.\n\n<img src=\"images/image2vector_kiank.png\" style=\"width:500px;height:300;\">\n\n**Exercise**: Implement `image2vector()` that takes an input of shape (length, height, 3) and returns a vector of shape (length\\*height\\*3, 1). For example, if you would like to reshape an array v of shape (a, b, c) into a vector of shape (a*b,c) you would do:\n``` python\nv = v.reshape((v.shape[0]*v.shape[1], v.shape[2])) # v.shape[0] = a ; v.shape[1] = b ; v.shape[2] = c\n```\n- Please don't hardcode the dimensions of image as a constant. Instead look up the quantities you need with `image.shape[0]`, etc. ",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: image2vector\ndef image2vector(image):\n \"\"\"\n Argument:\n image -- a numpy array of shape (length, height, depth)\n \n Returns:\n v -- a vector of shape (length*height*depth, 1)\n \"\"\"\n \n ### START CODE HERE ### (โ 1 line of code)\n v = image.reshape(image.size, 1)\n ### END CODE HERE ###\n \n return v",
"_____no_output_____"
],
[
"# This is a 3 by 3 by 2 array, typically images will be (num_px_x, num_px_y,3) where 3 represents the RGB values\nimage = np.array([[[ 0.67826139, 0.29380381],\n [ 0.90714982, 0.52835647],\n [ 0.4215251 , 0.45017551]],\n\n [[ 0.92814219, 0.96677647],\n [ 0.85304703, 0.52351845],\n [ 0.19981397, 0.27417313]],\n\n [[ 0.60659855, 0.00533165],\n [ 0.10820313, 0.49978937],\n [ 0.34144279, 0.94630077]]])\n\nprint (\"image2vector(image) = \" + str(image2vector(image)))",
"image2vector(image) = [[ 0.67826139]\n [ 0.29380381]\n [ 0.90714982]\n [ 0.52835647]\n [ 0.4215251 ]\n [ 0.45017551]\n [ 0.92814219]\n [ 0.96677647]\n [ 0.85304703]\n [ 0.52351845]\n [ 0.19981397]\n [ 0.27417313]\n [ 0.60659855]\n [ 0.00533165]\n [ 0.10820313]\n [ 0.49978937]\n [ 0.34144279]\n [ 0.94630077]]\n"
]
],
[
[
"**Expected Output**: \n\n\n<table style=\"width:100%\">\n <tr> \n <td> **image2vector(image)** </td> \n <td> [[ 0.67826139]\n [ 0.29380381]\n [ 0.90714982]\n [ 0.52835647]\n [ 0.4215251 ]\n [ 0.45017551]\n [ 0.92814219]\n [ 0.96677647]\n [ 0.85304703]\n [ 0.52351845]\n [ 0.19981397]\n [ 0.27417313]\n [ 0.60659855]\n [ 0.00533165]\n [ 0.10820313]\n [ 0.49978937]\n [ 0.34144279]\n [ 0.94630077]]</td> \n </tr>\n \n \n</table>",
"_____no_output_____"
],
[
"### 1.4 - Normalizing rows\n\nAnother common technique we use in Machine Learning and Deep Learning is to normalize our data. It often leads to a better performance because gradient descent converges faster after normalization. Here, by normalization we mean changing x to $ \\frac{x}{\\| x\\|} $ (dividing each row vector of x by its norm).\n\nFor example, if $$x = \n\\begin{bmatrix}\n 0 & 3 & 4 \\\\\n 2 & 6 & 4 \\\\\n\\end{bmatrix}\\tag{3}$$ then $$\\| x\\| = np.linalg.norm(x, axis = 1, keepdims = True) = \\begin{bmatrix}\n 5 \\\\\n \\sqrt{56} \\\\\n\\end{bmatrix}\\tag{4} $$and $$ x\\_normalized = \\frac{x}{\\| x\\|} = \\begin{bmatrix}\n 0 & \\frac{3}{5} & \\frac{4}{5} \\\\\n \\frac{2}{\\sqrt{56}} & \\frac{6}{\\sqrt{56}} & \\frac{4}{\\sqrt{56}} \\\\\n\\end{bmatrix}\\tag{5}$$ Note that you can divide matrices of different sizes and it works fine: this is called broadcasting and you're going to learn about it in part 5.\n\n\n**Exercise**: Implement normalizeRows() to normalize the rows of a matrix. After applying this function to an input matrix x, each row of x should be a vector of unit length (meaning length 1).",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: normalizeRows\n\ndef normalizeRows(x):\n \"\"\"\n Implement a function that normalizes each row of the matrix x (to have unit length).\n \n Argument:\n x -- A numpy matrix of shape (n, m)\n \n Returns:\n x -- The normalized (by row) numpy matrix. You are allowed to modify x.\n \"\"\"\n \n ### START CODE HERE ### (โ 2 lines of code)\n # Compute x_norm as the norm 2 of x. Use np.linalg.norm(..., ord = 2, axis = ..., keepdims = True)\n x_norm = np.linalg.norm(x, ord=2, axis=1, keepdims=True)\n \n # Divide x by its norm.\n x = x / x_norm\n ### END CODE HERE ###\n\n return x",
"_____no_output_____"
],
[
"x = np.array([\n [0, 3, 4],\n [2, 6, 4]])\nprint(\"normalizeRows(x) = \" + str(normalizeRows(x)))",
"normalizeRows(x) = [[ 0. 0.6 0.8 ]\n [ 0.26726124 0.80178373 0.53452248]]\n"
]
],
[
[
"**Expected Output**: \n\n<table style=\"width:60%\">\n\n <tr> \n <td> **normalizeRows(x)** </td> \n <td> [[ 0. 0.6 0.8 ]\n [ 0.13736056 0.82416338 0.54944226]]</td> \n </tr>\n \n \n</table>",
"_____no_output_____"
],
[
"**Note**:\nIn normalizeRows(), you can try to print the shapes of x_norm and x, and then rerun the assessment. You'll find out that they have different shapes. This is normal given that x_norm takes the norm of each row of x. So x_norm has the same number of rows but only 1 column. So how did it work when you divided x by x_norm? This is called broadcasting and we'll talk about it now! ",
"_____no_output_____"
],
[
"### 1.5 - Broadcasting and the softmax function ####\nA very important concept to understand in numpy is \"broadcasting\". It is very useful for performing mathematical operations between arrays of different shapes. For the full details on broadcasting, you can read the official [broadcasting documentation](http://docs.scipy.org/doc/numpy/user/basics.broadcasting.html).",
"_____no_output_____"
],
[
"**Exercise**: Implement a softmax function using numpy. You can think of softmax as a normalizing function used when your algorithm needs to classify two or more classes. You will learn more about softmax in the second course of this specialization.\n\n**Instructions**:\n- $ \\text{for } x \\in \\mathbb{R}^{1\\times n} \\text{, } softmax(x) = softmax(\\begin{bmatrix}\n x_1 &&\n x_2 &&\n ... &&\n x_n \n\\end{bmatrix}) = \\begin{bmatrix}\n \\frac{e^{x_1}}{\\sum_{j}e^{x_j}} &&\n \\frac{e^{x_2}}{\\sum_{j}e^{x_j}} &&\n ... &&\n \\frac{e^{x_n}}{\\sum_{j}e^{x_j}} \n\\end{bmatrix} $ \n\n- $\\text{for a matrix } x \\in \\mathbb{R}^{m \\times n} \\text{, $x_{ij}$ maps to the element in the $i^{th}$ row and $j^{th}$ column of $x$, thus we have: }$ $$softmax(x) = softmax\\begin{bmatrix}\n x_{11} & x_{12} & x_{13} & \\dots & x_{1n} \\\\\n x_{21} & x_{22} & x_{23} & \\dots & x_{2n} \\\\\n \\vdots & \\vdots & \\vdots & \\ddots & \\vdots \\\\\n x_{m1} & x_{m2} & x_{m3} & \\dots & x_{mn}\n\\end{bmatrix} = \\begin{bmatrix}\n \\frac{e^{x_{11}}}{\\sum_{j}e^{x_{1j}}} & \\frac{e^{x_{12}}}{\\sum_{j}e^{x_{1j}}} & \\frac{e^{x_{13}}}{\\sum_{j}e^{x_{1j}}} & \\dots & \\frac{e^{x_{1n}}}{\\sum_{j}e^{x_{1j}}} \\\\\n \\frac{e^{x_{21}}}{\\sum_{j}e^{x_{2j}}} & \\frac{e^{x_{22}}}{\\sum_{j}e^{x_{2j}}} & \\frac{e^{x_{23}}}{\\sum_{j}e^{x_{2j}}} & \\dots & \\frac{e^{x_{2n}}}{\\sum_{j}e^{x_{2j}}} \\\\\n \\vdots & \\vdots & \\vdots & \\ddots & \\vdots \\\\\n \\frac{e^{x_{m1}}}{\\sum_{j}e^{x_{mj}}} & \\frac{e^{x_{m2}}}{\\sum_{j}e^{x_{mj}}} & \\frac{e^{x_{m3}}}{\\sum_{j}e^{x_{mj}}} & \\dots & \\frac{e^{x_{mn}}}{\\sum_{j}e^{x_{mj}}}\n\\end{bmatrix} = \\begin{pmatrix}\n softmax\\text{(first row of x)} \\\\\n softmax\\text{(second row of x)} \\\\\n ... \\\\\n softmax\\text{(last row of x)} \\\\\n\\end{pmatrix} $$",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: softmax\n\ndef softmax(x):\n \"\"\"Calculates the softmax for each row of the input x.\n\n Your code should work for a row vector and also for matrices of shape (n, m).\n\n Argument:\n x -- A numpy matrix of shape (n,m)\n\n Returns:\n s -- A numpy matrix equal to the softmax of x, of shape (n,m)\n \"\"\"\n \n ### START CODE HERE ### (โ 3 lines of code)\n # Apply exp() element-wise to x. Use np.exp(...).\n x_exp = np.exp(x)\n\n # Create a vector x_sum that sums each row of x_exp. Use np.sum(..., axis = 1, keepdims = True).\n x_sum = np.sum(x_exp, axis=1, keepdims=True)\n \n # Compute softmax(x) by dividing x_exp by x_sum. It should automatically use numpy broadcasting.\n s = x_exp / x_sum\n\n ### END CODE HERE ###\n \n return s",
"_____no_output_____"
],
[
"x = np.array([\n [9, 2, 5, 0, 0],\n [7, 5, 0, 0 ,0]])\nprint(\"softmax(x) = \" + str(softmax(x)))",
"softmax(x) = [[ 9.80897665e-01 8.94462891e-04 1.79657674e-02 1.21052389e-04\n 1.21052389e-04]\n [ 8.78679856e-01 1.18916387e-01 8.01252314e-04 8.01252314e-04\n 8.01252314e-04]]\n"
]
],
[
[
"**Expected Output**:\n\n<table style=\"width:60%\">\n\n <tr> \n <td> **softmax(x)** </td> \n <td> [[ 9.80897665e-01 8.94462891e-04 1.79657674e-02 1.21052389e-04\n 1.21052389e-04]\n [ 8.78679856e-01 1.18916387e-01 8.01252314e-04 8.01252314e-04\n 8.01252314e-04]]</td> \n </tr>\n</table>\n",
"_____no_output_____"
],
[
"**Note**:\n- If you print the shapes of x_exp, x_sum and s above and rerun the assessment cell, you will see that x_sum is of shape (2,1) while x_exp and s are of shape (2,5). **x_exp/x_sum** works due to python broadcasting.\n\nCongratulations! You now have a pretty good understanding of python numpy and have implemented a few useful functions that you will be using in deep learning.",
"_____no_output_____"
],
[
"<font color='blue'>\n**What you need to remember:**\n- np.exp(x) works for any np.array x and applies the exponential function to every coordinate\n- the sigmoid function and its gradient\n- image2vector is commonly used in deep learning\n- np.reshape is widely used. In the future, you'll see that keeping your matrix/vector dimensions straight will go toward eliminating a lot of bugs. \n- numpy has efficient built-in functions\n- broadcasting is extremely useful",
"_____no_output_____"
],
[
"## 2) Vectorization",
"_____no_output_____"
],
[
"\nIn deep learning, you deal with very large datasets. Hence, a non-computationally-optimal function can become a huge bottleneck in your algorithm and can result in a model that takes ages to run. To make sure that your code is computationally efficient, you will use vectorization. For example, try to tell the difference between the following implementations of the dot/outer/elementwise product.",
"_____no_output_____"
]
],
[
[
"import time\n\nx1 = [9, 2, 5, 0, 0, 7, 5, 0, 0, 0, 9, 2, 5, 0, 0]\nx2 = [9, 2, 2, 9, 0, 9, 2, 5, 0, 0, 9, 2, 5, 0, 0]\n\n### CLASSIC DOT PRODUCT OF VECTORS IMPLEMENTATION ###\ntic = time.process_time()\ndot = 0\nfor i in range(len(x1)):\n dot+= x1[i]*x2[i]\ntoc = time.process_time()\nprint (\"dot = \" + str(dot) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")\n\n### CLASSIC OUTER PRODUCT IMPLEMENTATION ###\ntic = time.process_time()\nouter = np.zeros((len(x1),len(x2))) # we create a len(x1)*len(x2) matrix with only zeros\nfor i in range(len(x1)):\n for j in range(len(x2)):\n outer[i,j] = x1[i]*x2[j]\ntoc = time.process_time()\nprint (\"outer = \" + str(outer) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")\n\n### CLASSIC ELEMENTWISE IMPLEMENTATION ###\ntic = time.process_time()\nmul = np.zeros(len(x1))\nfor i in range(len(x1)):\n mul[i] = x1[i]*x2[i]\ntoc = time.process_time()\nprint (\"elementwise multiplication = \" + str(mul) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")\n\n### CLASSIC GENERAL DOT PRODUCT IMPLEMENTATION ###\nW = np.random.rand(3,len(x1)) # Random 3*len(x1) numpy array\ntic = time.process_time()\ngdot = np.zeros(W.shape[0])\nfor i in range(W.shape[0]):\n for j in range(len(x1)):\n gdot[i] += W[i,j]*x1[j]\ntoc = time.process_time()\nprint (\"gdot = \" + str(gdot) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")",
"dot = 278\n ----- Computation time = 0.10399999999988196ms\nouter = [[ 81. 18. 18. 81. 0. 81. 18. 45. 0. 0. 81. 18. 45. 0.\n 0.]\n [ 18. 4. 4. 18. 0. 18. 4. 10. 0. 0. 18. 4. 10. 0.\n 0.]\n [ 45. 10. 10. 45. 0. 45. 10. 25. 0. 0. 45. 10. 25. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]\n [ 63. 14. 14. 63. 0. 63. 14. 35. 0. 0. 63. 14. 35. 0.\n 0.]\n [ 45. 10. 10. 45. 0. 45. 10. 25. 0. 0. 45. 10. 25. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]\n [ 81. 18. 18. 81. 0. 81. 18. 45. 0. 0. 81. 18. 45. 0.\n 0.]\n [ 18. 4. 4. 18. 0. 18. 4. 10. 0. 0. 18. 4. 10. 0.\n 0.]\n [ 45. 10. 10. 45. 0. 45. 10. 25. 0. 0. 45. 10. 25. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]\n [ 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0. 0.\n 0.]]\n ----- Computation time = 0.5980000000000985ms\nelementwise multiplication = [ 81. 4. 10. 0. 0. 63. 10. 0. 0. 0. 81. 4. 25. 0. 0.]\n ----- Computation time = 0.3109999999999502ms\ngdot = [ 33.11438492 20.4506171 26.83302859]\n ----- Computation time = 0.484000000000151ms\n"
],
[
"x1 = [9, 2, 5, 0, 0, 7, 5, 0, 0, 0, 9, 2, 5, 0, 0]\nx2 = [9, 2, 2, 9, 0, 9, 2, 5, 0, 0, 9, 2, 5, 0, 0]\n\n### VECTORIZED DOT PRODUCT OF VECTORS ###\ntic = time.process_time()\ndot = np.dot(x1,x2)\ntoc = time.process_time()\nprint (\"dot = \" + str(dot) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")\n\n### VECTORIZED OUTER PRODUCT ###\ntic = time.process_time()\nouter = np.outer(x1,x2)\ntoc = time.process_time()\nprint (\"outer = \" + str(outer) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")\n\n### VECTORIZED ELEMENTWISE MULTIPLICATION ###\ntic = time.process_time()\nmul = np.multiply(x1,x2)\ntoc = time.process_time()\nprint (\"elementwise multiplication = \" + str(mul) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")\n\n### VECTORIZED GENERAL DOT PRODUCT ###\ntic = time.process_time()\ndot = np.dot(W,x1)\ntoc = time.process_time()\nprint (\"gdot = \" + str(dot) + \"\\n ----- Computation time = \" + str(1000*(toc - tic)) + \"ms\")",
"dot = 278\n ----- Computation time = 0.1129999999998077ms\nouter = [[81 18 18 81 0 81 18 45 0 0 81 18 45 0 0]\n [18 4 4 18 0 18 4 10 0 0 18 4 10 0 0]\n [45 10 10 45 0 45 10 25 0 0 45 10 25 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [63 14 14 63 0 63 14 35 0 0 63 14 35 0 0]\n [45 10 10 45 0 45 10 25 0 0 45 10 25 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [81 18 18 81 0 81 18 45 0 0 81 18 45 0 0]\n [18 4 4 18 0 18 4 10 0 0 18 4 10 0 0]\n [45 10 10 45 0 45 10 25 0 0 45 10 25 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]\n [ 0 0 0 0 0 0 0 0 0 0 0 0 0 0 0]]\n ----- Computation time = 0.2740000000001075ms\nelementwise multiplication = [81 4 10 0 0 63 10 0 0 0 81 4 25 0 0]\n ----- Computation time = 0.07899999999994023ms\ngdot = [ 33.11438492 20.4506171 26.83302859]\n ----- Computation time = 14.542000000000055ms\n"
]
],
[
[
"As you may have noticed, the vectorized implementation is much cleaner and more efficient. For bigger vectors/matrices, the differences in running time become even bigger. \n\n**Note** that `np.dot()` performs a matrix-matrix or matrix-vector multiplication. This is different from `np.multiply()` and the `*` operator (which is equivalent to `.*` in Matlab/Octave), which performs an element-wise multiplication.",
"_____no_output_____"
],
[
"### 2.1 Implement the L1 and L2 loss functions\n\n**Exercise**: Implement the numpy vectorized version of the L1 loss. You may find the function abs(x) (absolute value of x) useful.\n\n**Reminder**:\n- The loss is used to evaluate the performance of your model. The bigger your loss is, the more different your predictions ($ \\hat{y} $) are from the true values ($y$). In deep learning, you use optimization algorithms like Gradient Descent to train your model and to minimize the cost.\n- L1 loss is defined as:\n$$\\begin{align*} & L_1(\\hat{y}, y) = \\sum_{i=0}^m|y^{(i)} - \\hat{y}^{(i)}| \\end{align*}\\tag{6}$$",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: L1\n\ndef L1(yhat, y):\n \"\"\"\n Arguments:\n yhat -- vector of size m (predicted labels)\n y -- vector of size m (true labels)\n \n Returns:\n loss -- the value of the L1 loss function defined above\n \"\"\"\n \n ### START CODE HERE ### (โ 1 line of code)\n loss = np.sum(np.abs((y - yhat)))\n ### END CODE HERE ###\n \n return loss",
"_____no_output_____"
],
[
"yhat = np.array([.9, 0.2, 0.1, .4, .9])\ny = np.array([1, 0, 0, 1, 1])\nprint(\"L1 = \" + str(L1(yhat,y)))",
"L1 = 1.1\n"
]
],
[
[
"**Expected Output**:\n\n<table style=\"width:20%\">\n\n <tr> \n <td> **L1** </td> \n <td> 1.1 </td> \n </tr>\n</table>\n",
"_____no_output_____"
],
[
"**Exercise**: Implement the numpy vectorized version of the L2 loss. There are several way of implementing the L2 loss but you may find the function np.dot() useful. As a reminder, if $x = [x_1, x_2, ..., x_n]$, then `np.dot(x,x)` = $\\sum_{j=0}^n x_j^{2}$. \n\n- L2 loss is defined as $$\\begin{align*} & L_2(\\hat{y},y) = \\sum_{i=0}^m(y^{(i)} - \\hat{y}^{(i)})^2 \\end{align*}\\tag{7}$$",
"_____no_output_____"
]
],
[
[
"# GRADED FUNCTION: L2\n\ndef L2(yhat, y):\n \"\"\"\n Arguments:\n yhat -- vector of size m (predicted labels)\n y -- vector of size m (true labels)\n \n Returns:\n loss -- the value of the L2 loss function defined above\n \"\"\"\n \n ### START CODE HERE ### (โ 1 line of code)\n loss = np.sum(np.square(yhat - y))\n ### END CODE HERE ###\n \n return loss",
"_____no_output_____"
],
[
"yhat = np.array([.9, 0.2, 0.1, .4, .9])\ny = np.array([1, 0, 0, 1, 1])\nprint(\"L2 = \" + str(L2(yhat,y)))",
"L2 = 0.43\n"
]
],
[
[
"**Expected Output**: \n<table style=\"width:20%\">\n <tr> \n <td> **L2** </td> \n <td> 0.43 </td> \n </tr>\n</table>",
"_____no_output_____"
],
[
"Congratulations on completing this assignment. We hope that this little warm-up exercise helps you in the future assignments, which will be more exciting and interesting!",
"_____no_output_____"
],
[
"<font color='blue'>\n**What to remember:**\n- Vectorization is very important in deep learning. It provides computational efficiency and clarity.\n- You have reviewed the L1 and L2 loss.\n- You are familiar with many numpy functions such as np.sum, np.dot, np.multiply, np.maximum, etc...",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
]
] |
cbfaaece380612e17afe0cf1fc3a6c2ac7269c54
| 437 |
ipynb
|
Jupyter Notebook
|
d2l-en/chapter_references/zreferences.ipynb
|
mru4913/Dive-into-Deep-Learning
|
bcd16ac602f011292bd1d5540ef3833cd3fd7c72
|
[
"MIT"
] | null | null | null |
d2l-en/chapter_references/zreferences.ipynb
|
mru4913/Dive-into-Deep-Learning
|
bcd16ac602f011292bd1d5540ef3833cd3fd7c72
|
[
"MIT"
] | 15 |
2019-10-10T13:01:15.000Z
|
2022-02-10T00:21:14.000Z
|
chapter_references/zreferences.ipynb
|
pedro-abundio-wang/d2l-numpy
|
59e3e536f81d355f10a99a4e936d2b3e68201f1d
|
[
"Apache-2.0"
] | null | null | null | 13.65625 | 32 | 0.398169 |
[
[
[
"```eval_rst\n\n.. only:: html\n\n References\n ==========\n\n```\n",
"_____no_output_____"
],
[
":bibliography:`../d2l.bib`",
"_____no_output_____"
]
]
] |
[
"markdown"
] |
[
[
"markdown",
"markdown"
]
] |
cbfab5a2d294575a6a42c4b4c1b0fec0db443f3b
| 1,436 |
ipynb
|
Jupyter Notebook
|
MastDashboard.ipynb
|
dr-rodriguez/MovingMast
|
ebdc9bdef0a1fe03faf0fd2ca72dd70fbb4be020
|
[
"MIT"
] | 2 |
2020-09-02T03:19:13.000Z
|
2020-09-03T19:28:38.000Z
|
MastDashboard.ipynb
|
dr-rodriguez/MovingMast
|
ebdc9bdef0a1fe03faf0fd2ca72dd70fbb4be020
|
[
"MIT"
] | 34 |
2020-09-03T12:00:39.000Z
|
2020-09-16T20:32:10.000Z
|
MastDashboard.ipynb
|
dr-rodriguez/MovingMast
|
ebdc9bdef0a1fe03faf0fd2ca72dd70fbb4be020
|
[
"MIT"
] | 7 |
2020-09-01T15:52:05.000Z
|
2020-09-03T20:07:56.000Z
| 21.757576 | 90 | 0.538301 |
[
[
[
"# Searching MAST for Moving Targets\n\nThis is the version meant to be run via a server. Call:\n```\npanel serve --show MastDashboard.ipynb\n```",
"_____no_output_____"
]
],
[
[
"import panel as pn\nfrom movingmast import viz\n\ncss = ['https://cdn.datatables.net/1.10.21/css/jquery.dataTables.min.css']\njs = {\n '$': 'https://code.jquery.com/jquery-3.4.1.slim.min.js',\n 'DataTable': 'https://cdn.datatables.net/1.10.21/js/jquery.dataTables.min.js'\n}\n\npn.extension(css_files=css, js_files=js)",
"_____no_output_____"
],
[
"dash = viz.MastQuery(data_tables=True)\ndash.panel(debug=False).servable()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code"
]
] |
cbfac5806c5edf1e962bc7788e31de9e472e342a
| 112,523 |
ipynb
|
Jupyter Notebook
|
ch5/image_classification_mnist.ipynb
|
PacktPublishing/Natural-Language-Processing-with-TensorFlow
|
54653384ff2d0bab356e3e0877bd86c3ba3a80a3
|
[
"MIT"
] | 259 |
2018-06-07T02:46:53.000Z
|
2022-03-29T03:31:26.000Z
|
ch5/image_classification_mnist.ipynb
|
Chunlinx/Natural-Language-Processing-with-TensorFlow
|
097b59a2f085379bf9a53b8285701cf3a0cb1d5e
|
[
"MIT"
] | 3 |
2018-07-21T01:41:01.000Z
|
2020-10-06T06:47:00.000Z
|
ch5/image_classification_mnist.ipynb
|
Chunlinx/Natural-Language-Processing-with-TensorFlow
|
097b59a2f085379bf9a53b8285701cf3a0cb1d5e
|
[
"MIT"
] | 170 |
2018-06-01T23:56:47.000Z
|
2022-03-30T06:27:22.000Z
| 129.634793 | 78,762 | 0.842059 |
[
[
[
"import tensorflow as tf\nfrom matplotlib import pylab\nfrom tensorflow.examples.tutorials.mnist import input_data\nimport numpy as np\n\n# Required for Data downaload and preparation\nimport struct\nimport gzip\nimport os\nfrom six.moves.urllib.request import urlretrieve",
"c:\\users\\thushan\\documents\\python_virtualenvs\\tensorflow_venv\\lib\\site-packages\\h5py\\__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\n"
]
],
[
[
"## Defining Hyperparameters\n\nHere we define the set of hyperparameters we're going to you in our example. These hyperparameters include `batch_size`, train dataset size (`n_train`), different layers in our CNN (`cnn_layer_ids`). You can find descriptions of each hyperparameter in comments.",
"_____no_output_____"
]
],
[
[
"batch_size = 100 # This is the typical batch size we've been using\nimage_size = 28 # This is the width/height of a single image\n\n# Number of color channels in an image. These are black and white images \nn_channels = 1 \n\n# Number of different digits we have images for (i.e. classes)\nn_classes = 10\n\nn_train = 55000 # Train dataset size\nn_valid = 5000 # Validation dataset size\nn_test = 10000 # Test dataset size\n\n# Layers in the CNN in the order from input to output\ncnn_layer_ids = ['conv1','pool1','conv2','pool2','fulcon1','softmax']\n\n# Hyperparameters of each layer (e.g. filter size of each convolution layer)\nlayer_hyperparameters = {'conv1':{'weight_shape':[3,3,n_channels,16],'stride':[1,1,1,1],'padding':'SAME'},\n 'pool1':{'kernel_shape':[1,3,3,1],'stride':[1,2,2,1],'padding':'SAME'},\n 'conv2':{'weight_shape':[3,3,16,32],'stride':[1,1,1,1],'padding':'SAME'},\n 'pool2':{'kernel_shape':[1,3,3,1],'stride':[1,2,2,1],'padding':'SAME'},\n 'fulcon1':{'weight_shape':[7*7*32,128]},\n 'softmax':{'weight_shape':[128,n_classes]}\n }",
"_____no_output_____"
]
],
[
[
"## Defining Inputs and Outputs\nHere we define input and output placeholders required to process a batch of data. We will use the same placeholders for all training, validation and testing data as all of them are processed in same size batches.",
"_____no_output_____"
]
],
[
[
"# Inputs (Images) and Outputs (Labels) Placeholders\ntf_inputs = tf.placeholder(shape=[batch_size, image_size, image_size, n_channels],dtype=tf.float32,name='tf_mnist_images')\ntf_labels = tf.placeholder(shape=[batch_size, n_classes],dtype=tf.float32,name='tf_mnist_labels')",
"_____no_output_____"
]
],
[
[
"## Defining Model Parameters and Other Variables\nHere we define various TensorFlow variables required for the following computations. These includes a global step variable (to decay learning rate) and weights and biases of each layer of the CNN.",
"_____no_output_____"
]
],
[
[
"# Global step for decaying the learning rate\nglobal_step = tf.Variable(0,trainable=False)\n\n# Initializing the variables\nlayer_weights = {}\nlayer_biases = {}\n\nfor layer_id in cnn_layer_ids:\n if 'pool' not in layer_id:\n layer_weights[layer_id] = tf.Variable(initial_value=tf.random_normal(shape=layer_hyperparameters[layer_id]['weight_shape'],\n stddev=0.02,dtype=tf.float32),name=layer_id+'_weights')\n layer_biases[layer_id] = tf.Variable(initial_value=tf.random_normal(shape=[layer_hyperparameters[layer_id]['weight_shape'][-1]],\n stddev=0.01,dtype=tf.float32),name=layer_id+'_bias')\n \nprint('Variables initialized')",
"Variables initialized\n"
]
],
[
[
"## Defining Inference of the CNN\nHere we define the computations starting from input placeholder (`tf_inputs`) and then computing the hidden activations for each of the layers found in `cnn_layer_ids` (i.e. convolution/pooling and fulcon layers) and their respective parameters (`layer_hyperparamters`). At the final layer (`softmax`) we do not apply an activation function as for the rest of the layers, but obtain the unnormalized logit values without any activation function.",
"_____no_output_____"
]
],
[
[
"# Calculating Logits\n\nh = tf_inputs\nfor layer_id in cnn_layer_ids:\n if 'conv' in layer_id:\n # For each convolution layer, compute the output by using conv2d function\n # This operation results in a [batch_size, output_height, output_width, out_channels]\n # sized 4 dimensional tensor\n h = tf.nn.conv2d(h,layer_weights[layer_id],layer_hyperparameters[layer_id]['stride'], \n layer_hyperparameters[layer_id]['padding']) + layer_biases[layer_id]\n h = tf.nn.relu(h)\n\n elif 'pool' in layer_id:\n # For each pooling layer, compute the output by max pooling\n # This operation results in a [batch_size, output_height, output_width, out_channels]\n # sized 4 dimensional tensor\n h = tf.nn.max_pool(h, layer_hyperparameters[layer_id]['kernel_shape'],layer_hyperparameters[layer_id]['stride'],\n layer_hyperparameters[layer_id]['padding'])\n\n elif layer_id == 'fulcon1':\n # At the first fulcon layer we need to reshape the 4 dimensional output to a\n # 2 dimensional output to be processed by fully connected layers\n # Note this should only done once, before \n # computing the output of the first fulcon layer\n h = tf.reshape(h,[batch_size,-1])\n h = tf.matmul(h,layer_weights[layer_id]) + layer_biases[layer_id]\n h = tf.nn.relu(h)\n\n elif layer_id == 'softmax':\n # Note that here we do not perform the same reshaping we did for fulcon1\n # We only perform the matrix multiplication on previous output\n h = tf.matmul(h,layer_weights[layer_id]) + layer_biases[layer_id]\n\nprint('Calculated logits')\ntf_logits = h",
"Calculated logits\n"
]
],
[
[
"## Defining Loss\nWe use softmax cross entropy loss to optimize the parameters of the model.",
"_____no_output_____"
]
],
[
[
"# Calculating the softmax cross entropy loss with the computed logits and true labels (one hot encoded)\ntf_loss = tf.nn.softmax_cross_entropy_with_logits_v2(logits=tf_logits,labels=tf_labels)\nprint('Loss defined')",
"Loss defined\n"
]
],
[
[
"## Model Parameter Optimizer\nWe define an exponentially decaying learning rate and an optimizer to optimize the parameters.",
"_____no_output_____"
]
],
[
[
"# Optimization\n\n# Here we define the function to decay the learning rate exponentially. \n# Everytime the global step increases the learning rate decreases\ntf_learning_rate = tf.train.exponential_decay(learning_rate=0.001,global_step=global_step,decay_rate=0.5,decay_steps=1,staircase=True)\ntf_loss_minimize = tf.train.RMSPropOptimizer(learning_rate=tf_learning_rate, momentum=0.9).minimize(tf_loss)\nprint('Loss minimization defined')",
"Loss minimization defined\n"
]
],
[
[
"## Defining Predictions\nWe get the predictiosn out by applying a softmax activation to the logits. Additionally we define a global step increment function and will be increase every time the validation accuracy plateus.",
"_____no_output_____"
]
],
[
[
"tf_predictions = tf.nn.softmax(tf_logits)\nprint('Prediction defined')\n\ntf_tic_toc = tf.assign(global_step, global_step + 1)\n",
"Prediction defined\n"
]
],
[
[
"## Define Accuracy \nA simple function to calculate accuracy for a given set of labels and predictions.",
"_____no_output_____"
]
],
[
[
"def accuracy(predictions,labels):\n '''\n Accuracy of a given set of predictions of size (N x n_classes) and\n labels of size (N x n_classes)\n '''\n \n return np.sum(np.argmax(predictions,axis=1)==np.argmax(labels,axis=1))*100.0/labels.shape[0]",
"_____no_output_____"
]
],
[
[
"## Lolading Data\n\nHere we download (if needed) the MNIST dataset and, perform reshaping and normalization. Also we conver the labels to one hot encoded vectors.",
"_____no_output_____"
]
],
[
[
"def maybe_download(url, filename, expected_bytes, force=False):\n \"\"\"Download a file if not present, and make sure it's the right size.\"\"\"\n if force or not os.path.exists(filename):\n print('Attempting to download:', filename) \n filename, _ = urlretrieve(url + filename, filename)\n print('\\nDownload Complete!')\n statinfo = os.stat(filename)\n if statinfo.st_size == expected_bytes:\n print('Found and verified', filename)\n else:\n raise Exception(\n 'Failed to verify ' + filename + '. Can you get to it with a browser?')\n return filename\n\n\ndef read_mnist(fname_img, fname_lbl, one_hot=False):\n print('\\nReading files %s and %s'%(fname_img, fname_lbl))\n \n # Processing images\n with gzip.open(fname_img) as fimg: \n magic, num, rows, cols = struct.unpack(\">IIII\", fimg.read(16))\n print(num,rows,cols)\n img = (np.frombuffer(fimg.read(num*rows*cols), dtype=np.uint8).reshape(num, rows, cols,1)).astype(np.float32)\n print('(Images) Returned a tensor of shape ',img.shape)\n \n #img = (img - np.mean(img)) /np.std(img)\n img *= 1.0 / 255.0\n \n # Processing labels\n with gzip.open(fname_lbl) as flbl:\n # flbl.read(8) reads upto 8 bytes\n magic, num = struct.unpack(\">II\", flbl.read(8)) \n lbl = np.frombuffer(flbl.read(num), dtype=np.int8)\n if one_hot:\n one_hot_lbl = np.zeros(shape=(num,10),dtype=np.float32)\n one_hot_lbl[np.arange(num),lbl] = 1.0\n print('(Labels) Returned a tensor of shape: %s'%lbl.shape)\n print('Sample labels: ',lbl[:10])\n \n if not one_hot:\n return img, lbl\n else:\n return img, one_hot_lbl\n \n \n# Download data if needed\nurl = 'http://yann.lecun.com/exdb/mnist/'\n# training data\nmaybe_download(url,'train-images-idx3-ubyte.gz',9912422)\nmaybe_download(url,'train-labels-idx1-ubyte.gz',28881)\n# testing data\nmaybe_download(url,'t10k-images-idx3-ubyte.gz',1648877)\nmaybe_download(url,'t10k-labels-idx1-ubyte.gz',4542)\n\n# Read the training and testing data \ntrain_inputs, train_labels = read_mnist('train-images-idx3-ubyte.gz', 'train-labels-idx1-ubyte.gz',True)\ntest_inputs, test_labels = read_mnist('t10k-images-idx3-ubyte.gz', 't10k-labels-idx1-ubyte.gz',True)\n\nvalid_inputs, valid_labels = train_inputs[-n_valid:,:,:,:], train_labels[-n_valid:,:]\ntrain_inputs, train_labels = train_inputs[:-n_valid,:,:,:], train_labels[:-n_valid,:]\n\nprint('\\nTrain size: ', train_inputs.shape[0])\nprint('\\nValid size: ', valid_inputs.shape[0])\nprint('\\nTest size: ', test_inputs.shape[0])",
"Found and verified train-images-idx3-ubyte.gz\nFound and verified train-labels-idx1-ubyte.gz\nFound and verified t10k-images-idx3-ubyte.gz\nFound and verified t10k-labels-idx1-ubyte.gz\n\nReading files train-images-idx3-ubyte.gz and train-labels-idx1-ubyte.gz\n60000 28 28\n(Images) Returned a tensor of shape (60000, 28, 28, 1)\n(Labels) Returned a tensor of shape: 60000\nSample labels: [5 0 4 1 9 2 1 3 1 4]\n\nReading files t10k-images-idx3-ubyte.gz and t10k-labels-idx1-ubyte.gz\n10000 28 28\n(Images) Returned a tensor of shape (10000, 28, 28, 1)\n(Labels) Returned a tensor of shape: 10000\nSample labels: [7 2 1 0 4 1 4 9 5 9]\n\nTrain size: 55000\n\nValid size: 5000\n\nTest size: 10000\n"
]
],
[
[
"## Data Generators for MNIST\n\nHere we have the logic to iterate through each training, validation and testing datasets, in `batch_size` size strides.",
"_____no_output_____"
]
],
[
[
"train_index, valid_index, test_index = 0,0,0\n\ndef get_train_batch(images, labels, batch_size):\n global train_index\n batch = images[train_index:train_index+batch_size,:,:,:], labels[train_index:train_index+batch_size,:]\n train_index = (train_index + batch_size)%(images.shape[0] - batch_size)\n return batch\n\ndef get_valid_batch(images, labels, batch_size):\n global valid_index\n batch = images[valid_index:valid_index+batch_size,:,:,:], labels[valid_index:valid_index+batch_size,:]\n valid_index = (valid_index + batch_size)%(images.shape[0] - batch_size)\n return batch\n\ndef get_test_batch(images, labels, batch_size):\n global test_index\n batch = images[test_index:test_index+batch_size,:,:,:], labels[test_index:test_index+batch_size,:]\n test_index = (test_index + batch_size)%(images.shape[0] - batch_size)\n return batch",
"_____no_output_____"
]
],
[
[
"## Visualizing MNIST Results\nHere we define a function to collect correctly and incorrectly classified samples to visualize later. Visualizing such samples will help us to understand why the CNN incorrectly classified certain samples.",
"_____no_output_____"
]
],
[
[
"# Makes sure we only collect 10 samples for each \ncorrect_fill_index, incorrect_fill_index = 0,0\n\n# Visualization purposes\ncorrectly_predicted = np.empty(shape=(10,28,28,1),dtype=np.float32)\ncorrect_predictions = np.empty(shape=(10,n_classes),dtype=np.float32)\nincorrectly_predicted = np.empty(shape=(10,28,28,1),dtype=np.float32)\nincorrect_predictions = np.empty(shape=(10,n_classes),dtype=np.float32)\n\ndef collect_samples(test_batch_predictions,test_images, test_labels):\n global correctly_predicted, correct_predictions\n global incorrectly_predicted, incorrect_predictions\n global correct_fill_index, incorrect_fill_index\n \n correct_indices = np.where(np.argmax(test_batch_predictions,axis=1)==np.argmax(test_labels,axis=1))[0]\n incorrect_indices = np.where(np.argmax(test_batch_predictions,axis=1)!=np.argmax(test_labels,axis=1))[0]\n \n if correct_indices.size>0 and correct_fill_index<10:\n print('\\nCollecting Correctly Predicted Samples')\n chosen_index = np.random.choice(correct_indices)\n correctly_predicted[correct_fill_index,:,:,:]=test_images[chosen_index,:].reshape(1,image_size,image_size,n_channels)\n correct_predictions[correct_fill_index,:]=test_batch_predictions[chosen_index,:]\n correct_fill_index += 1\n\n if incorrect_indices.size>0 and incorrect_fill_index<10:\n print('Collecting InCorrectly Predicted Samples')\n chosen_index = np.random.choice(incorrect_indices)\n incorrectly_predicted[incorrect_fill_index,:,:,:]=test_images[chosen_index,:].reshape(1,image_size,image_size,n_channels)\n incorrect_predictions[incorrect_fill_index,:]=test_batch_predictions[chosen_index,:]\n incorrect_fill_index += 1",
"_____no_output_____"
]
],
[
[
"## Running MNIST Classification \nHere we train our CNN on MNIST data for `n_epochs` epochs. Each epoch we train the CNN with the full training dataset. Then we calculate the validation accuracy, according to which we decay the learning rate. Finally, each epoch we calculate the test accuracy which is computed using an independent test set. This code should run under 10 minutes if you run on a decent GPU and should reach to a test accuracy of about ~95%",
"_____no_output_____"
]
],
[
[
"# Parameters related to learning rate decay\n# counts how many times the validation accuracy has not increased consecutively for\nv_acc_not_increased_for = 0 \n# if the above count is above this value, decrease the learning rate\nv_acc_threshold = 3\n# currently recorded best validation accuracy\nmax_v_acc = 0.0\n\nconfig = tf.ConfigProto(allow_soft_placement=True)\n# Good practice to use this to avoid any surprising errors thrown by TensorFlow\nconfig.gpu_options.allow_growth = True \nconfig.gpu_options.per_process_gpu_memory_fraction = 0.9 # Making sure Tensorflow doesn't overflow the GPU\n\nn_epochs = 25 # Number of epochs the training runs for\n\nsession = tf.InteractiveSession(config=config)\n# Initialize all variables\ntf.global_variables_initializer().run()\n\n# Run training loop\n\nfor epoch in range(n_epochs):\n loss_per_epoch = []\n \n # Training phase. We train with all training data \n # processing one batch at a time\n for i in range(n_train//batch_size):\n # Get the next batch of MNIST dataset\n batch = get_train_batch(train_inputs, train_labels, batch_size) \n # Run TensorFlow opeartions\n l,_ = session.run([tf_loss,tf_loss_minimize],feed_dict={tf_inputs: batch[0].reshape(batch_size,image_size,image_size,n_channels),\n tf_labels: batch[1]})\n # Add the loss value to a list\n loss_per_epoch.append(l)\n print('Average loss in epoch %d: %.5f'%(epoch,np.mean(loss_per_epoch))) \n \n # Validation phase. We compute validation accuracy\n # processing one batch at a time\n valid_accuracy_per_epoch = []\n for i in range(n_valid//batch_size):\n # Get the next validation data batch\n vbatch_images,vbatch_labels = get_valid_batch(valid_inputs, valid_labels, batch_size)\n # Compute validation predictions\n valid_batch_predictions = session.run(\n tf_predictions,feed_dict={tf_inputs: vbatch_images}\n )\n # Compute and add the validation accuracy to a python list\n valid_accuracy_per_epoch.append(accuracy(valid_batch_predictions,vbatch_labels))\n \n # Compute and print average validation accuracy\n mean_v_acc = np.mean(valid_accuracy_per_epoch)\n print('\\tAverage Valid Accuracy in epoch %d: %.5f'%(epoch,np.mean(valid_accuracy_per_epoch)))\n \n # Learning rate decay logic\n if mean_v_acc > max_v_acc:\n max_v_acc = mean_v_acc\n else:\n v_acc_not_increased_for += 1\n \n # Time to decrease learning rate\n if v_acc_not_increased_for >= v_acc_threshold:\n print('\\nDecreasing Learning rate\\n')\n session.run(tf_tic_toc) # Increase global_step\n v_acc_not_increased_for = 0\n \n # Testing phase. We compute test accuracy\n # processing one batch at a time\n accuracy_per_epoch = []\n for i in range(n_test//batch_size):\n btest_images, btest_labels = get_test_batch(test_inputs, test_labels, batch_size)\n test_batch_predictions = session.run(tf_predictions,feed_dict={tf_inputs: btest_images})\n accuracy_per_epoch.append(accuracy(test_batch_predictions,btest_labels))\n \n # Collect samples for visualization only in the last epoch\n if epoch==n_epochs-1:\n collect_samples(test_batch_predictions, btest_images, btest_labels)\n \n print('\\tAverage Test Accuracy in epoch %d: %.5f\\n'%(epoch,np.mean(accuracy_per_epoch)))\nsession.close()",
"c:\\users\\thushan\\documents\\python_virtualenvs\\tensorflow_venv\\lib\\site-packages\\tensorflow\\python\\client\\session.py:1711: UserWarning: An interactive session is already active. This can cause out-of-memory errors in some cases. You must explicitly call `InteractiveSession.close()` to release resources held by the other session(s).\n warnings.warn('An interactive session is already active. This can '\n"
]
],
[
[
"## Visualizing Predictions\nLet us see how when our CNN did when it comes to predictions\n",
"_____no_output_____"
]
],
[
[
"# Defining the plot related settings\npylab.figure(figsize=(25,20)) # in inches\nwidth=0.5 # Width of a bar in the barchart\npadding = 0.05 # Padding between two bars\nlabels = list(range(0,10)) # Class labels\n\n# Defining X axis\nx_axis = np.arange(0,10)\n\n# We create 4 rows and 7 column set of subplots\n\n# We choose these to put the titles in\n# First row middle\npylab.subplot(4, 7, 4)\npylab.title('Correctly Classified Samples',fontsize=24)\n\n# Second row middle\npylab.subplot(4, 7,11)\npylab.title('Softmax Predictions for Correctly Classified Samples',fontsize=24)\n\n# For 7 steps\nfor sub_i in range(7):\n # Draw the top row (digit images)\n pylab.subplot(4, 7, sub_i + 1) \n pylab.imshow(np.squeeze(correctly_predicted[sub_i]),cmap='gray') \n pylab.axis('off')\n \n # Draw the second row (prediction bar chart)\n pylab.subplot(4, 7, 7 + sub_i + 1) \n pylab.bar(x_axis + padding, correct_predictions[sub_i], width)\n pylab.ylim([0.0,1.0]) \n pylab.xticks(x_axis, labels)\n\n# Set titles for the third and fourth rows\npylab.subplot(4, 7, 18)\npylab.title('Incorrectly Classified Samples',fontsize=26)\npylab.subplot(4, 7,25)\npylab.title('Softmax Predictions for Incorrectly Classified Samples',fontsize=24)\n\n# For 7 steps\nfor sub_i in range(7):\n # Draw the third row (incorrectly classified digit images)\n pylab.subplot(4, 7, 14 + sub_i + 1)\n pylab.imshow(np.squeeze(incorrectly_predicted[sub_i]),cmap='gray')\n pylab.axis('off')\n \n # Draw the fourth row (incorrect predictions bar chart)\n pylab.subplot(4, 7, 21 + sub_i + 1) \n pylab.bar(x_axis + padding, incorrect_predictions[sub_i], width)\n pylab.ylim([0.0,1.0])\n pylab.xticks(x_axis, labels)\n\n# Save the figure\npylab.savefig('mnist_results.png')\npylab.show()\n",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfad37039f507272b32d09be3cf0938bada3399
| 114,968 |
ipynb
|
Jupyter Notebook
|
notebooks/Exploratory data analysis.ipynb
|
tmcclintock/StreetViewNumbers
|
d06fdaae66c98d16b0907bc886c8708747bf301a
|
[
"MIT"
] | null | null | null |
notebooks/Exploratory data analysis.ipynb
|
tmcclintock/StreetViewNumbers
|
d06fdaae66c98d16b0907bc886c8708747bf301a
|
[
"MIT"
] | null | null | null |
notebooks/Exploratory data analysis.ipynb
|
tmcclintock/StreetViewNumbers
|
d06fdaae66c98d16b0907bc886c8708747bf301a
|
[
"MIT"
] | null | null | null | 614.802139 | 90,428 | 0.946429 |
[
[
[
"# Exporing the data\n\nHere I am going to explore the address data. I won't do any machine learning here, but I will do some analysis of the statistics of the digits.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport scipy as sp\nfrom scipy.io import loadmat\nimport pandas as pd\nimport matplotlib.pyplot as plt\n%matplotlib inline",
"_____no_output_____"
],
[
"#plt.rc(\"text\", usetex=True)\nplt.rc(\"font\", size=18, family=\"serif\")",
"_____no_output_____"
],
[
"input_path = \"../data/\"",
"_____no_output_____"
],
[
"#Load the training data and pull it apart into\ntraining_dataset = loadmat(input_path + \"train_32x32.mat\")\ntrain_X = training_dataset['X']\ntrain_y = np.squeeze(training_dataset['y'])\n#Do the same for the test sets\ntesting_dataset = loadmat(input_path + \"test_32x32.mat\")\ntest_X = testing_dataset['X']\ntest_y = np.squeeze(testing_dataset['y'])\n#Make dataframes and summarize\nprint(\"Training shapes: \", train_X.shape, train_y.shape)\nprint(\"Testing shapes: \", test_X.shape, test_y.shape)",
"Training shapes: (32, 32, 3, 73257) (73257,)\nTesting shapes: (32, 32, 3, 26032) (26032,)\n"
],
[
"train_X = np.moveaxis(train_X, -1, 0)\ntest_X = np.moveaxis(test_X, -1, 0)\nprint(\"Training shapes: \", train_X.shape, train_y.shape)\nprint(\"Testing shapes: \", test_X.shape, test_y.shape)",
"Training shapes: (73257, 32, 32, 3) (73257,)\nTesting shapes: (26032, 32, 32, 3) (26032,)\n"
],
[
"#Visualize some pictures\nrows = 3\ncolumns = 5\nfig, ax = plt.subplots(rows, columns, figsize=(11, 7))\nfor i, c in enumerate(np.random.choice(len(train_X), rows * columns)):\n ax[i%rows, i//rows].imshow(train_X[c,:,:,:])\n ax[i%rows, i//rows].axis(\"off\")\n ax[i%rows, i//rows].set_title(r\"Sample$_{%d}$: y=%d\"%(c, train_y[c]))",
"_____no_output_____"
],
[
"#Histogram the targets\nn, bins, patches = plt.hist(train_y, bins=range(1,12), align='left', color=\"b\", label=\"Train\")\nn, bins, patches = plt.hist(test_y, bins=range(1,12), align='left', color=\"r\", alpha=1, label=\"Test\")\nplt.axis([0,11,0,15000])\nplt.title(\"Label Frequency\")\nplt.ylabel(\"Count\")\nplt.xlabel(\"Label (index)\")\nplt.xticks(np.arange(1, 11))\nplt.xlim(.5, 10.5)\nplt.legend(frameon=False)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfad4c2a97f4ca3d361b998c3769aa9faa94b1c
| 196,734 |
ipynb
|
Jupyter Notebook
|
examples/SLAC Feb2018 Demo.ipynb
|
Russell-Jones-OxPhys/CCL
|
1cdc4ecb8ae6fb23806540b39799cc3317473e71
|
[
"BSD-3-Clause"
] | null | null | null |
examples/SLAC Feb2018 Demo.ipynb
|
Russell-Jones-OxPhys/CCL
|
1cdc4ecb8ae6fb23806540b39799cc3317473e71
|
[
"BSD-3-Clause"
] | null | null | null |
examples/SLAC Feb2018 Demo.ipynb
|
Russell-Jones-OxPhys/CCL
|
1cdc4ecb8ae6fb23806540b39799cc3317473e71
|
[
"BSD-3-Clause"
] | null | null | null | 290.59675 | 30,740 | 0.927887 |
[
[
[
"# CCL feature demo\n**SLAC 2018 DESC meeting**",
"_____no_output_____"
],
[
"In this demo, we use CCL to set up a cosmology and show how to get different quantities of interest.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport matplotlib.pyplot as plt\nimport pyccl as ccl",
"_____no_output_____"
]
],
[
[
"We start by setting up a cosmology object. This holds the cosmological parameters and metadata. The cosmology object is needed as input for many other functions.\n\nWe set three of these to demonstrate the different options which are available.",
"_____no_output_____"
]
],
[
[
"# Basic cosmology with mostly default parameters and calculating setting.\ncosmo = ccl.Cosmology(Omega_c=0.27, Omega_b=0.045, h=0.67, A_s=2.1e-9, n_s=0.96, \n Neff=3.046, Omega_k=0.)\n\n# Cosmology which incorporates baryonic correction terms in the power.\ncosmo_baryons = ccl.Cosmology(Omega_c=0.27, Omega_b=0.045, h=0.67, A_s=2.1e-9, n_s=0.96, \n Neff=3.046, Omega_k=0., baryons_power_spectrum='bcm', \n bcm_log10Mc=14.079181246047625, bcm_etab=0.5, bcm_ks=55.0)\n\n# Cosmology where the power spectrum will be computed with an emulator.\ncosmo_emu = ccl.Cosmology(Omega_c=0.27, Omega_b=0.05, h=0.67, sigma8=0.83, n_s=0.96, \n Neff=3.04, Omega_k=0., transfer_function='emulator', \n matter_power_spectrum=\"emu\")",
"_____no_output_____"
]
],
[
[
"## Background quantities",
"_____no_output_____"
],
[
"We can calculate a variety of background-type quantities. We set up a vector of scale factors at which to compute them.",
"_____no_output_____"
]
],
[
[
"z = np.linspace(0.0001, 5., 100)\na = 1. / (1.+z)",
"_____no_output_____"
]
],
[
[
"Compute ** distances **:",
"_____no_output_____"
]
],
[
[
"chi_rad = ccl.comoving_radial_distance(cosmo, a) \nchi_ang = ccl.comoving_angular_distance(cosmo,a)\nlum_dist = ccl.luminosity_distance(cosmo, a)\ndist_mod = ccl.distance_modulus(cosmo, a)\n\n\n# Plot the comoving radial distance as a function of redshift, as an example.\nplt.figure()\nplt.plot(z, chi_rad, 'k', linewidth=2)\nplt.xlabel('$z$', fontsize=20)\nplt.ylabel('Comoving distance, Mpc', fontsize=15)\nplt.tick_params(labelsize=13)\nplt.show()",
"_____no_output_____"
]
],
[
[
"Compute ** growth quantities ** :",
"_____no_output_____"
]
],
[
[
"D = ccl.growth_factor(cosmo, a)\nf = ccl.growth_rate(cosmo, a)\n\nplt.figure()\nplt.plot(z, D, 'k', linewidth=2, label='Growth factor')\nplt.plot(z, f, 'g', linewidth=2, label='Growth rate')\nplt.xlabel('$z$', fontsize=20)\nplt.tick_params(labelsize=13)\nplt.legend(loc='lower left')\nplt.show()",
"_____no_output_____"
]
],
[
[
"The ratio of the ** Hubble parameter ** at scale factor a to H0:",
"_____no_output_____"
]
],
[
[
"H_over_H0 = ccl.h_over_h0(cosmo, a)\n\nplt.figure()\nplt.plot(z, H_over_H0, 'k', linewidth=2)\nplt.xlabel('$z$', fontsize=20)\nplt.ylabel('$H / H_0$', fontsize=15)\nplt.tick_params(labelsize=13)\nplt.show()",
"_____no_output_____"
]
],
[
[
"For each component of the matter / energy budget, we can get $\\Omega_{\\rm x}(z)$, the ** fractional energy density ** at $z \\ne 0$.",
"_____no_output_____"
]
],
[
[
"OmM_z = ccl.omega_x(cosmo, a, 'matter')\nOmL_z = ccl.omega_x(cosmo, a, 'dark_energy')\nOmR_z = ccl.omega_x(cosmo, a, 'radiation')\nOmK_z = ccl.omega_x(cosmo, a, 'curvature')\nOmNuRel_z = ccl.omega_x(cosmo, a, 'neutrinos_rel')\nOmNuMass_z = ccl.omega_x(cosmo, a, 'neutrinos_massive')",
"_____no_output_____"
],
[
"plt.figure()\nplt.plot(z, OmM_z, 'k', linewidth=2, label='$\\Omega_{\\\\rm M}(z)$')\nplt.plot(z, OmL_z, 'g', linewidth=2, label='$\\Omega_{\\Lambda}(z)$')\nplt.plot(z, OmR_z, 'b', linewidth=2, label='$\\Omega_{\\\\rm R}(z)$')\nplt.plot(z, OmNuRel_z, 'm', linewidth=2, label='$\\Omega_{\\\\nu}^{\\\\rm rel}(z)$')\nplt.xlabel('$z$',fontsize=20)\nplt.ylabel('$\\Omega_{\\\\rm x}(z)$', fontsize= 20)\nplt.tick_params(labelsize=13)\nplt.legend(loc='upper right')\nplt.show()",
"_____no_output_____"
]
],
[
[
"## Matter power spectra and related quantities",
"_____no_output_____"
],
[
"To compute the matter power spectrum, we define a vector of k values, and use the same z values as above.",
"_____no_output_____"
]
],
[
[
"k = np.logspace(-3, 2, 100)",
"_____no_output_____"
]
],
[
[
"The first power spectrum call for a given cosmology will take a few seconds to run, because we are computing $P(k)$ with CLASS and initializing splines. Further calls will be much quicker because they just access the precomputed splined values.",
"_____no_output_____"
]
],
[
[
"z_Pk = 0.2\na_Pk = 1. / (1.+z_Pk)\n\nPk_lin = ccl.linear_matter_power(cosmo, k, a_Pk)\nPk_nonlin = ccl.nonlin_matter_power(cosmo, k, a_Pk)\nPk_baryon = ccl.nonlin_matter_power(cosmo_baryons, k, a_Pk)\nPk_emu = ccl.nonlin_matter_power(cosmo_emu, k, a_Pk)",
"_____no_output_____"
],
[
"plt.figure()\nplt.loglog(k, Pk_lin, 'k', linewidth=2, label='Linear')\nplt.loglog(k, Pk_nonlin, 'g', linewidth=2, label='Non-linear (halofit)')\nplt.loglog(k, Pk_baryon, 'm', linewidth=2, linestyle=':', label='With baryonic correction')\nplt.loglog(k, Pk_emu, 'b', linewidth=2, linestyle = '--', label='CosmicEmu')\nplt.xlabel('$k, \\\\frac{1}{\\\\rm Mpc}$', fontsize=20)\nplt.ylabel('$P(k), {\\\\rm Mpc^3}$', fontsize=20)\nplt.xlim(0.001, 50)\nplt.ylim(0.01, 10**6)\nplt.tick_params(labelsize=13)\nplt.legend(loc='lower left')\nplt.show()",
"_____no_output_____"
]
],
[
[
"We can also compute $\\sigma_{\\rm R}$, the RMS variance in a top-hat of radius R Mpc, as well as the special case of $\\sigma_{8}$.",
"_____no_output_____"
]
],
[
[
"R = np.linspace(5, 20, 15)\n\nsigmaR = ccl.sigmaR(cosmo, R)\nsigma8 = ccl.sigma8(cosmo)\n\nprint(\"sigma8 =\", sigma8)",
"('sigma8 =', 0.8404211632589388)\n"
]
],
[
[
"## $C_\\ell$ spectra",
"_____no_output_____"
],
[
"We can compute $C_\\ell$ for galaxy counts, galaxy lensing, and CMB lensing, for autocorrelations or any cross-correlation.",
"_____no_output_____"
],
[
"The first step to getting $C_\\ell$'s involving galaxy counts or lensing is to define a photo-z probability function and a galaxy redshift distribution. CCL allows you to flexibly design your own photo-z function, but fo the purposes of demonstration we use the included Gaussian function. ",
"_____no_output_____"
]
],
[
[
"z_pz = np.linspace(0.3, 3., 3) # Define the edges of the photo-z bins.\npz = ccl.PhotoZGaussian(sigma_z0=0.05)",
"_____no_output_____"
]
],
[
[
"We get the galaxy redshift distribution for each tomographic bin, for galaxy counts and galaxy lenisng.",
"_____no_output_____"
]
],
[
[
"dNdz_nc = [ccl.dNdz_tomog(z=z, dNdz_type='nc', zmin=z_pz[zi], zmax=z_pz[zi+1], pz_func=pz)\n for zi in range(0, len(z_pz)-1)]\ndNdz_len = [ccl.dNdz_tomog(z=z, dNdz_type='wl_fid', zmin=z_pz[zi], zmax=z_pz[zi+1], pz_func=pz)\n for zi in range(0, len(z_pz)-1)]",
"_____no_output_____"
]
],
[
[
"Let's assume a toy linear galaxy bias for our galaxy-count tracer.",
"_____no_output_____"
]
],
[
[
"bias = 2.*np.ones(len(z))",
"_____no_output_____"
]
],
[
[
"We can now set up tracer objects for CMB lensing and for each tomographic bin of galaxy counts and galaxy lensing. ",
"_____no_output_____"
]
],
[
[
"gal_counts = ([ccl.NumberCountsTracer(cosmo, has_rsd=False,\n dndz=(z, dNdz_nc[zi]), bias=(z, bias)) for zi in range(0, len(z_pz)-1)])\n\ngal_lens = ([ccl.WeakLensingTracer(cosmo, dndz=(z, dNdz_len[zi])) for zi in range(0, len(z_pz)-1)])\n\ncmb_lens = [ccl.CMBLensingTracer(cosmo, z_source=1089.)]\n\nall_tracers = gal_counts + gal_lens + cmb_lens",
"_____no_output_____"
]
],
[
[
"With these tracer objects, we can now get $C_\\ell$'s.",
"_____no_output_____"
]
],
[
[
"ell = np.linspace(1, 2000, 2000)\n\nn_tracer = len(all_tracers)\n\nc_ells = ([[ccl.angular_cl(cosmo, all_tracers[ni], all_tracers[nj], ell) \n for ni in range(0, n_tracer)] for nj in range(0, n_tracer)])",
"_____no_output_____"
]
],
[
[
"We can plot a couple of examples",
"_____no_output_____"
]
],
[
[
"plt.figure()\nplt.loglog(ell, c_ells[0][0], 'k', linewidth=2, label='gg bin 1 auto')\nplt.loglog(ell, c_ells[0][3], 'g', linewidth=2, label='g1 x src2')\nplt.loglog(ell, c_ells[4][4], 'm', linewidth=2, label='CMB lensing auto')\nplt.xlabel('$\\ell$', fontsize=20)\nplt.ylabel('$C_\\ell$', fontsize=20)\nplt.xlim(1, 1000)\nplt.tick_params(labelsize=13)\nplt.legend(loc='lower left')\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Correlation functions",
"_____no_output_____"
],
[
"From the $C_\\ell$s, we can then get correlatoin functions. Let's do an example of each type.",
"_____no_output_____"
]
],
[
[
"theta_deg = np.logspace(-1, np.log10(5.), 20) # Theta is in degrees\n\nxi_plus = ccl.correlation(cosmo, ell, c_ells[2][2], theta_deg, corr_type='L+', method='FFTLog')\nxi_minus = ccl.correlation(cosmo, ell, c_ells[2][2], theta_deg, corr_type='L-', method='FFTLog')\nxi_gg = ccl.correlation(cosmo, ell, c_ells[0][0], theta_deg, corr_type='GG', method='FFTLog')",
"_____no_output_____"
],
[
"plt.figure()\nplt.loglog(theta_deg, xi_plus, '+k', label='+')\nplt.loglog(theta_deg, xi_minus, 'ob', label='-')\nplt.xlabel('$\\\\theta$, deg', fontsize=20)\nplt.ylabel('$\\\\xi_{+ / -}$', fontsize=20)\nplt.xlim(0.1, 5)\nplt.ylim(10**(-7), 10**(-4))\nplt.tick_params(labelsize=13)\nplt.legend(loc='lower left')\nplt.show()\n\nplt.figure()\nplt.loglog(theta_deg, xi_gg, 'mo', linewidth=2)\nplt.xlabel('$\\\\theta$, deg', fontsize=20)\nplt.ylabel('$\\\\xi_{gg}$', fontsize=20)\nplt.xlim(0.1, 5)\nplt.ylim(4*10**(-5), 0.05)\nplt.tick_params(labelsize=13)\nplt.show()",
"_____no_output_____"
]
],
[
[
"# Halo Mass Function & Halo Bias",
"_____no_output_____"
],
[
"We can compute the halo bias and halo mass function from Tinker et al. ",
"_____no_output_____"
]
],
[
[
"halo_mass = np.logspace(10, 16, 200)\n\nhmf = ccl.massfunc(cosmo, halo_mass, a=1., overdensity=200)",
"_____no_output_____"
],
[
"plt.figure()\nplt.loglog(halo_mass, hmf, 'k', linewidth=2)\nplt.xlabel('Halo mass, $M_\\odot$', fontsize=20)\nplt.ylabel('$\\\\frac{dn}{dlog_{10}M}$', fontsize=20)\nplt.tick_params(labelsize=13)\nplt.show()",
"_____no_output_____"
],
[
"halo_bias = ccl.halo_bias(cosmo, halo_mass, a=1., overdensity=200)\n\nplt.figure()\nplt.loglog(halo_mass, halo_bias, 'k', linewidth=2)\nplt.xlabel('Halo mass, $M_\\odot$', fontsize=20)\nplt.ylabel('$b_h$', fontsize=20)\nplt.tick_params(labelsize=13)\nplt.show()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
]
] |
cbfad62ff91ae7e9fbb1aef15558d7e077c38fea
| 45,909 |
ipynb
|
Jupyter Notebook
|
A2/NeuralNet/NeuralNetA.ipynb
|
ankurshaswat/COL341
|
c967742d08ee0fdf3bf29d45c1b4429d6cf28371
|
[
"MIT"
] | 1 |
2019-08-25T21:41:47.000Z
|
2019-08-25T21:41:47.000Z
|
A2/NeuralNet/NeuralNetA.ipynb
|
ankurshaswat/COL341
|
c967742d08ee0fdf3bf29d45c1b4429d6cf28371
|
[
"MIT"
] | null | null | null |
A2/NeuralNet/NeuralNetA.ipynb
|
ankurshaswat/COL341
|
c967742d08ee0fdf3bf29d45c1b4429d6cf28371
|
[
"MIT"
] | 1 |
2021-09-03T16:05:08.000Z
|
2021-09-03T16:05:08.000Z
| 59.85528 | 108 | 0.592215 |
[
[
[
"import numpy as np\nimport random\nimport sys\nfrom scipy.special import expit as sigmoid\n\ntraining_data_path = sys.argv[1]\ntesting_data_path = sys.argv[2]\noutput_path = sys.argv[3]\nbatch_size = int(sys.argv[4])\nn0 = float(sys.argv[5])\nactivation = sys.argv[6]\nhidden_layers_sizes = []\nfor i in range(7,len(sys.argv)):\n hidden_layers_sizes.append(int(sys.argv[i]))\n\n# training_data_path = \"../data/devnagri_train.csv\"\n# testing_data_path = \"../data/devnagri_test_public.csv\"\n# output_path = \"../data/nn/a/cs1160328.txt\"\n# batch_size = 512\n# n0 = 0.01\n# activation = 'sigmoid'\n# hidden_layers_sizes = [100]",
"_____no_output_____"
],
[
"def relu(x):\n return (x>0) * x\n\ndef tanh(x):\n return np.tanh(x)\n\ndef reluPrime(x):\n return (x>0)+0\n\ndef tanhPrime(x):\n return 1 - np.power(x,2)\n\ndef sigmoidPrime(x):\n return x * (1 - x)\n\ndef exp_normalize(x):\n b = np.amax(x,axis=1,keepdims = True)\n y = np.exp(x - b)\n return y / y.sum(axis=1,keepdims=True)",
"_____no_output_____"
],
[
"class NeuralNetwork:\n \n def __init__(self,input_size,output_size,hidden_layers_sizes, activation):\n self.weights = []\n self.biases = []\n \n if(activation == 'relu'):\n self.activation = relu\n self.activationPrime = reluPrime\n elif(activation == 'tanh'):\n self.activation = tanh\n self.activationPrime = tanhPrime\n else:\n self.activation = sigmoid\n self.activationPrime = sigmoidPrime\n \n self.input_size = input_size\n self.output_size = output_size\n self.hiddent_layers_sizes = hidden_layers_sizes\n \n prev_layer_count = input_size\n \n for i in range(len(hidden_layers_sizes) + 1):\n if i==len(hidden_layers_sizes):\n self.weights.append(np.random.rand(prev_layer_count, output_size)/100)\n self.biases.append(np.random.rand(1, output_size)/100) \n else:\n hidden_layer_count = hidden_layers_sizes[i]\n self.weights.append(np.random.rand(prev_layer_count, hidden_layer_count)/100)\n self.biases.append(np.random.rand(1, hidden_layer_count)/100)\n prev_layer_count = hidden_layer_count\n \n def train(self,inpX,inpY,batch_size,n0,max_iterations):\n max_examples = inpX.shape[0]\n max_possible_iterations = int(0.5 + max_examples / batch_size)\n num_hidden_layers = len(self.weights) - 1\n \n count = 0\n \n lr = n0\n totLoss = 0\n prevAvgLoss = sys.float_info.max\n epoch = 0\n \n for n in range(max_iterations):\n # Forming Mini Batches\n i_eff = n%max_possible_iterations\n \n # Updating Learning Rate\n if (i_eff == 0 and n!=0):\n avgLoss = totLoss/max_possible_iterations\n \n if(np.absolute(avgLoss - prevAvgLoss) < 0.0001 * prevAvgLoss):\n stopCount += 1\n if stopCount > 1:\n break\n else:\n stopCount = 0\n if(avgLoss >= prevAvgLoss):\n count += 1\n lr = n0 / np.sqrt(count+1)\n print(\"Epoch = \",epoch,\" Average Loss = \",avgLoss,\" New Learning Rate = \",lr)\n epoch += 1\n prevAvgLoss = avgLoss\n totLoss = 0\n \n outputs = []\n \n if i_eff != max_possible_iterations - 1:\n X = inpX[i_eff*batch_size: (i_eff+1)*batch_size]\n Y = inpY[i_eff*batch_size: (i_eff+1)*batch_size]\n else:\n X = inpX[i_eff*batch_size:]\n Y = inpY[i_eff*batch_size:]\n \n # Neural Network Forward Propagation\n outputs.append(X)\n prev_layer_output = X\n for i in range(num_hidden_layers + 1):\n weight = self.weights[i]\n bias = self.biases[i]\n if i == num_hidden_layers:\n prev_layer_output = sigmoid(prev_layer_output.dot(weight) + bias)\n else:\n prev_layer_output = self.activation(prev_layer_output.dot(weight) + bias)\n outputs.append(prev_layer_output)\n \n # Backpropagation\n dWs = []\n dbs = []\n \n y_onehot = np.zeros((Y.shape[0],self.output_size))\n y_onehot[range(Y.shape[0]),Y] = 1\n \n for i in range(num_hidden_layers + 1,0,-1):\n if i == num_hidden_layers + 1:\n delta = (outputs[i] - y_onehot).dot(2/Y.shape[0]) * sigmoidPrime(outputs[i])\n else:\n delta = delta.dot(self.weights[i].T) * self.activationPrime(outputs[i])\n dW = (outputs[i-1].T).dot(delta)\n dWs.append(dW)\n dbs.append(np.sum(delta,axis=0,keepdims=True))\n\n if (n%100 == 0):\n loss_ = np.sum(np.power(outputs[-1] - y_onehot,2) )/Y.shape[0]\n labels_ = np.argmax(outputs[-1],axis = 1)\n accuracy_ = 100 * np.sum(labels_ == Y)/Y.shape[0]\n print(\"Iteration \",n,\"\\tLoss = \",loss_,\"\\tAccuracy = \",accuracy_,\"%\")\n \n dWs.reverse()\n dbs.reverse()\n\n # Gradient Descent Parameter Update\n for i in range(len(dWs)):\n self.weights[i] += dWs[i].dot(-1 * lr)\n self.biases[i] += dbs[i].dot(-1 * lr)\n\n loss = np.sum(np.power(outputs[-1] - y_onehot,2) )/Y.shape[0]\n totLoss += loss\n \n def predict(self,X):\n return self.forward_run(X)\n \n def forward_run(self,X):\n prev_layer_output = X\n num_hidden_layers = len(self.weights) - 1\n for i in range(num_hidden_layers + 1):\n weight = self.weights[i]\n bias = self.biases[i]\n if i == num_hidden_layers:\n probabilities = sigmoid(prev_layer_output.dot(weight) + bias)\n labels = np.argmax(probabilities,axis = 1)\n return labels\n else:\n prev_layer_output = self.activation(prev_layer_output.dot(weight) + bias)",
"_____no_output_____"
],
[
"def load_data(path,avg,std):\n if avg is None:\n input_data = np.loadtxt(open(path, \"rb\"), delimiter=\",\")\n Y = input_data[:,0].copy()\n X = input_data[:,1:].copy()\n avg = np.average(X,axis=0)\n X = X - avg\n std = np.std(X,axis=0)\n std[(std == 0)] = 1\n X = X / std\n return X,Y,avg,std\n else:\n input_data = np.loadtxt(open(path, \"rb\"), delimiter=\",\")\n X = input_data[:,1:].copy()\n X = (X - avg)/std\n return X",
"_____no_output_____"
],
[
"inpX,Y,avg,std = load_data(training_data_path,None,None)",
"_____no_output_____"
],
[
"X = inpX.copy()\n\ninput_size = X.shape[1]\noutput_size = int(np.amax(Y))+1\nnum_examples = X.shape[0]\nmax_iterations = int(40*(num_examples/batch_size))\nif(max_iterations < 25000):\n max_iterations = 25000\nnetwork = NeuralNetwork(input_size,output_size,hidden_layers_sizes,activation)\nnetwork.train(X,Y.astype(int),batch_size,n0,max_iterations)",
"Iteration 0 \tLoss = 32.14825318333342 \tAccuracy = 2.34375 %\nIteration 100 \tLoss = 32.53796718342765 \tAccuracy = 1.3671875 %\nEpoch = 0 Average Loss = 32.03512367346666 New Learning Rate = 0.01\nIteration 200 \tLoss = 30.906488228664266 \tAccuracy = 1.5625 %\nIteration 300 \tLoss = 30.2690481252686 \tAccuracy = 3.125 %\nEpoch = 1 Average Loss = 30.96630908246529 New Learning Rate = 0.01\nIteration 400 \tLoss = 29.26547757561292 \tAccuracy = 1.5625 %\nEpoch = 2 Average Loss = 30.000979333473232 New Learning Rate = 0.01\nIteration 500 \tLoss = 29.876958827903483 \tAccuracy = 1.171875 %\nIteration 600 \tLoss = 29.36768140903809 \tAccuracy = 1.5625 %\nEpoch = 3 Average Loss = 29.157723298814044 New Learning Rate = 0.01\nIteration 700 \tLoss = 27.47424872461346 \tAccuracy = 1.953125 %\nEpoch = 4 Average Loss = 28.439324679264715 New Learning Rate = 0.01\nIteration 800 \tLoss = 27.121728266458874 \tAccuracy = 1.953125 %\nIteration 900 \tLoss = 27.557161457289208 \tAccuracy = 2.34375 %\nEpoch = 5 Average Loss = 27.83712512664878 New Learning Rate = 0.01\nIteration 1000 \tLoss = 25.817963369761834 \tAccuracy = 2.34375 %\nEpoch = 6 Average Loss = 27.336395094281322 New Learning Rate = 0.01\nIteration 1100 \tLoss = 25.486986369423438 \tAccuracy = 1.5625 %\nIteration 1200 \tLoss = 27.463130236777587 \tAccuracy = 2.34375 %\nEpoch = 7 Average Loss = 26.920671370499317 New Learning Rate = 0.01\nIteration 1300 \tLoss = 26.03908607517313 \tAccuracy = 2.5390625 %\nEpoch = 8 Average Loss = 26.57439702979938 New Learning Rate = 0.01\nIteration 1400 \tLoss = 26.893592301228413 \tAccuracy = 2.5390625 %\nIteration 1500 \tLoss = 26.43278081589881 \tAccuracy = 2.5390625 %\nEpoch = 9 Average Loss = 26.284118987445755 New Learning Rate = 0.01\nIteration 1600 \tLoss = 25.544100864963333 \tAccuracy = 1.953125 %\nEpoch = 10 Average Loss = 26.038769769835064 New Learning Rate = 0.01\nIteration 1700 \tLoss = 25.61198693656795 \tAccuracy = 3.3203125 %\nIteration 1800 \tLoss = 26.3818960502181 \tAccuracy = 0.9765625 %\nEpoch = 11 Average Loss = 25.829489766784945 New Learning Rate = 0.01\nIteration 1900 \tLoss = 26.901203360626475 \tAccuracy = 1.953125 %\nEpoch = 12 Average Loss = 25.649284053422083 New Learning Rate = 0.01\nIteration 2000 \tLoss = 24.53980436133301 \tAccuracy = 1.7578125 %\nIteration 2100 \tLoss = 23.843319123999756 \tAccuracy = 2.734375 %\nEpoch = 13 Average Loss = 25.492658994568288 New Learning Rate = 0.01\nIteration 2200 \tLoss = 25.8863609174252 \tAccuracy = 2.5390625 %\nEpoch = 14 Average Loss = 25.355298867521043 New Learning Rate = 0.01\nIteration 2300 \tLoss = 24.065898784514463 \tAccuracy = 1.7578125 %\nIteration 2400 \tLoss = 25.805139013452816 \tAccuracy = 2.5390625 %\nEpoch = 15 Average Loss = 25.233802350727018 New Learning Rate = 0.01\nIteration 2500 \tLoss = 23.77890282338452 \tAccuracy = 1.953125 %\nIteration 2600 \tLoss = 27.888625254876324 \tAccuracy = 1.5957446808510638 %\nEpoch = 16 Average Loss = 25.125475843780162 New Learning Rate = 0.01\nIteration 2700 \tLoss = 25.494016799224696 \tAccuracy = 1.953125 %\nEpoch = 17 Average Loss = 25.02817330530103 New Learning Rate = 0.01\nIteration 2800 \tLoss = 27.004200858956494 \tAccuracy = 2.1484375 %\nIteration 2900 \tLoss = 24.264428348973166 \tAccuracy = 1.7578125 %\nEpoch = 18 Average Loss = 24.940173312527353 New Learning Rate = 0.01\nIteration 3000 \tLoss = 25.62323586049407 \tAccuracy = 2.34375 %\nEpoch = 19 Average Loss = 24.860085482935702 New Learning Rate = 0.01\nIteration 3100 \tLoss = 25.550435546626222 \tAccuracy = 1.953125 %\nIteration 3200 \tLoss = 23.09845499171248 \tAccuracy = 2.734375 %\nEpoch = 20 Average Loss = 24.786779328826988 New Learning Rate = 0.01\nIteration 3300 \tLoss = 24.671318712336237 \tAccuracy = 1.7578125 %\nEpoch = 21 Average Loss = 24.719329756068877 New Learning Rate = 0.01\nIteration 3400 \tLoss = 24.452649779028043 \tAccuracy = 3.125 %\nIteration 3500 \tLoss = 25.67436035693956 \tAccuracy = 2.734375 %\nEpoch = 22 Average Loss = 24.656974952760756 New Learning Rate = 0.01\nIteration 3600 \tLoss = 23.899886822575844 \tAccuracy = 1.953125 %\nEpoch = 23 Average Loss = 24.5990836527612 New Learning Rate = 0.01\nIteration 3700 \tLoss = 26.258948468054644 \tAccuracy = 2.734375 %\nIteration 3800 \tLoss = 25.146781125738784 \tAccuracy = 3.7109375 %\nEpoch = 24 Average Loss = 24.545129359455288 New Learning Rate = 0.01\nIteration 3900 \tLoss = 24.912618375799447 \tAccuracy = 3.125 %\nEpoch = 25 Average Loss = 24.49466966994934 New Learning Rate = 0.01\nIteration 4000 \tLoss = 23.780087637832008 \tAccuracy = 2.34375 %\nIteration 4100 \tLoss = 25.99748269943364 \tAccuracy = 3.3203125 %\nEpoch = 26 Average Loss = 24.447329571933313 New Learning Rate = 0.01\nIteration 4200 \tLoss = 23.837113483839758 \tAccuracy = 1.171875 %\nEpoch = 27 Average Loss = 24.40278812716953 New Learning Rate = 0.01\nIteration 4300 \tLoss = 25.11901580856763 \tAccuracy = 2.5390625 %\nIteration 4400 \tLoss = 23.65661661784118 \tAccuracy = 3.125 %\nEpoch = 28 Average Loss = 24.36076814490779 New Learning Rate = 0.01\nIteration 4500 \tLoss = 23.24731282160872 \tAccuracy = 0.78125 %\nEpoch = 29 Average Loss = 24.321028417200782 New Learning Rate = 0.01\nIteration 4600 \tLoss = 23.371967426190295 \tAccuracy = 1.7578125 %\nIteration 4700 \tLoss = 24.351669571470016 \tAccuracy = 1.171875 %\nEpoch = 30 Average Loss = 24.283357957730324 New Learning Rate = 0.01\nIteration 4800 \tLoss = 24.172509929022013 \tAccuracy = 2.734375 %\nEpoch = 31 Average Loss = 24.247571556131753 New Learning Rate = 0.01\nIteration 4900 \tLoss = 24.72647398881061 \tAccuracy = 2.1484375 %\nIteration 5000 \tLoss = 25.553361061442086 \tAccuracy = 1.953125 %\nEpoch = 32 Average Loss = 24.21350600628024 New Learning Rate = 0.01\nIteration 5100 \tLoss = 24.465847275816394 \tAccuracy = 1.953125 %\nIteration 5200 \tLoss = 24.638232074635297 \tAccuracy = 2.5390625 %\nEpoch = 33 Average Loss = 24.18101669527766 New Learning Rate = 0.01\nIteration 5300 \tLoss = 24.69816543115806 \tAccuracy = 1.5625 %\nEpoch = 34 Average Loss = 24.149974635038472 New Learning Rate = 0.01\nIteration 5400 \tLoss = 22.86277733718009 \tAccuracy = 2.34375 %\nIteration 5500 \tLoss = 23.13754057720742 \tAccuracy = 2.34375 %\nEpoch = 35 Average Loss = 24.12026411872298 New Learning Rate = 0.01\nIteration 5600 \tLoss = 23.59862906054122 \tAccuracy = 2.34375 %\nEpoch = 36 Average Loss = 24.091780972538036 New Learning Rate = 0.01\nIteration 5700 \tLoss = 23.001092149536024 \tAccuracy = 2.34375 %\nIteration 5800 \tLoss = 24.58641245619397 \tAccuracy = 1.7578125 %\nEpoch = 37 Average Loss = 24.064431180369493 New Learning Rate = 0.01\nIteration 5900 \tLoss = 24.351491023612606 \tAccuracy = 2.1484375 %\nEpoch = 38 Average Loss = 24.038129687548103 New Learning Rate = 0.01\nIteration 6000 \tLoss = 24.53319050573929 \tAccuracy = 2.34375 %\nIteration 6100 \tLoss = 24.746443047486643 \tAccuracy = 1.5625 %\nEpoch = 39 Average Loss = 24.012799338217015 New Learning Rate = 0.01\nIteration 6200 \tLoss = 24.896243303271596 \tAccuracy = 3.515625 %\nEpoch = 40 Average Loss = 23.988369989842464 New Learning Rate = 0.01\nIteration 6300 \tLoss = 23.127095100606823 \tAccuracy = 3.90625 %\nIteration 6400 \tLoss = 23.532609802785412 \tAccuracy = 3.3203125 %\nEpoch = 41 Average Loss = 23.964777823252263 New Learning Rate = 0.01\nIteration 6500 \tLoss = 23.68114799693165 \tAccuracy = 2.734375 %\nEpoch = 42 Average Loss = 23.941964794070632 New Learning Rate = 0.01\nIteration 6600 \tLoss = 25.21018933978104 \tAccuracy = 2.5390625 %\nIteration 6700 \tLoss = 22.63769857079599 \tAccuracy = 4.296875 %\nEpoch = 43 Average Loss = 23.919878140983844 New Learning Rate = 0.01\nIteration 6800 \tLoss = 25.048608449332228 \tAccuracy = 1.953125 %\nEpoch = 44 Average Loss = 23.898469905225234 New Learning Rate = 0.01\nIteration 6900 \tLoss = 22.925834742799136 \tAccuracy = 1.171875 %\nIteration 7000 \tLoss = 23.63907402501232 \tAccuracy = 1.7578125 %\nEpoch = 45 Average Loss = 23.877696478815256 New Learning Rate = 0.01\n"
],
[
"predictions = network.predict(X.copy())\nprint(\"Accuraccy on Training Data = \",100 * np.sum(predictions == Y)/Y.shape[0])\n# print(\"Average of predictions on Training Data = \",np.average(predictions))",
"Accuraccy on Training Data = 82.48337595907928\n"
],
[
"testX = load_data(testing_data_path,avg,std)",
"_____no_output_____"
],
[
"predictions = network.predict(testX)\nnp.savetxt(output_path,predictions,fmt=\"%i\")",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfad64ae1b11a7f1bb5cbbb76e0ccb9ded26e2f
| 11,427 |
ipynb
|
Jupyter Notebook
|
samples/pyESASky-Catalogue.ipynb
|
pierfra-ro/pyesasky
|
a9342efcaa5cca088ed9a5afa2c98d3e9aa4bd0f
|
[
"BSD-3-Clause"
] | 13 |
2019-05-30T19:57:37.000Z
|
2021-09-10T09:43:49.000Z
|
samples/pyESASky-Catalogue.ipynb
|
pierfra-ro/pyesasky
|
a9342efcaa5cca088ed9a5afa2c98d3e9aa4bd0f
|
[
"BSD-3-Clause"
] | 21 |
2019-06-21T18:55:25.000Z
|
2022-02-27T14:48:13.000Z
|
samples/pyESASky-Catalogue.ipynb
|
pierfra-ro/pyesasky
|
a9342efcaa5cca088ed9a5afa2c98d3e9aa4bd0f
|
[
"BSD-3-Clause"
] | 8 |
2019-05-30T12:20:48.000Z
|
2022-03-04T04:01:20.000Z
| 28.856061 | 202 | 0.550626 |
[
[
[
"from pyesasky import ESASkyWidget\nfrom pyesasky import Catalogue\nfrom pyesasky import CatalogueDescriptor\nfrom pyesasky import MetadataDescriptor\nfrom pyesasky import MetadataType\nfrom pyesasky import CooFrame",
"_____no_output_____"
],
[
"# instantiating pyESASky instance\nesasky = ESASkyWidget()",
"_____no_output_____"
],
[
"# loading pyESASky instance\nesasky",
"_____no_output_____"
],
[
"# Go to the Cosmos field in ESASky (as resolved by SIMBAD):\nesasky.goToTargetName('Cosmos Field')",
"_____no_output_____"
],
[
"#####################################################\n# EX.1 creating a user defined catalogue on the fly #\n#####################################################\ncatalogue = Catalogue('test catalogue name', CooFrame.FRAME_J2000, '#ee2345', 10)",
"_____no_output_____"
],
[
"# adding sources to the catalogue\ncatalogue.addSource('source name A', '150.44963', '2.24640', 1, [{\"name\":\"Flux 1\", \"value\":\"10.5\", \"type\":\"STRING\" },{\"name\":\"Flux 2\", \"value\":\"1.7\", \"type\":\"STRING\" }])\ncatalogue.addSource('source name B', '150.54963', '2.34640', 2, [{\"name\":\"Flux 1\", \"value\":\"11.5\", \"type\":\"STRING\" },{\"name\":\"Flux 2\", \"value\":\"2.7\", \"type\":\"STRING\" }])\ncatalogue.addSource('source name c', '150.34963', '2.44640', 3, [{\"name\":\"Flux 1\", \"value\":\"12.5\", \"type\":\"STRING\" },{\"name\":\"Flux 2\", \"value\":\"0.7\", \"type\":\"STRING\" }])",
"_____no_output_____"
],
[
"# overlay catalogue in pyESASky\nesasky.overlayCatalogueWithDetails(catalogue)",
"_____no_output_____"
],
[
"############################################\n# EX.2 importing a catalogue from CSV file #\n############################################\n# CatalogueDescriptor('<catName>', '<HTMLcolor>', <lineWidth>, '<idColumn>', '<nameColumn>', '<RAColumn>', '<DecColumn>', Metadata)\n# where:\n# - <catName> : name of the catalogue that will be used in pyESASky as label\n# - <HTMLcolor> : HTML color. It could be a \"Color name\", \"Hex color code\" or \"RGB color code\"\n# - <lineWidth> : width used to draw sources. From 1 to 10\n# - <idColumn> : name of the column containing a unique identifier for sources if any. None if not applicable\n# - <nameColumn> : name of the column with the name of the source\n# - <RAColumn> : name of the RA column in degrees\n# - <DecColumn> : name of the Dec column in degrees\n# - Metadata : list of pyesasky.pyesasky.MetadataDescriptor in case it has been defined. [] otherwise.\ncatalogueDesc =CatalogueDescriptor('my test', 'yellow', 5, 'id', 'name', 'ra', 'dec', [])",
"_____no_output_____"
],
[
"# parse, import and overlay a catalogue from a CSV\nesasky.overlayCatalogueFromCSV('./testcat', ',', catalogueDesc, 'J2000')",
"_____no_output_____"
],
[
"###################################################################\n# EX.3 importing a catalogue from AstropyTable using Gaia archive #\n###################################################################\nfrom astroquery.gaia import Gaia",
"_____no_output_____"
],
[
"job = Gaia.launch_job(\"select top 10\\\n ra, dec, source_id, designation, ref_epoch,ra_dec_corr,astrometric_n_obs_al,matched_observations,duplicated_source,phot_variable_flag \\\n from gaiadr2.gaia_source order by source_id\", verbose=True)\n",
"_____no_output_____"
],
[
"myGaiaData = job.get_results()\nprint(myGaiaData)",
"_____no_output_____"
],
[
"job.get_data()",
"_____no_output_____"
],
[
"# overlayCatalogueFromAstropyTable('<catName>', '<cooFrame>', <HTMLcolor>, '<(astropy.table)>', '<RAColumn>', '<DecColumn>', '<nameColumn>')\n# where:\n# - <catName> : name of the catalogue that will be used in pyESASky as label\n# - <HTMLcolor> : HTML color. It could be a \"Color name\", \"Hex color code\" or \"RGB color code\"\n# - <lineWidth> : width used to draw sources. From 1 to 10\n# - <idColumn> : name of the column containing a unique identifier for sources if any. None if not applicable\n# - <nameColumn> : name of the column with the name of the source\n# - <RAColumn> : name of the RA column in degrees\n# - <DecColumn> : name of the Dec column in degrees\n\nesasky.overlayCatalogueFromAstropyTable('Gaia DR2', 'J2000', '#a343ff', 5, myGaiaData, '','','')",
"_____no_output_____"
],
[
"# Import the VizieR Astroquery module\nfrom astroquery.vizier import Vizier",
"_____no_output_____"
],
[
"# Search for 'The XMM-Newton survey of the COSMOS field (Brusa+, 2010)':\ncatalog_list = Vizier.find_catalogs('Brusa+, 2010')\nprint({k:v.description for k,v in catalog_list.items()})",
"_____no_output_____"
],
[
"# Get the above list of catalogues:\nVizier.ROW_LIMIT = -1\ncatalogs = Vizier.get_catalogs(catalog_list.keys())\nprint(catalogs)",
"_____no_output_____"
],
[
"# Access one table: \nBrusa = catalogs['J/ApJ/716/348/table2']\nprint(Brusa)",
"_____no_output_____"
],
[
"# Visualise the table in ESASky:\nesasky.overlayCatalogueFromAstropyTable('Brusa', CooFrame.FRAME_J2000, '#00ff00', 5, Brusa, 'RAJ2000','DEJ2000','Name')",
"_____no_output_____"
],
[
"# Go to the LMC in ESASky (as resolved by SIMBAD):\nesasky.goToTargetName('LMC')",
"_____no_output_____"
],
[
"# Search for 'The HIZOA-S survey':\ncatalog_list2 = Vizier.find_catalogs('HIZOA-S survey 2016') #HIZOA-S survey 2016\nprint({k:v.description for k,v in catalog_list2.items()})",
"_____no_output_____"
],
[
"# Get the above list of catalogues:\nVizier.ROW_LIMIT = -1\n# Vizier.COLUMN_LIMIT = 20 Can't find the way to get all the columns rather than just the default columns. Going to try the TAP+ module\ncatalog = Vizier.get_catalogs(catalog_list2.keys())\nprint(catalog)",
"_____no_output_____"
],
[
"# Access the catalogue table: \nHIZOA = catalog['J/AJ/151/52/table2'] #\nprint(HIZOA)",
"_____no_output_____"
],
[
"# Visualise the table in ESASky:\n###### NOTE: NOT PLOTTING GALACTIC COORDS CORRECTLY\nesasky.overlayCatalogueFromAstropyTable('HIZOA', CooFrame.FRAME_GALACTIC, '#0000ff', 7, HIZOA, 'GLON','GLAT','HIZOA')",
"_____no_output_____"
],
[
"# TRYING THE SAME BUT USING THE TAP/TAP+ ASTROQUERY MODULE:\n# Import the TAP/TAP+ Astroquery module\nfrom astroquery.utils.tap.core import TapPlus",
"_____no_output_____"
],
[
"vizier = TapPlus(url=\"http://tapvizier.u-strasbg.fr/TAPVizieR/tap\")",
"_____no_output_____"
],
[
"tables = vizier.load_tables(only_names=True)\nfor table in (tables):\n print(table.get_qualified_name())",
"_____no_output_____"
],
[
"#ONLY TAP+ compatible, so doesn't seem to work\ntable = vizier.load_table('viz7.\"J/AJ/128/16/table2\"')\nfor column in (table.get_columns()):\n print(column.get_name())",
"_____no_output_____"
],
[
"# This works in TOPCAT to download the whole table: SELECT * FROM \"J/AJ/128/16/table2\"\n# This also works in TOPCAT : SELECT * FROM viz7.\"J/AJ/128/16/table2\"\n\njob = vizier.launch_job(\"SELECT * FROM \"'viz7.\"J/AJ/128/16/table2\"'\"\")\n#This also works:\n#job = vizier.launch_job(\"SELECT * FROM \"+str('viz7.\"J/AJ/128/16/table2\"')+\"\")\nprint(job)",
"_____no_output_____"
],
[
"Koribalski = job.get_results()\nprint(Koribalski['HIPASS', 'RAJ2000', 'DEJ2000'])",
"_____no_output_____"
],
[
"# Visualise the table in ESASky:\nesasky.overlayCatalogueFromAstropyTable('Koribalski', CooFrame.FRAME_J2000, '#ff0000', 6, Koribalski, 'RAJ2000','DEJ2000','HIPASS')",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfaf2558354a04d2216840ad785e28b3aea7fc4
| 9,693 |
ipynb
|
Jupyter Notebook
|
Linear Regression.ipynb
|
siddhantpathakk/MLDA-DLW-Hackathon-2021
|
8be876ffa32654c52235d41322b392d8037abd57
|
[
"MIT"
] | null | null | null |
Linear Regression.ipynb
|
siddhantpathakk/MLDA-DLW-Hackathon-2021
|
8be876ffa32654c52235d41322b392d8037abd57
|
[
"MIT"
] | null | null | null |
Linear Regression.ipynb
|
siddhantpathakk/MLDA-DLW-Hackathon-2021
|
8be876ffa32654c52235d41322b392d8037abd57
|
[
"MIT"
] | null | null | null | 27.615385 | 107 | 0.388837 |
[
[
[
"from sklearn.linear_model import LinearRegression\nfrom sklearn.model_selection import train_test_split\nfrom sklearn.metrics import mean_squared_error",
"_____no_output_____"
],
[
"import pandas as pd",
"_____no_output_____"
],
[
"from sklearn.metrics import mean_squared_error\nfrom sklearn.metrics import mean_absolute_error",
"_____no_output_____"
],
[
"df=pd.read_csv(\"Clean_DF.csv\")",
"_____no_output_____"
],
[
"df.drop(columns=[\"Unnamed: 0\"],inplace=True)",
"_____no_output_____"
],
[
"df.head()",
"_____no_output_____"
],
[
"predictors = [\"Open\", \"High\", \"Low\",\"Close\",\"Total Volume of Tweets\",\"Compound_Score\"]\n\ny = pd.DataFrame(df[\"Weighted_Price\"])\nX = pd.DataFrame(df[predictors])\n\nX_train, X_test, y_train, y_test = train_test_split(X, y, test_size = 0.25)",
"_____no_output_____"
],
[
"linreg = LinearRegression() \nlinreg.fit(X_train, y_train) ",
"_____no_output_____"
],
[
"y_train_pred = linreg.predict(X_train)\ny_test_pred = linreg.predict(X_test)",
"_____no_output_____"
],
[
"errors = mean_squared_error(y_test, y_test_pred, squared = False)\nprint(\"RMSE = \"+errors)",
"_____no_output_____"
],
[
"errors2 = mean_absolute_error(y_test, y_test_pred)\nprint(\"MAE = \"+errors2)",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfafeab9a9976550784c5d9b660de5341908936
| 25,977 |
ipynb
|
Jupyter Notebook
|
rasotools/ax/sonde_template.ipynb
|
MBlaschek/rasotools
|
a8b954518a1e39b554f850aac0f5bd8fa1f23dc6
|
[
"MIT"
] | 1 |
2019-10-06T22:26:43.000Z
|
2019-10-06T22:26:43.000Z
|
rasotools/ax/sonde_template.ipynb
|
MBlaschek/rasotools
|
a8b954518a1e39b554f850aac0f5bd8fa1f23dc6
|
[
"MIT"
] | null | null | null |
rasotools/ax/sonde_template.ipynb
|
MBlaschek/rasotools
|
a8b954518a1e39b554f850aac0f5bd8fa1f23dc6
|
[
"MIT"
] | 1 |
2020-04-19T13:47:52.000Z
|
2020-04-19T13:47:52.000Z
| 25.951049 | 161 | 0.48285 |
[
[
[
"%pylab\n%matplotlib inline",
"_____no_output_____"
],
[
"%run pdev notebook",
"_____no_output_____"
]
],
[
[
"# Radiosonde SONDE",
"_____no_output_____"
]
],
[
[
"ident = \"SONDE\"\nplt.rcParams['figure.figsize'] = [12.0, 6.0]\nplt.rcParams['lines.linewidth'] = 2\nplt.rcParams['font.size'] = 15\nyplevs = np.array([10,100,200,300,400,500,700,925])*100\nsave = True",
"_____no_output_____"
],
[
"!mkdir -p figures",
"_____no_output_____"
],
[
"rt.load_config()\nrt.config",
"_____no_output_____"
],
[
"isonde = rt.cls.Radiosonde(ident)",
"_____no_output_____"
],
[
"#\n# All the data available\n#\nisonde.list_store()",
"_____no_output_____"
]
],
[
[
"## Load Data",
"_____no_output_____"
]
],
[
[
"# close=False -> stay on disk, \n# =True -> load to memory\nclose = False",
"_____no_output_____"
]
],
[
[
"### ERA5",
"_____no_output_____"
]
],
[
[
"if False:\n isonde.add('ERA5', filename='ERA5_*.nc', cfunits=True, close=close, verbose=1)\nif False:\n isonde.add('ERA5_meta', filename='*_ERA5_station.nc', cfunits=True, close=close, verbose=1)",
"_____no_output_____"
]
],
[
[
"### ERA Interim",
"_____no_output_____"
]
],
[
[
"if False:\n isonde.add('ERAI', filename='ERAI_*.nc', cfunits=True, close=close, verbose=1)",
"_____no_output_____"
]
],
[
[
"### IGRA v2",
"_____no_output_____"
]
],
[
[
"if False:\n isonde.add('IGRAv2', cfunits=True, close=close, verbose=1)",
"_____no_output_____"
]
],
[
[
"### Upper Air Database (UADB)",
"_____no_output_____"
]
],
[
[
"if False:\n isonde.add('UADB', cfunits=True, close=close, verbose=1)",
"_____no_output_____"
]
],
[
[
"### JRA-55",
"_____no_output_____"
]
],
[
[
"if False:\n isonde.add('JRA55', close=close, verbose=1)",
"_____no_output_____"
]
],
[
[
"### CERA-20C",
"_____no_output_____"
]
],
[
[
"if False:\n isonde.add('CERA20C', close=close, verbose=1)",
"_____no_output_____"
]
],
[
[
"### Standardized Combined Data",
"_____no_output_____"
]
],
[
[
"idata = None\n#\n# ERA5\n#\nif isonde.in_store('dataE5JC'):\n isonde.add('dataE5JC', verbose=1)\n idata = isonde.data.dataE5JC\n#\n# ERA Interim\n#\nif isonde.in_store('dataEIJC') and idata is None:\n isonde.add('dataEIJC', verbose=1)\n idata = isonde.data.dataEIJC\n#\n# IGRA \n#\nif isonde.in_store('dataIE5JC') and idata is None:\n isonde.add('dataIE5JC', verbose=1)\n idata = isonde.data.dataIE5JC",
"_____no_output_____"
]
],
[
[
"### Experiment Data",
"_____no_output_____"
]
],
[
[
"isonde.list_store(pattern='exp')",
"_____no_output_____"
],
[
"ivar = 'dpd'\nversion = 'v1'\nisrc = 'mars5'\nires = 'era5'\nexpdata = None\n#\n# ERA5\n#\nif isonde.in_store('exp{}{}_{}_{}.nc'.format(ivar,version,isrc,ires)):\n isonde.add('exp{}{}_{}_{}'.format(ivar,version,isrc,ires), verbose=1)\n expdata = isonde.data['exp{}{}_{}_{}'.format(ivar,version,isrc,ires)]\n#\n# ERA Interim\n#\nif expdata is None:\n isrc = 'marsi'\n ires = 'erai'\n if isonde.in_store('exp{}{}_{}_{}.nc'.format(ivar,version,isrc,ires)):\n isonde.add('exp{}{}_{}_{}'.format(ivar,version,isrc,ires), verbose=1)\n expdata = isonde.data['exp{}{}_{}_{}'.format(ivar,version,isrc,ires)]\n#\n# JRA55\n#\nif expdata is None:\n isrc = 'mars5'\n ires = 'jra55'\n if isonde.in_store('exp{}{}_{}_{}.nc'.format(ivar,version,isrc,ires)):\n isonde.add('exp{}{}_{}_{}'.format(ivar,version,isrc,ires), verbose=1)\n expdata = isonde.data['exp{}{}_{}_{}'.format(ivar,version,isrc,ires)]",
"_____no_output_____"
],
[
"if idata is None:\n print(\"No data ?\")\n exit()",
"_____no_output_____"
],
[
"#\n# Some definitions\n#\ntimes = [0, 12]\nstart = '1979'\nende = '2019'\nperiod = slice(start, ende)\nperiod_str = \"%s-%s\" % (start, ende)\n#\n# Subset to only that period\n#\nidata = idata.sel(time=period, hour=times)",
"_____no_output_____"
]
],
[
[
"## Station Map",
"_____no_output_____"
]
],
[
[
"rt.plot.map.station_class(isonde, states=True, rivers=True, land=True, lakes=True)\nif save:\n savefig('figures/%s_station.png' % ident)",
"_____no_output_____"
]
],
[
[
"# Data Availability",
"_____no_output_____"
]
],
[
[
"dpdvars = []\ntvars = []\nfor jvar in list(idata.data_vars):\n if 'dpd_' in jvar:\n if not any([i in jvar for i in ['err','_fg_','snht']]):\n dpdvars.append(jvar)\n if 't_' in jvar:\n if not any([i in jvar for i in ['err','_fg_','snht']]):\n tvars.append(jvar)\nprint(dpdvars)\nprint(tvars)",
"_____no_output_____"
]
],
[
[
"## Dewpoint depression",
"_____no_output_____"
]
],
[
[
"counts = idata.reset_coords()[dpdvars].count('time').sum('hour').to_dataframe()\ncounts.index /= 100.\ncounts.plot()\nxticks(yplevs/100)\ngrid()\ntitle(\"%s Counts %s\" % (ident, period_str))\nylabel(\"Total counts [1]\")\nif save:\n savefig('figures/%s_dpd_counts.png' % ident)",
"_____no_output_____"
]
],
[
[
"## Temperature",
"_____no_output_____"
]
],
[
[
"counts = idata.reset_coords()[tvars].count('time').sum('hour').to_dataframe()\ncounts.index /= 100.\ncounts.plot()\nxticks(yplevs/100)\ngrid()\ntitle(\"%s Counts %s\" % (ident, period_str))\nylabel(\"Total counts [1]\")\nif save:\n savefig('figures/%s_t_counts.png' % ident)",
"_____no_output_____"
]
],
[
[
"## Annual",
"_____no_output_____"
]
],
[
[
"counts = idata.reset_coords()[dpdvars].count('plev').resample(time='A').sum().to_dataframe()\nn = len(idata.hour.values)\nf, ax = subplots(n,1, sharex=True)\nax[0].set_title(\"%s Annual counts %s\" % (ident, period_str))\nfor i,ihour in enumerate(idata.hour.values):\n counts.xs(ihour, level=0).plot(grid=True, ax=ax[i], legend=True if i==0 else False)\n ax[i].set_ylabel(\"%02d Z\" % (ihour))\nax[i].set_xlabel('Years')\ntight_layout()\nif save:\n savefig('figures/%s_dpd_ancounts.png' % (ident))",
"_____no_output_____"
],
[
"counts = idata.reset_coords()[tvars].count('plev').resample(time='A').sum().to_dataframe()\nn = len(idata.hour.values)\nf, ax = subplots(n,1, sharex=True)\nax[0].set_title(\"%s Annual counts %s\" % (ident, period_str))\nfor i,ihour in enumerate(idata.hour.values):\n counts.xs(ihour, level=0).plot(grid=True, ax=ax[i], legend=True if i==0 else False)\n ax[i].set_ylabel(\"%02d Z\" % (ihour))\nax[i].set_xlabel('Years')\ntight_layout()\nif save:\n savefig('figures/%s_t_ancounts.png' % (ident))",
"_____no_output_____"
]
],
[
[
"# Dewpoint depression",
"_____no_output_____"
]
],
[
[
"obs = 'dpd_{}'.format(isrc)\nhdim = 'hour'\nfor ihour in idata[hdim].values:\n rt.plot.time.var(idata[obs].sel(**{hdim:ihour}), dim='time', lev='plev', \n title='%s %s Radiosonde at %02d Z' % (ident, obs, ihour))",
"_____no_output_____"
]
],
[
[
"# Temperature",
"_____no_output_____"
]
],
[
[
"obs = 't_{}'.format(isrc)\nhdim = 'hour'\nfor ihour in idata[hdim].values:\n rt.plot.time.var(idata[obs].sel(**{hdim:ihour}), dim='time', lev='plev', \n title='%s %s Radiosonde at %02d Z' % (ident, obs, ihour))",
"_____no_output_____"
]
],
[
[
"# Comparison with Reanalysis",
"_____no_output_____"
]
],
[
[
"dim = 'time'\nhdim = 'hour'\nlev = 'plev'\nivar = 'dpd'\nobs = '{}_{}'.format(ivar, isrc)\nplotvars = []\n#\n# Select Variables\n#\nfor jvar in list(idata.data_vars):\n if '_' in jvar:\n iname = jvar.split('_')[1]\n if jvar == \"%s_%s\" %(ivar, iname):\n plotvars += [jvar]\n\nprint(plotvars)\n#\n# Select Level\n#\nipres=10000\n#\n# Plot\n#\nylims = (np.round(idata[obs].min()), np.round(idata[obs].max()))\nfor i,j in idata[plotvars].groupby(hdim):\n m = j.sel(**{lev:ipres}).resample(**{dim:'M'}).mean(dim)\n f, ax = plt.subplots(figsize=(16,4))\n for jvar in plotvars:\n rt.plot.time.var(m[jvar], ax=ax, dim=dim, label=jvar.replace(ivar+'_',''))\n ax.set_ylabel(\"%s [%s]\" % (ivar, idata[jvar].attrs['units']))\n ax.set_xlabel('Time [M]')\n ax.set_title('%s %s Comparison %s %02dZ at %d hPa' %(ident, ivar, period_str, i, ipres/100))\n ax.legend(ncol=len(plotvars))\n ax.set_ylim(ylims)\n tight_layout()\n if save:\n savefig('figures/%s_%s_comparison_%04d_%02dZ.png' % (ident, ivar, ipres/100, i))",
"_____no_output_____"
]
],
[
[
"## Departures",
"_____no_output_____"
]
],
[
[
"dim = 'time'\nhdim = 'hour'\nlev = 'plev'\nivar = 'dpd'\nobs = '{}_{}'.format(ivar, isrc)\nplotvars = []\n#\n# Select Variables\n#\nfor jvar in list(idata.data_vars):\n if '_' in jvar:\n iname = jvar.split('_')[1]\n if jvar == \"%s_%s\" %(ivar, iname):\n plotvars += [jvar]\n\nprint(plotvars)\n#\n# Select Level\n#\nipres=30000\n#\n# Plot\n#\nylims = (-10,10) # Manual\nfor i,j in idata[plotvars].groupby(hdim):\n m = j.sel(**{lev:ipres}).resample(**{dim:'M'}).mean(dim)\n f, ax = plt.subplots(figsize=(16,4))\n for jvar in plotvars:\n if jvar == obs:\n continue\n rt.plot.time.var(m[obs] - m[jvar], ax=ax, dim=dim, label=jvar.replace(ivar+'_',''))\n ax.set_ylabel(\"%s [%s]\" % (ivar, idata[jvar].attrs['units']))\n ax.set_xlabel('Time [M]')\n ax.set_title('%s Departures %s (OBS-BG) %s %02dZ at %d hPa' %(ident, ivar, period_str, i, ipres/100))\n ax.legend(ncol=len(plotvars))\n ax.set_ylim(ylims)\n tight_layout()\n if save:\n savefig('figures/%s_%s_dep_%04d_%02dZ.png' % (ident, ivar, ipres/100, i))",
"_____no_output_____"
]
],
[
[
"# Adjustment Process",
"_____no_output_____"
]
],
[
[
"if expdata is None:\n #\n # Make Experiments \n #\n expdata = idata.copy()\nelse:\n expdata = expdata.sel(**{dim: period})",
"_____no_output_____"
]
],
[
[
"## SNHT",
"_____no_output_____"
]
],
[
[
"dim = 'time'\nhdim = 'hour'\nivar = 'dpd'\nobs = '{}_{}'.format(ivar, isrc)\nres = '{}_{}'.format(ivar, ires)\n#\n# Execute SNHT ?\n#\nif not '{}_snht'.format(obs) in expdata.data_vars:\n #\n # Calculate SNHT values with Parameters (window and missing)\n #\n expdata = rt.bp.snht(expdata, var=obs, dep=res, dim=dim, \n window=1460, \n missing=600, \n verbose=1)\n #\n # Apply Threshold (threshold) and detect Peaks\n # allowed distances between peaks (dist)\n # minimum requires significant levels (min_levels)\n #\n expdata = expdata.groupby(hdim).apply(rt.bp.apply_threshold,\n threshold=50,\n dist=730,\n min_levels=3,\n var=obs + '_snht',\n dim=dim)",
"_____no_output_____"
],
[
"#\n# Plot SNHT\n#\nfor i,j in expdata.groupby(hdim):\n ax = rt.plot.time.threshold(j[obs + '_snht'], dim=dim, lev=lev, logy=False, \n title=\" %s SNHT %s at %02dZ\" % (ident, period_str, i), \n figsize=(12,4), \n yticklabels=yplevs)\n rt.plot.time.breakpoints(j[obs + '_snht_breaks'], ax=ax, startend=True)\n tight_layout()\n if save:\n savefig('figures/%s_%s_snht_%s_%02dZ.png' % (ident, obs, ires, i))",
"_____no_output_____"
]
],
[
[
"## Breakpoints",
"_____no_output_____"
]
],
[
[
"#\n# Give Breakpoint Information \n#\nfor i,j in expdata.groupby(hdim):\n _=rt.bp.get_breakpoints(j[obs + '_snht_breaks'], dim=dim, verbose=1)",
"_____no_output_____"
]
],
[
[
"## Adjustments",
"_____no_output_____"
]
],
[
[
"dim = 'time'\nhdim = 'hour'\nivar = 'dpd'\nobs = '{}_{}'.format(ivar, isrc)\nres = '{}_{}'.format(ivar, ires)\n\n# plotvars = [i for i in expdata.data_vars if '_dep' in i]\nadjvars = \"{obs},{obs}_m,{obs}_q,{obs}_qa\".format(obs=obs)\nadjvars = adjvars.split(',')\nprint(adjvars)",
"_____no_output_____"
],
[
"missing = False\nfor jvar in adjvars:\n if jvar not in expdata.data_vars:\n missing = True\n",
"_____no_output_____"
]
],
[
[
"### Run standard adjustment process",
"_____no_output_____"
]
],
[
[
"if missing:\n from detect import run_standard\n expdata = run_standard(idata, obs, res, meanadj=True, qadj=True, qqadj=True, verbose=1)",
"_____no_output_____"
]
],
[
[
"## Breakpoint Stats",
"_____no_output_____"
]
],
[
[
"ipres=85000\n#\n# MEAN ADJ\n#\nbins = np.round(np.nanpercentile(np.ravel(expdata[obs].sel(**{lev:ipres}).values), [1,99]))\nbins = np.arange(bins[0]-2,bins[1]+2,1)\nfor i,j in expdata.groupby(hdim):\n rt.plot.breakpoints_histograms(j.sel(**{lev:ipres}), \n obs, '{}_m'.format(obs), '{}_snht_breaks'.format(obs),\n figsize=(18,8), \n other_var=res,\n bins=bins);\n if save:\n savefig('figures/%s_bhist_m_%s_%02dZ_%04dhPa.png' % (ident, ivar, i, ipres/100))",
"_____no_output_____"
],
[
"ipres=85000\n#\n# QUANTIL ADJ\n#\nbins = np.round(np.nanpercentile(np.ravel(expdata[obs].sel(**{lev:ipres}).values), [1,99]))\nbins = np.arange(bins[0]-2,bins[1]+2,1)\nfor i,j in expdata.groupby(hdim):\n rt.plot.breakpoints_histograms(j.sel(**{lev:ipres}), \n obs, '{}_q'.format(obs), '{}_snht_breaks'.format(obs),\n figsize=(18,8), \n other_var=res,\n bins=bins);\n if save:\n savefig('figures/%s_bhist_q_%s_%02dZ_%04dhPa.png' % (ident, ivar, i, ipres/100))",
"_____no_output_____"
],
[
"ipres=85000\n#\n# QUANTIL ADJ\n#\nbins = np.round(np.nanpercentile(np.ravel(expdata[obs].sel(**{lev:ipres}).values), [1,99]))\nbins = np.arange(bins[0]-2,bins[1]+2,1)\nfor i,j in expdata.groupby(hdim):\n rt.plot.breakpoints_histograms(j.sel(**{lev:ipres}), \n obs, '{}_qa'.format(obs), '{}_snht_breaks'.format(obs),\n figsize=(18,8), \n other_var=res,\n bins=bins);\n if save:\n savefig('figures/%s_bhist_qa_%s_%02dZ_%04dhPa.png' % (ident, ivar, i, ipres/100))",
"_____no_output_____"
]
],
[
[
"## Adjustment methods",
"_____no_output_____"
]
],
[
[
"bvar = '{}_snht_breaks'.format(obs)\n#\n# Select Level\n#\nipres=30000\n#\n# Plot\n#\nylims = np.round(np.nanpercentile(np.ravel(expdata[obs].sel(**{lev:ipres}).rolling(**{dim:30, 'center':True, 'min_periods':10}).mean().values), [1,99]))\nylims += [-2,2]\nfor i,j in expdata[adjvars].groupby(hdim):\n m = j.sel(**{lev:ipres}).rolling(**{dim:30, 'center':True, 'min_periods':10}).mean()\n f, ax = plt.subplots(figsize=(16,4))\n for jvar in adjvars:\n rt.plot.time.var(m[jvar], ax=ax, dim=dim, label=jvar[-1:].upper() if jvar != obs else ivar, ls='-' if jvar == obs else '--')\n if bvar in expdata.data_vars:\n rt.plot.time.breakpoints(expdata[bvar].sel(**{hdim:i}), ax=ax, color='k', lw=2, ls='--') \n ax.set_ylabel(\"%s [%s]\" % (ivar, expdata[jvar].attrs['units']))\n ax.set_xlabel('Time [M]')\n ax.set_title('%s Adjustments %s %s %02dZ at %d hPa' %(ident, ivar, period_str, i, ipres/100))\n ax.legend(ncol=len(plotvars))\n ax.set_ylim(ylims)\n tight_layout()\n if save:\n savefig('figures/%s_%s_adj_%04d_%02dZ.png' % (ident, ivar, ipres/100, i)) \n ",
"_____no_output_____"
]
],
[
[
"# Analysis",
"_____no_output_____"
]
],
[
[
"#\n# Monthly Means\n#\nvariables = list(unique(dpdvars + tvars + adjvars))\nfor jvar in variables[:]:\n if jvar not in expdata.data_vars:\n variables.remove(jvar)\nprint(variables)\nmdata = expdata[variables].resample(**{dim:'M'}).mean(keep_attrs=True)",
"_____no_output_____"
]
],
[
[
"## Trends",
"_____no_output_____"
]
],
[
[
"trends = rt.met.time.trend(mdata, period=period, dim=dim, only_slopes=True)\nwith xr.set_options(keep_attrs=True):\n trends = trends*3650. # Trends per Decade\nfor jvar in trends.data_vars:\n trends[jvar].attrs['units'] = trends[jvar].attrs['units'].replace('day','decade')",
"_____no_output_____"
],
[
"xlims = (np.round(trends.min().to_array().min()), np.round(trends.max().to_array().max()))\nn = mdata[hdim].size\nf,ax = rt.plot.init_fig_horizontal(n=n, ratios=tuple([2]*n), sharey=True)\nfor i, ihour in enumerate(trends[hdim].values):\n for jvar in variables:\n rt.plot.profile.var(trends[jvar].sel(**{hdim:ihour}), ax=ax[i], label=jvar[-1:].upper() if jvar != obs else ivar)\n ax[i].set_title('%02d' % ihour)\n ax[i].set_xlim(xlims)\n ax[i].set_xlabel(\"%s [%s]\" % (mdata[obs].attrs['standard_name'], trends[jvar].attrs['units']))\n \nf.suptitle('%s %s Trends %s' % (ident, ivar.upper(), period_str))\nif save:\n savefig('figures/%s_trends_%s.png' % (ident, ivar))",
"_____no_output_____"
]
],
[
[
"## Statistics",
"_____no_output_____"
]
],
[
[
"from detect import skills_table",
"_____no_output_____"
],
[
"for jvar in mdata.data_vars:\n if jvar == obs or jvar == res:\n continue\n _ , ytable = skills_table(mdata[obs], mdata[res], mdata[jvar])\n print(\"#\"*50)\n print(ident, obs, res, jvar)\n print(ytable)\n print(\"#\"*50)",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbfb000addb6d5fae0d221781e478853b90799d3
| 2,741 |
ipynb
|
Jupyter Notebook
|
hydrometer.ipynb
|
adam-funk/pytilt
|
3890179b618759831932a7f3e99bc5ba48166919
|
[
"MIT"
] | null | null | null |
hydrometer.ipynb
|
adam-funk/pytilt
|
3890179b618759831932a7f3e99bc5ba48166919
|
[
"MIT"
] | null | null | null |
hydrometer.ipynb
|
adam-funk/pytilt
|
3890179b618759831932a7f3e99bc5ba48166919
|
[
"MIT"
] | null | null | null | 20.303704 | 104 | 0.485589 |
[
[
[
"from collections import defaultdict\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport numpy as np",
"_____no_output_____"
],
[
"# ignore NaN\n\ndef meanr(x):\n return round(np.nanmean(x), 1)\n\ndef medianr(x):\n return round(np.nanmedian(x), 1)",
"_____no_output_____"
],
[
"data = pd.read_csv('data/hydrometer.csv', names=['color', 'epoch', 'iso', 'sg', 'c', 'f', 'n'],\n index_col='epoch')\ndata['time'] = pd.to_datetime(data['iso'])\ndata['date'] = data['time'].dt.date\ndata['c'] = round(data['c'], 1)\n\ndate_data = data.groupby('date').agg({'sg':['min', meanr, medianr, 'max'], \n 'c':['min', meanr, medianr,'max']})",
"_____no_output_____"
],
[
"data.info()",
"_____no_output_____"
],
[
"plt.rcParams[\"figure.figsize\"] = (15,6)\nplt.grid()\nplt.plot(data['time'], data['sg'])",
"_____no_output_____"
],
[
"plt.grid()\nplt.plot(data['time'], data['c'])",
"_____no_output_____"
],
[
"date_data",
"_____no_output_____"
],
[
"plt.grid()\nplt.plot(date_data.index, date_data['sg'])",
"_____no_output_____"
],
[
"plt.grid()\nplt.plot(date_data.index, date_data['c'])",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfb0d87e1202b38c9e9a1d519622957ec3526a3
| 30,649 |
ipynb
|
Jupyter Notebook
|
Pyramide.ipynb
|
bassschlussel/pyramiden
|
b67d89641f95d2e2501ad0d8f8849d4bf15093a7
|
[
"Apache-2.0"
] | null | null | null |
Pyramide.ipynb
|
bassschlussel/pyramiden
|
b67d89641f95d2e2501ad0d8f8849d4bf15093a7
|
[
"Apache-2.0"
] | null | null | null |
Pyramide.ipynb
|
bassschlussel/pyramiden
|
b67d89641f95d2e2501ad0d8f8849d4bf15093a7
|
[
"Apache-2.0"
] | null | null | null | 45.676602 | 187 | 0.184802 |
[
[
[
"# Rechenpyramiden\n\nDie Zellen werden ausgefรผhrt mit gelichzeitigem drรผcken von 'Shift'+'Enter' </br>\nFรผhre als erstes die Zelle unten aus damit der Pyramidengenerator parat ist. </br>\n\nDie Funktion Pyramide() erzeugt die Pyramide und die Lรถsung: </br>\n\n`Pyramide(7)` </br>\n=> erzeugt eine Pyramide mit 7 zufรคlligen Basiszahlen wobei die grรถsste Basiszahl 2 ist. </br>\n\n`Pyramide(9,1)` </br>\n=> erzeugt eine Pyramide mit 9 zufรคlligen Basiszahlen und sowohl Addition als auch Subtraktion in der Pyramide. </br>\n\n`Pyramide(9,7)` </br>\n=> erzeugt eine Pyramide die weniger Zahlen zeigt und damit schwieriger zu lรถsen ist. </br>\n\n`Pyramide([1,2,1,2,3,2,1,2,0,3,2,3])` </br>\n=> erzeugt eine Pyramide mit den Basiszahlen: 1, 2, 1, 2, 3, 2, 1, 2, 0, 3, 2, 3 und aussschliesslich Addition. </br>\n\n`Pyramide([11,2])` </br>\n=> erzeugt eine Pyramide mit 11 zufรคlligen Basiszahlen wobei die grรถsste Basiszahl 2 sein darf. </br>\n\n`Pyramide([Hรถhe,MaxBasis],Schwierigkeit)` </br>\n=> erzeugt eine Pyramide mit `Hรถhe` zufรคlligen Basiszahlen wobei die grรถsste Basiszahl `MaxBasis` sein darf und die `Schwierigkeit` zwischen 0 und `MaxBasis` liegen sollte. </br>\n\n#### Es gibt keine Garantie dass jede Pyramide lรถsbar ist. Insbesondere wenn die Schwierigkeit nahe bei MaxBasis liegt.\n#### Alle Eingaben sind ganze positive Zahlen\n",
"_____no_output_____"
]
],
[
[
"# Diese Zelle mit `Shift`+`Enter` als erstes ausfรผhren\n\nfrom random import randint as randd\ndef cre_pyramid(numbers):\n if type(numbers) is list:\n if len(numbers) == 1:\n n = numbers[0]+1\n nmax = 2\n numbers = [randd(0,nmax-1) for ii in range(n)]\n elif len(numbers) == 2:\n n = numbers[0]+1\n nmax = numbers[1]\n numbers = [randd(0,nmax-1) for ii in range(n)]\n else:\n n = len(numbers)\n nmax = max(numbers)+1\n elif type(numbers) is int:\n n = numbers+1\n nmax = 2 \n numbers = [randd(0,nmax-1) for ii in range(n)]\n else:\n return None\n \n pyramid = [numbers]\n for ii in range(n-1):\n pyramid.append([numbers[iii]+numbers[iii+1] for iii in range(n-ii-1)])\n numbers = pyramid[-1]\n return pyramid\n\ndef cancalc(holes,level,index):\n # check calc from below\n if level > 0 and index < len(holes[level-1]) and bool(holes[level-1][index]*holes[level-1][index+1]):\n return True\n #check left\n elif index > 0 and level < len(holes)-1 and bool(holes[level][index-1]*holes[level+1][index-1]):\n return True\n elif level < len(holes)-1 and index < len(holes[level])-1 and bool(holes[level][index+1]*holes[level+1][index]):\n return True\n else:\n return False\n \ndef pri_solution(py):\n center = round((len(str(py[0]))+len(py[0])-1)/2)+1\n n = len(py)\n for ii in range(n):\n nwidth =len(str(py[n-ii-1]))+len(py[n-ii-1])+1\n start = center - round(nwidth/2-0.5)\n print(start*' '+nwidth*'-')\n ss = '| '\n for jj in range(len(py[n-ii-1])):\n ss += str(py[n-ii-1][jj]) +' | '\n print(start*' '+ss)\n\ndef rxy(holes):\n level = randd(0,len(holes)-1)\n index = randd(0,len(holes[level])-1)\n return level,index\n \ndef pri_pyramid(py,sol = False, hardness = 0):\n holes = [[1 for ii in jj] for jj in py]\n n = len(py)\n if not sol:\n if hardness > 0:\n for ii in range(800):\n level,index=rxy(holes)\n if cancalc(holes,level,index):\n holes[level][index] = 0\n for hard in range(hardness):\n for level in [ss+hard*2 for ss in range(n-hard*2)]:\n if bool(randd(0,1)):\n for index in range(len(py[level])):\n if cancalc(holes,level,len(py[level])-index-1):\n holes[level][len(py[level])-index-1] = 0\n else:\n for index in range(len(py[level])):\n if cancalc(holes,level,index):\n holes[level][index] = 0\n for level in [ss+hard for ss in range(n-hard)]:\n for index in range(len(py[level])-1):\n if bool(holes[level][index]*holes[level][index]):\n if level < n and holes[level+1][index] == 0:\n holes[level+1][index] = 1\n holes[level][index] = 0\n for level in [ss+hard for ss in range(n-hard)]:\n if bool(randd(0,1)):\n for index in range(len(py[level])):\n if cancalc(holes,level,len(py[level])-index-1):\n holes[level][len(py[level])-index-1] = 0\n else:\n for index in range(len(py[level])):\n if cancalc(holes,level,index):\n holes[level][index] = 0\n else:\n holes[1:] = [[0 for ii in jj] for jj in py[1:]]\n \n center = len(py[0])*3+1\n ll = 0\n for ii in range(n):\n ss = '| '\n for jj in range(len(py[n-ii-1])):\n if holes[n-ii-1][jj] == 1:\n sadd = str(py[n-ii-1][jj])\n if len(sadd) == 2:\n sadd = ' '+sadd\n elif len(sadd) == 1:\n sadd = ' '+sadd+' '\n else:\n sadd = max(len(str(py[n-ii-1][jj])),3)*' '\n ss += sadd +' | '\n dw = ll+3-len(ss)\n for oo in range(dw):\n pos = round((oo+1)*len(ss)/(dw+1))\n while ss[pos-1].isdigit() and ss[pos].isdigit():\n pos += -1\n ss = ss[:pos]+' '+ss[pos:]\n nwidth =len(ss)-1\n start = round(center - nwidth/2+0.5)\n print(start*' '+nwidth*'-')\n print(start*' '+ss)\n ll = nwidth\n print(start*' '+nwidth*'-')\n\ndef Pyramide(py, hardness = 0):\n pyramid = cre_pyramid(py)\n print()\n print()\n print()\n pri_pyramid(pyramid,False,hardness) \n print()\n print()\n pri_pyramid(pyramid,True,hardness) \n print()\n print()\n",
"_____no_output_____"
],
[
"Pyramide(7)",
"\n\n\n -------\n | | \n -------------\n | | | \n -------------------\n | | | | \n -------------------------\n | | | | | \n -------------------------------\n | | | | | | \n -------------------------------------\n | | | | | | | \n -------------------------------------------\n | | | | | | | | \n -------------------------------------------------\n | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | \n -------------------------------------------------\n\n\n -------\n | 78 | \n -------------\n | 42 | 36 | \n -------------------\n | 22 | 20 | 16 | \n -------------------------\n | 11 | 11 | 9 | 7 | \n -------------------------------\n | 5 | 6 | 5 | 4 | 3 | \n -------------------------------------\n | 2 | 3 | 3 | 2 | 2 | 1 | \n -------------------------------------------\n | 1 | 1 | 2 | 1 | 1 | 1 | 0 | \n -------------------------------------------------\n | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 0 | \n -------------------------------------------------\n\n\n"
],
[
"Pyramide(9,1)",
"\n\n\n -------\n | 335 | \n -------------\n | | 164 | \n -------------------\n | 86 | | 79 | \n -------------------------\n | | 43 | | 37 | \n -------------------------------\n | 22 | | | 20 | | \n -------------------------------------\n | 12 | | | | | 8 | \n -------------------------------------------\n | | | 5 | 6 | 5 | | 4 | \n -------------------------------------------------\n | 4 | 3 | | 3 | | 2 | 2 | | \n -------------------------------------------------------\n | 2 | | | | | 1 | | | 1 | \n -------------------------------------------------------------\n | | | 1 | | | | | | | 1 | \n -------------------------------------------------------------\n\n\n -------\n | 335 | \n -------------\n | 171 | 164 | \n -------------------\n | 86 | 85 | 79 | \n -------------------------\n | 43 | 43 | 42 | 37 | \n -------------------------------\n | 22 | 21 | 22 | 20 | 17 | \n -------------------------------------\n | 12 | 10 | 11 | 11 | 9 | 8 | \n -------------------------------------------\n | 7 | 5 | 5 | 6 | 5 | 4 | 4 | \n -------------------------------------------------\n | 4 | 3 | 2 | 3 | 3 | 2 | 2 | 2 | \n -------------------------------------------------------\n | 2 | 2 | 1 | 1 | 2 | 1 | 1 | 1 | 1 | \n -------------------------------------------------------------\n | 1 | 1 | 1 | 0 | 1 | 1 | 0 | 1 | 0 | 1 | \n -------------------------------------------------------------\n\n\n"
],
[
"Pyramide(9,7)",
"\n\n\n -------\n | 131 | \n -------------\n | 45 | | \n -------------------\n | 16 | | 57 | \n -------------------------\n | | | | 36 | \n -------------------------------\n | | | | 15 | | \n -------------------------------------\n | | 1 | | 5 | | 11 | \n -------------------------------------------\n | | | | | | 6 | | \n -------------------------------------------------\n | | | | | | | | 2 | \n -------------------------------------------------------\n | | | | | | | | 1 | | \n -------------------------------------------------------------\n | | | | 0 | | 0 | | | | 1 | \n -------------------------------------------------------------\n\n\n -------\n | 131 | \n -------------\n | 45 | 86 | \n -------------------\n | 16 | 29 | 57 | \n -------------------------\n | 8 | 8 | 21 | 36 | \n -------------------------------\n | 6 | 2 | 6 | 15 | 21 | \n -------------------------------------\n | 5 | 1 | 1 | 5 | 10 | 11 | \n -------------------------------------------\n | 4 | 1 | 0 | 1 | 4 | 6 | 5 | \n -------------------------------------------------\n | 3 | 1 | 0 | 0 | 1 | 3 | 3 | 2 | \n -------------------------------------------------------\n | 2 | 1 | 0 | 0 | 0 | 1 | 2 | 1 | 1 | \n -------------------------------------------------------------\n | 1 | 1 | 0 | 0 | 0 | 0 | 1 | 1 | 0 | 1 | \n -------------------------------------------------------------\n\n\n"
],
[
"Pyramide([1,2,1,2,3,2,1,2,0,3,2,3])",
"\n\n\n --------\n | | \n ---------------\n | | | \n --------------------\n | | | | \n -------------------------\n | | | | | \n -------------------------------\n | | | | | | \n -------------------------------------\n | | | | | | | \n -------------------------------------------\n | | | | | | | | \n -------------------------------------------------\n | | | | | | | | | \n -------------------------------------------------------\n | | | | | | | | | | \n -------------------------------------------------------------\n | | | | | | | | | | | \n -------------------------------------------------------------------\n | | | | | | | | | | | | \n -------------------------------------------------------------------------\n | 1 | 2 | 1 | 2 | 3 | 2 | 1 | 2 | 0 | 3 | 2 | 3 | \n -------------------------------------------------------------------------\n\n\n --------\n | 3634 | \n ---------------\n | 1922 | 1712 | \n --------------------\n | 1012 | 910 | 802 | \n -------------------------\n | 523 | 489 | 421 | 381 | \n -------------------------------\n | 262 | 261 | 228 | 193 | 188 | \n -------------------------------------\n | 126 | 136 | 125 | 103 | 90 | 98 | \n -------------------------------------------\n | 58 | 68 | 68 | 57 | 46 | 44 | 54 | \n -------------------------------------------------\n | 26 | 32 | 36 | 32 | 25 | 21 | 23 | 31 | \n -------------------------------------------------------\n | 12 | 14 | 18 | 18 | 14 | 11 | 10 | 13 | 18 | \n -------------------------------------------------------------\n | 6 | 6 | 8 | 10 | 8 | 6 | 5 | 5 | 8 | 10 | \n -------------------------------------------------------------------\n | 3 | 3 | 3 | 5 | 5 | 3 | 3 | 2 | 3 | 5 | 5 | \n -------------------------------------------------------------------------\n | 1 | 2 | 1 | 2 | 3 | 2 | 1 | 2 | 0 | 3 | 2 | 3 | \n -------------------------------------------------------------------------\n\n\n"
],
[
"Pyramide([1,2,1,2,3,2,1,2,0,3,2,3],1)",
"\n\n\n --------\n | 3634 | \n ---------------\n | 1922 | | \n --------------------\n | | | 802 | \n -------------------------\n | 523 | 489 | 421 | | \n -------------------------------\n | 262 | | | | 188 | \n -------------------------------------\n | | | 125 | 103 | | 98 | \n -------------------------------------------\n | 58 | 68 | | 57 | | | 54 | \n -------------------------------------------------\n | | | 36 | | | | | 31 | \n -------------------------------------------------------\n | 12 | | | | 14 | | | 13 | | \n -------------------------------------------------------------\n | | 6 | 8 | 10 | 8 | | | | 8 | | \n -------------------------------------------------------------------\n | 3 | | 3 | | | | 3 | 2 | | | 5 | \n -------------------------------------------------------------------------\n | | | | 2 | | | | | | | | 3 | \n -------------------------------------------------------------------------\n\n\n --------\n | 3634 | \n ---------------\n | 1922 | 1712 | \n --------------------\n | 1012 | 910 | 802 | \n -------------------------\n | 523 | 489 | 421 | 381 | \n -------------------------------\n | 262 | 261 | 228 | 193 | 188 | \n -------------------------------------\n | 126 | 136 | 125 | 103 | 90 | 98 | \n -------------------------------------------\n | 58 | 68 | 68 | 57 | 46 | 44 | 54 | \n -------------------------------------------------\n | 26 | 32 | 36 | 32 | 25 | 21 | 23 | 31 | \n -------------------------------------------------------\n | 12 | 14 | 18 | 18 | 14 | 11 | 10 | 13 | 18 | \n -------------------------------------------------------------\n | 6 | 6 | 8 | 10 | 8 | 6 | 5 | 5 | 8 | 10 | \n -------------------------------------------------------------------\n | 3 | 3 | 3 | 5 | 5 | 3 | 3 | 2 | 3 | 5 | 5 | \n -------------------------------------------------------------------------\n | 1 | 2 | 1 | 2 | 3 | 2 | 1 | 2 | 0 | 3 | 2 | 3 | \n -------------------------------------------------------------------------\n\n\n"
],
[
"Pyramide([1,2,1,2,3,2,1,2,0,3,2,3],8)",
"\n\n\n --------\n | 3634 | \n ---------------\n | 1922 | | \n --------------------\n | | | 802 | \n -------------------------\n | | | | 381 | \n -------------------------------\n | | | | 193 | | \n -------------------------------------\n | | | | | | 98 | \n -------------------------------------------\n | | | | | | | 54 | \n -------------------------------------------------\n | | | | | | | | 31 | \n -------------------------------------------------------\n | | | | | | | | 13 | | \n -------------------------------------------------------------\n | | | | | | | | | | | \n -------------------------------------------------------------------\n | 3 | | | 5 | | | | | 3 | | 5 | \n -------------------------------------------------------------------------\n | 1 | | | 2 | | | 1 | | | | 2 | | \n -------------------------------------------------------------------------\n\n\n --------\n | 3634 | \n ---------------\n | 1922 | 1712 | \n --------------------\n | 1012 | 910 | 802 | \n -------------------------\n | 523 | 489 | 421 | 381 | \n -------------------------------\n | 262 | 261 | 228 | 193 | 188 | \n -------------------------------------\n | 126 | 136 | 125 | 103 | 90 | 98 | \n -------------------------------------------\n | 58 | 68 | 68 | 57 | 46 | 44 | 54 | \n -------------------------------------------------\n | 26 | 32 | 36 | 32 | 25 | 21 | 23 | 31 | \n -------------------------------------------------------\n | 12 | 14 | 18 | 18 | 14 | 11 | 10 | 13 | 18 | \n -------------------------------------------------------------\n | 6 | 6 | 8 | 10 | 8 | 6 | 5 | 5 | 8 | 10 | \n -------------------------------------------------------------------\n | 3 | 3 | 3 | 5 | 5 | 3 | 3 | 2 | 3 | 5 | 5 | \n -------------------------------------------------------------------------\n | 1 | 2 | 1 | 2 | 3 | 2 | 1 | 2 | 0 | 3 | 2 | 3 | \n -------------------------------------------------------------------------\n\n\n"
],
[
"Pyramide([9,3],3)",
"\n\n\n -------\n | 473 | \n -------------\n | | 242 | \n -------------------\n | | 115 | | \n -------------------------\n | | | | 68 | \n -------------------------------\n | 32 | | 28 | | | \n -------------------------------------\n | | | | 14 | | 20 | \n -------------------------------------------\n | | | | | | 10 | | \n -------------------------------------------------\n | | | | | | | | | \n -------------------------------------------------------\n | | | 1 | | | | 3 | | 1 | \n -------------------------------------------------------------\n | | | | | | | | | | 0 | \n -------------------------------------------------------------\n\n\n -------\n | 473 | \n -------------\n | 231 | 242 | \n -------------------\n | 116 | 115 | 127 | \n -------------------------\n | 60 | 56 | 59 | 68 | \n -------------------------------\n | 32 | 28 | 28 | 31 | 37 | \n -------------------------------------\n | 18 | 14 | 14 | 14 | 17 | 20 | \n -------------------------------------------\n | 11 | 7 | 7 | 7 | 7 | 10 | 10 | \n -------------------------------------------------\n | 7 | 4 | 3 | 4 | 3 | 4 | 6 | 4 | \n -------------------------------------------------------\n | 4 | 3 | 1 | 2 | 2 | 1 | 3 | 3 | 1 | \n -------------------------------------------------------------\n | 2 | 2 | 1 | 0 | 2 | 0 | 1 | 2 | 1 | 0 | \n -------------------------------------------------------------\n\n\n"
]
]
] |
[
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfb1330a6efe8eb87a8bb291e7c18901cdbcc5a
| 2,951 |
ipynb
|
Jupyter Notebook
|
docs/tutorials/plotting/pairplot.ipynb
|
PerifanosPrometheus/graspologic
|
f874b9a19b4f1c117a09ca703310291a9e152334
|
[
"MIT"
] | null | null | null |
docs/tutorials/plotting/pairplot.ipynb
|
PerifanosPrometheus/graspologic
|
f874b9a19b4f1c117a09ca703310291a9e152334
|
[
"MIT"
] | null | null | null |
docs/tutorials/plotting/pairplot.ipynb
|
PerifanosPrometheus/graspologic
|
f874b9a19b4f1c117a09ca703310291a9e152334
|
[
"MIT"
] | null | null | null | 23.054688 | 138 | 0.534056 |
[
[
[
"# Pairplot: Visualizing High Dimensional Data\n\nThis example provides how to visualize high dimensional data using the pairplot.",
"_____no_output_____"
]
],
[
[
"import graspologic\n\nimport numpy as np\n%matplotlib inline",
"_____no_output_____"
]
],
[
[
"## Simulate a binary graph using stochastic block model\nThe 3-block model is defined as below:\n\n\\begin{align*}\nn &= [50, 50, 50]\\\\\nP &= \n\\begin{bmatrix}0.5 & 0.1 & 0.05 \\\\\n0.1 & 0.4 & 0.15 \\\\\n0.05 & 0.15 & 0.3\n\\end{bmatrix}\n\\end{align*}\n\nThus, the first 50 vertices belong to block 1, the second 50 vertices belong to block 2, and the last 50 vertices belong to block 3.",
"_____no_output_____"
]
],
[
[
"from graspologic.simulations import sbm\n\nn_communities = [50, 50, 50]\np = [[0.5, 0.1, 0.05], \n [0.1, 0.4, 0.15], \n [0.05, 0.15, 0.3],]\n\nnp.random.seed(2)\nA = sbm(n_communities, p)",
"_____no_output_____"
]
],
[
[
"## Embed using adjacency spectral embedding to obtain lower dimensional representation of the graph\n\nThe embedding dimension is automatically chosen. It should embed to 3 dimensions.",
"_____no_output_____"
]
],
[
[
"from graspologic.embed import AdjacencySpectralEmbed\n\nase = AdjacencySpectralEmbed()\nX = ase.fit_transform(A)\n\nprint(X.shape)",
"_____no_output_____"
]
],
[
[
"## Use pairplot to plot the embedded data\n\nFirst we generate labels that correspond to blocks. We pass the labels along with the data for pair plot.",
"_____no_output_____"
]
],
[
[
"from graspologic.plot import pairplot\n\nlabels = ['Block 1'] * 50 + ['Block 2'] * 50 + ['Block 3'] * 50\n\nplot = pairplot(X, labels)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfb18ffc0691494426615e61a56241b83caa63a
| 65,288 |
ipynb
|
Jupyter Notebook
|
projects/causal_scene_generation/causal_model/game_characters/GameCharacter_ImageClassification.ipynb
|
afurmanov/causalML
|
d98b9fd8fd111cf032fdde72a76b295e4660e26a
|
[
"MIT"
] | 354 |
2018-12-21T15:20:21.000Z
|
2021-01-02T14:48:51.000Z
|
projects/causal_scene_generation/causal_model/game_characters/GameCharacter_ImageClassification.ipynb
|
afurmanov/causalML
|
d98b9fd8fd111cf032fdde72a76b295e4660e26a
|
[
"MIT"
] | 5 |
2021-04-15T20:38:12.000Z
|
2022-03-12T00:52:29.000Z
|
projects/causal_scene_generation/causal_model/game_characters/GameCharacter_ImageClassification.ipynb
|
afurmanov/causalML
|
d98b9fd8fd111cf032fdde72a76b295e4660e26a
|
[
"MIT"
] | 112 |
2019-05-21T22:10:43.000Z
|
2020-12-29T05:52:07.000Z
| 73.688488 | 34,182 | 0.7413 |
[
[
[
"<a href=\"https://colab.research.google.com/github/mancunian1792/causal_scene_generation/blob/master/causal_model/game_characters/GameCharacter_ImageClassification.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
]
],
[
[
"from google.colab import drive\n\ndrive.mount(\"/content/gdrive\", force_remount=True)\n\n",
"Go to this URL in a browser: https://accounts.google.com/o/oauth2/auth?client_id=947318989803-6bn6qk8qdgf4n4g3pfee6491hc0brc4i.apps.googleusercontent.com&redirect_uri=urn%3aietf%3awg%3aoauth%3a2.0%3aoob&response_type=code&scope=email%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdocs.test%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive%20https%3a%2f%2fwww.googleapis.com%2fauth%2fdrive.photos.readonly%20https%3a%2f%2fwww.googleapis.com%2fauth%2fpeopleapi.readonly\n\nEnter your authorization code:\nยทยทยทยทยทยทยทยทยทยท\nMounted at /content/gdrive\n"
],
[
"import keras\nfrom keras.models import Sequential\nfrom keras.layers import Dense, Dropout, Flatten\nfrom keras.layers import Conv2D, MaxPooling2D\nfrom keras.utils import to_categorical\nfrom keras.preprocessing import image\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nfrom sklearn.model_selection import train_test_split\nfrom keras.utils import to_categorical\nfrom tqdm import tqdm\nfrom skimage.transform import rotate\nfrom skimage.util import random_noise\nfrom skimage.filters import gaussian",
"Using TensorFlow backend.\n"
],
[
"root_path = 'gdrive/My Drive/causal_scene_generation/game_characters/'",
"_____no_output_____"
],
[
"train_path = root_path + 'train/'\ntest_path = root_path + 'test/'\n\ntrain_images = train_path + 'images/'\ntest_images = test_path + 'images/'\ntrain_csv = train_path + 'train.csv'\ntest_csv = test_path + 'test.csv'",
"_____no_output_____"
],
[
"def preprocess(imgPath, filePath):\n images = []\n # Transform each image in the imgPath and add it to the input array\n data = pd.read_csv(filePath)\n\n for imgFile in tqdm(data[\"filename\"]):\n imgFullPath = imgPath + imgFile + \".png\"\n img = image.load_img(imgFullPath, target_size=(400,400,3), grayscale=False)\n img = image.img_to_array(img)\n img = img/255\n images.append(img)\n features = np.array(images)\n\n # Get the labels for each \n target = data.drop([\"filename\"], axis=1)\n return features, target\n",
"_____no_output_____"
],
[
"def augmentData(features, target):\n augmented_features = []\n augmented_target = []\n for idx in tqdm(range(features.shape[0])):\n augmented_features.append(features[idx])\n augmented_features.append(rotate(features[idx], angle=45, mode = 'wrap'))\n augmented_features.append(np.fliplr(features[idx]))\n augmented_features.append(np.flipud(features[idx]))\n augmented_features.append(random_noise(features[idx],var=0.2**2))\n for i in range(5):\n augmented_target.append(target.iloc[idx, :])\n return np.asarray(augmented_features), pd.DataFrame(augmented_target, columns= target.columns)",
"_____no_output_____"
],
[
"x_train, y_train = preprocess(train_images, train_csv)",
"100%|โโโโโโโโโโ| 345/345 [01:21<00:00, 4.25it/s]\n"
],
[
"x_train_augment, y_train_augment = augmentData(x_train, y_train)\ndel x_train, y_train",
"100%|โโโโโโโโโโ| 345/345 [00:18<00:00, 19.13it/s]\n"
],
[
"x_test, y_test = preprocess(test_images, test_csv)",
"100%|โโโโโโโโโโ| 87/87 [00:17<00:00, 5.05it/s]\n"
],
[
"x_test, x_validate, y_test, y_validate = train_test_split(x_test, y_test, random_state = 3000, test_size = 0.2)",
"_____no_output_____"
],
[
"plt.imshow(x_validate[2])",
"_____no_output_____"
],
[
"# Size of vector is 64 * 64 * 3 -> resize ((64 *64*3), 1)\n# (/255 )\n\n# Convert to grayscale.-> \n\n# The output shape \nop_shape = y_train_augment.shape[1]",
"_____no_output_____"
],
[
"model = Sequential()\nmodel.add(Conv2D(filters=16, kernel_size=(5, 5), activation=\"relu\", input_shape=(400,400,3)))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Conv2D(filters=32, kernel_size=(10, 10), activation='relu'))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Conv2D(filters=64, kernel_size=(10, 10), activation=\"relu\"))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Conv2D(filters=64, kernel_size=(5, 5), activation='relu'))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Flatten())\nmodel.add(Dense(128, activation='relu'))\nmodel.add(Dropout(0.5))\nmodel.add(Dense(64, activation='relu'))\nmodel.add(Dropout(0.5))\nmodel.add(Dense(op_shape, activation='sigmoid'))",
"_____no_output_____"
],
[
"model.summary()",
"Model: \"sequential_1\"\n_________________________________________________________________\nLayer (type) Output Shape Param # \n=================================================================\nconv2d_1 (Conv2D) (None, 396, 396, 16) 1216 \n_________________________________________________________________\nmax_pooling2d_1 (MaxPooling2 (None, 198, 198, 16) 0 \n_________________________________________________________________\ndropout_1 (Dropout) (None, 198, 198, 16) 0 \n_________________________________________________________________\nconv2d_2 (Conv2D) (None, 189, 189, 32) 51232 \n_________________________________________________________________\nmax_pooling2d_2 (MaxPooling2 (None, 94, 94, 32) 0 \n_________________________________________________________________\ndropout_2 (Dropout) (None, 94, 94, 32) 0 \n_________________________________________________________________\nconv2d_3 (Conv2D) (None, 85, 85, 64) 204864 \n_________________________________________________________________\nmax_pooling2d_3 (MaxPooling2 (None, 42, 42, 64) 0 \n_________________________________________________________________\ndropout_3 (Dropout) (None, 42, 42, 64) 0 \n_________________________________________________________________\nconv2d_4 (Conv2D) (None, 38, 38, 64) 102464 \n_________________________________________________________________\nmax_pooling2d_4 (MaxPooling2 (None, 19, 19, 64) 0 \n_________________________________________________________________\ndropout_4 (Dropout) (None, 19, 19, 64) 0 \n_________________________________________________________________\nflatten_1 (Flatten) (None, 23104) 0 \n_________________________________________________________________\ndense_1 (Dense) (None, 128) 2957440 \n_________________________________________________________________\ndropout_5 (Dropout) (None, 128) 0 \n_________________________________________________________________\ndense_2 (Dense) (None, 64) 8256 \n_________________________________________________________________\ndropout_6 (Dropout) (None, 64) 0 \n_________________________________________________________________\ndense_3 (Dense) (None, 23) 1495 \n=================================================================\nTotal params: 3,326,967\nTrainable params: 3,326,967\nNon-trainable params: 0\n_________________________________________________________________\n"
],
[
"model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])",
"_____no_output_____"
],
[
"model.fit(x_train_augment, y_train_augment, epochs=10, validation_data=(x_test, y_test), batch_size=64)",
"Train on 1725 samples, validate on 69 samples\nEpoch 1/10\n1725/1725 [==============================] - 1190s 690ms/step - loss: 0.5421 - accuracy: 0.7570 - val_loss: 0.5140 - val_accuracy: 0.8210\nEpoch 2/10\n1725/1725 [==============================] - 1184s 687ms/step - loss: 0.5091 - accuracy: 0.7773 - val_loss: 0.4775 - val_accuracy: 0.8242\nEpoch 3/10\n1725/1725 [==============================] - 1187s 688ms/step - loss: 0.4734 - accuracy: 0.7948 - val_loss: 0.4355 - val_accuracy: 0.8299\nEpoch 4/10\n1725/1725 [==============================] - 1184s 686ms/step - loss: 0.4450 - accuracy: 0.8013 - val_loss: 0.3910 - val_accuracy: 0.8368\nEpoch 5/10\n1725/1725 [==============================] - 1184s 686ms/step - loss: 0.4258 - accuracy: 0.8045 - val_loss: 0.3662 - val_accuracy: 0.8355\nEpoch 6/10\n1725/1725 [==============================] - 1186s 688ms/step - loss: 0.4101 - accuracy: 0.8098 - val_loss: 0.3813 - val_accuracy: 0.8431\nEpoch 7/10\n1725/1725 [==============================] - 1200s 696ms/step - loss: 0.3947 - accuracy: 0.8168 - val_loss: 0.3558 - val_accuracy: 0.8488\nEpoch 8/10\n1725/1725 [==============================] - 1190s 690ms/step - loss: 0.3853 - accuracy: 0.8188 - val_loss: 0.3426 - val_accuracy: 0.8494\nEpoch 9/10\n1725/1725 [==============================] - 1186s 688ms/step - loss: 0.3756 - accuracy: 0.8231 - val_loss: 0.3270 - val_accuracy: 0.8557\nEpoch 10/10\n1725/1725 [==============================] - 1188s 689ms/step - loss: 0.3648 - accuracy: 0.8287 - val_loss: 0.3264 - val_accuracy: 0.8563\n"
],
[
"model.save(root_path+\"model-both-images.hdf5\")",
"_____no_output_____"
],
[
"prediction = model.predict(x_validate)",
"_____no_output_____"
],
[
"prediction[0]",
"_____no_output_____"
],
[
"del x_train_augment, y_train_augment, x_test, y_test",
"_____no_output_____"
]
],
[
[
"### Attempt 2 - Image Classification\nThis time, i am splitting the images and modify the labels. The image classification will try to predict the entity (actor/reactor), character(satyr/golem), type(1/2/3) and entity_doing (action/reaction) and entity_doing_type(Idle/Attacking/Hurt/Die/Walking/Taunt)",
"_____no_output_____"
]
],
[
[
"# Modify the labels (Do - encoding)\nsplits_path = root_path + 'splits/'\nsplits_images = splits_path + 'images/'\nsplits_dataset = splits_path + 'split_dataset.csv'",
"_____no_output_____"
],
[
"df = pd.read_csv(splits_dataset)",
"_____no_output_____"
],
[
"df[\"type\"] = df.type.str.extract('(\\d+)')",
"_____no_output_____"
],
[
"images = df[\"img_name\"]\ntarget = df.drop([\"img_name\"], axis=1)\ntarget = pd.get_dummies(target)",
"_____no_output_____"
],
[
"def processSplitImages(imgPath, filenames):\n images_data = []\n for img in tqdm(filenames):\n imgFullPath = imgPath + img + \".png\"\n img = image.load_img(imgFullPath, target_size=(400,400,3), grayscale=False)\n img = image.img_to_array(img)\n img = img/255\n images_data.append(img)\n features = np.array(images_data)\n return features",
"_____no_output_____"
],
[
"img_features = processSplitImages(splits_images, images)",
"100%|โโโโโโโโโโ| 864/864 [05:21<00:00, 2.68it/s]\n"
],
[
"# Split into train and test . And then augment the train data.\nfeatures_train, features_test, target_train, target_test = train_test_split(img_features, target, stratify=target, test_size=0.2)\ndel img_features, target",
"_____no_output_____"
],
[
"# Augmenting train data -> Not able to allocate enough RAM\n#feature_train_augmented, target_augmented = augmentData(features_train, target_train)",
"100%|โโโโโโโโโโ| 691/691 [00:38<00:00, 17.82it/s]\n"
],
[
"op_shape = target_train.shape[1]",
"_____no_output_____"
],
[
"model = Sequential()\nmodel.add(Conv2D(filters=16, kernel_size=(5, 5), activation=\"relu\", input_shape=(400,400,3)))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Conv2D(filters=32, kernel_size=(10, 10), activation='relu'))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Conv2D(filters=64, kernel_size=(10, 10), activation=\"relu\"))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Conv2D(filters=64, kernel_size=(5, 5), activation='relu'))\nmodel.add(MaxPooling2D(pool_size=(2, 2)))\nmodel.add(Dropout(0.25))\nmodel.add(Flatten())\nmodel.add(Dense(128, activation='relu'))\nmodel.add(Dropout(0.5))\nmodel.add(Dense(64, activation='relu'))\nmodel.add(Dropout(0.5))\nmodel.add(Dense(op_shape, activation='sigmoid'))",
"_____no_output_____"
],
[
"model.compile(optimizer='adam', loss='binary_crossentropy', metrics=['accuracy'])",
"_____no_output_____"
],
[
"from keras.callbacks import ModelCheckpoint\n\nfilepath=root_path + \"weights-{epoch:02d}-{val_accuracy:.3f}.hdf5\"\ncheckpoint = ModelCheckpoint(filepath, monitor='val_accuracy',\n verbose=1, mode='max')\n\ncallbacks_list = [checkpoint]\nmodel.fit(features_train, target_train, epochs=10, validation_data=(features_test, target_test), batch_size=64, callbacks=callbacks_list)",
"Train on 691 samples, validate on 173 samples\nEpoch 1/10\n691/691 [==============================] - 511s 739ms/step - loss: 0.6233 - accuracy: 0.6616 - val_loss: 0.6088 - val_accuracy: 0.8224\n\nEpoch 00001: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-01-0.822.hdf5\nEpoch 2/10\n691/691 [==============================] - 507s 733ms/step - loss: 0.5335 - accuracy: 0.7487 - val_loss: 0.4891 - val_accuracy: 0.8482\n\nEpoch 00002: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-02-0.848.hdf5\nEpoch 3/10\n691/691 [==============================] - 503s 727ms/step - loss: 0.4832 - accuracy: 0.7785 - val_loss: 0.4509 - val_accuracy: 0.8647\n\nEpoch 00003: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-03-0.865.hdf5\nEpoch 4/10\n691/691 [==============================] - 504s 729ms/step - loss: 0.4373 - accuracy: 0.8068 - val_loss: 0.4061 - val_accuracy: 0.8844\n\nEpoch 00004: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-04-0.884.hdf5\nEpoch 5/10\n691/691 [==============================] - 501s 725ms/step - loss: 0.3986 - accuracy: 0.8206 - val_loss: 0.3359 - val_accuracy: 0.9017\n\nEpoch 00005: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-05-0.902.hdf5\nEpoch 6/10\n691/691 [==============================] - 500s 723ms/step - loss: 0.3721 - accuracy: 0.8337 - val_loss: 0.2794 - val_accuracy: 0.9110\n\nEpoch 00006: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-06-0.911.hdf5\nEpoch 7/10\n691/691 [==============================] - 496s 718ms/step - loss: 0.3321 - accuracy: 0.8546 - val_loss: 0.2288 - val_accuracy: 0.9175\n\nEpoch 00007: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-07-0.918.hdf5\nEpoch 8/10\n691/691 [==============================] - 496s 717ms/step - loss: 0.3238 - accuracy: 0.8553 - val_loss: 0.2294 - val_accuracy: 0.9299\n\nEpoch 00008: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-08-0.930.hdf5\nEpoch 9/10\n691/691 [==============================] - 494s 715ms/step - loss: 0.2962 - accuracy: 0.8682 - val_loss: 0.2043 - val_accuracy: 0.9403\n\nEpoch 00009: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-09-0.940.hdf5\nEpoch 10/10\n691/691 [==============================] - 496s 718ms/step - loss: 0.2761 - accuracy: 0.8753 - val_loss: 0.1933 - val_accuracy: 0.9538\n\nEpoch 00010: saving model to gdrive/My Drive/causal_scene_generation/game_characters/weights-10-0.954.hdf5\n"
],
[
"",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfb1fb44ee02153cd176f7e781595afababd4f4
| 98,570 |
ipynb
|
Jupyter Notebook
|
nbs/dl1/lesson6-rossmann.ipynb
|
greyestapps/course-v3
|
6f2ed6f7537ca14be958dd160cfbfaee0638e0c5
|
[
"Apache-2.0"
] | 11 |
2019-09-02T17:27:29.000Z
|
2021-01-25T17:40:01.000Z
|
nbs/dl1/lesson6-rossmann.ipynb
|
greyestapps/course-v3
|
6f2ed6f7537ca14be958dd160cfbfaee0638e0c5
|
[
"Apache-2.0"
] | 5 |
2021-09-27T22:01:25.000Z
|
2022-02-27T10:26:46.000Z
|
nbs/dl1/lesson6-rossmann.ipynb
|
greyestapps/course-v3
|
6f2ed6f7537ca14be958dd160cfbfaee0638e0c5
|
[
"Apache-2.0"
] | 9 |
2019-09-05T06:20:14.000Z
|
2021-03-24T06:19:04.000Z
| 55.036293 | 30,756 | 0.62589 |
[
[
[
"%reload_ext autoreload\n%autoreload 2",
"_____no_output_____"
],
[
"from fastai.tabular import *",
"_____no_output_____"
]
],
[
[
"# Rossmann",
"_____no_output_____"
],
[
"## Data preparation",
"_____no_output_____"
],
[
"To create the feature-engineered train_clean and test_clean from the Kaggle competition data, run `rossman_data_clean.ipynb`. One important step that deals with time series is this:\n\n```python\nadd_datepart(train, \"Date\", drop=False)\nadd_datepart(test, \"Date\", drop=False)\n```",
"_____no_output_____"
]
],
[
[
"path = Config().data_path()/'rossmann'\ntrain_df = pd.read_pickle(path/'train_clean')",
"_____no_output_____"
],
[
"train_df.head().T",
"_____no_output_____"
],
[
"n = len(train_df); n",
"_____no_output_____"
]
],
[
[
"### Experimenting with a sample",
"_____no_output_____"
]
],
[
[
"idx = np.random.permutation(range(n))[:2000]\nidx.sort()\nsmall_train_df = train_df.iloc[idx[:1000]]\nsmall_test_df = train_df.iloc[idx[1000:]]\nsmall_cont_vars = ['CompetitionDistance', 'Mean_Humidity']\nsmall_cat_vars = ['Store', 'DayOfWeek', 'PromoInterval']\nsmall_train_df = small_train_df[small_cat_vars + small_cont_vars + ['Sales']]\nsmall_test_df = small_test_df[small_cat_vars + small_cont_vars + ['Sales']]",
"_____no_output_____"
],
[
"small_train_df.head()",
"_____no_output_____"
],
[
"small_test_df.head()",
"_____no_output_____"
],
[
"categorify = Categorify(small_cat_vars, small_cont_vars)\ncategorify(small_train_df)\ncategorify(small_test_df, test=True)",
"_____no_output_____"
],
[
"small_test_df.head()",
"_____no_output_____"
],
[
"small_train_df.PromoInterval.cat.categories",
"_____no_output_____"
],
[
"small_train_df['PromoInterval'].cat.codes[:5]",
"_____no_output_____"
],
[
"fill_missing = FillMissing(small_cat_vars, small_cont_vars)\nfill_missing(small_train_df)\nfill_missing(small_test_df, test=True)",
"_____no_output_____"
],
[
"small_train_df[small_train_df['CompetitionDistance_na'] == True]",
"_____no_output_____"
]
],
[
[
"### Preparing full data set",
"_____no_output_____"
]
],
[
[
"train_df = pd.read_pickle(path/'train_clean')\ntest_df = pd.read_pickle(path/'test_clean')",
"_____no_output_____"
],
[
"len(train_df),len(test_df)",
"_____no_output_____"
],
[
"procs=[FillMissing, Categorify, Normalize]",
"_____no_output_____"
],
[
"cat_vars = ['Store', 'DayOfWeek', 'Year', 'Month', 'Day', 'StateHoliday', 'CompetitionMonthsOpen',\n 'Promo2Weeks', 'StoreType', 'Assortment', 'PromoInterval', 'CompetitionOpenSinceYear', 'Promo2SinceYear',\n 'State', 'Week', 'Events', 'Promo_fw', 'Promo_bw', 'StateHoliday_fw', 'StateHoliday_bw',\n 'SchoolHoliday_fw', 'SchoolHoliday_bw']\n\ncont_vars = ['CompetitionDistance', 'Max_TemperatureC', 'Mean_TemperatureC', 'Min_TemperatureC',\n 'Max_Humidity', 'Mean_Humidity', 'Min_Humidity', 'Max_Wind_SpeedKm_h', \n 'Mean_Wind_SpeedKm_h', 'CloudCover', 'trend', 'trend_DE',\n 'AfterStateHoliday', 'BeforeStateHoliday', 'Promo', 'SchoolHoliday']",
"_____no_output_____"
],
[
"dep_var = 'Sales'\ndf = train_df[cat_vars + cont_vars + [dep_var,'Date']].copy()",
"_____no_output_____"
],
[
"test_df['Date'].min(), test_df['Date'].max()",
"_____no_output_____"
],
[
"cut = train_df['Date'][(train_df['Date'] == train_df['Date'][len(test_df)])].index.max()\ncut",
"_____no_output_____"
],
[
"valid_idx = range(cut)",
"_____no_output_____"
],
[
"df[dep_var].head()",
"_____no_output_____"
],
[
"data = (TabularList.from_df(df, path=path, cat_names=cat_vars, cont_names=cont_vars, procs=procs,)\n .split_by_idx(valid_idx)\n .label_from_df(cols=dep_var, label_cls=FloatList, log=True)\n .add_test(TabularList.from_df(test_df, path=path, cat_names=cat_vars, cont_names=cont_vars))\n .databunch())",
"_____no_output_____"
],
[
"doc(FloatList)",
"_____no_output_____"
]
],
[
[
"## Model",
"_____no_output_____"
]
],
[
[
"max_log_y = np.log(np.max(train_df['Sales'])*1.2)\ny_range = torch.tensor([0, max_log_y], device=defaults.device)",
"_____no_output_____"
],
[
"learn = tabular_learner(data, layers=[1000,500], ps=[0.001,0.01], emb_drop=0.04, \n y_range=y_range, metrics=exp_rmspe)",
"_____no_output_____"
],
[
"learn.model",
"_____no_output_____"
],
[
"len(data.train_ds.cont_names)",
"_____no_output_____"
],
[
"learn.lr_find()",
"LR Finder is complete, type {learner_name}.recorder.plot() to see the graph.\n"
],
[
"learn.recorder.plot()",
"_____no_output_____"
],
[
"learn.fit_one_cycle(5, 1e-3, wd=0.2)",
"_____no_output_____"
],
[
"learn.save('1')",
"_____no_output_____"
],
[
"learn.recorder.plot_losses(last=-1)",
"_____no_output_____"
],
[
"learn.load('1');",
"_____no_output_____"
],
[
"learn.fit_one_cycle(5, 3e-4)",
"_____no_output_____"
],
[
"learn.fit_one_cycle(5, 3e-4)",
"_____no_output_____"
]
],
[
[
"(10th place in the competition was 0.108)",
"_____no_output_____"
]
],
[
[
"test_preds=learn.get_preds(DatasetType.Test)\ntest_df[\"Sales\"]=np.exp(test_preds[0].data).numpy().T[0]\ntest_df[[\"Id\",\"Sales\"]]=test_df[[\"Id\",\"Sales\"]].astype(\"int\")\ntest_df[[\"Id\",\"Sales\"]].to_csv(\"rossmann_submission.csv\",index=False)",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfb38e069ac80a7a0320f12546898c434877655
| 7,793 |
ipynb
|
Jupyter Notebook
|
001_alg/log.ipynb
|
wispwisp/math
|
a67f1cd8937d099371e5d05af83dda8519bc6691
|
[
"MIT"
] | null | null | null |
001_alg/log.ipynb
|
wispwisp/math
|
a67f1cd8937d099371e5d05af83dda8519bc6691
|
[
"MIT"
] | null | null | null |
001_alg/log.ipynb
|
wispwisp/math
|
a67f1cd8937d099371e5d05af83dda8519bc6691
|
[
"MIT"
] | null | null | null | 22.076487 | 129 | 0.356345 |
[
[
[
"import pandas as pd\nimport numpy as np\nimport seaborn as sns\n\nimport matplotlib.pyplot as plt\nimport matplotlib\nmatplotlib.pyplot.style.use('seaborn')\nmatplotlib.rcParams['figure.figsize'] = (15, 5)\n\n%matplotlib inline\n\nimport math\n\nfrom IPython.core.interactiveshell import InteractiveShell\nInteractiveShell.ast_node_interactivity = \"all\"",
"_____no_output_____"
]
],
[
[
"## $a^{n} \\cdot a^{m} = a^{n+m}$\n\n## $\\frac{a^{n}}{a^{m}} = a^{n-m}$\n\n## $\\left(a^{n} \\right)^{m} = a^{n\\cdot m}$\n\n## $a^{x} \\cdot b^{x} = \\left(a \\cdot b \\right)^{x}$\n\n## $\\frac{a^{x}}{b^{x}} = \\left(\\frac{a}{b} \\right)^{x}$\n\n$a^{-n} = \\frac{1}{a^{n}}$\n\n$a^{-1} = \\frac{1}{a}$",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"## $\\sqrt[n]{a^{k}} = \\left(\\sqrt[n]{a} \\right)^{k}$\n## $\\left(a^{k} \\right)^{\\frac{1}{n}} = \\left(a^{\\frac{1}{n}} \\right)^{k} = a^{\\frac{k}{n}}$\n\n## $\\sqrt[n]{a} = \\sqrt[n \\cdot k]{a^{k}}$\n\n## $a^{\\frac{1}{n}} = \\left(a^{k} \\right)^{\\frac{1}{n \\cdot k}} = a^{\\frac{k}{n \\cdot k}}$\n\n## $\\sqrt[n]{\\sqrt[k]{a}} = \\sqrt[n \\cdot k]{a}$\n\n## $\\left( a^{\\frac{1}{k}} \\right)^{\\frac{1}{n}} = a^{\\frac{1}{k} \\cdot \\frac{1}{n}} = a^{\\frac{1}{k \\cdot n}}$\n\n#### $\\sqrt[n]{ab} = \\sqrt[n]{a} \\cdot \\sqrt[n]{b}$\n#### $ab^{\\frac{1}{n}} = a^{\\frac{1}{n}} \\cdot b^{\\frac{1}{n}}$\n\n#### $\\sqrt[n]{\\frac{a}{b}} = \\frac{\\sqrt[n]{a}}{\\sqrt[n]{b}}$\n#### $\\left(\\frac{a}{b} \\right)^{\\frac{1}{x}} = \\frac{a^{\\frac{1}{x}}}{b^{\\frac{1}{x}}}$",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"## $\\log_{a}b = C \\Leftrightarrow a^{C}=b$\n\n## $\\log_{a}bc = \\log_{a}b + \\log_{a}c$\n\n## $\\log_{a}\\frac{b}{c} = \\log_{a}b - \\log_{a}c$\n\n## $\\log_{a^{k}}b^{m} = \\frac{m}{k}\\log_{a}b$\n\n\n## $a^{\\log_{a}b} = b$\n\n## $\\log_{a}b = \\frac{\\log_{x}b}{\\log_{x}a}$",
"_____no_output_____"
],
[
"# Examples:\n\n$\\sqrt[4]{x} + \\sqrt{x} = 6$\n\n$x^{\\frac{1}{4}} + x^{\\frac{1}{2}} = 6$\n\n$a = x^{\\frac{1}{2}}$\n\n$a^{\\frac{1}{2}} + a = 6$\n\n$a^{\\frac{1}{2}} = 6 - a$\n\n$a = (6-a)^{2}$\n\n---\n\n$2\\sqrt{8b^3} \\cdot 9\\sqrt{18b}$\n\n$2\\sqrt{8b^3} \\cdot 9\\sqrt{18b} = 18\\sqrt{2^3b^3 18b}$\n\n$2\\sqrt{8b^3} \\cdot 9\\sqrt{18b} = 18\\sqrt{2^3b^3 18b} = 18\\sqrt{2^3b^3 \\cdot 2 \\cdot 3^2b}$\n\n$2\\sqrt{8b^3} \\cdot 9\\sqrt{18b} = 18\\sqrt{2^3b^3 18b} = 18\\sqrt{2^3b^3 \\cdot 2 \\cdot 3^2b} =\n18\\sqrt{2^4 b^4 \\cdot 3^2}$\n\n$18\\sqrt{2^4 b^4 \\cdot 3^2} = 18 \\cdot 2^2 \\cdot 3 b^2 = 216b^2$",
"_____no_output_____"
],
[
"# Log Equations",
"_____no_output_____"
],
[
"$\\log_{2} (x+2) = 3$\n\n$x+2 = 2^3$\n\n$x = 6$",
"_____no_output_____"
],
[
"---\n\n$\\log_{9} (3^x) = 15$\n\n$3^x = 9^{15}$\n\n$3^x = 3^{2 \\cdot 15}$\n\n$x = 30$",
"_____no_output_____"
],
[
"---\n\n$\\log_{x} (36) = 2$\n\n$36 = x^2$\n\n$\\pm \\sqrt{36} = x$\n\n$x = \\pm 6$\n\n$\\log_{x}$: $x > 0$",
"_____no_output_____"
],
[
"---\n\n$\\log_{9} x = \\frac{1}{2}$\n\n$x = 9^{\\frac{1}{2}}$\n\n$x = \\sqrt{9}$\n\n$x = 3$",
"_____no_output_____"
],
[
"---\n\n$\\log_{5} 25 = 2x$\n\n$25 = 5^{2x}$\n\n$5^2 = 5^{2x}$\n\n$2 = 2x$",
"_____no_output_____"
],
[
"---\n\n$\\log_{3} (3^{2x} - 3^x + 1) = x$\n\n$3^{2x} - 3^x + 1 = 3^x$\n\n$(3^{x})^2 - 2(3^x) + 1 = 0$\n\n$a = 3^{x}$\n\n$a^2 - 2a + 1 = 0$\n\n$a + b = -2$\n\n$ab = 1$\n\n$(-1,-1)$\n\n$(a-1)^2 = 0$\n\n$a = 1$\n\n$1 = 3^{x}$\n\n$x=0$",
"_____no_output_____"
],
[
"---\n\n$\\log_{16} (3x + 1) = 2$\n\n$\\frac{1}{4} \\log_{2} (3x + 1) = 2$\n\n$\\frac{4}{4} \\log_{2} (3x + 1) = 2 \\cdot 4$\n\n$3x + 1 = 2^8$\n\n$x = 85$",
"_____no_output_____"
],
[
"---\n\n$\\log_{5} x + \\log_{3} x = 0$\n\n$\\frac{\\log_{x} x}{\\log_{x} 5} + \\frac{\\log_{x} x}{\\log_{x} 3} = 0$\n\n$\\frac{1}{\\log_{x} 5} + \\frac{1}{\\log_{x} 3} = 0$\n\n$\\frac{1}{\\log_{x} 5} = -\\frac{1}{\\log_{x} 3}$\n\n$(\\frac{1}{\\log_{x} 5})^{-1} = (\\frac{-1}{\\log_{x} 3})^{-1}$\n\n${\\log_{x} 5} = -\\log_{x} 3$\n\n${\\log_{x} 5} + \\log_{x} 3 = 0$\n\n${\\log_{x} 15} = 0$\n\n$15 = x^0$ Impossible, so $x = 1$:\n\n$\\log_{5} 1 + \\log_{3} 1 = 0$\n\n$0 + 0 = 0$",
"_____no_output_____"
],
[
"---\n\n$\\frac{\\log_{10} 8x}{\\log_{10} |7x+3|} = 1$\n\n$\\log_{10} 8x = \\log_{10} |7x+3|$\n\n$10^{\\log_{10} 8x} = 10^{\\log_{10} |7x+3|}$\n\n$8x = 7x+3$\n\n$x = 3$",
"_____no_output_____"
],
[
"---\n\n$\\log_{5} x - \\log_{25} x + \\log_{\\sqrt{5}} x = -5$\n\n$\\frac{2}{2}\\log_{5} x - \\frac{1}{2}\\log_{5} x + \\frac{2}{2}\\log_{\\sqrt{5}} x = -5$\n\n$\\log_{5} x \\cdot(\\frac{2}{2} - \\frac{1}{2}) + 2\\log_{5} x = -5$\n\n$\\frac{1}{2}\\log_{5} x + 2\\log_{5} x = -5$\n\n$\\log_{5} x \\cdot (\\frac{1}{2} + 2) = -5$\n\n$2.5\\log_{5} x = -5$\n\n$\\log_{5} x = -2$\n\n$5^{\\log_{5} x} = 5^{-2}$\n\n$x = \\frac{1}{5^2}$",
"_____no_output_____"
]
]
] |
[
"code",
"markdown"
] |
[
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
cbfb4b47ea1653d7b12ad6550a3c74a8758bc9fa
| 379,021 |
ipynb
|
Jupyter Notebook
|
WIP/vector_transformation_copulas.ipynb
|
junpenglao/Planet_Sakaar_Data_Science
|
73d9605b91b774a56d18c193538691521f679f16
|
[
"MIT"
] | 51 |
2018-04-08T19:53:15.000Z
|
2021-11-24T21:08:25.000Z
|
WIP/vector_transformation_copulas.ipynb
|
junpenglao/Planet_Sakaar_Data_Science
|
73d9605b91b774a56d18c193538691521f679f16
|
[
"MIT"
] | 2 |
2018-05-29T20:50:37.000Z
|
2020-09-12T07:14:08.000Z
|
WIP/vector_transformation_copulas.ipynb
|
junpenglao/Planet_Sakaar_Data_Science
|
73d9605b91b774a56d18c193538691521f679f16
|
[
"MIT"
] | 13 |
2018-07-21T09:53:10.000Z
|
2021-06-07T19:06:26.000Z
| 293.814729 | 94,084 | 0.888062 |
[
[
[
"Code testing for https://github.com/pymc-devs/pymc3/pull/2986",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pymc3 as pm\nimport pymc3.distributions.transforms as tr\nimport theano.tensor as tt\nfrom theano.scan_module import until\nimport theano\nimport matplotlib.pylab as plt\nimport seaborn as sns\n\n%matplotlib inline",
"/usr/local/lib/python3.5/dist-packages/h5py/__init__.py:36: FutureWarning: Conversion of the second argument of issubdtype from `float` to `np.floating` is deprecated. In future, it will be treated as `np.float64 == np.dtype(float).type`.\n from ._conv import register_converters as _register_converters\n"
]
],
[
[
"# Polar transformation\n ",
"_____no_output_____"
]
],
[
[
"# Polar to Cartesian\ndef backward(y):\n # y = [r, theta]\n x = tt.zeros(y.shape)\n x = tt.inc_subtensor(x[0], y[0]*tt.cos(y[1]))\n x = tt.inc_subtensor(x[1], y[0]*tt.sin(y[1]))\n return x\n\ndef forward(x):\n # y = [r, theta]\n y = tt.zeros(x.shape)\n y = tt.inc_subtensor(y[0], tt.sqrt(tt.square(x[0]) + tt.square(x[1])))\n if y[0] != 0:\n if x[1] < 0:\n theta = -tt.arccos(x[0]/y[0])\n else:\n theta = tt.arccos(x[0]/y[0])\n y = tt.inc_subtensor(y[1], theta)\n return y",
"_____no_output_____"
],
[
"y = tt.vector('polar')\ny.tag.test_value=np.asarray([1., np.pi/2])",
"_____no_output_____"
],
[
"f_inv = backward(y)\nJ, _ = theano.scan(lambda i, f, x: tt.grad(f[i], x),\n sequences=tt.arange(f_inv.shape[0]),\n non_sequences=[f_inv, y])",
"_____no_output_____"
],
[
"Jacob_f1 = theano.function([y], J)",
"_____no_output_____"
],
[
"Jacob_f1(np.asarray([1., np.pi/2]))",
"_____no_output_____"
],
[
"J2 = pm.theanof.jacobian(f_inv, [y])\nJacob_f2 = theano.function([y], J2)\nJacob_f2(np.asarray([1., np.pi/2]))",
"_____no_output_____"
],
[
"%timeit Jacob_f1(np.asarray([1., np.pi/2]))\n%timeit Jacob_f2(np.asarray([1., np.pi/2]))",
"93.2 ยตs ยฑ 1.13 ยตs per loop (mean ยฑ std. dev. of 7 runs, 10000 loops each)\n92.2 ยตs ยฑ 2.11 ยตs per loop (mean ยฑ std. dev. of 7 runs, 10000 loops each)\n"
],
[
"class VectorTransform(tr.Transform): \n \n def jacobian_det(self, x): \n f_inv = self.backward(x) \n J, _ = theano.scan(lambda i, f, x: tt.grad(f[i], x), \n sequences=tt.arange(f_inv.shape[0]), \n non_sequences=[f_inv, x]) \n return tt.log(tt.abs_(tt.nlinalg.det(J))) ",
"_____no_output_____"
],
[
"class Nealfun(VectorTransform):\n name = \"Neal_funnel\"\n\n def backward(self, y):\n x = tt.zeros(y.shape)\n x = tt.inc_subtensor(x[0], y[0] / 3.)\n x = tt.inc_subtensor(x[1:], y[1:] / tt.exp(y[0] / 2))\n return x\n\n def forward(self, x):\n y = tt.zeros(x.shape)\n y = tt.inc_subtensor(y[0], x[0] * 3.)\n y = tt.inc_subtensor(y[1:], tt.exp(x[0] * 3. / 2) * x[1:])\n return y",
"_____no_output_____"
],
[
"y = tt.vector('y')\ny.tag.test_value = np.zeros(101)",
"_____no_output_____"
],
[
"nealfun = Nealfun()\nf_inv = nealfun.backward(y)\n\nJ1, _ = theano.scan(lambda i, f, x: tt.grad(f[i], x),\n sequences=tt.arange(f_inv.shape[0]),\n non_sequences=[f_inv, y])\nJacob_f1 = theano.function([y], J1)\n\nJ2 = pm.theanof.jacobian(f_inv, [y])\nJacob_f2 = theano.function([y], J2)",
"_____no_output_____"
],
[
"%timeit Jacob_f1(np.zeros(101))\n%timeit Jacob_f2(np.zeros(101))",
"1.15 ms ยฑ 14.1 ยตs per loop (mean ยฑ std. dev. of 7 runs, 1000 loops each)\n1.22 ms ยฑ 21.3 ยตs per loop (mean ยฑ std. dev. of 7 runs, 1000 loops each)\n"
]
],
[
[
"# Copulas\nBackground reading http://twiecki.github.io/blog/2018/05/03/copulas/ \nMore information https://github.com/tensorflow/probability/blob/master/tensorflow_probability/examples/jupyter_notebooks/Gaussian_Copula.ipynb",
"_____no_output_____"
]
],
[
[
"import scipy.stats as st\nnorm = st.norm()",
"_____no_output_____"
],
[
"def norm_cdf(x):\n return x_unif\n\n\ndef copulas_forward_func(nsample, cov, marg1_ppf, marg2_ppf):\n mvnorm = st.multivariate_normal(mean=[0, 0], cov=cov)\n # Generate random samples from multivariate normal with correlation .5\n x = mvnorm.rvs(nsample)\n x_unif = norm.cdf(x)\n x_trans = np.vstack([marg1_ppf(x_unif[:, 0]), marg2_ppf(x_unif[:, 1])]).T\n return x_trans, x_unif, x",
"_____no_output_____"
],
[
"cov = np.asarray([[1., 0.725], [0.725, 1.]])\nmarg1_ppf = st.gumbel_r().ppf\nmarg2_ppf = st.beta(a=10, b=2).ppf\nx_trans, x_unif, x = copulas_forward_func(10000, cov, marg1_ppf, marg2_ppf)\n\nsns.jointplot(x[:, 0], x[:, 1], kind='kde', stat_func=None)\nsns.jointplot(x_unif[:, 0], x_unif[:, 1], kind='hex',\n stat_func=None, joint_kws=dict(gridsize=50))\nsns.jointplot(x_trans[:, 0], x_trans[:, 1], kind='kde',\n stat_func=None, xlim=(-2, 6), ylim=(.6, 1.0),)\nplt.tight_layout()",
"/usr/local/lib/python3.5/dist-packages/matplotlib/axes/_axes.py:6462: UserWarning: The 'normed' kwarg is deprecated, and has been replaced by the 'density' kwarg.\n warnings.warn(\"The 'normed' kwarg is deprecated, and has been \"\n"
],
[
"xrange = np.linspace(-2, 6, 200)\nplt.hist(x_trans[:, 0], xrange, density='pdf')\nplt.plot(xrange, st.gumbel_r.pdf(xrange));",
"_____no_output_____"
],
[
"def gumbel_cdf(value, mu, beta):\n return tt.exp(-tt.exp(-(value-mu)/beta))",
"_____no_output_____"
]
],
[
[
"Beta CDF",
"_____no_output_____"
]
],
[
[
"from theano.scan_module import until",
"_____no_output_____"
],
[
"max_iter=200\nvalue_, a, b = x_trans[:, 1], 10., 2.\nvalue = theano.shared(np.reshape(value_, (1,len(value_))))\n\nEPS = 3.0e-7\nqab = a + b\nqap = a + 1.0\nqam = a - 1.0\n\ndef _step(i, az, bm, am, bz):\n\n tem = i + i\n d = i * (b - i) * value / ((qam + tem) * (a + tem))\n d =- (a + i) * i * value / ((qap + tem) * (a + tem))\n\n ap = az + d * am\n bp = bz + d * bm\n\n app = ap + d * az\n bpp = bp + d * bz\n\n aold = az\n\n am = ap / bpp\n bm = bp / bpp\n az = app / bpp\n\n bz = tt.ones_like(bz)\n\n return (az, bm, am, bz), until(tt.sum(tt.lt(tt.abs_(az - aold), (EPS * tt.abs_(az)))))\n\n(az, bm, am, bz), _ = theano.scan(_step,\n sequences=[tt.arange(1, max_iter)],\n outputs_info=[tt.ones_like(value),\n tt.ones_like(value), \n tt.ones_like(value), \n 1. - qab * value / qap])",
"_____no_output_____"
],
[
"def cont_fraction_beta(value_, a, b, max_iter=500):\n '''Evaluates the continued fraction form of the incomplete Beta function.\n Derived from implementation by Ali Shoaib (https://goo.gl/HxjIJx).\n '''\n\n EPS = 1.0e-20\n qab = a + b\n qap = a + 1.0\n qam = a - 1.0\n value = theano.shared(value_)\n\n def _step(i, az, bm, am, bz):\n\n tem = i + i\n d = i * (b - i) * value / ((qam + tem) * (a + tem))\n d = - (a + i) * i * value / ((qap + tem) * (a + tem))\n\n ap = az + d * am\n bp = bz + d * bm\n\n app = ap + d * az\n bpp = bp + d * bz\n\n aold = az\n\n am = ap / bpp\n bm = bp / bpp\n az = app / bpp\n\n bz = tt.ones_like(bz)\n\n return (az, bm, am, bz), until(tt.sum(tt.lt(tt.abs_(az - aold), (EPS * tt.abs_(az)))))\n\n (az, bm, am, bz), _ = theano.scan(_step,\n sequences=[tt.arange(1, max_iter)],\n outputs_info=[tt.ones_like(value),\n tt.ones_like(value),\n tt.ones_like(value),\n 1. - qab * value / qap])\n\n return az[-1]\n\n\ndef beta_cdf(value, a, b):\n log_beta = tt.gammaln(a+b) - tt.gammaln(a) - tt.gammaln(b)\n log_beta += a * tt.log(value) + b * tt.log(1 - value)\n cdf = tt.switch(\n tt.lt(value, (a + 1) / (a + b + 2)),\n tt.exp(log_beta) * cont_fraction_beta(value, a, b) / a,\n 1. - tt.exp(log_beta) * cont_fraction_beta(1. - value, b, a) / b\n )\n return cdf",
"_____no_output_____"
],
[
"def normal_ppf(value):\n return -np.sqrt(2.) * tt.erfcinv(2. * value)",
"_____no_output_____"
],
[
"functmp = theano.function([],\n tt.stack([gumbel_cdf(x_trans[:, 0], 0., 1.),\n beta_cdf(x_trans[:, 1], 10., 2.)]).T\n )\nx_ = functmp()\nx_",
"_____no_output_____"
],
[
"x_unif",
"_____no_output_____"
],
[
"np.sum(~np.isfinite(x_))",
"_____no_output_____"
],
[
"with pm.Model() as model:\n # rโผUniform(โ1,1)\n r = pm.Uniform('r',lower=-1, upper=1)\n\n cov = pm.Deterministic('cov', \n tt.stacklists([[1., r],\n [r, 1.]]))\n a = pm.HalfNormal('alpha', 5., testval=10.)\n b = pm.HalfNormal('beta', 2.5, testval=2.)\n loc = pm.Normal('loc', 0., 5., testval=0.)\n scale = pm.HalfNormal('scale', 2.5, testval=1.)\n\n tr_func = normal_ppf(\n tt.stack([gumbel_cdf(x_trans[:, 0], loc, scale),\n beta_cdf(x_trans[:, 1], a, b)]).T\n )\n pm.MvNormal('obs', np.zeros(2), cov=cov, observed=tr_func)\n pm.Gumbel('marg0', loc, scale, observed=x_trans[:, 0])\n pm.Beta('marg1', a, b, observed=x_trans[:, 1])",
"_____no_output_____"
]
],
[
[
"The beta CDF does not quite work - use another distribution instead",
"_____no_output_____"
]
],
[
[
"from scipy.special import logit\nxrange = np.linspace(0, 1, 200)\nplt.hist(x_trans[:, 1], xrange, density='pdf')\nlogitnormpdf = st.norm.pdf(logit(xrange), loc=1.725, scale=.8) * 1/(xrange * (1-xrange))\nplt.plot(xrange, logitnormpdf);",
"_____no_output_____"
],
[
"def logitnorm_cdf(value, mu, sd):\n return .5 + .5*(tt.erf((pm.math.logit(value)-mu)/(np.sqrt(2)*sd)))",
"_____no_output_____"
],
[
"tr_func = normal_ppf(\n tt.stack([gumbel_cdf(x_trans[:, 0], 0., 1.),\n logitnorm_cdf(x_trans[:, 1], 1.725, .8)]).T\n)\nfunctmp = theano.function([], tr_func)\nx_ = functmp()\nsns.jointplot(x_[:, 0], x_[:, 1], kind='kde', stat_func=None);",
"_____no_output_____"
],
[
"np.sum(~np.isfinite(x_[:, 1]))",
"_____no_output_____"
],
[
"with pm.Model() as model:\n # rโผUniform(โ1,1)\n r = pm.Uniform('r',lower=-1, upper=1)\n\n cov = pm.Deterministic('cov', \n tt.stacklists([[1., r],\n [r, 1.]]))\n loc = pm.Normal('loc', 0., 5., testval=0.)\n scale = pm.HalfNormal('scale', 2.5, testval=1.)\n mu = pm.Normal('mu', 1., 1., testval=1.725)\n sd = pm.HalfNormal('sd', .5, testval=.8)\n\n tr_func = normal_ppf(\n tt.stack([gumbel_cdf(x_trans[:, 0], loc, scale),\n logitnorm_cdf(x_trans[:, 1], mu, sd)]).T\n )\n\n pm.MvNormal('obs', np.zeros(2), cov=cov, observed=tr_func)\n pm.Gumbel('marg0', loc, scale, observed=x_trans[:, 0])\n pm.LogitNormal('marg1', mu, sd, observed=x_trans[:, 1])",
"_____no_output_____"
],
[
"with model:\n map1 = pm.find_MAP()\nmap1",
"logp = -27,090, ||grad|| = 771.94: 100%|โโโโโโโโโโ| 19/19 [00:00<00:00, 166.37it/s] \n"
],
[
"_, ax = plt.subplots(1, 2, figsize=(10, 3))\nx0 = np.linspace(-2, 6, 200)\nax[0].hist(x_trans[:, 0], x0, density='pdf')\nax[0].plot(x0, st.gumbel_r.pdf(x0, loc=map1['loc'], scale=map1['scale']))\n\nx1 = np.linspace(0, 1, 200)\nax[1].hist(x_trans[:, 1], x1, density='pdf')\nlogitnormpdf = st.norm.pdf(logit(x1), loc=map1['mu'], scale=map1['sd']) * 1/(x1 * (1-x1))\nax[1].plot(x1, logitnormpdf);",
"_____no_output_____"
],
[
"with pm.Model() as model_marg:\n loc = pm.Normal('loc', 0., 5., testval=0.)\n scale = pm.HalfNormal('scale', 2.5, testval=1.)\n mu = pm.Normal('mu', 1., 1., testval=1.725)\n sd = pm.HalfNormal('sd', .5, testval=.8)\n\n pm.Gumbel('marg0', loc, scale, observed=x_trans[:, 0])\n pm.LogitNormal('marg1', mu, sd, observed=x_trans[:, 1])\n map_ = pm.find_MAP()\n\nmap_",
"logp = -6,617.8, ||grad|| = 2,716.4: 100%|โโโโโโโโโโ| 9/9 [00:00<00:00, 937.67it/s]\n"
],
[
"_, ax = plt.subplots(1, 2, figsize=(10, 3))\nx0 = np.linspace(-2, 6, 200)\nax[0].hist(x_trans[:, 0], x0, density='pdf')\nax[0].plot(x0, st.gumbel_r.pdf(x0, loc=map_['loc'], scale=map_['scale']))\n\nx1 = np.linspace(0, 1, 200)\nax[1].hist(x_trans[:, 1], x1, density='pdf')\nlogitnormpdf = st.norm.pdf(logit(x1), loc=map_['mu'], scale=map_['sd']) * 1/(x1 * (1-x1))\nax[1].plot(x1, logitnormpdf);",
"_____no_output_____"
],
[
"from pymc3.theanof import gradient\n\ndef jacobian_det(f_inv_x, x):\n grad = tt.reshape(gradient(tt.sum(f_inv_x), [x]), x.shape)\n return tt.log(tt.abs_(grad))",
"_____no_output_____"
],
[
"xt_0 = theano.shared(x_trans[:, 0])\nxt_1 = theano.shared(x_trans[:, 1])\n\nwith pm.Model() as model2:\n # rโผUniform(โ1,1)\n r = pm.Uniform('r',lower=-1, upper=1)\n\n cov = pm.Deterministic('cov', \n tt.stacklists([[1., r],\n [r, 1.]]))\n loc = pm.Normal('loc', 0., 5., testval=0.)\n scale = pm.HalfNormal('scale', 2.5, testval=1.)\n mu = pm.Normal('mu', 1., .5, testval=1.725)\n sd = pm.HalfNormal('sd', .5, testval=.8)\n\n tr_func = normal_ppf(\n tt.stack([gumbel_cdf(xt_0, loc, scale),\n logitnorm_cdf(xt_1, mu, sd)]).T\n )\n\n pm.MvNormal('obs', np.zeros(2), cov=cov, observed=tr_func)\n pm.Potential('jacob_det0', jacobian_det(normal_ppf(gumbel_cdf(xt_0, loc, scale)), xt_0))\n pm.Potential('jacob_det1', jacobian_det(normal_ppf(logitnorm_cdf(xt_1, mu, sd)), xt_1))\n map_ = pm.find_MAP()\n\n_, ax = plt.subplots(1, 2, figsize=(10, 3))\nx0 = np.linspace(-2, 6, 200)\nax[0].hist(x_trans[:, 0], x0, density='pdf')\nax[0].plot(x0, st.gumbel_r.pdf(x0, loc=map_['loc'], scale=map_['scale']))\n\nx1 = np.linspace(0, 1, 200)\nax[1].hist(x_trans[:, 1], x1, density='pdf')\nlogitnormpdf = st.norm.pdf(logit(x1), loc=map_['mu'], scale=map_['sd']) * 1/(x1 * (1-x1))\nax[1].plot(x1, logitnormpdf);",
"logp = -6,618.6, ||grad|| = 4,769.4: 0%| | 3/5000 [00:00<01:14, 66.99it/s]\n"
]
],
[
[
"Kumaraswamy distribution",
"_____no_output_____"
]
],
[
[
"from scipy.special import logit\nxrange = np.linspace(0, 1, 200)\nplt.hist(x_trans[:, 1], xrange, density='pdf')\nKumaraswamypdf = lambda x, a, b: a*b*np.power(x, a-1)*np.power(1-np.power(x, a), b-1)\nplt.plot(xrange, Kumaraswamypdf(xrange, 8, 2));",
"_____no_output_____"
],
[
"def Kumaraswamy_cdf(value, a, b):\n return 1 - tt.pow(1 - tt.pow(value, a), b)",
"_____no_output_____"
],
[
"tr_func = normal_ppf(\n tt.stack([gumbel_cdf(x_trans[:, 0], 0., 1.),\n Kumaraswamy_cdf(x_trans[:, 1], 8, 2)]).T\n)\nfunctmp = theano.function([], tr_func)\nx_ = functmp()\nsns.jointplot(x_[:, 0], x_[:, 1], kind='kde', stat_func=None);",
"_____no_output_____"
],
[
"np.sum(~np.isfinite(x_[:, 1]))",
"_____no_output_____"
],
[
"with pm.Model() as model_marg:\n a = pm.HalfNormal('alpha', 5., testval=10.)\n b = pm.HalfNormal('beta', 2.5, testval=2.)\n loc = pm.Normal('loc', 0., 5., testval=0.)\n scale = pm.HalfNormal('scale', 2.5, testval=1.)\n\n pm.Gumbel('marg0', loc, scale, observed=x_trans[:, 0])\n pm.Kumaraswamy('marg1', a, b, observed=x_trans[:, 1])\n map_ = pm.find_MAP()",
"logp = -6,704.3, ||grad|| = 5,298.7: 100%|โโโโโโโโโโ| 11/11 [00:00<00:00, 391.90it/s]\n"
],
[
"_, ax = plt.subplots(1, 2, figsize=(10, 3))\nx0 = np.linspace(-2, 6, 200)\nax[0].hist(x_trans[:, 0], x0, density='pdf')\nax[0].plot(x0, st.gumbel_r.pdf(x0, loc=map_['loc'], scale=map_['scale']))\n\nx1 = np.linspace(0, 1, 200)\nax[1].hist(x_trans[:, 1], x1, density='pdf')\nax[1].plot(x1, Kumaraswamypdf(x1, map_['alpha'], map_['beta']));",
"_____no_output_____"
],
[
"with pm.Model() as model2:\n # rโผUniform(โ1,1)\n r = pm.Uniform('r',lower=-1, upper=1)\n\n cov = pm.Deterministic('cov', \n tt.stacklists([[1., r],\n [r, 1.]]))\n\n a = pm.HalfNormal('alpha', 5.)\n b = pm.HalfNormal('beta', 2.5)\n loc = pm.Normal('loc', 0., 5.)\n scale = pm.HalfNormal('scale', 2.5)\n\n tr_func = normal_ppf(\n tt.stack([gumbel_cdf(xt_0, loc, scale),\n Kumaraswamy_cdf(xt_1, a, b)]).T\n )\n\n pm.MvNormal('obs', np.zeros(2), cov=cov, observed=tr_func)\n pm.Potential('jacob_det0', jacobian_det(normal_ppf(gumbel_cdf(xt_0, loc, scale)), xt_0))\n pm.Potential('jacob_det1', jacobian_det(normal_ppf(Kumaraswamy_cdf(xt_1, a, b)), xt_1))\n map_ = pm.find_MAP()\n\n_, ax = plt.subplots(1, 2, figsize=(10, 3))\nx0 = np.linspace(-2, 6, 200)\nax[0].hist(x_trans[:, 0], x0, density='pdf')\nax[0].plot(x0, st.gumbel_r.pdf(x0, loc=map_['loc'], scale=map_['scale']))\n\nx1 = np.linspace(0, 1, 200)\nax[1].hist(x_trans[:, 1], x1, density='pdf')\nax[1].plot(x1, Kumaraswamypdf(x1, map_['alpha'], map_['beta']));",
"logp = -2,492, ||grad|| = 722.33: 100%|โโโโโโโโโโ| 19/19 [00:00<00:00, 75.77it/s] \n"
],
[
"map_",
"_____no_output_____"
],
[
"with model2:\n trace = pm.sample()\n\n_, ax = plt.subplots(1, 2, figsize=(10, 3))\nx0 = np.linspace(-2, 6, 200)\nax[0].hist(x_trans[:, 0], x0, density='pdf')\nax[0].plot(x0, st.gumbel_r.pdf(x0, loc=trace['loc'].mean(), scale=trace['scale'].mean()))\n\nx1 = np.linspace(0, 1, 200)\nax[1].hist(x_trans[:, 1], x1, density='pdf')\nax[1].plot(x1, Kumaraswamypdf(x1, trace['alpha'].mean(), trace['beta'].mean()));",
"Auto-assigning NUTS sampler...\nInitializing NUTS using jitter+adapt_diag...\nMultiprocess sampling (4 chains in 4 jobs)\nNUTS: [scale, loc, beta, alpha, r]\nSampling 4 chains: 100%|โโโโโโโโโโ| 4000/4000 [01:50<00:00, 11.78draws/s]\nThe acceptance probability does not match the target. It is 0.8870644389498042, but should be close to 0.8. Try to increase the number of tuning steps.\nThe acceptance probability does not match the target. It is 0.8822052335254932, but should be close to 0.8. Try to increase the number of tuning steps.\n"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfb5d71d13d158b0fb0c3e72c0ddb511379220a
| 8,307 |
ipynb
|
Jupyter Notebook
|
jupyter notebooks/Mean Normalization and Data Separation.ipynb
|
olivererxleben/aipnd
|
3ff786dd7dfa86fe903b0d5a753e78e3b9069383
|
[
"MIT"
] | null | null | null |
jupyter notebooks/Mean Normalization and Data Separation.ipynb
|
olivererxleben/aipnd
|
3ff786dd7dfa86fe903b0d5a753e78e3b9069383
|
[
"MIT"
] | null | null | null |
jupyter notebooks/Mean Normalization and Data Separation.ipynb
|
olivererxleben/aipnd
|
3ff786dd7dfa86fe903b0d5a753e78e3b9069383
|
[
"MIT"
] | null | null | null | 33.768293 | 506 | 0.624052 |
[
[
[
"# Mean Normalization\n\nIn machine learning we use large amounts of data to train our models. Some machine learning algorithms may require that the data is *normalized* in order to work correctly. The idea of normalization, also known as *feature scaling*, is to ensure that all the data is on a similar scale, *i.e.* that all the data takes on a similar range of values. For example, we might have a dataset that has values between 0 and 5,000. By normalizing the data we can make the range of values be between 0 and 1.\n\nIn this lab you will be performing a particular form of feature scaling known as *mean normalization*. Mean normalization will not only scale the data but will also ensure your data has zero mean. \n\n# To Do:\n\nYou will start by importing NumPy and creating a rank 2 ndarray of random integers between 0 and 5,000 (inclusive) with 1000 rows and 20 columns. This array will simulate a dataset with a wide range of values. Fill in the code below",
"_____no_output_____"
]
],
[
[
"# import NumPy into Python\n\n\n# Create a 1000 x 20 ndarray with random integers in the half-open interval [0, 5001).\nX = \n\n# print the shape of X\n",
"_____no_output_____"
]
],
[
[
"Now that you created the array we will mean normalize it. We will perform mean normalization using the following equation:\n\n$\\mbox{Norm_Col}_i = \\frac{\\mbox{Col}_i - \\mu_i}{\\sigma_i}$\n\nwhere $\\mbox{Col}_i$ is the $i$th column of $X$, $\\mu_i$ is average of the values in the $i$th column of $X$, and $\\sigma_i$ is the standard deviation of the values in the $i$th column of $X$. In other words, mean normalization is performed by subtracting from each column of $X$ the average of its values, and then by dividing by the standard deviation of its values. In the space below, you will first calculate the average and standard deviation of each column of $X$. ",
"_____no_output_____"
]
],
[
[
"# Average of the values in each column of X\nave_cols = \n\n# Standard Deviation of the values in each column of X\nstd_cols = ",
"_____no_output_____"
]
],
[
[
"If you have done the above calculations correctly, then `ave_cols` and `std_cols`, should both be vectors with shape `(20,)` since $X$ has 20 columns. You can verify this by filling the code below:",
"_____no_output_____"
]
],
[
[
"# Print the shape of ave_cols\n\n# Print the shape of std_cols\n",
"_____no_output_____"
]
],
[
[
"You can now take advantage of Broadcasting to calculate the mean normalized version of $X$ in just one line of code using the equation above. Fill in the code below",
"_____no_output_____"
]
],
[
[
"# Mean normalize X\nX_norm = ",
"_____no_output_____"
]
],
[
[
"If you have performed the mean normalization correctly, then the average of all the elements in $X_{\\tiny{\\mbox{norm}}}$ should be close to zero. You can verify this by filing the code below:",
"_____no_output_____"
]
],
[
[
"# Print the average of all the values of X_norm\n\n# Print the minimum value of each column of X_norm\n\n# Print the maximum value of each column of X_norm\n",
"_____no_output_____"
]
],
[
[
"You should note that since $X$ was created using random integers, the above values will vary. \n\n# Data Separation\n\nAfter the data has been mean normalized, it is customary in machine learnig to split our dataset into three sets:\n\n1. A Training Set\n2. A Cross Validation Set\n3. A Test Set\n\nThe dataset is usually divided such that the Training Set contains 60% of the data, the Cross Validation Set contains 20% of the data, and the Test Set contains 20% of the data. \n\nIn this part of the lab you will separate `X_norm` into a Training Set, Cross Validation Set, and a Test Set. Each data set will contain rows of `X_norm` chosen at random, making sure that we don't pick the same row twice. This will guarantee that all the rows of `X_norm` are chosen and randomly distributed among the three new sets.\n\nYou will start by creating a rank 1 ndarray that contains a random permutation of the row indices of `X_norm`. You can do this by using the `np.random.permutation()` function. The `np.random.permutation(N)` function creates a random permutation of integers from 0 to `N - 1`. Let's see an example:",
"_____no_output_____"
]
],
[
[
"# We create a random permutation of integers 0 to 4\nnp.random.permutation(5)",
"_____no_output_____"
]
],
[
[
"# To Do\n\nIn the space below create a rank 1 ndarray that contains a random permutation of the row indices of `X_norm`. You can do this in one line of code by extracting the number of rows of `X_norm` using the `shape` attribute and then passing it to the `np.random.permutation()` function. Remember the `shape` attribute returns a tuple with two numbers in the form `(rows,columns)`.",
"_____no_output_____"
]
],
[
[
"# Create a rank 1 ndarray that contains a random permutation of the row indices of `X_norm`\nrow_indices = ",
"_____no_output_____"
]
],
[
[
"Now you can create the three datasets using the `row_indices` ndarray to select the rows that will go into each dataset. Rememeber that the Training Set contains 60% of the data, the Cross Validation Set contains 20% of the data, and the Test Set contains 20% of the data. Each set requires just one line of code to create. Fill in the code below",
"_____no_output_____"
]
],
[
[
"# Make any necessary calculations.\n# You can save your calculations into variables to use later.\n\n\n# Create a Training Set\nX_train = \n\n# Create a Cross Validation Set\nX_crossVal = \n\n# Create a Test Set\nX_test = ",
"_____no_output_____"
]
],
[
[
"If you performed the above calculations correctly, then `X_tain` should have 600 rows and 20 columns, `X_crossVal` should have 200 rows and 20 columns, and `X_test` should have 200 rows and 20 columns. You can verify this by filling the code below:",
"_____no_output_____"
]
],
[
[
"# Print the shape of X_train\n\n# Print the shape of X_crossVal\n\n# Print the shape of X_test",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfb7b166baa10f15608110be14b57062bba3714
| 102,728 |
ipynb
|
Jupyter Notebook
|
ParameterExploration.ipynb
|
GarciaLab/OnsetTimeTransientInputs
|
642c4a4acff8d7f4b2267031c5b6cf1c518db164
|
[
"MIT"
] | null | null | null |
ParameterExploration.ipynb
|
GarciaLab/OnsetTimeTransientInputs
|
642c4a4acff8d7f4b2267031c5b6cf1c518db164
|
[
"MIT"
] | null | null | null |
ParameterExploration.ipynb
|
GarciaLab/OnsetTimeTransientInputs
|
642c4a4acff8d7f4b2267031c5b6cf1c518db164
|
[
"MIT"
] | null | null | null | 112.887912 | 22,832 | 0.804688 |
[
[
[
"In this notebook we will use the boundary exploration algorithm to fully explore the parameter space of a generic Markov chain.\n\nLast updated by: Jonathan Liu, 10/22/2020",
"_____no_output_____"
]
],
[
[
"#Import necessary packages\n%matplotlib inline\nimport numpy as np\nfrom scipy.spatial import ConvexHull\nimport matplotlib.pyplot as plt\nimport scipy.special as sps\nfrom IPython.core.debugger import set_trace\nfrom numba import njit, prange\nimport numba as numba\nfrom datetime import date\nimport time as Time\n\n\n#Set number of threads\nnumba.set_num_threads(28)",
"_____no_output_____"
],
[
"#Simulation for calculating onset times for a generic Markov chain\n#This function is now deprecated since the Gillespie algorithm is much more accurate and efficient\n\ndef CalculatetOn_GenericMarkovChain(time,dt,Q,n,N_cells):\n#Calculates the onset time for a linear Markov chain with forward and backward rates.\n#The transition rate can be time-varying, but is the same\n#global rate for each transition. The model assumes n states, beginning\n#in the 1st state. Using finite timesteps and a Markov chain formalism, it\n#simulates N_cells realizations of the overall time it takes to reach the\n#nth state.\n\n# Inputs:\n# time: simulation time vector\n# dt: simulation timestep\n# Q: 3D transition rate matrix, where q_kji is the transition rate at time k from state i to j for i =/= j and \n# q_kii is the sum of transition rates out of state i\n# n: number of states\n# N_cells: number of cells to simulate\n\n# Outputs:\n# t_on: time to reach the final state for each cell (length = N_cells)\n\n## Setup variables\n t_on = np.empty(N_cells) #Time to transition to final ON state for each cell\n t_on[:] = np.nan\n state = np.zeros(N_cells, dtype=int) #State vector describing current state of each cell\n finished_states = np.zeros(N_cells, dtype=int) #Vector storing finished statuses of each cell\n\n ## Run simulation\n #Loop over time\n #q = waitbar(0,'Running simulation...')\n for i in range(len(time)):\n if np.sum(finished_states) == N_cells: #If all cells have turned on, stop the simulation\n #print('Halting simulation since all cells have turned on.')\n break\n \n #Simulate binomial random variable to see if each cell has transitioned\n #If the input transition rate is a nan, this will manifest as never\n #transitioning.\n \n #Find indices of that have not finished yet\n incompleteCells = np.transpose(np.where(finished_states != 1))\n #Loop over cells\n for j in incompleteCells:\n #The probability that a state i switches is given by -Q_ii * dt\n p = -Q[i,state[j],state[j]] * dt #Probability of transition at this timestep for this cell\n transitioned = np.random.binomial(1,p,1) #Binary transition decision for this cell\n\n #The state to transition to is given by the ratio of the individual rates in the column j over the total rate -Q_ii\n if transitioned == 1:\n Q_temp = np.copy(Q) #Temporary matrix where we will remove Q_ii for this cell and state\n Q_temp[i,state[j],state[j]] = 0\n pState = np.squeeze(Q_temp[i,:,state[j]]/-Q[i,state[j],state[j]])\n #print(Q[i,:,:])\n newState = np.random.choice(n, 1, p=pState)\n #print('cell ' + str(j) + ' transitioned from state ' + str(state[j]) + \\\n # ' to state ' + str(newState) + 'at time ' + str(time[i])) \n state[j] = newState\n \n #Record the time if it transitioned to the new state\n if newState == n-1:\n t_on[j] = time[i]\n\n #See if any states have reached the ON state\n finished_states[state == n-1] = 1\n return t_on",
"_____no_output_____"
],
[
"#Function to generate a random transition matrix for a generic Markov chain with n states, and an irreversible\n#transition into the final state.\n\n#Inputs:\n# n: number of states\n# k_min: minimum transition rate\n# k_max: maximum transition rate\n\n#pseudocode\n#generate 2D matrix based on n\n#loop over each index, if indices are connected by 1 then generate a value (except final state)\n#Calculate diagonal elements from summing columns to zero\n#\n\ndef MakeRandomTransitionMatrix(n, k_min, k_max):\n #Initialize the transition matrix\n Q = np.zeros((n,n))\n \n #Loop through transition indices (note that the final column is all zeros since it's an absorbing state)\n for i in range(n):\n for j in range(n-1):\n #If the indices are exactly one apart (i.e. adjacent states), then make a transition rate\n if np.abs(i-j) == 1:\n Q[i,j] = np.random.uniform(k_min,k_max)\n\n #Calculate the diagonal elements by taking the negative of the sum of the column\n for i in range(n-1):\n Q[i,i] = -np.sum(Q[:,i])\n \n return Q\n",
"_____no_output_____"
],
[
"#Function to mutate a transition matrix for a generic Markov chain with an irreversible transition\n#into the final absorbing state. For each element in Q, the function picks a random number between\n#[1-s,1+s] and multiplies it with the element\n\n#Inputs:\n# Q: n x n transition matrix, where n is the number of states\n# s: mutation factor (q_ij -> q_ij * s)\n\ndef MutateTransitionMatrix(Q,s):\n #Loop over transition matrix indices\n for i in range(Q.shape[0]):\n for j in range(Q.shape[1]):\n #Adjacent states\n if i != j:\n Q[i,j] = Q[i,j] * np.random.uniform(1-s,1+s)\n #Reset the diagonals to zero\n elif i == j:\n Q[i,j] = 0\n #Recalculate the diagonal entries\n for i in range(Q.shape[0]-1):\n Q[i,i] = -np.sum(Q[:,i])\n return Q",
"_____no_output_____"
],
[
"#Simulation for calculating onset times for a generic Markov chain using Gillespie algorithm\n#Using vectorized formulation for faster speed\n\ndef CalculatetOn_GenericMarkovChainGillespie(Q,n,N_cells):\n#Calculates the onset time for a linear Markov chain with forward and backward rates.\n#The transition rate can be time-varying, but is the same\n#global rate for each transition. The model assumes n states, beginning\n#in the 1st state. Using the Gillespie algorithm and a Markov chain formalism, it\n#simulates N_cells realizations of the overall time it takes to reach the\n#nth state.\n\n#For now, this only works with steady transition rates. Later we will modify this to account \n#for time-varying rates.\n\n# Inputs:\n# Q: transition rate matrix, where q_ji is the transition rate from state i to j for i =/= j and \n# q_ii is the sum of transition rates out of state i\n# n: number of states\n# N_cells: number of cells to simulate\n\n# Outputs:\n# t_on: time to reach the final state for each cell (length = N_cells)\n\n## Setup variables\n t_on = np.zeros(N_cells) #Time to transition to final ON state for each cell\n state = np.zeros(N_cells, dtype=int) #State vector describing current state of each cell\n\n ## Run simulation\n # We will simulate waiting times for each transition for each cell and stop once each cell has\n # reached the final state\n \n #Set diagonal entries in transition matrix to nan since self transitions don't count\n for i in range(n):\n Q[i,i] = 0\n \n #Construct the transition vector out of each cell's current state\n Q_states = np.zeros((N_cells,n))\n while np.sum(state) < (n-1)*N_cells:\n Q_states = np.transpose(Q[:,state])\n \n #Generate random numbers in [0,1] for each cell\n randNums = np.random.random(Q_states.shape)\n\n #Calculate waiting times for each entry in the transition matrix\n #Make sure to suppress divide by zero warning\n with np.errstate(divide='ignore'):\n tau = (1/Q_states) * np.log(1/randNums)\n\n #Find the shortest waiting time to figure out which state we transitioned to for each cell\n tau_min = np.amin(tau, axis=1)\n newState = np.argmin(tau, axis=1)\n \n #Replace infinities with zero, corresponding to having reached the final state\n newState[tau_min==np.inf] = n-1\n tau_min[tau_min==np.inf] = 0\n \n #Update the state and add the waiting time to the overall waiting time\n state = newState\n t_on += tau_min\n return t_on",
"_____no_output_____"
],
[
"#Simulation for calculating onset times for a generic Markov chain using Gillespie algorithm\n#Using vectorized formulation for faster speed\n\ndef CalculatetOn_GenericMarkovChainGillespieTime(Q,n,t_d,N_cells):\n#Calculates the onset time for a linear Markov chain with forward and backward rates.\n#The transition rate can be time-varying, but is the same\n#global rate for each transition. The model assumes n states, beginning\n#in the 1st state. Using the Gillespie algorithm and a Markov chain formalism, it\n#simulates N_cells realizations of the overall time it takes to reach the\n#nth state.\n\n#This considers time-dependent transition rates parameterized by a diffusion timescale t_d.\n#The time-dependent rate has the form r ~ (1 - exp(-t/t_d)). For now, we assume only the forwards\n#rates have the time-dependent profile, and that backwards rates are time-independent.\n\n# Inputs:\n# Q: 3D transition rate matrix, where q_kji is the transition rate at time k from state i to j for i =/= j and \n# q_kii is the sum of transition rates out of state i\n# n: number of states\n# t_d: diffusion timescale of time-dependent transition rate\n# N_cells: number of cells to simulate\n\n# Outputs:\n# t_on: time to reach the final state for each cell (length = N_cells)\n\n## Setup variables\n t_on = np.zeros(N_cells) #Time to transition to final ON state for each cell\n time = np.zeros(N_cells) #Vector of current time for each cell\n state = np.zeros(N_cells, dtype=int) #State vector describing current state of each cell\n\n ## Run simulation\n # We will simulate waiting times for each transition for each cell and stop once each cell has\n # reached the final state\n \n #Set diagonal entries in transition matrix to nan since self transitions don't count\n for i in range(n):\n Q[i,i] = 0\n \n #Define the diffusion timescale matrix t_d (finite for forwards rates, effectively 0 for backwards rates)\n t_d_mat = np.zeros((n,n))\n t_d_mat[:,:] = 0.00000001 #Non forwards transitions are essentially 0 diffusive timescale\n for i in range(n):\n for j in range(n-1):\n #Forwards rates\n if i == j + 1:\n t_d_mat[i,j] = t_d\n \n #Construct the transition vector out of each cell's current state\n Q_states = np.zeros((N_cells,n))\n #Construct the diffusion timescale vector for each cell\n t_d_states = np.zeros((N_cells,n))\n while np.sum(state) < (n-1)*N_cells:\n Q_states = np.transpose(Q[:,state])\n t_d_states = np.transpose(t_d_mat[:,state])\n \n #Construct the current time vector for each cell\n time_states = np.transpose(np.tile(time,(n,1)))\n \n \n #Generate random numbers in [0,1] for each cell\n randNums = np.random.random(Q_states.shape)\n\n #Calculate waiting times for each entry in the transition matrix\n #Make sure to suppress divide by zero warning\n \n #For the exponential profile, this uses the lambertw/productlog function. The steady-state\n #case corresponds to t_d -> 0.\n with np.errstate(divide='ignore', invalid='ignore'):\n #Temp variables for readability\n a = 1/Q_states * np.log(1/randNums)\n b = -np.exp(-(a + t_d_states * np.exp(-time_states/t_d_states) + time_states)/t_d_states)\n tau = np.real(t_d_states * sps.lambertw(b) + a + t_d_states *\\\n np.exp(-time_states / t_d_states))\n #Find the shortest waiting time to figure out which state we transitioned to for each cell\n tau_min = np.amin(tau, axis=1)\n newState = np.argmin(tau, axis=1)\n \n #Replace infinities with zero, corresponding to having reached the final state\n newState[tau_min==np.inf] = n-1\n tau_min[tau_min==np.inf] = 0\n \n #Update the state and add the waiting time to the overall waiting time\n state = newState\n t_on += tau_min\n time += tau_min\n return t_on",
"_____no_output_____"
],
[
"#Verify time-dependent Gillespie algorithm with naive algorithm\nn = 4\nk_min = 0.5\nk_max = 5\nQ = MakeRandomTransitionMatrix(n,k_min,k_max)\nN_cells = 1000\nt_d = 7\ndt = 0.1\ntime = np.arange(0,20,dt)\n\n#Construct the time-dependent transition matrix for the naive simulation\nQ_timedep = np.zeros((len(time),n,n))\nfor i in range(len(time)):\n for j in range(n):\n for k in range(n):\n if j == k + 1:\n Q_timedep[i,j,k] = Q[j,k] * (1 - np.exp(-time[i]/t_d))\n elif j == k:\n Q_timedep[i,j,k] = 0\n else:\n Q_timedep[i,j,k] = Q[j,k]\n#Fix the diagonal entries\nfor i in range(len(time)):\n for j in range(n):\n Q_timedep[i,j,j] = -np.sum(Q_timedep[i,:,j])\n \nt_on_static = CalculatetOn_GenericMarkovChainGillespie(Q,n,N_cells)\nt_on_timedep = CalculatetOn_GenericMarkovChainGillespieTime(Q,n,t_d,N_cells)\nt_on_naive = CalculatetOn_GenericMarkovChain(time,dt,Q_timedep,n,N_cells)\n\n#Plot distribution\nbins = np.arange(0,15,1)\nplt.figure()\nplt.hist(t_on_static,bins=bins,label='static',alpha=0.5)\nplt.hist(t_on_timedep,bins=bins,label='Gillespie time-dependent',alpha=0.5)\nplt.hist(t_on_naive,bins=bins,label='naive time-dependent',alpha=0.5)\nplt.xlabel('onset time')\nplt.legend()\n",
"_____no_output_____"
]
],
[
[
"# Parameter sweeping of Markov chain model\nWe're going to test the n=3 model, which has 3 free parameters, using the ideal limit of equal, irreversible rates, and considering small deviations in the form of backward rates.",
"_____no_output_____"
]
],
[
[
"# Parameter sweep\nn = 3\nk_min = 0.1\nk_max = 5.1\nk_step = 0.1\nk_range = np.arange(k_min,k_max,k_step)\nN_cells = 10000\ndt = 0.1\ntime = np.arange(0,50,dt)\n\n#Small deviation from ideal Gamma limit\nk_backFrac = np.arange(0,1,0.25) #Fraction backwards/forwards transition\nmeans_dev = np.zeros((len(k_range),len(k_backFrac)))\nCV2s_dev = np.zeros((len(k_range),len(k_backFrac)))\n\nfor i in range(len(k_range)):\n for j in range(len(k_backFrac)):\n Q = np.array([[-k_range[i],k_range[i]*k_backFrac[j],0],\\\n [k_range[i],-(k_range[i]+k_range[i]*k_backFrac[j]),0],\\\n [0,k_range[i],0]])\n t_on = CalculatetOn_GenericMarkovChainGillespie(Q,n,N_cells)\n means_dev[i,j] = np.mean(t_on)\n CV2s_dev[i,j] = np.var(t_on) / np.mean(t_on)**2\n \n#Plot results\nplt.figure()\nplt.plot((n-1)/k_range,(1/(n-1))*np.ones(k_range.shape),'k--',label='Gamma limit')\nfor i in range(len(k_backFrac)):\n plt.plot(means_dev[:,i],CV2s_dev[:,i],'.',label='backFrac = ' + str(k_backFrac[i]))\nplt.xlabel('mean')\nplt.ylabel('CV^2')\nplt.legend()",
"_____no_output_____"
]
],
[
[
"It looks like the effect of increasing the backwards rates is to increase noise independent of mean, as expected.",
"_____no_output_____"
]
],
[
[
"print(sps.lambertw(0))\nprint(sps.lambertw(0.00001))\nprint(sps.lambertw(np.exp(-0)))",
"0j\n(9.999900001499974e-06+0j)\n(0.5671432904097838+0j)\n"
],
[
"#Boundary exploration algorithm\n\n#This function explores the 2D mean-CV2 space for a given model by following the same\n#procedure as Eck et. al. 2020 (eLife). Briefly, it lays down a random parameter set, then\n#slices the space into vertical and horizontal slices, finding the extremal points in each slice.\n#It then mutates the parameter values for these points and recalculates the parameter space.\n#This iterates until the total number of iterations has passed. The function keeps track\n#of the boundary parameter values, as well as the means, CV2s, and overall parameter space areas.\n\n#The models are generic Markov chains with nearest-neighbor transitions, and an irreversible transition\n#into the final state.\n\n#For now we are only considering the steady state\ndef BoundaryExploration(n_initial,iterations,slices,n_states,s,k_start,k_min,k_max,mean_max,\\\n N_cells,plots=True):\n #Inputs:\n # n_initial: number of initial points\n # iterations: number of iterations\n # slices: number of horizontal or vertical slices\n # n_states: number of Markov chain states\n # s: mutation growth factor (should be > 1)\n # k_start: starting mean value for transition rates\n # k_min: minimum transition rate\n # k_max: maximum transition rate\n # mean_meax: maximum bound on mean onset time\n # T: simulation end time\n # dt: simulation timestep\n # N_cells: number of cells in each simulation\n # plots: Boolean to display plots (true by default)\n\n\n #Generate initial set of simulated data with random parameters.\n \n #Initialize with nan arrays\n mean_onset_total = np.zeros(n_initial) #Mean t_on values\n CV2_onset_total = np.zeros(n_initial) #CV2iance t_on values\n Q_values_total = np.zeros((n_initial,n_states,n_states)) #Transition matrices\n \n mean_onset_total[:] = np.nan\n CV2_onset_total[:] = np.nan\n Q_values_total[:,:,:] = np.nan\n \n area_total = [] #Area of boundary\n\n for i in prange(n_initial):\n print('Initializing point ' + str(i+1) + ' of ' + str(n_initial), end='\\r')\n\n #Initialize input parameters randomly.\n #Create transition rates around the starting value with 25% width\n k_min_start = k_start * 0.5\n k_max_start = k_start * 1.5\n Q = MakeRandomTransitionMatrix(n_states,k_min_start,k_max_start)\n\n #Simulate the data with these parameters.\n t_on = CalculatetOn_GenericMarkovChainGillespie(Q,n_states,N_cells)\n\n #If there aren't at least 100 samples, skip this simulation\n if np.sum(np.invert(np.isnan(t_on))) < 100:\n continue\n \n #Calculate mean and CV2\n mean_onset = np.nanmean(t_on)\n CV2_onset = np.nanvar(t_on)/np.nanmean(t_on)**2\n \n #If mean is outside bounds, skip\n if mean_onset > mean_max:\n continue\n\n #Save the results\n mean_onset_total[i] = mean_onset\n CV2_onset_total[i] = CV2_onset\n Q_values_total[i,:,:] = Q\n\n #Calculate the boundary\n \n #Remove nans\n noNanInd = np.invert(np.isnan(mean_onset_total)) #Indices of nans\n mean_onset_total = mean_onset_total[noNanInd]\n CV2_onset_total = CV2_onset_total[noNanInd]\n Q_values_total = Q_values_total[noNanInd,:,:]\n \n #Convex hulls\n points = np.transpose(np.array([mean_onset_total,CV2_onset_total]))\n hull = ConvexHull(points)\n\n area_total.append(hull.area) #Save the area of the convex hull\n vertices = hull.vertices #Indices of the points on the convex hull\n\n #Keep only boundary points\n #mean_onset_total = mean_onset_total[vertices]\n #CV2_onset_total = CV2_onset_total[vertices]\n #Q_values_total = Q_values_total[vertices]\n\n #Plot results\n if plots:\n plt.close('all')\n plt.figure('boundaryExploration')\n plt.plot(mean_onset_total,CV2_onset_total,'b.',label='initial set')\n plt.xlabel('mean onset time')\n plt.ylabel('CV2 onset time')\n plt.title('Boundary exploration')\n plt.legend()\n \n\n plt.figure('areaGrowth')\n plt.plot(area_total,'r.-',label='boundary area')\n plt.legend()\n \n\n #Generate new parameter values using the boundary. Divide the initial\n #region into horizontal and vertical slices (number of slices given by\n #function input). Within each horizontal/vertical slice, the two points with the\n #most extreme mean/CV2 in onset time are found. A new point is\n #generated for each of these points with parameter values in a neighborhood\n #of the \"seed\" datapoint. This neighborhood is \"tighter\"\n #than the range of possible values for the initial set so that we can\n #efficiently determine the boundary of the possible parameter space. The\n #advantage of using this slice technique is that the boundary sampling is\n #not biased towards regions of higher density.\n print('')\n for j in range(iterations):\n print('Iteration ' + str(j+1) + ' of ' + str(iterations), end='\\r')\n #Calculate slices in mean onset time\n min_mean_onset = mean_onset_total.min()\n max_mean_onset = mean_onset_total.max()\n mean_onset_slice_values = np.linspace(min_mean_onset,max_mean_onset,slices+1) #slices\n \n #Continue to next iteration if this set has no range in the mean onset\n if len(mean_onset_slice_values) == 0:\n continue\n \n #Calculate slices in CV2 onset time\n min_CV2_onset = CV2_onset_total.min()\n max_CV2_onset = CV2_onset_total.max()\n CV2_onset_slice_values = np.linspace(min_CV2_onset,max_CV2_onset,slices+1) #slices\n \n #Continue to next iteration if this set has no range in the CV2 onset\n if len(CV2_onset_slice_values) == 0:\n continue\n\n #Loop through slices\n newpoints = 1 #Number of new points for each extremal point\n mean_onset_slice = np.zeros((slices,newpoints*4))\n CV2_onset_slice = np.zeros((slices,newpoints*4))\n Q_values_slice = np.zeros((slices,newpoints*4,n_states,n_states))\n \n mean_onset_slice[:,:] = np.nan\n CV2_onset_slice[:,:] = np.nan\n Q_values_slice[:,:,:,:] = np.nan\n\n for i in prange(slices):\n #print('Iteration ' + str(j+1) + ' of ' + str(iterations) + \\\n # ', slice ' + str(i+1) + ' of ' + str(slices), end='\\r')\n #Identify extremal points within slices (4 maximum extremal points\n #per iteration for the max/min for CV2_onset/t_on spread)\n p = 1 #Counter to keep track of new points\n\n #Keep track of new points for each parallel loop\n mean_onset_p = np.zeros(newpoints*4)\n CV2_onset_p = np.zeros(newpoints*4)\n Q_values_p = np.zeros((newpoints*4,n_states,n_states))\n \n mean_onset_p[:] = np.nan\n CV2_onset_p[:] = np.nan\n Q_values_p[:] = np.nan\n\n for p in range(newpoints*4):\n if p <= (newpoints*1):\n index = np.intersect1d(np.where(mean_onset_total >= mean_onset_slice_values[i]),\\\n np.where(mean_onset_total <= mean_onset_slice_values[i+1]))\n if len(index) == 0:\n continue\n else:\n index = np.where(CV2_onset_total[index] == CV2_onset_total[index].min()) #Find index of minimum CV2_onset point\n elif p <= (newpoints*2):\n index = np.intersect1d(np.where(mean_onset_total >= mean_onset_slice_values[i]),\\\n np.where(mean_onset_total <= mean_onset_slice_values[i+1]))\n if len(index) == 0:\n continue\n else:\n index = np.where(CV2_onset_total[index] == CV2_onset_total[index].max()) #Find index of maximum CV2_onset point \n elif p <= (newpoints*3):\n index = np.intersect1d(np.where(CV2_onset_total >= CV2_onset_slice_values[i]),\\\n np.where(CV2_onset_total <= CV2_onset_slice_values[i+1]))\n if len(index) == 0:\n continue\n else:\n index = np.where(mean_onset_total[index] == mean_onset_total[index].min()) #Find index of minumum mean_onset point\n elif p <= (newpoints*4):\n np.intersect1d(np.where(CV2_onset_total >= CV2_onset_slice_values[i]),\\\n np.where(CV2_onset_total <= CV2_onset_slice_values[i+1]))\n if len(index) == 0:\n continue\n else:\n index = np.where(mean_onset_total[index] == mean_onset_total[index].max()) #Find index of maximum mean_onset point\n #If no point found, continue\n if len(index) == 0:\n continue\n #If multiple extremal indices found, keep the first one\n if len(index) > 1:\n index = index[0]\n\n #Generate new points.\n Q = MutateTransitionMatrix(np.squeeze(Q_values_total[index,:,:]),s)\n\n #Check to see that new parameters lie within parameter bounds.\n #JL 10/22/2020: Update this to repeat simulations until we get something within bounds\n c = 0\n for i in range(n_states):\n for j in range(n_states):\n if i != j and Q[i,j] != 0 and (Q[i,j] < k_min or Q[i,j] > k_max):\n c = 1\n if c == 1:\n continue\n\n #Simulate the data with these parameters.\n t_on = CalculatetOn_GenericMarkovChainGillespie(Q,n_states,N_cells)\n\n #If there aren't at least 100 samples, skip this simulation\n if np.sum(np.invert(np.isnan(t_on))) < 100:\n continue\n\n #Calculate mean and CV2\n mean_onset = np.nanmean(t_on)\n CV2_onset = np.nanvar(t_on)/np.nanmean(t_on)**2\n\n #If mean onset time is outside bounds, skip\n if mean_onset > mean_max:\n continue\n \n #Save the data for each new point, for this slice\n mean_onset_p[p] = mean_onset\n CV2_onset_p[p] = CV2_onset\n Q_values_p[p,:,:] = Q\n #Save each slice's data into the whole iteration result.\n mean_onset_slice[i,:] = mean_onset_p\n CV2_onset_slice[i,:] = CV2_onset_p\n Q_values_slice[i,:,:,:] = Q_values_p\n\n #print('Slice complete')\n\n #Save data from this iteration.\n for u in range(slices):\n for y in range(newpoints*4):\n #Check if this simulation resulted in anything and skip otherwise\n if np.isnan(mean_onset_slice[u,y]):\n continue\n mean_onset_total = np.concatenate((mean_onset_total,\\\n np.array([mean_onset_slice[u,y]])),axis=0)\n CV2_onset_total = np.concatenate((CV2_onset_total,\\\n np.array([CV2_onset_slice[u,y]])),axis=0)\n Q_values_total = np.concatenate((Q_values_total,np.reshape(Q_values_slice[u,y,:,:]\\\n ,(1,n_states,n_states))),axis=0)\n\n #Calculate the new boundary\n #Convex hulls\n points = np.transpose(np.array([mean_onset_total,CV2_onset_total]))\n hull = ConvexHull(points)\n\n area_total.append(hull.area) #Save the area of the convex hull\n vertices = hull.vertices #Indices of the points on the convex hull\n\n #Keep only boundary points\n mean_onset_total = mean_onset_total[vertices]\n CV2_onset_total = CV2_onset_total[vertices]\n Q_values_total = Q_values_total[vertices]\n\n\n #Plot results\n if plots:\n plt.figure('boundaryExploration')\n plt.plot(mean_onset_total,CV2_onset_total,'k.')\n\n plt.figure('areaGrowth')\n plt.plot(area_total,'r.-',label='boundary area')\n\n #Plot final results\n if plots:\n plt.figure('boundaryExploration')\n plt.plot(mean_onset_total[vertices],CV2_onset_total[vertices],'r.-',label='final boundary')\n plt.xlim(0,mean_max)\n \n #Plot the Gamma distribution limit\n k_Gamma = np.linspace(k_min,k_max,50)\n plt.plot((n_states-1)/k_Gamma,(1/(n_states-1))*np.ones(k_Gamma.shape),'k--',label='Gamma distribution')\n plt.legend()\n\n \n #Save data\n #Store relevant information in numpy savefile\n #Saving: mean, CV2, area, Q, k_min, k_max, n_sites\n filename = 'ParameterExplorationResults/' + str(date.today()) + '_n=' + str(n_states) + \\\n '_k_min=' + str(k_min) + '_k_max=' + str(k_max)\n #np.savez(filename, mean_onset = mean_onset_total, CV2_onset = CV2_onset_total,\\\n # area = area_total, Q = Q_values_total)",
"_____no_output_____"
],
[
"#Testing the boundary exploration\nn_initial = 50\niterations = 10\nslices = 10\nn_states = 3\ns = 0.5\nk_start = 2\nk_min = 0\nk_max = 4\nmean_max = 10\nN_cells = 50000\nBoundaryExploration(n_initial,iterations,slices,n_states,s,k_start,k_min,k_max,mean_max,N_cells)",
"Initializing point 50 of 50\nIteration 10 of 10\r"
]
],
[
[
"Interestingly, it looks like the boundary algorithm is having trouble exploring the parameter space. We know from sensitivity explorations that the general model should be able to smoothly explore the area around the Gamma distribution limit, but the exploration algorithm seems to be restricted to the upper left quadrant.\n\nTesting the algorithm with a super naive function (e.g. treating the diagonals of a 2x2 matrix as x and y coordinates) indicates the exploration algorithm itself works. So the issue is something with how the parameter space of the Markov chain model gets explored. My hunch is that the stochastic nature of the simulation is interfering with determining the \"smoothness\" of the underlying feature space.",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
]
] |
cbfb9155a44e8428b436b01c43dde3ec139d901a
| 456,352 |
ipynb
|
Jupyter Notebook
|
Processing.ipynb
|
kmamykin/askamanager_salary_survey
|
b0aaaa1bbc213d33234b1d6a140361ff0d22c4a3
|
[
"Apache-2.0"
] | 1 |
2019-12-09T18:03:18.000Z
|
2019-12-09T18:03:18.000Z
|
Processing.ipynb
|
kmamykin/askamanager_salary_survey
|
b0aaaa1bbc213d33234b1d6a140361ff0d22c4a3
|
[
"Apache-2.0"
] | null | null | null |
Processing.ipynb
|
kmamykin/askamanager_salary_survey
|
b0aaaa1bbc213d33234b1d6a140361ff0d22c4a3
|
[
"Apache-2.0"
] | null | null | null | 67.051425 | 426 | 0.669422 |
[
[
[
"import pandas as pd\nimport os\nimport hashlib\nimport requests\nfrom bs4 import BeautifulSoup\nfrom bs4.element import Comment\nimport urllib.parse\nfrom tqdm.notebook import tqdm\nimport random\nfrom multiprocessing import Pool\nimport spacy\nimport numpy as np\n",
"_____no_output_____"
],
[
"industries = pd.read_csv(\"industry_categories.csv\")\nindustries.head()",
"_____no_output_____"
],
[
"salary_industries = pd.read_csv(\"Salary-Industries.csv\")\nsalary_industries.head()",
"_____no_output_____"
],
[
"GOOGLE_API_KEY = 'AIzaSyCqd-BAzUsp6a2ICBETWebYYwoA3d3EeWk'\n",
"_____no_output_____"
],
[
"class KeyValueCache:\n \n def __init__(self, data_dir):\n self.data_dir = data_dir\n if not os.path.isdir(self.data_dir):\n os.mkdir(self.data_dir)\n \n def hash_of(self, key):\n return hashlib.md5(key.encode('utf-8')).hexdigest()\n \n def file_for(self, key):\n return os.path.join(self.data_dir, self.hash_of(key) + '.html')\n \n def contains(self, key): \n \"\"\"Checks if there is content for the key\"\"\"\n return os.path.isfile(self.file_for(key))\n \n def get(self, key):\n \"\"\"Returns the value of the key\"\"\"\n with open(self.file_for(key)) as f:\n return f.read()\n \n def put(self, key, value):\n \"\"\"Stores value at the key\"\"\"\n with open(self.file_for(key), 'w') as f:\n f.write(value)\n return value\n \ncache = KeyValueCache(os.path.join('.', '.cache'))\n# print(cache.hash_of(b'abc'))\n# print(cache.file_for(b'abc'))\n# print(cache.contains(b'abc'))\n# print(cache.put(b'abc', 'abc value'))\n# print(cache.get(b'abc'))\n# print(cache.contains(b'abc'))\n",
"_____no_output_____"
]
],
[
[
"requests quickstart: https://requests.kennethreitz.org/en/master/user/quickstart/",
"_____no_output_____"
]
],
[
[
"static_proxies = pd.read_csv(\"utils/trusted_proxies.csv\")['proxy'].to_list()\n",
"_____no_output_____"
],
[
"\ndef request_proxy(url):\n proxies = static_proxies\n return random.choice(proxies)\n\ndef request_user_agent(url):\n agents = [\n # 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_5) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/50.0.2661.102 Safari/537.36',\n # 'Mozilla/5.0 (iPad; U; CPU OS 3_2_1 like Mac OS X; en-us) AppleWebKit/531.21.10 (KHTML, like Gecko) Mobile/7B405',\n # 'Mozilla/5.0 (Linux; Android 8.0.0; SM-G960F Build/R16NW) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/62.0.3202.84 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 7.0; SM-G892A Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/60.0.3112.107 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 7.0; SM-G930VC Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/58.0.3029.83 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0.1; SM-G935S Build/MMB29K; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/55.0.2883.91 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0.1; SM-G920V Build/MMB29K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.98 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 5.1.1; SM-G928X Build/LMY47X) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.83 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0.1; Nexus 6P Build/MMB29P) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.83 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 7.1.1; G8231 Build/41.2.A.0.219; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/59.0.3071.125 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0.1; E6653 Build/32.2.A.0.253) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.98 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0; HTC One X10 Build/MRA58K; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/61.0.3163.98 Mobile Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0; HTC One M9 Build/MRA58K) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.98 Mobile Safari/537.3'\n # 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/42.0.2311.135 Safari/537.36 Edge/12.246',\n # 'Mozilla/5.0 (X11; CrOS x86_64 8172.45.0) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/51.0.2704.64 Safari/537.36',\n # 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_11_2) AppleWebKit/601.3.9 (KHTML, like Gecko) Version/9.0.2 Safari/601.3.9',\n # 'Mozilla/5.0 (Windows NT 6.1; WOW64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/47.0.2526.111 Safari/537.36',\n # 'Mozilla/5.0 (X11; Ubuntu; Linux x86_64; rv:15.0) Gecko/20100101 Firefox/15.0.1',\n # 'Mozilla/5.0 (Linux; Android 7.0; Pixel C Build/NRD90M; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/52.0.2743.98 Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0.1; SGP771 Build/32.2.A.0.253; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/52.0.2743.98 Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 6.0.1; SHIELD Tablet K1 Build/MRA58K; wv) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/55.0.2883.91 Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 7.0; SM-T827R4 Build/NRD90M) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.116 Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 5.0.2; SAMSUNG SM-T550 Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) SamsungBrowser/3.3 Chrome/38.0.2125.102 Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 4.4.3; KFTHWI Build/KTU84M) AppleWebKit/537.36 (KHTML, like Gecko) Silk/47.1.79 like Chrome/47.0.2526.80 Safari/537.36',\n # 'Mozilla/5.0 (Linux; Android 5.0.2; LG-V410/V41020c Build/LRX22G) AppleWebKit/537.36 (KHTML, like Gecko) Version/4.0 Chrome/34.0.1847.118 Safari/537.36',\n # from Google desctop\n # 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36',\n # 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36'\n # 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36',\n # 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/61.0.3163.100 Safari/537.36',\n # 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36',\n # 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_12_6) AppleWebKit/604.1.38 (KHTML, like Gecko) Version/11.0 Safari/604.1.38',\n # 'Mozilla/5.0 (Windows NT 6.1; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/60.0.3112.113 Safari/537.36',\n # 'Mozilla/5.0 (Windows NT 10.0; WOW64; rv:55.0) Gecko/20100101 Firefox/55.0', \n # 'Mozilla/5.0 (Windows NT 10.0; Win64; x64) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/52.0.2743.116 Safari/537.36 Edge/15.15063'\n # 'Mozilla/5.0 (Windows NT 6.1; WOW64; Trident/7.0; rv:11.0) like Gecko',\n 'Mozilla/5.0 (Macintosh; Intel Mac OS X 10_13_6) AppleWebKit/537.36 (KHTML, like Gecko) Chrome/77.0.3865.120 Safari/537.36'\n ]\n return random.choice(agents)\n\ndef http_get(url):\n headers = {\n 'User-Agent': request_user_agent(url),\n 'Accept': 'text/html,application/xhtml+xml,application/xml;q=1',\n 'Accept-Encoding': 'identity;q=1'\n }\n proxy = request_proxy(url)\n proxies = {\n \"http\": proxy,\n \"https\": proxy\n }\n response = requests.get(url, headers=headers, proxies=proxies, timeout=10, allow_redirects=True)\n # handle HTTPSConnectionPool, ProxyError to retry with a different proxy\n return response\n\ndef cached_get(url, cache):\n \"\"\"gets the content of a url either from cache or by making HTTP request\"\"\"\n if cache.contains(url):\n #print(\"Cached: {}\".format(url))\n return cache.get(url)\n else:\n #print(\"GET: {}\".format(url))\n try:\n response = http_get(url)\n if response: # check for 200\n return cache.put(url, response.text)\n else:\n raise Exception(response.status_code)\n except Exception as err:\n print(\"ERROR: {}\".format(err))\n return None\n\nserp = cached_get('https://www.google.com/search?q=Tech%20-%20IT%20department%20of%20national%20insurance%20company', cache)",
"_____no_output_____"
],
[
"def extract_links(serp):\n def external_url_from_href(href):\n if href.startswith('/url'):\n return urllib.parse.parse_qs(urllib.parse.urlparse('https://www.google.com' + href).query).get('q')[0]\n else:\n return href\n \n soup = BeautifulSoup(serp, 'html.parser')\n hrefs = []\n blocks_v1 = soup.select('#rso div.bkWMgd div.g:not(.g-blk):not(#ZpxfC):not(.gws-trips__outer-card):not(.knavi)')\n #print(\"Elements found: {}\".format(len(blocks_v1)))\n for div in blocks_v1:\n for a in div.select('a:not(.fl):not(.ab_button)'):\n hrefs.append(external_url_from_href(a.get('href')))\n \n blocks_v2 = soup.select('#main div.ZINbbc div.kCrYT > a')\n #print(\"Elements found: {}\".format(len(blocks_v2)))\n for a in blocks_v2:\n hrefs.append(external_url_from_href(a.get('href')))\n return hrefs\n \n #print(a)\n #glinks = ['https://www.google.com' + l.get('href') for l in links]\n #site_links = [urllib.parse.parse_qs(urllib.parse.urlparse(gl).query).get('q')[0] for gl in glinks]\n #return site_links\n\nserp = cached_get('https://www.google.com/search?q=Auto%20rental', cache)\n#print(cache.file_for('https://www.google.com/search?q=Auto%20rental'))\n# serp = cached_get('https://www.google.com/search?q=Accounting', cache)\nextract_links(serp)\n",
"_____no_output_____"
],
[
"def splitup_to_queries(items, separator = None):\n for i in items:\n chunks = i.split(separator) if separator else [i]\n for chunk in chunks:\n yield (i, chunk.strip())\n if len(chunks) > 1:\n yield (i, \" \".join(chunks))\n\ndef queries_to_links(items):\n for industry, query in items:\n search_url = \"https://www.google.com/search?q={}\".format(urllib.parse.quote(query))\n serp = cached_get(search_url, cache)\n if not serp:\n continue\n links = extract_links(serp)\n yield (industry, search_url, links)\n\ndef download_links(items):\n for industry, search_url, links in items:\n for url in links[0:3]:\n link_html = cached_get(url, cache)\n if not link_html:\n yield (industry, 'error', url)\n else:\n yield (industry, 'success', url)\n\ndef first_n_links(items):\n for industry, search_url, links in items:\n for url in links[0:3]:\n yield url\n \ndef download_url(items):\n c = cache\n def download(u):\n result = 'success' if cached_get(url, c) else 'error'\n return (url, result)\n \n with Pool(processes=10) as pool:\n resutls = pool.imap(download, items, chunksize=8)\n \ndef download_data_for_list(list_to_download, separator=None):\n it = download_links(queries_to_links(splitup_to_queries(list_to_download, separator=separator))) \n successes = 0\n errors = 0\n progress = tqdm(it, desc='Downloading', miniters=1)\n for industry, status, url in progress:\n if status == 'error':\n errors = errors + 1\n print(\"ERROR: {}\".format(url))\n else:\n successes = successes + 1\n progress.set_postfix(insudtry = industry, successes = successes, errors = errors)\n\ndef download_serps_for_list(list_to_download, separator=None):\n it = queries_to_links(splitup_to_queries(list_to_download, separator=separator))\n for industry, search_url, urls in tqdm(it, desc='Downloading', miniters=1):\n if len(urls) < 5:\n print(\"{} {}\".format(len(urls), search_url))\n\ndef download_pages_for_list_multiprocess(list_to_download, separator=None):\n urls = first_n_links(queries_to_links(splitup_to_queries(list_to_download, separator=separator)))\n def download(url):\n print(url)\n page = cached_get(url, cache)\n result = 'success' if page else 'error'\n return (url, result)\n with Pool(processes=4) as pool:\n resutls = pool.imap_unordered(download, list(urls), chunksize=4)\n \n#download_data_for_list(industries['industry'].to_list(), separator='/')\ndownload_serps_for_list(salary_industries['Industry Ref'].dropna().sort_values().to_list())\n",
"_____no_output_____"
],
[
"def industry_to_queries(industry, separator = None):\n chunks = industry.split(separator) if separator else [industry]\n result = [chunk.strip() for chunk in chunks]\n if len(chunks) > 1:\n result.append(\" \".join(chunks))\n return result\n\ndef queries_to_links(items):\n for industry, query in items:\n search_url = \"https://www.google.com/search?q={}\".format(urllib.parse.quote(query))\n serp = cached_get(search_url, cache)\n if not serp:\n continue\n links = extract_links(serp)\n yield (industry, search_url, links)\n\ndef create_industry_term_url_map(items, separator=None):\n records = []\n for industry in items:\n for search_term in industry_to_queries(industry, separator=separator):\n search_url = \"https://www.google.com/search?q={}\".format(urllib.parse.quote(search_term))\n serp = cached_get(search_url, cache)\n if not serp:\n continue\n links = extract_links(serp)\n for link in links:\n records.append((industry, search_term, link))\n return pd.DataFrame.from_records(records, columns=['industry', 'term', 'url'])\n\n# industry_targets_urls = create_industry_term_url_map(industries['industry'].to_list(), separator='/')\n# industry_targets_urls.to_csv(\"./utils/industry_targets_urls.csv\", index=False)\n#industry_inputs_urls = create_industry_term_url_map(salary_industries['Industry Ref'].dropna().sort_values().to_list())\n#industry_inputs_urls.to_csv(\"./utils/industry_inputs_urls.csv\", index=False)\n",
"_____no_output_____"
],
[
"industry_targets_urls = pd.read_csv(\"./utils/industry_targets_urls.csv\")\nindustry_inputs_urls = pd.read_csv(\"./utils/industry_inputs_urls.csv\")\nindustry_inputs_urls.head()\n",
"_____no_output_____"
],
[
"import re\ndef tag_visible(element):\n if element.parent.name in ['style', 'script', 'head', 'title', 'meta', '[document]']:\n return False\n if isinstance(element, Comment):\n return False\n #if any(c.contains('hidden') for c in element.parent['class']):\n # return False\n return True\n\n\ndef text_from_html(body):\n try:\n soup = BeautifulSoup(body, 'html.parser')\n texts = soup.find_all(text=True)\n visible_texts = filter(tag_visible, texts)\n visible_texts = map(lambda s: s.encode('utf-8', 'ignore').decode('utf-8'), visible_texts)\n visible_texts = map(lambda s: re.sub(r'\\s+', ' ', s).strip(), visible_texts)\n visible_texts = filter(lambda s: len(s)>0, visible_texts)\n visible_texts = filter(lambda s: len(s.split(' '))>5, visible_texts)\n return ' '.join(visible_texts)\n except:\n return ''\n \nprint(cache.file_for('https://www.rightway.com/used-vehicles/'))\n#page = cached_get('https://www.britannica.com/topic/finance', cache)\n#text_from_html(page)",
"./.cache/56fbf26e13dfd8a1228eb52741d02115.html\n"
],
[
"def extract_and_combine_text_from_urls(urls):\n pages = [cache.get(url) for url in urls if cache.contains(url)]\n texts = [text_from_html(page) for page in pages]\n return \" \".join(texts)\n\ndef is_downloaded(row):\n return cache.contains(row['url'])\n\ndef file_for_url(row):\n if cache.contains(row['url']):\n return cache.file_for(row['url'])\n else:\n return None\n \ndef extract_text(row):\n if cache.contains(row['url']):\n return text_from_html(cache.get(row['url']))\n else:\n return None\n\ndef create_url_text_file(input_file, out_file):\n df = pd.read_csv(input_file)\n df['is_downloaded'] = df.apply(is_downloaded, axis=1)\n df['file'] = df.apply(file_for_url, axis=1)\n df['text'] = df.apply(extract_text, axis=1)\n df.to_csv(out_file, index=False)\n return df",
"_____no_output_____"
],
[
"industry_targets_url_text = create_url_text_file(\"./utils/industry_targets_urls.csv\", \"./utils/industry_targets_url_text.csv\")\nindustry_targets_url_text.head()",
"_____no_output_____"
],
[
"industry_inputs_url_text = create_url_text_file(\"./utils/industry_inputs_urls.csv\", \"./utils/industry_inputs_url_text.csv\")\nindustry_inputs_url_text.head()\n",
"_____no_output_____"
],
[
"industry_inputs_url_text['url'][industry_inputs_url_text['is_downloaded'] == False]\n",
"_____no_output_____"
],
[
"def combine_texts(series):\n return \" \".join([str(t) for t in series.values])\n\ndef create_text_file(input_file, out_file):\n df = pd.read_csv(input_file)\n df = df.groupby(['industry', 'term']).aggregate({'text': combine_texts}).reset_index()\n df.to_csv(out_file, index=False)\n return df",
"_____no_output_____"
],
[
"industry_targets_text = create_text_file(\"./utils/industry_targets_url_text.csv\", \"./utils/industry_targets_text.csv\")\nindustry_targets_text.head()",
"_____no_output_____"
],
[
"industry_inputs_text = create_text_file(\"./utils/industry_inputs_url_text.csv\", \"./utils/industry_inputs_text.csv\")\nindustry_inputs_text.head()",
"_____no_output_____"
],
[
"#industry_inputs_url_text['text'].apply(lambda r: len(r) if r else 0)\nindustry_inputs_url_text['file'].dropna()",
"_____no_output_____"
],
[
"industry_inputs_url_text['link_rank'] = (industry_inputs_url_text.groupby('term').cumcount()+1)\nprioritized_urls = (\n industry_inputs_url_text.loc[:,['term', 'url', 'link_rank', 'is_downloaded']]\n .query('is_downloaded == False')\n .sort_values(['link_rank', 'term'], ascending=[True, False])\n #['url']\n)\nl = prioritized_urls.query('link_rank == 1')['term'].to_list()\nprioritized_urls[prioritized_urls['term'].isin(l)]['url']",
"_____no_output_____"
],
[
"industry_inputs_url_text.groupby('term').aggregate({ 'is_downloaded': lambda g: any(g)}).query('is_downloaded == False')",
"_____no_output_____"
],
[
"def download(url):\n page = cached_get(url, cache)\n if page:\n return ('success', url)\n else:\n return ('error', url)\n\ndef download_all(urls):\n with Pool(10) as pool:\n return pool.map(func, urls)\n\ndef first_few_url_for_each_term(term_url_df, n): \n for g, rows in term_url_df.groupby('term'):\n for url in rows['url'].to_list()[:n]:\n yield url\n \nl = [download(url) for url in tqdm(list(prioritized_urls[prioritized_urls['term'].isin(l)]['url']))]\n ",
"_____no_output_____"
],
[
"s = 'asd\\ud800sdf'.encode('utf-8', 'ignore').decode('utf-8')\nprint(s)\nsrs = pd.Series()\nsrs.loc[ 0 ] = s\nsrs.to_csv('testcase.csv')",
"asdsdf\n"
]
]
] |
[
"code",
"markdown",
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfb94aec9d069cce7acba425bdeb2415c9fe646
| 29,973 |
ipynb
|
Jupyter Notebook
|
Summarizer.ipynb
|
NoahBraceOfficial/ExperimeNtaLP
|
8a16a14ca0959b36c76b3c8b804e4e53643b90f7
|
[
"MIT"
] | null | null | null |
Summarizer.ipynb
|
NoahBraceOfficial/ExperimeNtaLP
|
8a16a14ca0959b36c76b3c8b804e4e53643b90f7
|
[
"MIT"
] | null | null | null |
Summarizer.ipynb
|
NoahBraceOfficial/ExperimeNtaLP
|
8a16a14ca0959b36c76b3c8b804e4e53643b90f7
|
[
"MIT"
] | null | null | null | 132.039648 | 14,232 | 0.633537 |
[
[
[
"from nltk.corpus import stopwords\nfrom nltk.cluster.util import cosine_distance\nimport numpy as np\nimport networkx as nx",
"_____no_output_____"
],
[
"from nltk.corpus import stopwords\nfrom nltk.cluster.util import cosine_distance\nimport numpy as np\nimport networkx as nx\n\ndef read_article(file_name):\n file = open(file_name, \"r\")\n filedata = file.readlines()\n article = filedata[0].split(\". \")\n sentences = []\n\n for sentence in article:\n print(sentence)\n sentences.append(sentence.replace(\"[^a-zA-Z]\", \" \").split(\" \"))\n sentences.pop() \n \n return sentences\n\ndef sentence_similarity(sent1, sent2, stopwords=None):\n if stopwords is None:\n stopwords = []\n \n sent1 = [w.lower() for w in sent1]\n sent2 = [w.lower() for w in sent2]\n \n all_words = list(set(sent1 + sent2))\n \n vector1 = [0] * len(all_words)\n vector2 = [0] * len(all_words)\n \n # build the vector for the first sentence\n for w in sent1:\n if w in stopwords:\n continue\n vector1[all_words.index(w)] += 1\n \n # build the vector for the second sentence\n for w in sent2:\n if w in stopwords:\n continue\n vector2[all_words.index(w)] += 1\n \n return 1 - cosine_distance(vector1, vector2)\n \ndef build_similarity_matrix(sentences, stop_words):\n # Create an empty similarity matrix\n similarity_matrix = np.zeros((len(sentences), len(sentences)))\n \n for idx1 in range(len(sentences)):\n for idx2 in range(len(sentences)):\n if idx1 == idx2: #ignore if both are same sentences\n continue \n similarity_matrix[idx1][idx2] = sentence_similarity(sentences[idx1], sentences[idx2], stop_words)\n\n return similarity_matrix\n\n\ndef generate_summary(file_name, top_n=5):\n stop_words = stopwords.words('english')\n summarize_text = []\n\n # Step 1 - Read text anc split it\n sentences = read_article(file_name)\n\n # Step 2 - Generate Similary Martix across sentences\n sentence_similarity_martix = build_similarity_matrix(sentences, stop_words)\n\n # Step 3 - Rank sentences in similarity martix\n sentence_similarity_graph = nx.from_numpy_array(sentence_similarity_martix)\n scores = nx.pagerank(sentence_similarity_graph)\n\n # Step 4 - Sort the rank and pick top sentences\n ranked_sentence = sorted(((scores[i],s) for i,s in enumerate(sentences)), reverse=True) \n print(\"Indexes of top ranked_sentence order are \", ranked_sentence) \n\n for i in range(top_n):\n summarize_text.append(\" \".join(ranked_sentence[i][1]))\n\n # Step 5 - Offcourse, output the summarize texr\n print(\"Summarize Text: \\n\", \". \".join(summarize_text))\n\n# let's begin\ngenerate_summary( \"butt.txt\", 10)",
"At 11 p.m\non Friday, Britain will leave the European Union\nBig Ben will not bongโitโs too expensiveโbut the United Kingdom will secede from its defining economic and political relationship of the past fifty years\nIn Parliament Square, under the silent clock, Nigel Farage, the countryโs most influential populist politician since Oswald Mosley, will headline a Brexit celebration event\nThe invitation to the rally asks people to โcome in good voice to sing some patriotic songs and bring along as many Union flags as they can, to wave in a patriotic display of pride.โ Since the referendum, in 2016, Brexiteers have sought to characterize leaving the E.U\nas an emancipatory actโan independence day for a country that once ruled the largest empire in history\nIn March, 2019, when Brexit was first supposed to have taken place, Boris Johnson had resigned from the government\nโIt was meant to be the week when church bells were rung, coins struck, stamps issued and bonfires lit to send beacons of freedom from hilltop to hilltop,โ he rued\nJohnson summoned an image of faithful farm workers โweaving through the moonlit lanes of Sussex, half blind with scrumpy, singing Brexit shanties at the tops of their voices and beating the hedgerows with staves.โ Now that it has come to pass, Brexit Day will be muted by comparison\nAs Prime Minister, Johnson is expected to celebrate with a low-key party in Downing Street\nSome Remainers have pledged not to handle a new fifty-pence coin commissioned by the government to mark the occasion\nThe whole thing feels fairly peevishโthat is to say, properly British\nBut what will Brexit Britain actually be like? At the moment it happens, Britons will cease to be citizens of the E.U\nUntil the end of 2020, however, the country is officially in โa transition period,โ while it negotiates a future trading relationship with the bloc, so nothing will immediately change\nOn the surface, the second phase of the Brexit talks may well resemble the first: the E.U\nwill play hardball, led by its arch-rational lead negotiator, Michel Barnier, of France, and there will be periodic spasms in Westminster, as Johnsonโs government figures out how to respond\nBut a larger, subtler process of divergence will have already begun\nBrexit has always been fascinating to me because it contains a genuinely difficult question: Will a country like Britain be better off trying to navigate the challenges of the twenty-first century deeply embedded in an international bloc, or out on its own, with greater freedom to maneuver and, in theory, listen to its population? In 2000, Dani Rodrik, an economics professor at Harvard, first described what he has since called his โimpossibility theorem,โ in which the nation-state, democracy, and global economic integration are mutually incompatible\nโWe can have at most two of these three things,โ he wrote\nBritainโs forty-seven-year membership of the E.U\nturned out to be a case study in the impossible\nIn 2016, a narrow majority of voters chose what they perceived to be democracy and sovereignty over the economic logic of being part of a huge supranational market\nLast October, at the Conservative Party conference, in Manchester, I listened to Stephen Harper and Tony Abbott, the former Prime Ministers of Canada and Australia, making the case that Brexiteers often like to make: that there is no reason why the U.K., with its economy, its history, its language, should not be able to flourish alone, like Japan or South Korea or New Zealand\nDominic Cummings, the campaign director of Vote Leave and now Johnsonโs senior adviser, often quotes David Deutsch, the celebrated physicist, who sees the nation-state as better equipped to correct itself in a complicated world than a vast, slow-moving entity like the E.U\nโError correction is the basic issue,โ Deutsch said\nโAnd I canโt foresee the E.U improving much in this respect.โ But it is striking that these arguments are almost always made by those who wish to shrink the state or, in Cummingsโs case, redesign it altogether\nA long time ago, it was common for British politicians on the left, such as Jeremy Corbyn, to denounce European integration as a project that mostly benefitted the blocโs richest countries and their largest corporations\nBut, in the twenty-first century, the E.U\nhas been a steady advocate for human rights, action on climate change, digital privacy, and the rules-based international orderโeven when that has proved awkward for its members\nLast week, with an eighty-seat majority in the House of Commons and Brexit Day in sight, Johnsonโs government removed an amendment to protect unaccompanied child refugees coming to the U.K\nfrom the legislation that will finally set Britain free\nMargaret Thatcher was one of the architects of the E.U.โs single market\nShe positioned Britain as a free-trading entry point to the more regulated economies of mainland Europe\nYou donโt have to be Henry Kissinger to see that the obvious course for Brexit Britain is to try to becomeโin some formโan offshore competitor\nEarlier this month, Sajid Javid, the Chancellor, who is a former derivatives trader for Deutsche Bank, told the Financial Times that โthere will not be alignmentโ with many E.U\ngoods regulations in the future\nSince the eighties, successive British governments have had a remarkably relaxed attitude to foreign companies playing vital roles in the economy, or owning critical pieces of infrastructure\nThe U.K.โs busiest airport, Heathrow, is owned by an international consortium led by a Spanish company and the Qatari government; its busiest port, Felixstowe, belongs to a Hong Kong investment firm\nOn January 28th, the government gave its approval for Huawei, the Chinese technology company, to help build Britainโs new 5G cell-phone network, over the objections of the U.S\ngovernment and other security partners\nBrexit is often characterized as a populist rebellion against the forces of globalization, but the reality of life outside the E.U\nmeans that they will probably only accelerate\nThings that are distinctive, British, and publicly ownedโlike the BBCโwill be vulnerable\nThis week, the national broadcaster, which is under threat from streaming services and ideological opponents in the Conservative Party, announced that it will cut four hundred and fifty jobs from its news division\nBrexit Britain wonโt be unrecognizable\nSince 2017, mainly under the influence of Michael Gove, Johnsonโs wingman during the referendum, the country has committed itself to an ambitious environmental agenda\nLast year, Britain passed legislation to reduce its carbon emissions to net zero by 2050\nLike Theresa May before him, Johnson plans to inject billions of pounds into the National Health Service, the final great edifice of the British welfare state\nLast December, the Conservatives won their decisive election victory with the help of constituencies in northern England, which have traditionally voted for Labour but which backed Brexit\nSince then, Johnsonโs government has promised to โlevel upโ economic prosperity in the U.K\nby signing off on ambitious infrastructure projects, such as HS2, a high-speed rail link that is forecast to cost more than a hundred billion pounds\nAs far as I can see, the signs point to a kind of hybrid nation: more welcoming to foreign capital but more hostile to foreign people; greener but more conservative; freer on the world stage, but weaker\nThere will be shiny totemsโa few new trains and hospitalsโbut the general safety net for most citizens, in terms of their rights and benefits, will be cut away\nBrexit has always been a mythical endpoint, which means that governments will be able to justify a great variety of things in its name\nIt is the perfect project for Johnson, the most flexible of Prime Ministers\nMore than anything, the enterprise appears unstable\nThe most obvious tensions will be geographic\nIn December, the people of Northern Ireland and Scotland voted overwhelmingly for anti-Brexit parties\nThis month, the U.K.โs devolved assemblies in Belfast, Cardiff, and Edinburgh all rejected the governmentโs legislation to take Britain out of the E.U.โvotes that were ignored in London, which is itself not a Brexit city\nAs the Irish columnist Fintan OโToole wrote in the Guardian on Sunday, โWhile Johnson likes to talk of 31 January as โthis pivotal moment in our national story,โ there is neither a settled nation nor a shared story.โ Britain was a deeply divided country before 2016, and it led to Brexit\nWhat will be the next rupture?\n"
]
]
] |
[
"code"
] |
[
[
"code",
"code"
]
] |
cbfbcd416b80a4fce47a42987a08e1d81a8063e1
| 175,483 |
ipynb
|
Jupyter Notebook
|
Model backlog/Train/101-jigsaw-fold3-xlm-roberta-large-best.ipynb
|
dimitreOliveira/Jigsaw-Multilingual-Toxic-Comment-Classification
|
44422e6aeeff227e22dbb5c05101322e9d4aabbe
|
[
"MIT"
] | 4 |
2020-06-23T02:31:07.000Z
|
2020-07-04T11:50:08.000Z
|
Model backlog/Train/101-jigsaw-fold3-xlm-roberta-large-best.ipynb
|
dimitreOliveira/Jigsaw-Multilingual-Toxic-Comment-Classification
|
44422e6aeeff227e22dbb5c05101322e9d4aabbe
|
[
"MIT"
] | null | null | null |
Model backlog/Train/101-jigsaw-fold3-xlm-roberta-large-best.ipynb
|
dimitreOliveira/Jigsaw-Multilingual-Toxic-Comment-Classification
|
44422e6aeeff227e22dbb5c05101322e9d4aabbe
|
[
"MIT"
] | null | null | null | 120.358711 | 62,748 | 0.806015 |
[
[
[
"## Dependencies",
"_____no_output_____"
]
],
[
[
"import json, warnings, shutil, glob\nfrom jigsaw_utility_scripts import *\nfrom scripts_step_lr_schedulers import *\nfrom transformers import TFXLMRobertaModel, XLMRobertaConfig\nfrom tensorflow.keras.models import Model\nfrom tensorflow.keras import optimizers, metrics, losses, layers\n\nSEED = 0\nseed_everything(SEED)\nwarnings.filterwarnings(\"ignore\")",
"_____no_output_____"
]
],
[
[
"## TPU configuration",
"_____no_output_____"
]
],
[
[
"strategy, tpu = set_up_strategy()\nprint(\"REPLICAS: \", strategy.num_replicas_in_sync)\nAUTO = tf.data.experimental.AUTOTUNE",
"Running on TPU grpc://10.0.0.2:8470\nREPLICAS: 8\n"
]
],
[
[
"# Load data",
"_____no_output_____"
]
],
[
[
"database_base_path = '/kaggle/input/jigsaw-data-split-roberta-192-ratio-2-upper/'\nk_fold = pd.read_csv(database_base_path + '5-fold.csv')\nvalid_df = pd.read_csv(\"/kaggle/input/jigsaw-multilingual-toxic-comment-classification/validation.csv\", \n usecols=['comment_text', 'toxic', 'lang'])\n\nprint('Train samples: %d' % len(k_fold))\ndisplay(k_fold.head())\nprint('Validation samples: %d' % len(valid_df))\ndisplay(valid_df.head())\n\nbase_data_path = 'fold_3/'\nfold_n = 3\n# Unzip files\n!tar -xvf /kaggle/input/jigsaw-data-split-roberta-192-ratio-2-upper/fold_3.tar.gz",
"Train samples: 400830\n"
]
],
[
[
"# Model parameters",
"_____no_output_____"
]
],
[
[
"base_path = '/kaggle/input/jigsaw-transformers/XLM-RoBERTa/'\n\nconfig = {\n \"MAX_LEN\": 192,\n \"BATCH_SIZE\": 128,\n \"EPOCHS\": 4,\n \"LEARNING_RATE\": 1e-5, \n \"ES_PATIENCE\": None,\n \"base_model_path\": base_path + 'tf-xlm-roberta-large-tf_model.h5',\n \"config_path\": base_path + 'xlm-roberta-large-config.json'\n}\n\nwith open('config.json', 'w') as json_file:\n json.dump(json.loads(json.dumps(config)), json_file)",
"_____no_output_____"
]
],
[
[
"## Learning rate schedule",
"_____no_output_____"
]
],
[
[
"lr_min = 1e-7\nlr_start = 1e-7\nlr_max = config['LEARNING_RATE']\nstep_size = len(k_fold[k_fold[f'fold_{fold_n}'] == 'train']) // config['BATCH_SIZE']\ntotal_steps = config['EPOCHS'] * step_size\nhold_max_steps = 0\nwarmup_steps = step_size * 1\ndecay = .9997\n\nrng = [i for i in range(0, total_steps, config['BATCH_SIZE'])]\ny = [exponential_schedule_with_warmup(tf.cast(x, tf.float32), warmup_steps, hold_max_steps, \n lr_start, lr_max, lr_min, decay) for x in rng]\n\nsns.set(style=\"whitegrid\")\nfig, ax = plt.subplots(figsize=(20, 6))\nplt.plot(rng, y)\nprint(\"Learning rate schedule: {:.3g} to {:.3g} to {:.3g}\".format(y[0], max(y), y[-1]))",
"Learning rate schedule: 1e-07 to 9.84e-06 to 1.06e-06\n"
]
],
[
[
"# Model",
"_____no_output_____"
]
],
[
[
"module_config = XLMRobertaConfig.from_pretrained(config['config_path'], output_hidden_states=False)\n\ndef model_fn(MAX_LEN):\n input_ids = layers.Input(shape=(MAX_LEN,), dtype=tf.int32, name='input_ids')\n attention_mask = layers.Input(shape=(MAX_LEN,), dtype=tf.int32, name='attention_mask')\n \n base_model = TFXLMRobertaModel.from_pretrained(config['base_model_path'], config=module_config)\n last_hidden_state, _ = base_model({'input_ids': input_ids, 'attention_mask': attention_mask})\n cls_token = last_hidden_state[:, 0, :]\n \n output = layers.Dense(1, activation='sigmoid', name='output')(cls_token)\n \n model = Model(inputs=[input_ids, attention_mask], outputs=output)\n \n return model",
"_____no_output_____"
]
],
[
[
"# Train",
"_____no_output_____"
]
],
[
[
"# Load data\nx_train = np.load(base_data_path + 'x_train.npy')\ny_train = np.load(base_data_path + 'y_train_int.npy').reshape(x_train.shape[1], 1).astype(np.float32)\nx_valid_ml = np.load(database_base_path + 'x_valid.npy')\ny_valid_ml = np.load(database_base_path + 'y_valid.npy').reshape(x_valid_ml.shape[1], 1).astype(np.float32)\n\n#################### ADD TAIL ####################\nx_train = np.hstack([x_train, np.load(base_data_path + 'x_train_tail.npy')])\ny_train = np.vstack([y_train, y_train])\n\nstep_size = x_train.shape[1] // config['BATCH_SIZE']\nvalid_step_size = x_valid_ml.shape[1] // config['BATCH_SIZE']\n\n# Build TF datasets\ntrain_dist_ds = strategy.experimental_distribute_dataset(get_training_dataset(x_train, y_train, config['BATCH_SIZE'], AUTO, seed=SEED))\nvalid_dist_ds = strategy.experimental_distribute_dataset(get_validation_dataset(x_valid_ml, y_valid_ml, config['BATCH_SIZE'], AUTO, repeated=True, seed=SEED))\ntrain_data_iter = iter(train_dist_ds)\nvalid_data_iter = iter(valid_dist_ds)\n\n# Step functions\[email protected]\ndef train_step(data_iter):\n def train_step_fn(x, y):\n with tf.GradientTape() as tape:\n probabilities = model(x, training=True)\n loss = loss_fn(y, probabilities)\n grads = tape.gradient(loss, model.trainable_variables)\n optimizer.apply_gradients(zip(grads, model.trainable_variables))\n train_auc.update_state(y, probabilities)\n train_loss.update_state(loss)\n for _ in tf.range(step_size):\n strategy.experimental_run_v2(train_step_fn, next(data_iter))\n\[email protected]\ndef valid_step(data_iter):\n def valid_step_fn(x, y):\n probabilities = model(x, training=False)\n loss = loss_fn(y, probabilities)\n valid_auc.update_state(y, probabilities)\n valid_loss.update_state(loss)\n for _ in tf.range(valid_step_size):\n strategy.experimental_run_v2(valid_step_fn, next(data_iter))\n\n# Train model\nwith strategy.scope():\n model = model_fn(config['MAX_LEN'])\n optimizer = optimizers.Adam(learning_rate=lambda: \n exponential_schedule_with_warmup(tf.cast(optimizer.iterations, tf.float32), \n warmup_steps, hold_max_steps, lr_start, \n lr_max, lr_min, decay))\n loss_fn = losses.binary_crossentropy\n train_auc = metrics.AUC()\n valid_auc = metrics.AUC()\n train_loss = metrics.Sum()\n valid_loss = metrics.Sum()\n\nmetrics_dict = {'loss': train_loss, 'auc': train_auc, \n 'val_loss': valid_loss, 'val_auc': valid_auc}\n\nhistory = custom_fit(model, metrics_dict, train_step, valid_step, train_data_iter, valid_data_iter, \n step_size, valid_step_size, config['BATCH_SIZE'], config['EPOCHS'], \n config['ES_PATIENCE'], save_last=False)\n# model.save_weights('model.h5')\n\n# Make predictions\nx_train = np.load(base_data_path + 'x_train.npy')\nx_valid = np.load(base_data_path + 'x_valid.npy')\nx_valid_ml_eval = np.load(database_base_path + 'x_valid.npy')\n\ntrain_preds = model.predict(get_test_dataset(x_train, config['BATCH_SIZE'], AUTO))\nvalid_preds = model.predict(get_test_dataset(x_valid, config['BATCH_SIZE'], AUTO))\nvalid_ml_preds = model.predict(get_test_dataset(x_valid_ml_eval, config['BATCH_SIZE'], AUTO))\n\nk_fold.loc[k_fold[f'fold_{fold_n}'] == 'train', f'pred_{fold_n}'] = np.round(train_preds)\nk_fold.loc[k_fold[f'fold_{fold_n}'] == 'validation', f'pred_{fold_n}'] = np.round(valid_preds)\nvalid_df[f'pred_{fold_n}'] = valid_ml_preds\n\n\n# Fine-tune on validation set\n#################### ADD TAIL ####################\nx_valid_ml_tail = np.hstack([x_valid_ml, np.load(database_base_path + 'x_valid_tail.npy')])\ny_valid_ml_tail = np.vstack([y_valid_ml, y_valid_ml])\n\nvalid_step_size_tail = x_valid_ml_tail.shape[1] // config['BATCH_SIZE']\n\n# Build TF datasets\ntrain_ml_dist_ds = strategy.experimental_distribute_dataset(get_training_dataset(x_valid_ml_tail, y_valid_ml_tail, \n config['BATCH_SIZE'], AUTO, seed=SEED))\ntrain_ml_data_iter = iter(train_ml_dist_ds)\n\nhistory_ml = custom_fit(model, metrics_dict, train_step, valid_step, train_ml_data_iter, valid_data_iter, \n valid_step_size_tail, valid_step_size, config['BATCH_SIZE'], 1, \n config['ES_PATIENCE'], save_last=False)\n\n# Join history\nfor key in history_ml.keys():\n history[key] += history_ml[key]\n \nmodel.save_weights('model_ml.h5')\n\n# Make predictions\nvalid_ml_preds = model.predict(get_test_dataset(x_valid_ml_eval, config['BATCH_SIZE'], AUTO))\nvalid_df[f'pred_ml_{fold_n}'] = valid_ml_preds\n\n### Delete data dir\nshutil.rmtree(base_data_path)",
"Train for 5010 steps, validate for 62 steps\n\nEPOCH 1/4\ntime: 1715.5s loss: 0.2364 auc: 0.9612 val_loss: 0.2883 val_auc: 0.9203\n\nEPOCH 2/4\ntime: 1520.0s loss: 0.1621 auc: 0.9816 val_loss: 0.3094 val_auc: 0.9190\n\nEPOCH 3/4\ntime: 1519.9s loss: 0.1386 auc: 0.9863 val_loss: 0.3088 val_auc: 0.9098\n\nEPOCH 4/4\ntime: 1520.1s loss: 0.1335 auc: 0.9873 val_loss: 0.3193 val_auc: 0.9096\nTraining finished\nTrain for 125 steps, validate for 62 steps\n\nEPOCH 1/1\ntime: 1624.4s loss: 7.2102 auc: 0.9577 val_loss: 0.1235 val_auc: 0.9818\nTraining finished\n"
]
],
[
[
"## Model loss graph",
"_____no_output_____"
]
],
[
[
"plot_metrics(history)",
"_____no_output_____"
]
],
[
[
"# Model evaluation",
"_____no_output_____"
]
],
[
[
"display(evaluate_model_single_fold(k_fold, fold_n, label_col='toxic_int').style.applymap(color_map))",
"_____no_output_____"
]
],
[
[
"# Confusion matrix",
"_____no_output_____"
]
],
[
[
"train_set = k_fold[k_fold[f'fold_{fold_n}'] == 'train']\nvalidation_set = k_fold[k_fold[f'fold_{fold_n}'] == 'validation'] \nplot_confusion_matrix(train_set['toxic_int'], train_set[f'pred_{fold_n}'], \n validation_set['toxic_int'], validation_set[f'pred_{fold_n}'])",
"_____no_output_____"
]
],
[
[
"# Model evaluation by language",
"_____no_output_____"
]
],
[
[
"display(evaluate_model_single_fold_lang(valid_df, fold_n).style.applymap(color_map))\n# ML fine-tunned preds\ndisplay(evaluate_model_single_fold_lang(valid_df, fold_n, pred_col='pred_ml').style.applymap(color_map))",
"_____no_output_____"
]
],
[
[
"# Visualize predictions",
"_____no_output_____"
]
],
[
[
"pd.set_option('max_colwidth', 120)\nprint('English validation set')\ndisplay(k_fold[['comment_text', 'toxic'] + [c for c in k_fold.columns if c.startswith('pred')]].head(10))\n\nprint('Multilingual validation set')\ndisplay(valid_df[['comment_text', 'toxic'] + [c for c in valid_df.columns if c.startswith('pred')]].head(10))",
"English validation set\n"
]
],
[
[
"# Test set predictions",
"_____no_output_____"
]
],
[
[
"x_test = np.load(database_base_path + 'x_test.npy')\ntest_preds = model.predict(get_test_dataset(x_test, config['BATCH_SIZE'], AUTO))",
"_____no_output_____"
],
[
"submission = pd.read_csv('/kaggle/input/jigsaw-multilingual-toxic-comment-classification/sample_submission.csv')\nsubmission['toxic'] = test_preds\nsubmission.to_csv('submission.csv', index=False)\n\ndisplay(submission.describe())\ndisplay(submission.head(10))",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbfbd2a07b35041c48fc442d2d4663fc03461fff
| 34,489 |
ipynb
|
Jupyter Notebook
|
labs/04_model_monitor/04_model_monitor.ipynb
|
crayon/amazon-sagemaker-immersion-day
|
f2a2e38994f25bd996a39d02a5cef7747255f11d
|
[
"MIT-0"
] | null | null | null |
labs/04_model_monitor/04_model_monitor.ipynb
|
crayon/amazon-sagemaker-immersion-day
|
f2a2e38994f25bd996a39d02a5cef7747255f11d
|
[
"MIT-0"
] | null | null | null |
labs/04_model_monitor/04_model_monitor.ipynb
|
crayon/amazon-sagemaker-immersion-day
|
f2a2e38994f25bd996a39d02a5cef7747255f11d
|
[
"MIT-0"
] | null | null | null | 38.193798 | 739 | 0.624489 |
[
[
[
"# Amazon SageMaker Model Monitor\nThis notebook shows how to:\n* Host a machine learning model in Amazon SageMaker and capture inference requests, results, and metadata \n* Analyze a training dataset to generate baseline constraints\n* Monitor a live endpoint for violations against constraints\n\n---\n## Background\n\nAmazon SageMaker provides every developer and data scientist with the ability to build, train, and deploy machine learning models quickly. Amazon SageMaker is a fully-managed service that encompasses the entire machine learning workflow. You can label and prepare your data, choose an algorithm, train a model, and then tune and optimize it for deployment. You can deploy your models to production with Amazon SageMaker to make predictions and lower costs than was previously possible.\n\nIn addition, Amazon SageMaker enables you to capture the input, output and metadata for invocations of the models that you deploy. It also enables you to analyze the data and monitor its quality. In this notebook, you learn how Amazon SageMaker enables these capabilities.\n\n---\n## Setup\n\nTo get started, make sure you have these prerequisites completed.\n\n* Specify an AWS Region to host your model.\n* An IAM role ARN exists that is used to give Amazon SageMaker access to your data in Amazon Simple Storage Service (Amazon S3). See the documentation for how to fine tune the permissions needed. \n* Create an S3 bucket used to store the data used to train your model, any additional model data, and the data captured from model invocations. For demonstration purposes, you are using the same bucket for these. In reality, you might want to separate them with different security policies.",
"_____no_output_____"
]
],
[
[
"import boto3\nimport os\nimport sagemaker\nfrom sagemaker import get_execution_role\n\nregion = boto3.Session().region_name\nrole = get_execution_role()\nsess = sagemaker.session.Session()\nbucket = sess.default_bucket() \nprefix = 'tf-2-workflow'\n\ns3_capture_upload_path = 's3://{}/{}/monitoring/datacapture'.format(bucket, prefix)\n\nreports_prefix = '{}/reports'.format(prefix)\ns3_report_path = 's3://{}/{}'.format(bucket,reports_prefix)\n\nprint(\"Capture path: {}\".format(s3_capture_upload_path))\nprint(\"Report path: {}\".format(s3_report_path))",
"Capture path: s3://sagemaker-eu-central-1-022505235570/tf-2-workflow/monitoring/datacapture\nReport path: s3://sagemaker-eu-central-1-022505235570/tf-2-workflow/reports\n"
]
],
[
[
"# PART A: Capturing real-time inference data from Amazon SageMaker endpoints\nCreate an endpoint to showcase the data capture capability in action.\n",
"_____no_output_____"
],
[
"### Deploy the model to Amazon SageMaker\nStart with deploying the trained TensorFlow model from lab 03.",
"_____no_output_____"
]
],
[
[
"import boto3\n\ndef get_latest_training_job_name(base_job_name):\n client = boto3.client('sagemaker')\n response = client.list_training_jobs(NameContains=base_job_name, SortBy='CreationTime', \n SortOrder='Descending', StatusEquals='Completed')\n if len(response['TrainingJobSummaries']) > 0 :\n return response['TrainingJobSummaries'][0]['TrainingJobName']\n else:\n raise Exception('Training job not found.')\n\ndef get_training_job_s3_model_artifacts(job_name):\n client = boto3.client('sagemaker')\n response = client.describe_training_job(TrainingJobName=job_name)\n s3_model_artifacts = response['ModelArtifacts']['S3ModelArtifacts']\n return s3_model_artifacts\n\nlatest_training_job_name = get_latest_training_job_name('tf-2-workflow')\nprint(latest_training_job_name)\nmodel_path = get_training_job_s3_model_artifacts(latest_training_job_name)\nprint(model_path)",
"tf-2-workflow-22-09-23-03-015-763adf42\ns3://sagemaker-eu-central-1-022505235570/tf-2-workflow-22-09-23-03-015-763adf42/output/model.tar.gz\n"
]
],
[
[
"Here, you create the model object with the image and model data.",
"_____no_output_____"
]
],
[
[
"from sagemaker.tensorflow.model import TensorFlowModel\n\ntensorflow_model = TensorFlowModel(\n model_data = model_path,\n role = role,\n framework_version = '2.3.1'\n)\n",
"_____no_output_____"
],
[
"from time import gmtime, strftime\nfrom sagemaker.model_monitor import DataCaptureConfig\n\nendpoint_name = 'tf-2-workflow-endpoint-' + strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\nprint(endpoint_name)\n\npredictor = tensorflow_model.deploy(\n initial_instance_count=1,\n instance_type='ml.m5.xlarge',\n endpoint_name=endpoint_name,\n data_capture_config=DataCaptureConfig(\n enable_capture=True,\n sampling_percentage=100,\n destination_s3_uri=s3_capture_upload_path\n )\n)",
"update_endpoint is a no-op in sagemaker>=2.\nSee: https://sagemaker.readthedocs.io/en/stable/v2.html for details.\n"
]
],
[
[
"### Prepare dataset\n\nNext, we'll import the dataset. The dataset itself is small and relatively issue-free. For example, there are no missing values, a common problem for many other datasets. Accordingly, preprocessing just involves normalizing the data.",
"_____no_output_____"
]
],
[
[
"import numpy as np\nfrom tensorflow.python.keras.datasets import boston_housing\nfrom sklearn.preprocessing import StandardScaler\n\n(x_train, y_train), (x_test, y_test) = boston_housing.load_data()\nscaler = StandardScaler()\nscaler.fit(x_train)\nx_train = scaler.transform(x_train)\nx_test = scaler.transform(x_test)",
"_____no_output_____"
]
],
[
[
"## Invoke the deployed model\n\nYou can now send data to this endpoint to get inferences in real time. Because you enabled the data capture in the previous steps, the request and response payload, along with some additional metadata, is saved in the Amazon Simple Storage Service (Amazon S3) location you have specified in the DataCaptureConfig.",
"_____no_output_____"
],
[
"This step invokes the endpoint with included sample data for about 3 minutes. Data is captured based on the sampling percentage specified and the capture continues until the data capture option is turned off.",
"_____no_output_____"
]
],
[
[
"%%time \n\nimport time\n\nprint(\"Sending test traffic to the endpoint {}. \\nPlease wait...\".format(endpoint_name))\n\nflat_list =[]\nfor item in x_test:\n result = predictor.predict(item)['predictions'] \n flat_list.append(float('%.1f'%(np.array(result))))\n time.sleep(1.8)\n \nprint(\"Done!\")\nprint('predictions: \\t{}'.format(np.array(flat_list)))",
"Sending test traffic to the endpoint tf-2-workflow-endpoint-2021-03-22-10-41-03. \nPlease wait...\nDone!\npredictions: \t[12.7 19. 21.5 30.4 22.1 21. 28.5 24.8 18.9 19.4 16.9 18.7 17.2 37.\n 13.6 23.6 24. 24.1 15.4 19.5 12.7 12.9 21.4 17.6 24.8 25.6 30.6 24.6\n 12.7 23.7 21.3 13. 34.1 23.1 17.1 12.7 18.9 19.9 22.1 25.7 24.2 26.3\n 15.4 37. 29.1 21.1 27.4 19. 19.5 21.8 29.2 19.7 12.7 15. 36.2 25.1\n 15.4 37.1 35.6 21.6 23.9 17.3 15.3 19.7 22.2 25.1 15.2 25.7 12.7 12.7\n 19.1 25.9 22.3 13.1 26.2 19.5 22.9 22. 31.6 12.7 21.6 37. 19.8 16.9\n 19.5 18.4 17.4 20.3 22.3 33.6 18.6 24.5 25.5 31.3 32.3 21.1 36.7 35.\n 24.3 37. 32.3 22.4]\nCPU times: user 302 ms, sys: 31.2 ms, total: 333 ms\nWall time: 3min 5s\n"
]
],
[
[
"## View captured data\n\nNow list the data capture files stored in Amazon S3. You should expect to see different files from different time periods organized based on the hour in which the invocation occurred. The format of the Amazon S3 path is:\n\n`s3://{destination-bucket-prefix}/{endpoint-name}/{variant-name}/yyyy/mm/dd/hh/filename.jsonl`\n\n<b>Note that the delivery of capture data to Amazon S3 can require a couple of minutes so next cell might error. If this happens, please retry after a minute.</b>",
"_____no_output_____"
]
],
[
[
"s3_client = boto3.Session().client('s3')\nresult = s3_client.list_objects(Bucket=bucket, Prefix='tf-2-workflow/monitoring/datacapture/')\ncapture_files = [capture_file.get(\"Key\") for capture_file in result.get('Contents')]\nprint(\"Found Capture Files:\")\nprint(\"\\n \".join(capture_files))",
"Found Capture Files:\ntf-2-workflow/monitoring/datacapture/tf-2-workflow-endpoint-2021-03-22-10-41-03/AllTraffic/2021/03/22/10/57-47-993-8069436d-6535-4e5d-be17-b48b52e50cbb.jsonl\n tf-2-workflow/monitoring/datacapture/tf-2-workflow-endpoint-2021-03-22-10-41-03/AllTraffic/2021/03/22/10/58-48-150-ba44bc72-0a55-4d32-a4fd-dab67f8bf28f.jsonl\n"
]
],
[
[
"Next, view the contents of a single capture file. Here you should see all the data captured in an Amazon SageMaker specific JSON-line formatted file. Take a quick peek at the first few lines in the captured file.",
"_____no_output_____"
]
],
[
[
"def get_obj_body(obj_key):\n return s3_client.get_object(Bucket=bucket, Key=obj_key).get('Body').read().decode(\"utf-8\")\n\ncapture_file = get_obj_body(capture_files[-1])\nprint(capture_file[:2000])",
"{\"captureData\":{\"endpointInput\":{\"observedContentType\":\"application/json\",\"mode\":\"INPUT\",\"data\":\"[-0.3846146658753033, -0.4836154708652843, -0.04328034837026877, -0.2568327484687563, -1.2322543225547462, -0.031148842501293867, -2.2507883885350743, 0.7629475894761004, -0.6262490526587586, -0.6072060061360174, 0.32944628645426327, 0.2381714623786403, -0.7177935480963249]\",\"encoding\":\"JSON\"},\"endpointOutput\":{\"observedContentType\":\"application/json\",\"mode\":\"OUTPUT\",\"data\":\"{\\n \\\"predictions\\\": [[23.0875893]\\n ]\\n}\",\"encoding\":\"JSON\"}},\"eventMetadata\":{\"eventId\":\"72c01046-9516-414c-8456-af55d0f9003b\",\"inferenceTime\":\"2021-03-22T10:58:48Z\"},\"eventVersion\":\"0\"}\n{\"captureData\":{\"endpointInput\":{\"observedContentType\":\"application/json\",\"mode\":\"INPUT\",\"data\":\"[1.0109107738040224, -0.4836154708652843, 1.0283257954396188, -0.2568327484687563, 0.19329471283500554, -0.7815981666935935, -0.4411458316241723, -0.452024740593543, 1.6758857724016463, 1.5652874992218142, 0.7844763709927688, 0.44807713457179416, 0.27867865713191414]\",\"encoding\":\"JSON\"},\"endpointOutput\":{\"observedContentType\":\"application/json\",\"mode\":\"OUTPUT\",\"data\":\"{\\n \\\"predictions\\\": [[17.0651016]\\n ]\\n}\",\"encoding\":\"JSON\"}},\"eventMetadata\":{\"eventId\":\"7be44e32-a408-4f42-844f-fbf39282df60\",\"inferenceTime\":\"2021-03-22T10:58:49Z\"},\"eventVersion\":\"0\"}\n{\"captureData\":{\"endpointInput\":{\"observedContentType\":\"application/json\",\"mode\":\"INPUT\",\"data\":\"[2.0426260248948114, -0.4836154708652843, 1.0283257954396188, -0.2568327484687563, 1.2176413250911147, -1.7873695353949273, 0.7342259677159778, -1.095758862382756, 1.6758857724016463, 1.5652874992218142, 0.7844763709927688, 0.44807713457179416, 2.6566864266724344]\",\"encoding\":\"JSON\"},\"endpointOutput\":{\"observedContentType\":\"application/json\",\"mode\":\"OUTPUT\",\"data\":\"{\\n \\\"predictions\\\": [[12.6608448]\\n ]\\n}\",\"encoding\":\"JSON\"}},\"eventMetadata\":{\"eventId\":\"d41c53ff-0f45-4237-8708-9ea0d7b48f63\",\"inferenceTime\":\"2021-03-22T10:58:51Z\"},\"eventVersion\":\"0\"}\n{\"captu\n"
]
],
[
[
"Finally, the contents of a single line is present below in a formatted JSON file so that you can observe a little better.",
"_____no_output_____"
]
],
[
[
"import json\nprint(json.dumps(json.loads(capture_file.split('\\n')[0]), indent=2))",
"{\n \"captureData\": {\n \"endpointInput\": {\n \"observedContentType\": \"application/json\",\n \"mode\": \"INPUT\",\n \"data\": \"[-0.3846146658753033, -0.4836154708652843, -0.04328034837026877, -0.2568327484687563, -1.2322543225547462, -0.031148842501293867, -2.2507883885350743, 0.7629475894761004, -0.6262490526587586, -0.6072060061360174, 0.32944628645426327, 0.2381714623786403, -0.7177935480963249]\",\n \"encoding\": \"JSON\"\n },\n \"endpointOutput\": {\n \"observedContentType\": \"application/json\",\n \"mode\": \"OUTPUT\",\n \"data\": \"{\\n \\\"predictions\\\": [[23.0875893]\\n ]\\n}\",\n \"encoding\": \"JSON\"\n }\n },\n \"eventMetadata\": {\n \"eventId\": \"72c01046-9516-414c-8456-af55d0f9003b\",\n \"inferenceTime\": \"2021-03-22T10:58:48Z\"\n },\n \"eventVersion\": \"0\"\n}\n"
]
],
[
[
"As you can see, each inference request is captured in one line in the jsonl file. The line contains both the input and output merged together. In the example, you provided the ContentType as `text/csv` which is reflected in the `observedContentType` value. Also, you expose the encoding that you used to encode the input and output payloads in the capture format with the `encoding` value.\n\nTo recap, you observed how you can enable capturing the input or output payloads to an endpoint with a new parameter. You have also observed what the captured format looks like in Amazon S3. Next, continue to explore how Amazon SageMaker helps with monitoring the data collected in Amazon S3.",
"_____no_output_____"
],
[
"# PART B: Model Monitor - Baselining and continuous monitoring",
"_____no_output_____"
],
[
"In addition to collecting the data, Amazon SageMaker provides the capability for you to monitor and evaluate the data observed by the endpoints. For this:\n1. Create a baseline with which you compare the realtime traffic. \n1. Once a baseline is ready, setup a schedule to continously evaluate and compare against the baseline.",
"_____no_output_____"
],
[
"## 1. Constraint suggestion with baseline/training dataset",
"_____no_output_____"
],
[
"The training dataset with which you trained the model is usually a good baseline dataset. Note that the training dataset data schema and the inference dataset schema should exactly match (i.e. the number and order of the features).\n\nFrom the training dataset you can ask Amazon SageMaker to suggest a set of baseline `constraints` and generate descriptive `statistics` to explore the data. For this example, upload the training dataset that was used to train the pre-trained model included in this example. If you already have it in Amazon S3, you can directly point to it.",
"_____no_output_____"
],
[
"### Prepare training dataset with headers",
"_____no_output_____"
]
],
[
[
"import pandas as pd\ndt = pd.DataFrame(data = x_train, \n columns = [\"CRIM\", \"ZN\", \"INDUS\", \"CHAS\",\"NOX\",\"RM\",\"AGE\",\"DIS\",\"RAD\",\"TAX\",\"PTRATIO\",\"B\",\"LSTAT\"])\n\ndt.to_csv(\"training-dataset-with-header.csv\", index = False)",
"_____no_output_____"
],
[
"# copy over the training dataset to Amazon S3 (if you already have it in Amazon S3, you could reuse it)\nbaseline_prefix = prefix + '/baselining'\nbaseline_data_prefix = baseline_prefix + '/data'\nbaseline_results_prefix = baseline_prefix + '/results'\n\nbaseline_data_uri = 's3://{}/{}'.format(bucket,baseline_data_prefix)\nbaseline_results_uri = 's3://{}/{}'.format(bucket, baseline_results_prefix)\nprint('Baseline data uri: {}'.format(baseline_data_uri))\nprint('Baseline results uri: {}'.format(baseline_results_uri))\n",
"_____no_output_____"
],
[
"training_data_file = open(\"training-dataset-with-header.csv\", 'rb')\ns3_key = os.path.join(baseline_prefix, 'data', 'training-dataset-with-header.csv')\nboto3.Session().resource('s3').Bucket(bucket).Object(s3_key).upload_fileobj(training_data_file)",
"_____no_output_____"
]
],
[
[
"### Create a baselining job with training dataset",
"_____no_output_____"
],
[
"Now that you have the training data ready in Amazon S3, start a job to `suggest` constraints. `DefaultModelMonitor.suggest_baseline(..)` starts a `ProcessingJob` using an Amazon SageMaker provided Model Monitor container to generate the constraints.",
"_____no_output_____"
]
],
[
[
"from sagemaker.model_monitor import DefaultModelMonitor\nfrom sagemaker.model_monitor.dataset_format import DatasetFormat\n\nmy_default_monitor = DefaultModelMonitor(\n role=role,\n instance_count=1,\n instance_type='ml.m5.xlarge',\n volume_size_in_gb=20,\n max_runtime_in_seconds=3600,\n)\n\nmy_default_monitor.suggest_baseline(\n baseline_dataset=baseline_data_uri+'/training-dataset-with-header.csv',\n dataset_format=DatasetFormat.csv(header=True),\n output_s3_uri=baseline_results_uri,\n wait=True\n)",
"_____no_output_____"
]
],
[
[
"### Explore the generated constraints and statistics",
"_____no_output_____"
]
],
[
[
"s3_client = boto3.Session().client('s3')\nresult = s3_client.list_objects(Bucket=bucket, Prefix=baseline_results_prefix)\nreport_files = [report_file.get(\"Key\") for report_file in result.get('Contents')]\nprint(\"Found Files:\")\nprint(\"\\n \".join(report_files))",
"_____no_output_____"
],
[
"import pandas as pd\n\nbaseline_job = my_default_monitor.latest_baselining_job\nschema_df = pd.io.json.json_normalize(baseline_job.baseline_statistics().body_dict[\"features\"])\nschema_df.head(10)",
"_____no_output_____"
],
[
"constraints_df = pd.io.json.json_normalize(baseline_job.suggested_constraints().body_dict[\"features\"])\nconstraints_df.head(10)",
"_____no_output_____"
]
],
[
[
"## 2. Analyzing collected data for data quality issues",
"_____no_output_____"
],
[
"### Create a schedule",
"_____no_output_____"
],
[
"You can create a model monitoring schedule for the endpoint created earlier. Use the baseline resources (constraints and statistics) to compare against the realtime traffic.",
"_____no_output_____"
],
[
"From the analysis above, you saw how the captured data is saved - that is the standard input and output format for Tensorflow models. But Model Monitor is framework-agnostic, and expects a specific format [explained in the docs](https://docs.aws.amazon.com/sagemaker/latest/dg/model-monitor-pre-and-post-processing.html#model-monitor-pre-processing-script):\n- Input\n - Flattened JSON `{\"feature0\": <value>, \"feature1\": <value>...}`\n - Tabular `\"<value>, <value>...\"`\n- Output:\n - Flattened JSON `{\"prediction0\": <value>, \"prediction1\": <value>...}`\n - Tabular `\"<value>, <value>...\"`\n \nWe need to transform the input records to comply with this requirement. Model Monitor offers _pre-processing scripts_ in Python to transform the input. The cell below has the script that will work for our case.",
"_____no_output_____"
]
],
[
[
"%%writefile preprocessing.py\n\nimport json\n\ndef preprocess_handler(inference_record):\n input_data = json.loads(inference_record.endpoint_input.data)\n input_data = {f\"feature{i}\": val for i, val in enumerate(input_data)}\n \n output_data = json.loads(inference_record.endpoint_output.data)[\"predictions\"][0][0]\n output_data = {\"prediction0\": output_data}\n \n return{**input_data}",
"_____no_output_____"
]
],
[
[
"We'll upload this script to an s3 destination and pass it as the `record_preprocessor_script` parameter to the `create_monitoring_schedule` call.",
"_____no_output_____"
]
],
[
[
"script_s3_dest_path = f\"s3://{bucket}/{prefix}/artifacts/modelmonitor\"\nscript_s3_dest = sagemaker.s3.S3Uploader.upload(\"preprocessing.py\", script_s3_dest_path)\nprint(script_s3_dest)",
"_____no_output_____"
],
[
"from sagemaker.model_monitor import CronExpressionGenerator\nfrom time import gmtime, strftime\n\nmon_schedule_name = 'DEMO-tf-2-workflow-model-monitor-schedule-' + strftime(\"%Y-%m-%d-%H-%M-%S\", gmtime())\nmy_default_monitor.create_monitoring_schedule(\n monitor_schedule_name=mon_schedule_name,\n endpoint_input=predictor.endpoint,\n record_preprocessor_script=script_s3_dest,\n output_s3_uri=s3_report_path,\n statistics=my_default_monitor.baseline_statistics(),\n constraints=my_default_monitor.suggested_constraints(),\n schedule_cron_expression=CronExpressionGenerator.hourly(),\n enable_cloudwatch_metrics=True,\n)",
"_____no_output_____"
]
],
[
[
"### Generating violations artificially\n\nIn order to get some result relevant to monitoring analysis, you can try and generate artificially some inferences with feature values causing specific violations, and then invoke the endpoint with this data\n\nLooking at our RM and AGE features:\n\n- RM - average number of rooms per dwelling\n- AGE - proportion of owner-occupied units built prior to 1940\n\nLet's simulate a situation where the average number of rooms is 0, and proportion of owner-occupied units built is 1000.",
"_____no_output_____"
]
],
[
[
"df_with_violations = pd.read_csv(\"training-dataset-with-header.csv\")\ndf_with_violations[\"RM\"] = 0\ndf_with_violations[\"AGE\"] = 1000\ndf_with_violations",
"_____no_output_____"
]
],
[
[
"### Start generating some artificial traffic\nThe cell below starts a thread to send some traffic to the endpoint. Note that you need to stop the kernel to terminate this thread. If there is no traffic, the monitoring jobs are marked as `Failed` since there is no data to process.",
"_____no_output_____"
]
],
[
[
"from threading import Thread\nfrom time import sleep\nimport time\n\ndef invoke_endpoint():\n for item in df_with_violations.to_numpy():\n result = predictor.predict(item)['predictions'] \n time.sleep(1)\n\ndef invoke_endpoint_forever():\n while True:\n invoke_endpoint()\n \nthread = Thread(target = invoke_endpoint_forever)\nthread.start()\n\n# Note that you need to stop the kernel to stop the invocations",
"_____no_output_____"
]
],
[
[
"### Describe and inspect the schedule\nOnce you describe, observe that the MonitoringScheduleStatus changes to Scheduled.",
"_____no_output_____"
]
],
[
[
"desc_schedule_result = my_default_monitor.describe_schedule()\nprint('Schedule status: {}'.format(desc_schedule_result['MonitoringScheduleStatus']))",
"_____no_output_____"
]
],
[
[
"### List executions\nThe schedule starts jobs at the previously specified intervals. Here, you list the latest five executions. Note that if you are kicking this off after creating the hourly schedule, you might find the executions empty. You might have to wait until you cross the hour boundary (in UTC) to see executions kick off. The code below has the logic for waiting.\n\nNote: Even for an hourly schedule, Amazon SageMaker has a buffer period of 20 minutes to schedule your execution. You might see your execution start in anywhere from zero to ~20 minutes from the hour boundary. This is expected and done for load balancing in the backend.",
"_____no_output_____"
]
],
[
[
"mon_executions = my_default_monitor.list_executions()\nprint(\"We created a hourly schedule above and it will kick off executions ON the hour (plus 0 - 20 min buffer.\\nWe will have to wait till we hit the hour...\")\n\nwhile len(mon_executions) == 0:\n print(\"Waiting for the 1st execution to happen...\")\n time.sleep(60)\n mon_executions = my_default_monitor.list_executions() ",
"_____no_output_____"
]
],
[
[
"### Inspect a specific execution (latest execution)\nIn the previous cell, you picked up the latest completed or failed scheduled execution. Here are the possible terminal states and what each of them mean: \n* Completed - This means the monitoring execution completed and no issues were found in the violations report.\n* CompletedWithViolations - This means the execution completed, but constraint violations were detected.\n* Failed - The monitoring execution failed, maybe due to client error (perhaps incorrect role premissions) or infrastructure issues. Further examination of FailureReason and ExitMessage is necessary to identify what exactly happened.\n* Stopped - job exceeded max runtime or was manually stopped.",
"_____no_output_____"
]
],
[
[
"latest_execution = mon_executions[-1] # latest execution's index is -1, second to last is -2 and so on..\n#time.sleep(60)\nlatest_execution.wait(logs=False)\n\nprint(\"Latest execution status: {}\".format(latest_execution.describe()['ProcessingJobStatus']))\nprint(\"Latest execution result: {}\".format(latest_execution.describe()['ExitMessage']))\n\nlatest_job = latest_execution.describe()\nif (latest_job['ProcessingJobStatus'] != 'Completed'):\n print(\"====STOP==== \\n No completed executions to inspect further. Please wait till an execution completes or investigate previously reported failures.\")",
"_____no_output_____"
],
[
"report_uri=latest_execution.output.destination\nprint('Report Uri: {}'.format(report_uri))",
"_____no_output_____"
]
],
[
[
"### List the generated reports",
"_____no_output_____"
]
],
[
[
"from urllib.parse import urlparse\ns3uri = urlparse(report_uri)\nreport_bucket = s3uri.netloc\nreport_key = s3uri.path.lstrip('/')\nprint('Report bucket: {}'.format(report_bucket))\nprint('Report key: {}'.format(report_key))\n\ns3_client = boto3.Session().client('s3')\nresult = s3_client.list_objects(Bucket=report_bucket, Prefix=report_key)\nreport_files = [report_file.get(\"Key\") for report_file in result.get('Contents')]\nprint(\"Found Report Files:\")\nprint(\"\\n \".join(report_files))",
"_____no_output_____"
]
],
[
[
"### Violations report",
"_____no_output_____"
],
[
"If there are any violations compared to the baseline, they will be listed here.",
"_____no_output_____"
]
],
[
[
"violations = my_default_monitor.latest_monitoring_constraint_violations()\npd.set_option('display.max_colwidth', -1)\nconstraints_df = pd.io.json.json_normalize(violations.body_dict[\"violations\"])\nconstraints_df.head(10)",
"_____no_output_____"
]
],
[
[
"## Delete the resources\n\nYou can keep your endpoint running to continue capturing data. If you do not plan to collect more data or use this endpoint further, you should delete the endpoint to avoid incurring additional charges. Note that deleting your endpoint does not delete the data that was captured during the model invocations. That data persists in Amazon S3 until you delete it yourself.\n\nBut before that, you need to delete the schedule first.",
"_____no_output_____"
]
],
[
[
"my_default_monitor.delete_monitoring_schedule()\ntime.sleep(120) # actually wait for the deletion",
"_____no_output_____"
],
[
"predictor.delete_endpoint()",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbfbda11ff2a528e986557a42f95f493cd2aede7
| 186,468 |
ipynb
|
Jupyter Notebook
|
.ipynb_checkpoints/toxic-comments-classifications-using-ml-checkpoint.ipynb
|
Amrit1Gurung/Customers-Churn-Prediction
|
0b189704620abea28ae984c8c707fb99e1d64a5f
|
[
"MIT"
] | null | null | null |
.ipynb_checkpoints/toxic-comments-classifications-using-ml-checkpoint.ipynb
|
Amrit1Gurung/Customers-Churn-Prediction
|
0b189704620abea28ae984c8c707fb99e1d64a5f
|
[
"MIT"
] | null | null | null |
.ipynb_checkpoints/toxic-comments-classifications-using-ml-checkpoint.ipynb
|
Amrit1Gurung/Customers-Churn-Prediction
|
0b189704620abea28ae984c8c707fb99e1d64a5f
|
[
"MIT"
] | null | null | null | 55.070289 | 37,320 | 0.692773 |
[
[
[
"### Importing the required libraries ###",
"_____no_output_____"
]
],
[
[
"import numpy as np\nimport pandas as pd\nfrom matplotlib import pyplot as plt\n%matplotlib inline\nimport seaborn as sns\nimport re\nimport zipfile\n",
"_____no_output_____"
]
],
[
[
"### UNZIP files ###",
"_____no_output_____"
]
],
[
[
"# Will unzip the files so that you can see them..\nwith zipfile.ZipFile(\"/kaggle/input/jigsaw-toxic-comment-classification-challenge/train.csv.zip\",\"r\") as z:\n z.extractall(\".\")\n",
"_____no_output_____"
],
[
"import os\nfor dirname, _, filenames in os.walk('/kaggle/working/'):\n for filename in filenames:\n print(os.path.join(dirname, filename))",
"/kaggle/working/train.csv\n/kaggle/working/__notebook__.ipynb\n"
]
],
[
[
"### Reading the Train File ###",
"_____no_output_____"
]
],
[
[
"# prepare text samples and their labels\nprint('Loading in comments...')\n\ndata = pd.read_csv(\"/kaggle/working/train.csv\")\nprint(data.head())",
"Loading in comments...\n id comment_text toxic \\\n0 0000997932d777bf Explanation\\nWhy the edits made under my usern... 0 \n1 000103f0d9cfb60f D'aww! He matches this background colour I'm s... 0 \n2 000113f07ec002fd Hey man, I'm really not trying to edit war. It... 0 \n3 0001b41b1c6bb37e \"\\nMore\\nI can't make any real suggestions on ... 0 \n4 0001d958c54c6e35 You, sir, are my hero. Any chance you remember... 0 \n\n severe_toxic obscene threat insult identity_hate \n0 0 0 0 0 0 \n1 0 0 0 0 0 \n2 0 0 0 0 0 \n3 0 0 0 0 0 \n4 0 0 0 0 0 \n"
],
[
"# Feature Imformation \ndata.columns",
"_____no_output_____"
],
[
"# Data Dimension \n\ndata.shape ",
"_____no_output_____"
],
[
"cols_target = ['obscene','insult','toxic','severe_toxic','identity_hate','threat']",
"_____no_output_____"
],
[
"# Check Missing Value \n\nprint(data[\"comment_text\"].isna().sum())\n\n# dropna ",
"0\n"
],
[
"# check missing values in numeric columns\ndata.describe()",
"_____no_output_____"
],
[
"unlabelled_in_all = data[(data['toxic']!=1) & (data['severe_toxic']!=1) &\n (data['obscene']!=1) & (data['threat']!=1) &\n (data['insult']!=1) & (data['identity_hate']!=1)]\nprint('Percentage of unlabelled comments or good comments is ', len(unlabelled_in_all)/len(data)*100)",
"Percentage of unlabelled comments or good comments is 89.83211235124176\n"
],
[
"labelled_in_all = data[(data['toxic']==1) & (data['severe_toxic']==1) &\n (data['obscene']==1) & (data['threat']==1) &\n (data['insult']==1) & (data['identity_hate']==1)]\nprint('Percentage of comments which is present in all categories is ', len(labelled_in_all)/len(data)*100)",
"Percentage of comments which is present in all categories is 0.019427088882065038\n"
],
[
"# let's see the total rows in train, test data and the numbers for the various categories\nprint('Total rows in train is {}'.format(len(data)))\nprint(data[cols_target].sum())",
"Total rows in train is 159571\nobscene 8449\ninsult 7877\ntoxic 15294\nsevere_toxic 1595\nidentity_hate 1405\nthreat 478\ndtype: int64\n"
]
],
[
[
"Next, let's examine the correlations among the target variables.",
"_____no_output_____"
]
],
[
[
"target_data = data[cols_target]\ncolormap = plt.cm.plasma\nplt.figure(figsize=(7,7))\nplt.title('Correlation of features & targets',y=1.05,size=14)\nsns.heatmap(target_data.astype(float).corr(),linewidths=0.1,vmax=1.0,square=True,cmap=colormap,\n linecolor='white',annot=True)",
"_____no_output_____"
]
],
[
[
"Indeed, it looks like some of the labels are higher correlated, e.g. insult-obscene has the highest at 0.74, followed by toxic-obscene and toxic-insult.",
"_____no_output_____"
],
[
"### Now this kind of problem is ###\n\n1) Multi class problem and not Binary\n\n2) Also all classes are not independent but rather dependent or correlated \n\n3) A comment can belong to multiple classes at the same time for e.g. comment can be toxic and insulting at the same time\n\nLet us simplify the problem by first classifying the comments as \"block\" vs \"allow\" ",
"_____no_output_____"
]
],
[
[
"data['block'] =data[cols_target].sum(axis =1)\nprint(data['block'].value_counts())\ndata['block'] = data['block'] > 0 \ndata['block'] = data['block'].astype(int)\nprint(data['block'].value_counts())\n",
"0 143346\n1 6360\n3 4209\n2 3480\n4 1760\n5 385\n6 31\nName: block, dtype: int64\n0 143346\n1 16225\nName: block, dtype: int64\n"
],
[
"# look at the count plot for text length\nsns.set()\nsns.countplot(x=\"block\" , data = data )\nplt.show()",
"_____no_output_____"
],
[
"# Event Rate \n\nprint(\"Percentage Event Rate : \" , round(100*data['block'].sum()/data.shape[0],2) , \"%\")",
"Percentage Event Rate : 10.17 %\n"
]
],
[
[
"### Let us focus on comments ###",
"_____no_output_____"
]
],
[
[
"# Let's look at the character length for the rows in the training data and record these\ndata['char_length'] = data['comment_text'].apply(lambda x: len(str(x)))",
"_____no_output_____"
],
[
"# look at the histogram plot for text length\nsns.set()\ndata['char_length'].hist()\nplt.show()",
"_____no_output_____"
]
],
[
[
"Most of the text length are within 500 characters, with some up to 5,000 characters long.\n\n",
"_____no_output_____"
],
[
"### Clean the Comments Text ###",
"_____no_output_____"
]
],
[
[
"def clean_text(text):\n text = text.lower()\n text = re.sub(r\"what's\", \"what is \", text)\n text = re.sub(r\"\\'s\", \" \", text)\n text = re.sub(r\"\\'ve\", \" have \", text)\n text = re.sub(r\"can't\", \"cannot \", text)\n text = re.sub(r\"n't\", \" not \", text)\n text = re.sub(r\"i'm\", \"i am \", text)\n text = re.sub(r\"\\'re\", \" are \", text)\n text = re.sub(r\"\\'d\", \" would \", text)\n text = re.sub(r\"\\'ll\", \" will \", text)\n text = re.sub(r\"\\'scuse\", \" excuse \", text)\n text = re.sub('\\W', ' ', text)\n text = re.sub('\\s+', ' ', text)\n text = text.strip(' ')\n return text",
"_____no_output_____"
],
[
"%%time \n# clean the comment_text in train_df [Thanks to Pulkit Jha for the useful pointer.]\ndata['comment_text'] = data['comment_text'].map(lambda com : clean_text(com))",
"CPU times: user 14.1 s, sys: 88.3 ms, total: 14.2 s\nWall time: 14.2 s\n"
],
[
"from sklearn.model_selection import train_test_split\n\nX_train, X_test, y_train, y_test = train_test_split(data['comment_text'], data['block'], test_size=0.2, random_state=42)",
"_____no_output_____"
],
[
"print(X_train.shape, X_test.shape)\nprint(y_train.shape, y_test.shape)\n",
"(127656,) (31915,)\n(127656,) (31915,)\n"
],
[
"# import and instantiate TfidfVectorizer\nfrom sklearn.feature_extraction.text import CountVectorizer\nfrom sklearn.feature_extraction.text import TfidfVectorizer\nvect = TfidfVectorizer(max_features = 10000, stop_words='english')\n#vect = TfidfVectorizer(stop_words='english')\nprint(vect)",
"TfidfVectorizer(max_features=10000, stop_words='english')\n"
],
[
"%%time \n# learn the vocabulary in the training data, then use it to create a document-term matrix\nX_train_dtm = vect.fit_transform(X_train)\n# examine the document-term matrix created from X_train\nX_train_dtm",
"CPU times: user 9.42 s, sys: 92.7 ms, total: 9.51 s\nWall time: 9.52 s\n"
],
[
"X_train_dtm.shape",
"_____no_output_____"
],
[
"100*2792162/ (127656*10000)",
"_____no_output_____"
],
[
"%%time\n# transform the test data using the earlier fitted vocabulary, into a document-term matrix\nX_test_dtm = vect.transform(X_test)\n# examine the document-term matrix from X_test\nX_test_dtm",
"CPU times: user 2.36 s, sys: 0 ns, total: 2.36 s\nWall time: 2.36 s\n"
]
],
[
[
"## Lets us build a binary classifier using Logistic Regression ##",
"_____no_output_____"
]
],
[
[
"# import and instantiate the Logistic Regression model\nfrom sklearn.linear_model import LogisticRegression\nfrom sklearn.metrics import accuracy_score\nfrom sklearn.metrics import confusion_matrix\nlogreg = LogisticRegression(C=1, max_iter = 2000)\n\n\n\n# train the model using X_train_dtm & y_train\nlogreg.fit(X_train_dtm, y_train)\n# compute the training accuracy\ny_pred_train = logreg.predict(X_train_dtm)\nprint('Training accuracy is {}'.format(accuracy_score(y_train, y_pred_train)))\n# compute the predicted probabilities for X_test_dtm\ny_pred_test = logreg.predict(X_test_dtm)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\n",
"Training accuracy is 0.9594613649182178\nTest accuracy is 0.9563214789284036\n[[28507 164]\n [ 1230 2014]]\n"
],
[
"#28507 -> comments are good and predeicted as good \n#2014 -> comments are block and predicted as block\n#164 -> comments are good but predicted as block\n#1230 -> comments are block but predicted as good\n",
"_____no_output_____"
],
[
"(28507 + 2014)/(28507+2014+164+1230)\n",
"_____no_output_____"
],
[
"import sklearn.metrics as metrics\n# calculate the fpr and tpr for all thresholds of the classification\nprobs = logreg.predict_proba(X_test_dtm)\npreds = probs[:,1]\nfpr, tpr, threshold = metrics.roc_curve(y_test, preds)\nroc_auc = metrics.auc(fpr, tpr)\n\n# method I: plt\nimport matplotlib.pyplot as plt\nplt.title('Receiver Operating Characteristic')\nplt.plot(fpr, tpr, 'b', label = 'AUC = %0.2f' % roc_auc)\nplt.legend(loc = 'lower right')\nplt.plot([0, 1], [0, 1],'r--')\nplt.xlim([0, 1])\nplt.ylim([0, 1])\nplt.ylabel('True Positive Rate')\nplt.xlabel('False Positive Rate')\nplt.show()\n\n",
"_____no_output_____"
]
],
[
[
"# Welcome to the curse of Accuracy, F1(help) to the rescue #",
"_____no_output_____"
]
],
[
[
"from sklearn.metrics import f1_score\n\n\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))\n \n",
"F1 score on Test data : 0.7428992991516045\n"
]
],
[
[
"### In case of Class Imbalance - we use F1 score as a general measure for the model performance ###\n\nDepending on the Business case - we need to fine tune the model \n\nThere is a Precision vs Recall Trade off \n\nIf you want to capture all toxic tweets - then some of the good twwets will be misclassified as bad tweets ",
"_____no_output_____"
]
],
[
[
"y_pred_test = logreg.predict_proba(X_test_dtm)[:,1]\n#print(y_pred_test)\ny_pred_test = y_pred_test >= 0.2 # by default it is 0.5\ny_pred_test = y_pred_test.astype(int)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Test accuracy is 0.9515588281372396\n[[27777 894]\n [ 652 2592]]\nF1 score on Test data : 0.7702823179791977\n"
]
],
[
[
"# Let us use a tree base model #",
"_____no_output_____"
]
],
[
[
"%%time \n\nfrom sklearn.metrics import f1_score\nfrom sklearn.tree import DecisionTreeClassifier \n\ndt_clf = DecisionTreeClassifier()\n# train the model using X_train_dtm & y_train\ndt_clf.fit(X_train_dtm, y_train)\n# compute the training accuracy\ny_pred_train = dt_clf.predict(X_train_dtm)\nprint('Training accuracy is {}'.format(accuracy_score(y_train, y_pred_train)))\n# compute the predicted probabilities for X_test_dtm\ny_pred_test = dt_clf.predict(X_test_dtm)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Training accuracy is 0.9987701322303691\nTest accuracy is 0.9425661914460285\n[[27835 836]\n [ 997 2247]]\nF1 score on Test data : 0.7102892366050261\nCPU times: user 3min 35s, sys: 62.2 ms, total: 3min 35s\nWall time: 3min 36s\n"
]
],
[
[
"### Lets us try an Ensemble of Trees ###",
"_____no_output_____"
]
],
[
[
"%%time \nfrom sklearn.metrics import f1_score\nfrom sklearn.ensemble import RandomForestClassifier \nfrom sklearn.tree import DecisionTreeClassifier \n\nrf_clf = RandomForestClassifier()\n\n# train the model using X_train_dtm & y_train\nrf_clf.fit(X_train_dtm, y_train)\n# compute the training accuracy\ny_pred_train = rf_clf.predict(X_train_dtm)\nprint('Training accuracy is {}'.format(accuracy_score(y_train, y_pred_train)))\n# compute the predicted probabilities for X_test_dtm\ny_pred_test = rf_clf.predict(X_test_dtm)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Training accuracy is 0.9987544651250235\nTest accuracy is 0.9576061413128623\n[[28290 381]\n [ 972 2272]]\nF1 score on Test data : 0.7705613023571307\nCPU times: user 5min 53s, sys: 163 ms, total: 5min 53s\nWall time: 5min 54s\n"
],
[
"# Fine Tuning Random Forest \n\ny_pred_test = rf_clf.predict_proba(X_test_dtm)[:,1]\ny_pred_test = y_pred_test >= 0.05 # by default it is 0.5\ny_pred_test = y_pred_test.astype(int)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Test accuracy is 0.8090239699201003\n[[22765 5906]\n [ 189 3055]]\nF1 score on Test data : 0.5006145022531749\n"
],
[
"%%time\nfrom sklearn.metrics import f1_score\nfrom sklearn.linear_model import PassiveAggressiveClassifier \n\npa_clf = PassiveAggressiveClassifier()\n\n# train the model using X_train_dtm & y_train\npa_clf.fit(X_train_dtm, y_train)\n# compute the training accuracy\ny_pred_train = pa_clf.predict(X_train_dtm)\nprint('Training accuracy is {}'.format(accuracy_score(y_train, y_pred_train)))\n# compute the predicted probabilities for X_test_dtm\ny_pred_test = pa_clf.predict(X_test_dtm)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Training accuracy is 0.9754888136867832\nTest accuracy is 0.9519348268839104\n[[28066 605]\n [ 929 2315]]\nF1 score on Test data : 0.7511356262167422\nCPU times: user 762 ms, sys: 14 ms, total: 776 ms\nWall time: 504 ms\n"
]
],
[
[
"### Passive Aggresive Classifier does not support prediction probability - so can't be fined ###",
"_____no_output_____"
]
],
[
[
"%%time \nfrom sklearn.metrics import f1_score\nimport xgboost \n\nxgb = xgboost.XGBClassifier()\n# train the model using X_train_dtm & y_train\nxgb.fit(X_train_dtm, y_train)\n# compute the training accuracy\ny_pred_train = xgb.predict(X_train_dtm)\nprint('Training accuracy is {}'.format(accuracy_score(y_train, y_pred_train)))\n# compute the predicted probabilities for X_test_dtm\ny_pred_test = xgb.predict(X_test_dtm)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Training accuracy is 0.9605032274237012\nTest accuracy is 0.9544414851950493\n[[28513 158]\n [ 1296 1948]]\nF1 score on Test data : 0.7282242990654205\nCPU times: user 2min 7s, sys: 541 ms, total: 2min 8s\nWall time: 36 s\n"
],
[
"# Fine Tuning XGBOOST\n\ny_pred_test = xgb.predict_proba(X_test_dtm)[:,1]\ny_pred_test = y_pred_test >= 0.06 # by default it is 0.5\ny_pred_test = y_pred_test.astype(int)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Test accuracy is 0.8847877173742754\n[[25405 3266]\n [ 411 2833]]\nF1 score on Test data : 0.6064433265546398\n"
]
],
[
[
"### Advance Models - LightGBM ### ",
"_____no_output_____"
]
],
[
[
"import lightgbm \n\nparameters = {\n 'application': 'binary',\n 'objective': 'binary',\n 'metric': 'auc',\n 'is_unbalance': 'true',\n 'boosting': 'gbdt',\n 'num_leaves': 31,\n 'feature_fraction': 0.5,\n 'bagging_fraction': 0.5,\n 'bagging_freq': 20,\n 'learning_rate': 0.05,\n 'verbose': 0\n}\n\ntrain_data = lightgbm.Dataset(X_train_dtm, label=y_train)\ntest_data = lightgbm.Dataset(X_test_dtm, label=y_test)\n\nclf = lightgbm.train(parameters,\n train_data,\n valid_sets=test_data,\n num_boost_round=500,\n early_stopping_rounds=10)\n\n\n\n\n",
"[1]\tvalid_0's auc: 0.783092\nTraining until validation scores don't improve for 10 rounds\n[2]\tvalid_0's auc: 0.84214\n[3]\tvalid_0's auc: 0.861219\n[4]\tvalid_0's auc: 0.874613\n[5]\tvalid_0's auc: 0.877839\n[6]\tvalid_0's auc: 0.884924\n[7]\tvalid_0's auc: 0.889691\n[8]\tvalid_0's auc: 0.890547\n[9]\tvalid_0's auc: 0.893395\n[10]\tvalid_0's auc: 0.892991\n[11]\tvalid_0's auc: 0.893584\n[12]\tvalid_0's auc: 0.894835\n[13]\tvalid_0's auc: 0.895268\n[14]\tvalid_0's auc: 0.895774\n[15]\tvalid_0's auc: 0.895769\n[16]\tvalid_0's auc: 0.896405\n[17]\tvalid_0's auc: 0.897059\n[18]\tvalid_0's auc: 0.897579\n[19]\tvalid_0's auc: 0.899455\n[20]\tvalid_0's auc: 0.900329\n[21]\tvalid_0's auc: 0.901626\n[22]\tvalid_0's auc: 0.902794\n[23]\tvalid_0's auc: 0.903408\n[24]\tvalid_0's auc: 0.904566\n[25]\tvalid_0's auc: 0.906046\n[26]\tvalid_0's auc: 0.90721\n[27]\tvalid_0's auc: 0.907912\n[28]\tvalid_0's auc: 0.907802\n[29]\tvalid_0's auc: 0.908702\n[30]\tvalid_0's auc: 0.908989\n[31]\tvalid_0's auc: 0.909595\n[32]\tvalid_0's auc: 0.909745\n[33]\tvalid_0's auc: 0.910031\n[34]\tvalid_0's auc: 0.910882\n[35]\tvalid_0's auc: 0.91148\n[36]\tvalid_0's auc: 0.912151\n[37]\tvalid_0's auc: 0.913136\n[38]\tvalid_0's auc: 0.913915\n[39]\tvalid_0's auc: 0.914295\n[40]\tvalid_0's auc: 0.915343\n[41]\tvalid_0's auc: 0.916305\n[42]\tvalid_0's auc: 0.917158\n[43]\tvalid_0's auc: 0.917423\n[44]\tvalid_0's auc: 0.918092\n[45]\tvalid_0's auc: 0.918734\n[46]\tvalid_0's auc: 0.919429\n[47]\tvalid_0's auc: 0.920148\n[48]\tvalid_0's auc: 0.920909\n[49]\tvalid_0's auc: 0.92152\n[50]\tvalid_0's auc: 0.92198\n[51]\tvalid_0's auc: 0.92241\n[52]\tvalid_0's auc: 0.923005\n[53]\tvalid_0's auc: 0.923413\n[54]\tvalid_0's auc: 0.923824\n[55]\tvalid_0's auc: 0.924233\n[56]\tvalid_0's auc: 0.924727\n[57]\tvalid_0's auc: 0.925279\n[58]\tvalid_0's auc: 0.925787\n[59]\tvalid_0's auc: 0.926529\n[60]\tvalid_0's auc: 0.926783\n[61]\tvalid_0's auc: 0.927296\n[62]\tvalid_0's auc: 0.928063\n[63]\tvalid_0's auc: 0.928735\n[64]\tvalid_0's auc: 0.929292\n[65]\tvalid_0's auc: 0.930003\n[66]\tvalid_0's auc: 0.93049\n[67]\tvalid_0's auc: 0.931133\n[68]\tvalid_0's auc: 0.931424\n[69]\tvalid_0's auc: 0.931839\n[70]\tvalid_0's auc: 0.932426\n[71]\tvalid_0's auc: 0.932774\n[72]\tvalid_0's auc: 0.932909\n[73]\tvalid_0's auc: 0.933261\n[74]\tvalid_0's auc: 0.933541\n[75]\tvalid_0's auc: 0.933716\n[76]\tvalid_0's auc: 0.934096\n[77]\tvalid_0's auc: 0.934273\n[78]\tvalid_0's auc: 0.934485\n[79]\tvalid_0's auc: 0.934682\n[80]\tvalid_0's auc: 0.934912\n[81]\tvalid_0's auc: 0.935309\n[82]\tvalid_0's auc: 0.935822\n[83]\tvalid_0's auc: 0.936294\n[84]\tvalid_0's auc: 0.9367\n[85]\tvalid_0's auc: 0.936941\n[86]\tvalid_0's auc: 0.937098\n[87]\tvalid_0's auc: 0.937478\n[88]\tvalid_0's auc: 0.937819\n[89]\tvalid_0's auc: 0.938205\n[90]\tvalid_0's auc: 0.938639\n[91]\tvalid_0's auc: 0.938898\n[92]\tvalid_0's auc: 0.939222\n[93]\tvalid_0's auc: 0.939412\n[94]\tvalid_0's auc: 0.939531\n[95]\tvalid_0's auc: 0.939712\n[96]\tvalid_0's auc: 0.939901\n[97]\tvalid_0's auc: 0.940018\n[98]\tvalid_0's auc: 0.940191\n[99]\tvalid_0's auc: 0.940452\n[100]\tvalid_0's auc: 0.940784\n[101]\tvalid_0's auc: 0.94113\n[102]\tvalid_0's auc: 0.941375\n[103]\tvalid_0's auc: 0.941759\n[104]\tvalid_0's auc: 0.94212\n[105]\tvalid_0's auc: 0.942593\n[106]\tvalid_0's auc: 0.942878\n[107]\tvalid_0's auc: 0.943084\n[108]\tvalid_0's auc: 0.943373\n[109]\tvalid_0's auc: 0.943667\n[110]\tvalid_0's auc: 0.94392\n[111]\tvalid_0's auc: 0.943956\n[112]\tvalid_0's auc: 0.944151\n[113]\tvalid_0's auc: 0.944369\n[114]\tvalid_0's auc: 0.944445\n[115]\tvalid_0's auc: 0.944627\n[116]\tvalid_0's auc: 0.944635\n[117]\tvalid_0's auc: 0.944818\n[118]\tvalid_0's auc: 0.944939\n[119]\tvalid_0's auc: 0.945036\n[120]\tvalid_0's auc: 0.945197\n[121]\tvalid_0's auc: 0.945559\n[122]\tvalid_0's auc: 0.945812\n[123]\tvalid_0's auc: 0.945949\n[124]\tvalid_0's auc: 0.946244\n[125]\tvalid_0's auc: 0.946465\n[126]\tvalid_0's auc: 0.946591\n[127]\tvalid_0's auc: 0.946757\n[128]\tvalid_0's auc: 0.946938\n[129]\tvalid_0's auc: 0.947117\n[130]\tvalid_0's auc: 0.947222\n[131]\tvalid_0's auc: 0.947374\n[132]\tvalid_0's auc: 0.947416\n[133]\tvalid_0's auc: 0.947536\n[134]\tvalid_0's auc: 0.947667\n[135]\tvalid_0's auc: 0.947893\n[136]\tvalid_0's auc: 0.947902\n[137]\tvalid_0's auc: 0.94808\n[138]\tvalid_0's auc: 0.948223\n[139]\tvalid_0's auc: 0.94838\n[140]\tvalid_0's auc: 0.948578\n[141]\tvalid_0's auc: 0.948696\n[142]\tvalid_0's auc: 0.948873\n[143]\tvalid_0's auc: 0.948938\n[144]\tvalid_0's auc: 0.949134\n[145]\tvalid_0's auc: 0.949311\n[146]\tvalid_0's auc: 0.949409\n[147]\tvalid_0's auc: 0.949501\n[148]\tvalid_0's auc: 0.949588\n[149]\tvalid_0's auc: 0.94967\n[150]\tvalid_0's auc: 0.949849\n[151]\tvalid_0's auc: 0.949904\n[152]\tvalid_0's auc: 0.950066\n[153]\tvalid_0's auc: 0.950157\n[154]\tvalid_0's auc: 0.950296\n[155]\tvalid_0's auc: 0.950287\n[156]\tvalid_0's auc: 0.950271\n[157]\tvalid_0's auc: 0.950299\n[158]\tvalid_0's auc: 0.950401\n[159]\tvalid_0's auc: 0.95046\n[160]\tvalid_0's auc: 0.950443\n[161]\tvalid_0's auc: 0.950571\n[162]\tvalid_0's auc: 0.950632\n[163]\tvalid_0's auc: 0.950799\n[164]\tvalid_0's auc: 0.950914\n[165]\tvalid_0's auc: 0.950976\n[166]\tvalid_0's auc: 0.951008\n[167]\tvalid_0's auc: 0.951131\n[168]\tvalid_0's auc: 0.951273\n[169]\tvalid_0's auc: 0.951412\n[170]\tvalid_0's auc: 0.951466\n[171]\tvalid_0's auc: 0.951538\n[172]\tvalid_0's auc: 0.951633\n[173]\tvalid_0's auc: 0.951721\n[174]\tvalid_0's auc: 0.951759\n[175]\tvalid_0's auc: 0.951817\n[176]\tvalid_0's auc: 0.951834\n[177]\tvalid_0's auc: 0.951864\n[178]\tvalid_0's auc: 0.951919\n[179]\tvalid_0's auc: 0.951996\n[180]\tvalid_0's auc: 0.95202\n[181]\tvalid_0's auc: 0.952129\n[182]\tvalid_0's auc: 0.952185\n[183]\tvalid_0's auc: 0.952244\n[184]\tvalid_0's auc: 0.952357\n[185]\tvalid_0's auc: 0.952404\n[186]\tvalid_0's auc: 0.952459\n[187]\tvalid_0's auc: 0.952548\n[188]\tvalid_0's auc: 0.952575\n[189]\tvalid_0's auc: 0.952579\n[190]\tvalid_0's auc: 0.952622\n[191]\tvalid_0's auc: 0.952612\n[192]\tvalid_0's auc: 0.952656\n[193]\tvalid_0's auc: 0.952684\n[194]\tvalid_0's auc: 0.95269\n[195]\tvalid_0's auc: 0.952738\n[196]\tvalid_0's auc: 0.952752\n[197]\tvalid_0's auc: 0.952874\n[198]\tvalid_0's auc: 0.952968\n[199]\tvalid_0's auc: 0.95298\n[200]\tvalid_0's auc: 0.953058\n[201]\tvalid_0's auc: 0.953042\n[202]\tvalid_0's auc: 0.953116\n[203]\tvalid_0's auc: 0.953232\n[204]\tvalid_0's auc: 0.95326\n[205]\tvalid_0's auc: 0.953282\n[206]\tvalid_0's auc: 0.953318\n[207]\tvalid_0's auc: 0.953383\n[208]\tvalid_0's auc: 0.953409\n[209]\tvalid_0's auc: 0.953405\n[210]\tvalid_0's auc: 0.953505\n[211]\tvalid_0's auc: 0.953539\n[212]\tvalid_0's auc: 0.953614\n[213]\tvalid_0's auc: 0.953666\n[214]\tvalid_0's auc: 0.953676\n[215]\tvalid_0's auc: 0.953744\n[216]\tvalid_0's auc: 0.953809\n[217]\tvalid_0's auc: 0.953838\n[218]\tvalid_0's auc: 0.953814\n[219]\tvalid_0's auc: 0.95383\n[220]\tvalid_0's auc: 0.953826\n[221]\tvalid_0's auc: 0.953968\n[222]\tvalid_0's auc: 0.954086\n[223]\tvalid_0's auc: 0.954196\n[224]\tvalid_0's auc: 0.954245\n[225]\tvalid_0's auc: 0.954302\n[226]\tvalid_0's auc: 0.954397\n[227]\tvalid_0's auc: 0.954471\n[228]\tvalid_0's auc: 0.954548\n[229]\tvalid_0's auc: 0.95462\n[230]\tvalid_0's auc: 0.954673\n[231]\tvalid_0's auc: 0.954754\n[232]\tvalid_0's auc: 0.954796\n[233]\tvalid_0's auc: 0.954871\n[234]\tvalid_0's auc: 0.954892\n[235]\tvalid_0's auc: 0.954957\n[236]\tvalid_0's auc: 0.954992\n[237]\tvalid_0's auc: 0.955002\n[238]\tvalid_0's auc: 0.955062\n[239]\tvalid_0's auc: 0.955022\n[240]\tvalid_0's auc: 0.955086\n[241]\tvalid_0's auc: 0.955127\n[242]\tvalid_0's auc: 0.955191\n[243]\tvalid_0's auc: 0.955251\n[244]\tvalid_0's auc: 0.955333\n[245]\tvalid_0's auc: 0.955451\n[246]\tvalid_0's auc: 0.955491\n[247]\tvalid_0's auc: 0.955573\n[248]\tvalid_0's auc: 0.955613\n[249]\tvalid_0's auc: 0.955644\n[250]\tvalid_0's auc: 0.955658\n[251]\tvalid_0's auc: 0.955709\n[252]\tvalid_0's auc: 0.955771\n[253]\tvalid_0's auc: 0.95583\n[254]\tvalid_0's auc: 0.955892\n[255]\tvalid_0's auc: 0.955915\n[256]\tvalid_0's auc: 0.95601\n[257]\tvalid_0's auc: 0.956023\n[258]\tvalid_0's auc: 0.956008\n[259]\tvalid_0's auc: 0.956102\n[260]\tvalid_0's auc: 0.95614\n[261]\tvalid_0's auc: 0.956195\n[262]\tvalid_0's auc: 0.956311\n[263]\tvalid_0's auc: 0.956321\n[264]\tvalid_0's auc: 0.95642\n[265]\tvalid_0's auc: 0.95646\n[266]\tvalid_0's auc: 0.956472\n[267]\tvalid_0's auc: 0.956436\n[268]\tvalid_0's auc: 0.956454\n[269]\tvalid_0's auc: 0.956496\n[270]\tvalid_0's auc: 0.956584\n[271]\tvalid_0's auc: 0.956627\n[272]\tvalid_0's auc: 0.956621\n[273]\tvalid_0's auc: 0.956651\n[274]\tvalid_0's auc: 0.956674\n[275]\tvalid_0's auc: 0.956708\n[276]\tvalid_0's auc: 0.956734\n[277]\tvalid_0's auc: 0.956792\n[278]\tvalid_0's auc: 0.956803\n[279]\tvalid_0's auc: 0.956883\n[280]\tvalid_0's auc: 0.95693\n[281]\tvalid_0's auc: 0.956964\n[282]\tvalid_0's auc: 0.957007\n[283]\tvalid_0's auc: 0.957083\n[284]\tvalid_0's auc: 0.95707\n[285]\tvalid_0's auc: 0.957124\n[286]\tvalid_0's auc: 0.957207\n[287]\tvalid_0's auc: 0.957244\n[288]\tvalid_0's auc: 0.957306\n[289]\tvalid_0's auc: 0.957349\n[290]\tvalid_0's auc: 0.957408\n[291]\tvalid_0's auc: 0.957445\n[292]\tvalid_0's auc: 0.957502\n[293]\tvalid_0's auc: 0.95758\n[294]\tvalid_0's auc: 0.957599\n[295]\tvalid_0's auc: 0.957609\n[296]\tvalid_0's auc: 0.957614\n[297]\tvalid_0's auc: 0.957632\n[298]\tvalid_0's auc: 0.957682\n[299]\tvalid_0's auc: 0.957702\n[300]\tvalid_0's auc: 0.957733\n[301]\tvalid_0's auc: 0.957814\n[302]\tvalid_0's auc: 0.957865\n[303]\tvalid_0's auc: 0.957903\n[304]\tvalid_0's auc: 0.957921\n[305]\tvalid_0's auc: 0.95791\n[306]\tvalid_0's auc: 0.95795\n[307]\tvalid_0's auc: 0.957935\n[308]\tvalid_0's auc: 0.957934\n[309]\tvalid_0's auc: 0.957971\n[310]\tvalid_0's auc: 0.958039\n[311]\tvalid_0's auc: 0.95808\n[312]\tvalid_0's auc: 0.958136\n[313]\tvalid_0's auc: 0.958162\n[314]\tvalid_0's auc: 0.958157\n[315]\tvalid_0's auc: 0.958151\n[316]\tvalid_0's auc: 0.958165\n[317]\tvalid_0's auc: 0.958181\n[318]\tvalid_0's auc: 0.958215\n[319]\tvalid_0's auc: 0.958195\n[320]\tvalid_0's auc: 0.958216\n[321]\tvalid_0's auc: 0.958264\n[322]\tvalid_0's auc: 0.958296\n[323]\tvalid_0's auc: 0.958379\n[324]\tvalid_0's auc: 0.958453\n[325]\tvalid_0's auc: 0.958499\n[326]\tvalid_0's auc: 0.958549\n[327]\tvalid_0's auc: 0.958597\n[328]\tvalid_0's auc: 0.95862\n[329]\tvalid_0's auc: 0.958657\n[330]\tvalid_0's auc: 0.958716\n[331]\tvalid_0's auc: 0.958747\n[332]\tvalid_0's auc: 0.958805\n[333]\tvalid_0's auc: 0.958825\n[334]\tvalid_0's auc: 0.958834\n[335]\tvalid_0's auc: 0.958848\n[336]\tvalid_0's auc: 0.95889\n[337]\tvalid_0's auc: 0.958881\n[338]\tvalid_0's auc: 0.958905\n[339]\tvalid_0's auc: 0.958919\n[340]\tvalid_0's auc: 0.958963\n[341]\tvalid_0's auc: 0.959018\n[342]\tvalid_0's auc: 0.959014\n[343]\tvalid_0's auc: 0.959012\n[344]\tvalid_0's auc: 0.959021\n[345]\tvalid_0's auc: 0.958992\n[346]\tvalid_0's auc: 0.958932\n[347]\tvalid_0's auc: 0.958981\n[348]\tvalid_0's auc: 0.959007\n[349]\tvalid_0's auc: 0.958992\n[350]\tvalid_0's auc: 0.958951\n[351]\tvalid_0's auc: 0.958946\n[352]\tvalid_0's auc: 0.958914\n[353]\tvalid_0's auc: 0.958843\n[354]\tvalid_0's auc: 0.958838\nEarly stopping, best iteration is:\n[344]\tvalid_0's auc: 0.959021\n"
],
[
"# Fine Tuning LIGHT GBM\n\ny_pred_test = clf.predict(X_test_dtm)\ny_pred_test = y_pred_test >= 0.35 # by default it is 0.5\ny_pred_test = y_pred_test.astype(int)\nprint('Test accuracy is {}'.format(accuracy_score(y_test,y_pred_test)))\nprint(confusion_matrix(y_test,y_pred_test))\nprint(\"F1 score on Test data : \" ,f1_score(y_test,y_pred_test))",
"Test accuracy is 0.8930910230299233\n[[25598 3073]\n [ 339 2905]]\nF1 score on Test data : 0.6300151810887009\n"
]
],
[
[
"## Model Explanation ##",
"_____no_output_____"
]
],
[
[
"import eli5\n\neli5.show_weights(logreg,vec = vect, top = 15) # logistic regression\n# will give you top 15 features or words which makes a comment toxic ",
"/opt/conda/lib/python3.7/site-packages/sklearn/utils/deprecation.py:143: FutureWarning: The sklearn.metrics.scorer module is deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.metrics. Anything that cannot be imported from sklearn.metrics is now part of the private API.\n warnings.warn(message, FutureWarning)\n/opt/conda/lib/python3.7/site-packages/sklearn/utils/deprecation.py:143: FutureWarning: The sklearn.feature_selection.base module is deprecated in version 0.22 and will be removed in version 0.24. The corresponding classes / functions should instead be imported from sklearn.feature_selection. Anything that cannot be imported from sklearn.feature_selection is now part of the private API.\n warnings.warn(message, FutureWarning)\n"
],
[
"eli5.show_weights(xgb,vec = vect,top = 15) # XGBoost\n# will give you top 15 features or words which makes a comment toxic ",
"_____no_output_____"
]
],
[
[
"## Tweets Explanation ##",
"_____no_output_____"
]
],
[
[
"X_test.iloc[718]",
"_____no_output_____"
],
[
"eli5.show_prediction(logreg, vec = vect, doc = X_test.iloc[718]) ",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbfbdb8dafd57689f59b90289b154e92ae1c89fe
| 51,488 |
ipynb
|
Jupyter Notebook
|
model-layer/text-module/oommix/notebook/exp4.ipynb
|
nowhyun/METIS
|
78e99861b076ab7c075858c48509920bb4e97796
|
[
"MIT"
] | 67 |
2018-11-15T20:01:02.000Z
|
2021-11-22T01:37:17.000Z
|
model-layer/text-module/oommix/notebook/exp4.ipynb
|
nowhyun/METIS
|
78e99861b076ab7c075858c48509920bb4e97796
|
[
"MIT"
] | 28 |
2018-12-05T05:57:52.000Z
|
2021-04-20T10:18:35.000Z
|
model-layer/text-module/oommix/notebook/exp4.ipynb
|
nowhyun/METIS
|
78e99861b076ab7c075858c48509920bb4e97796
|
[
"MIT"
] | 55 |
2018-12-03T07:57:10.000Z
|
2021-09-30T14:40:57.000Z
| 39.154373 | 1,289 | 0.340235 |
[
[
[
"import os\nimport json\nimport numpy as np\nimport pandas as pd\nimport seaborn as sns\nimport scipy as scp\n\ndef mean(x):\n return scp.stats.trim_mean(x, 0.2)\n\ndef std(x):\n x = np.array(x)\n x.sort()\n low = int(0.2 * len(x))\n high = int(0.8 * len(x))\n \n return x[low:high].std(ddof=0)\n\nparam_dir = os.path.join(\"..\", \"out\", \"param\")\nresults = []\nfor fname in os.listdir(param_dir):\n with open(os.path.join(param_dir, fname)) as f:\n results.append(json.load(f))\ndf = pd.DataFrame(data=results)\n#print(df)\n\nexp1 = df.groupby([\"dataset\", \"num_train_data\", \"mix_strategy\"])[\"test_acc\"].agg([mean, std, \"count\"])\nexp1",
"/home/sh0416/anaconda3/envs/python3.8/lib/python3.8/site-packages/numpy/core/_methods.py:233: RuntimeWarning: Degrees of freedom <= 0 for slice\n ret = _var(a, axis=axis, dtype=dtype, out=out, ddof=ddof,\n/home/sh0416/anaconda3/envs/python3.8/lib/python3.8/site-packages/numpy/core/_methods.py:194: RuntimeWarning: invalid value encountered in true_divide\n arrmean = um.true_divide(\n/home/sh0416/anaconda3/envs/python3.8/lib/python3.8/site-packages/numpy/core/_methods.py:226: RuntimeWarning: invalid value encountered in double_scalars\n ret = ret.dtype.type(ret / rcount)\n"
],
[
"import numpy as np\nimport matplotlib.pyplot as plt\n\nsns.set_theme(context=\"paper\", style=\"ticks\", font_scale=1.7)\nmatrix = np.zeros((6, 7))\n\nfig, ax = plt.subplots(figsize=(8, 6))\nfor idx, row in df[(df[\"dataset\"]==\"amazon_review_polarity\")&(df[\"mix_strategy\"]==\"oommix\")].groupby([\"m_layer\", \"d_layer\"])[\"test_acc\"].mean().iteritems():\n matrix[(idx[0]//2, idx[1]//2)] = 100*row\nmask = matrix == 0\n\nax = sns.heatmap(matrix, mask=mask, linewidths=1.5, cmap=\"YlGnBu\",\n xticklabels=[0, 2, 4, 6, 8, 10, 12],\n yticklabels=[0, 2, 4, 6, 8, 10],\n annot=True, fmt=\".2f\")\nax.set_ylabel(\"Generator layer\", fontsize=20)\nax.set_xlabel(\"Discriminator layer\", fontsize=20)\nplt.tight_layout()\nplt.savefig(\"exp4.png\", dpi=200)",
"_____no_output_____"
],
[
"df[(df[\"mix_strategy\"]==\"oommix\")&(df[\"dataset\"]==\"yahoo_answer\")&(df[\"num_train_data\"]==500)]",
"_____no_output_____"
],
[
"import torch\nckpt = torch.load(\"../out/ckpt/model_bd9700bb-488.pth\", map_location=torch.device(\"cpu\"))",
"_____no_output_____"
],
[
"sum(v.numel() for k, v in ckpt.items())\n",
"_____no_output_____"
],
[
"df[(df[\"dataset\"]==\"yahoo_answer\")&(df[\"num_train_data\"] == 2000)]",
"_____no_output_____"
],
[
"df[(df[\"dataset\"]==\"dbpedia\")&(df[\"num_train_data\"] == 35000)]",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfbdddcea308f8f0b76af83e4a89167deee0b4e
| 97,612 |
ipynb
|
Jupyter Notebook
|
Colab_Notebooks/EDA.ipynb
|
Molecular-Exploration/toxicity-classification
|
64ea8068086b1d7b758d96fc68b4f2008ce4cf0e
|
[
"MIT"
] | null | null | null |
Colab_Notebooks/EDA.ipynb
|
Molecular-Exploration/toxicity-classification
|
64ea8068086b1d7b758d96fc68b4f2008ce4cf0e
|
[
"MIT"
] | null | null | null |
Colab_Notebooks/EDA.ipynb
|
Molecular-Exploration/toxicity-classification
|
64ea8068086b1d7b758d96fc68b4f2008ce4cf0e
|
[
"MIT"
] | 1 |
2021-04-05T22:41:06.000Z
|
2021-04-05T22:41:06.000Z
| 101.99791 | 34,592 | 0.473722 |
[
[
[
"from google.colab import drive\ndrive.mount('/content/drive')",
"Mounted at /content/drive\n"
],
[
"import numpy as np\nimport pandas as pd",
"_____no_output_____"
],
[
"train_srp53 = pd.read_csv('/content/drive/MyDrive/Molecular Exploration/Data/sr-p53.smiles',\n sep='\\t',\n names=['smiles', 'id', 'target'])",
"_____no_output_____"
],
[
"train_srp53.head()",
"_____no_output_____"
],
[
"len(train_srp53)",
"_____no_output_____"
],
[
"sum(train_srp53.target)",
"_____no_output_____"
],
[
"!pip install -q SmilesPE",
"_____no_output_____"
]
],
[
[
"### Tokenization of string compounds with SmilesPE (Byte pair encoding library with built-in tokenizers)",
"_____no_output_____"
]
],
[
[
"from SmilesPE.pretokenizer import atomwise_tokenizer\n\nsmi = 'CC[N+](C)(C)Cc1ccccc1Br'\ntoks = atomwise_tokenizer(smi)\nprint(toks)",
"['C', 'C', '[N+]', '(', 'C', ')', '(', 'C', ')', 'C', 'c', '1', 'c', 'c', 'c', 'c', 'c', '1', 'Br']\n"
]
],
[
[
"***example of pretrained SMILES byte-pair encoding***",
"_____no_output_____"
]
],
[
[
"import requests\nfile_url = 'https://raw.githubusercontent.com/XinhaoLi74/SmilesPE/master/SPE_ChEMBL.txt'\n\nr = requests.get(file_url, stream = True)\n\nwith open('/content/drive/MyDrive/Molecular Exploration/Data/BPE_codes.txt', 'wb') as file:\n for block in r.iter_content(chunk_size = 1024):\n if block:\n file.write(block)",
"_____no_output_____"
],
[
"import codecs\nfrom SmilesPE.tokenizer import *\n\nspe_vob= codecs.open('/content/drive/MyDrive/DATA_2040/Molecular Exploration/Data/BPE_codes.txt')\nspe = SPE_Tokenizer(spe_vob)\n\nsmi = 'CC[N+](C)(C)Cc1ccccc1Br'\nbpe_encoding = spe.tokenize(smi)\n\n# should get >>> 'CC [N+](C) (C)C c1ccccc1 Br'",
"_____no_output_____"
]
],
[
[
"*The output of the byte-pair encoding is a space-separated string of tokens, each token being a string. The example output below would be the input sequence to a model.*",
"_____no_output_____"
]
],
[
[
"bpe_encoding.split(' ')",
"_____no_output_____"
]
],
[
[
"### Looking at the byte-pair encoding alphabet across the whole (~8000 large) dataset",
"_____no_output_____"
]
],
[
[
"# initialize the pretrained BP encoder\nspe = SPE_Tokenizer(spe_vob)\n\n# initialize empyt vocabulary set\nalphabet = set()\n\n# traverse through data adding byte-pair tokens to vocabulary\nfor smi in train_srp53.smiles:\n bpe_encoding = spe.tokenize(smi)\n tkns = set(bpe_encoding.split(' '))\n alphabet = alphabet.union(tkns)",
"_____no_output_____"
]
],
[
[
"***The alphabet for this training set is 1096 elements -- the whole alphabet used to train this BP encoder is ~3000 ==> what do we do to prepare for getting test samples with tokens unseen in the training set?***",
"_____no_output_____"
]
],
[
[
"len(alphabet)",
"_____no_output_____"
],
[
"from matplotlib import pyplot as plt",
"_____no_output_____"
],
[
"def smiles_to_token(row):\n return atomwise_tokenizer(row['smiles'])\n\ntrain_srp53['tokens'] = train_srp53.apply(lambda row: smiles_to_token(row), axis=1)",
"_____no_output_____"
],
[
"train_srp53.head()",
"_____no_output_____"
],
[
"vocab = set()\nfor smi in train_srp53.smiles:\n tok = atomwise_tokenizer(smi)\n tokens = set(tok)\n vocab = vocab.union(tokens)\n\ndef CountFrequency(my_list):\n \n # Creating an empty dictionary \n freq = {}\n for item in my_list:\n if item in freq:\n freq[item] += 1\n else:\n freq[item] = 1\n \n return freq\n\ntoken_appears_once = {}\ntoken_freq = {}\ntoken_prop = {k:[] for k in vocab}\nsmile_lengths = []\n\nfor i, row in train_srp53.iterrows():\n\n token_dict = CountFrequency(row['tokens'])\n\n smile_lengths.append(len(row['tokens']))\n\n for token, count in token_dict.items():\n\n if token in token_appears_once.keys():\n token_appears_once[token] += 1\n else:\n token_appears_once[token] = 1\n\n if token in token_freq.keys():\n token_freq[token] += count\n else:\n token_freq[token] = count\n\n for tok in token_prop.keys():\n\n token_prop[tok].append(row['tokens'].count(tok) / len(row['tokens']))\n ",
"_____no_output_____"
],
[
"print(token_appears_once['N'])\nprint(token_freq['N'])\nprint(len(token_prop['N']))\nprint(len(smile_lengths))",
"4486\n10031\n8634\n8634\n"
]
],
[
[
"## EDA Plots",
"_____no_output_____"
]
],
[
[
"import plotly.express as px\nfrom heapq import nlargest\n \ndef dict_to_df(d, N):\n \n # N largest values in dictionary\n # Using nlargest\n res = nlargest(N, d, key = d.get)\n \n df = pd.DataFrame(columns=['Token', 'Count'])\n df['Token'] = res\n\n counts = [d[token] for token in res]\n df['Count'] = counts\n\n return df",
"_____no_output_____"
],
[
"token_appears_df = dict_to_df(token_appears_once, 30)\nfig = px.bar(token_appears_df, x='Token', y='Count')\nfig.show()",
"_____no_output_____"
],
[
"token_freq_df = dict_to_df(token_freq, 30)\nfig = px.bar(token_freq_df, x='Token', y='Count', log_y=True)\nfig.show()",
"_____no_output_____"
],
[
"fig = px.histogram(pd.DataFrame(smile_lengths, columns=['Lengths']), x = 'Lengths')\nfig.show()",
"_____no_output_____"
],
[
"prop_df = dict_to_df(token_prop, len(vocab))\nprop_df['Average'] = prop_df.apply(lambda row: np.mean(row.Count), axis=1)\nprop_df_sorted = prop_df.sort_values(by=['Average'], ascending=False)\n\n# fig = px.box(prop_df_sorted.head(10), x='Token', y='Count')\n# fig.show()",
"_____no_output_____"
],
[
"# prop_dict = {'C': token_prop['C'], '(':token_prop['(']}\n# # prop_df = pd.DataFrame([token_prop['C'], token_prop['('], token_prop[')']], columns=['Carbon', '(', ')'])\n# prop_df = pd.DataFrame(prop_dict)\n\n# fig = px.box(prop_df, y=)\n# fig.show()",
"_____no_output_____"
],
[
"",
"_____no_output_____"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfbe7053d1e0a87d9ebd8a28b92971e36f9325c
| 53,368 |
ipynb
|
Jupyter Notebook
|
3_Training_a_Model.ipynb
|
cixuuz/plagiarism-detector
|
d85e60ce9ba14a39c473c3ab68d245ff291ec959
|
[
"MIT"
] | 1 |
2020-10-18T05:19:47.000Z
|
2020-10-18T05:19:47.000Z
|
3_Training_a_Model.ipynb
|
cixuuz/plagiarism-detector
|
d85e60ce9ba14a39c473c3ab68d245ff291ec959
|
[
"MIT"
] | null | null | null |
3_Training_a_Model.ipynb
|
cixuuz/plagiarism-detector
|
d85e60ce9ba14a39c473c3ab68d245ff291ec959
|
[
"MIT"
] | 3 |
2020-08-04T07:04:43.000Z
|
2021-12-22T13:12:48.000Z
| 49.277932 | 1,234 | 0.628729 |
[
[
[
"# Plagiarism Detection Model\n\nNow that you've created training and test data, you are ready to define and train a model. Your goal in this notebook, will be to train a binary classification model that learns to label an answer file as either plagiarized or not, based on the features you provide the model.\n\nThis task will be broken down into a few discrete steps:\n\n* Upload your data to S3.\n* Define a binary classification model and a training script.\n* Train your model and deploy it.\n* Evaluate your deployed classifier and answer some questions about your approach.\n\nTo complete this notebook, you'll have to complete all given exercises and answer all the questions in this notebook.\n> All your tasks will be clearly labeled **EXERCISE** and questions as **QUESTION**.\n\nIt will be up to you to explore different classification models and decide on a model that gives you the best performance for this dataset.\n\n---",
"_____no_output_____"
],
[
"## Load Data to S3\n\nIn the last notebook, you should have created two files: a `training.csv` and `test.csv` file with the features and class labels for the given corpus of plagiarized/non-plagiarized text data. \n\n>The below cells load in some AWS SageMaker libraries and creates a default bucket. After creating this bucket, you can upload your locally stored data to S3.\n\nSave your train and test `.csv` feature files, locally. To do this you can run the second notebook \"2_Plagiarism_Feature_Engineering\" in SageMaker or you can manually upload your files to this notebook using the upload icon in Jupyter Lab. Then you can upload local files to S3 by using `sagemaker_session.upload_data` and pointing directly to where the training data is saved.",
"_____no_output_____"
]
],
[
[
"import pandas as pd\nimport boto3\nimport sagemaker",
"_____no_output_____"
],
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\n# session and role\nsagemaker_session = sagemaker.Session()\nrole = sagemaker.get_execution_role()\n\n# create an S3 bucket\nbucket = sagemaker_session.default_bucket()",
"_____no_output_____"
]
],
[
[
"## EXERCISE: Upload your training data to S3\n\nSpecify the `data_dir` where you've saved your `train.csv` file. Decide on a descriptive `prefix` that defines where your data will be uploaded in the default S3 bucket. Finally, create a pointer to your training data by calling `sagemaker_session.upload_data` and passing in the required parameters. It may help to look at the [Session documentation](https://sagemaker.readthedocs.io/en/stable/session.html#sagemaker.session.Session.upload_data) or previous SageMaker code examples.\n\nYou are expected to upload your entire directory. Later, the training script will only access the `train.csv` file.",
"_____no_output_____"
]
],
[
[
"# should be the name of directory you created to save your features data\ndata_dir = 'plagiarism_data'\n\n# set prefix, a descriptive name for a directory \nprefix = 'sagemaker/plagiarism-data'\n\n# upload all data to S3\ninput_data = sagemaker_session.upload_data(path=data_dir, bucket=bucket, key_prefix=prefix)\nprint(input_data)",
"s3://sagemaker-us-west-2-203336335427/sagemaker/plagiarism-data\n"
]
],
[
[
"### Test cell\n\nTest that your data has been successfully uploaded. The below cell prints out the items in your S3 bucket and will throw an error if it is empty. You should see the contents of your `data_dir` and perhaps some checkpoints. If you see any other files listed, then you may have some old model files that you can delete via the S3 console (though, additional files shouldn't affect the performance of model developed in this notebook).",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\n# confirm that data is in S3 bucket\nempty_check = []\nfor obj in boto3.resource('s3').Bucket(bucket).objects.all():\n empty_check.append(obj.key)\n print(obj.key)\n\nassert len(empty_check) !=0, 'S3 bucket is empty.'\nprint('Test passed!')",
"counties/kmeans-2019-06-18-19-53-42-962/output/model.tar.gz\ncounties/pca-2019-06-18-02-33-24-120/output/model.tar.gz\ncounties/pca-2019-06-18-18-35-22-487/output/model.tar.gz\ncreditcard/linear-learner-2019-06-19-00-44-35-049/output/model.tar.gz\ncreditcard/linear-learner-2019-06-19-01-14-12-340/output/model.tar.gz\ncreditcard/linear-learner-2019-06-19-01-29-12-272/output/model.tar.gz\nkmeans-2019-06-18-19-36-55-355/output/model.tar.gz\nsagemaker-pytorch-2019-06-11-18-54-24-679/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-11-19-04-58-540/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-12-01-17-41-931/output/model.tar.gz\nsagemaker-pytorch-2019-06-12-01-17-41-931/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-12-01-57-38-889/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-12-02-16-44-055/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-12-02-28-00-891/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-14-01-02-55-277/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-14-01-17-38-838/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-17-00-20-38-492/output/model.tar.gz\nsagemaker-pytorch-2019-06-17-00-20-38-492/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-17-00-39-07-677/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-20-01-05-08-275/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-20-01-09-37-484/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-00-58-08-500/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-03-00-920/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-06-13-121/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-07-18-193/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-10-40-154/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-19-53-285/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-25-37-707/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-32-12-745/source/sourcedir.tar.gz\nsagemaker-pytorch-2019-06-22-01-36-38-148/source/sourcedir.tar.gz\nsagemaker-record-sets/KMeans-2019-06-18-19-35-22-411/.amazon.manifest\nsagemaker-record-sets/KMeans-2019-06-18-19-35-22-411/matrix_0.pbr\nsagemaker-record-sets/KMeans-2019-06-18-19-53-37-757/.amazon.manifest\nsagemaker-record-sets/KMeans-2019-06-18-19-53-37-757/matrix_0.pbr\nsagemaker-record-sets/LinearLearner-2019-06-19-00-43-01-922/.amazon.manifest\nsagemaker-record-sets/LinearLearner-2019-06-19-00-43-01-922/matrix_0.pbr\nsagemaker-record-sets/PCA-2019-06-18-02-32-59-817/.amazon.manifest\nsagemaker-record-sets/PCA-2019-06-18-02-32-59-817/matrix_0.pbr\nsagemaker-record-sets/PCA-2019-06-18-18-35-20-709/.amazon.manifest\nsagemaker-record-sets/PCA-2019-06-18-18-35-20-709/matrix_0.pbr\nsagemaker/moon-data/sagemaker-pytorch-2019-06-20-01-05-08-275/output/model.tar.gz\nsagemaker/moon-data/train.csv\nsagemaker/plagiarism-data/sagemaker-pytorch-2019-06-22-01-32-12-745/output/model.tar.gz\nsagemaker/plagiarism-data/sagemaker-pytorch-2019-06-22-01-36-38-148/output/model.tar.gz\nsagemaker/plagiarism-data/test.csv\nsagemaker/plagiarism-data/train.csv\nsagemaker/sentiment_rnn/train.csv\nsagemaker/sentiment_rnn/word_dict.pkl\nTest passed!\n"
]
],
[
[
"---\n\n# Modeling\n\nNow that you've uploaded your training data, it's time to define and train a model!\n\nThe type of model you create is up to you. For a binary classification task, you can choose to go one of three routes:\n* Use a built-in classification algorithm, like LinearLearner.\n* Define a custom Scikit-learn classifier, a comparison of models can be found [here](https://scikit-learn.org/stable/auto_examples/classification/plot_classifier_comparison.html).\n* Define a custom PyTorch neural network classifier. \n\nIt will be up to you to test out a variety of models and choose the best one. Your project will be graded on the accuracy of your final model. \n \n---\n\n## EXERCISE: Complete a training script \n\nTo implement a custom classifier, you'll need to complete a `train.py` script. You've been given the folders `source_sklearn` and `source_pytorch` which hold starting code for a custom Scikit-learn model and a PyTorch model, respectively. Each directory has a `train.py` training script. To complete this project **you only need to complete one of these scripts**; the script that is responsible for training your final model.\n\nA typical training script:\n* Loads training data from a specified directory\n* Parses any training & model hyperparameters (ex. nodes in a neural network, training epochs, etc.)\n* Instantiates a model of your design, with any specified hyperparams\n* Trains that model \n* Finally, saves the model so that it can be hosted/deployed, later\n\n### Defining and training a model\nMuch of the training script code is provided for you. Almost all of your work will be done in the `if __name__ == '__main__':` section. To complete a `train.py` file, you will:\n1. Import any extra libraries you need\n2. Define any additional model training hyperparameters using `parser.add_argument`\n2. Define a model in the `if __name__ == '__main__':` section\n3. Train the model in that same section\n\nBelow, you can use `!pygmentize` to display an existing `train.py` file. Read through the code; all of your tasks are marked with `TODO` comments. \n\n**Note: If you choose to create a custom PyTorch model, you will be responsible for defining the model in the `model.py` file,** and a `predict.py` file is provided. If you choose to use Scikit-learn, you only need a `train.py` file; you may import a classifier from the `sklearn` library.",
"_____no_output_____"
]
],
[
[
"# directory can be changed to: source_sklearn or source_pytorch\n!pygmentize source_sklearn/train.py",
"\u001b[34mfrom\u001b[39;49;00m \u001b[04m\u001b[36m__future__\u001b[39;49;00m \u001b[34mimport\u001b[39;49;00m print_function\r\n\r\n\u001b[34mimport\u001b[39;49;00m \u001b[04m\u001b[36margparse\u001b[39;49;00m\r\n\u001b[34mimport\u001b[39;49;00m \u001b[04m\u001b[36mos\u001b[39;49;00m\r\n\u001b[34mimport\u001b[39;49;00m \u001b[04m\u001b[36mpandas\u001b[39;49;00m \u001b[34mas\u001b[39;49;00m \u001b[04m\u001b[36mpd\u001b[39;49;00m\r\n\r\n\u001b[34mfrom\u001b[39;49;00m \u001b[04m\u001b[36msklearn.externals\u001b[39;49;00m \u001b[34mimport\u001b[39;49;00m joblib\r\n\r\n\u001b[37m## TODO: Import any additional libraries you need to define a model\u001b[39;49;00m\r\n\r\n\r\n\u001b[37m# Provided model load function\u001b[39;49;00m\r\n\u001b[34mdef\u001b[39;49;00m \u001b[32mmodel_fn\u001b[39;49;00m(model_dir):\r\n \u001b[33m\"\"\"Load model from the model_dir. This is the same model that is saved\u001b[39;49;00m\r\n\u001b[33m in the main if statement.\u001b[39;49;00m\r\n\u001b[33m \"\"\"\u001b[39;49;00m\r\n \u001b[34mprint\u001b[39;49;00m(\u001b[33m\"\u001b[39;49;00m\u001b[33mLoading model.\u001b[39;49;00m\u001b[33m\"\u001b[39;49;00m)\r\n \r\n \u001b[37m# load using joblib\u001b[39;49;00m\r\n model = joblib.load(os.path.join(model_dir, \u001b[33m\"\u001b[39;49;00m\u001b[33mmodel.joblib\u001b[39;49;00m\u001b[33m\"\u001b[39;49;00m))\r\n \u001b[34mprint\u001b[39;49;00m(\u001b[33m\"\u001b[39;49;00m\u001b[33mDone loading model.\u001b[39;49;00m\u001b[33m\"\u001b[39;49;00m)\r\n \r\n \u001b[34mreturn\u001b[39;49;00m model\r\n\r\n\r\n\u001b[37m## TODO: Complete the main code\u001b[39;49;00m\r\n\u001b[34mif\u001b[39;49;00m \u001b[31m__name__\u001b[39;49;00m == \u001b[33m'\u001b[39;49;00m\u001b[33m__main__\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m:\r\n \r\n \u001b[37m# All of the model parameters and training parameters are sent as arguments\u001b[39;49;00m\r\n \u001b[37m# when this script is executed, during a training job\u001b[39;49;00m\r\n \r\n \u001b[37m# Here we set up an argument parser to easily access the parameters\u001b[39;49;00m\r\n parser = argparse.ArgumentParser()\r\n\r\n \u001b[37m# SageMaker parameters, like the directories for training data and saving models; set automatically\u001b[39;49;00m\r\n \u001b[37m# Do not need to change\u001b[39;49;00m\r\n parser.add_argument(\u001b[33m'\u001b[39;49;00m\u001b[33m--output-data-dir\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m, \u001b[36mtype\u001b[39;49;00m=\u001b[36mstr\u001b[39;49;00m, default=os.environ[\u001b[33m'\u001b[39;49;00m\u001b[33mSM_OUTPUT_DATA_DIR\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m])\r\n parser.add_argument(\u001b[33m'\u001b[39;49;00m\u001b[33m--model-dir\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m, \u001b[36mtype\u001b[39;49;00m=\u001b[36mstr\u001b[39;49;00m, default=os.environ[\u001b[33m'\u001b[39;49;00m\u001b[33mSM_MODEL_DIR\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m])\r\n parser.add_argument(\u001b[33m'\u001b[39;49;00m\u001b[33m--data-dir\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m, \u001b[36mtype\u001b[39;49;00m=\u001b[36mstr\u001b[39;49;00m, default=os.environ[\u001b[33m'\u001b[39;49;00m\u001b[33mSM_CHANNEL_TRAIN\u001b[39;49;00m\u001b[33m'\u001b[39;49;00m])\r\n \r\n \u001b[37m## TODO: Add any additional arguments that you will need to pass into your model\u001b[39;49;00m\r\n \r\n \u001b[37m# args holds all passed-in arguments\u001b[39;49;00m\r\n args = parser.parse_args()\r\n\r\n \u001b[37m# Read in csv training file\u001b[39;49;00m\r\n training_dir = args.data_dir\r\n train_data = pd.read_csv(os.path.join(training_dir, \u001b[33m\"\u001b[39;49;00m\u001b[33mtrain.csv\u001b[39;49;00m\u001b[33m\"\u001b[39;49;00m), header=\u001b[36mNone\u001b[39;49;00m, names=\u001b[36mNone\u001b[39;49;00m)\r\n\r\n \u001b[37m# Labels are in the first column\u001b[39;49;00m\r\n train_y = train_data.iloc[:,\u001b[34m0\u001b[39;49;00m]\r\n train_x = train_data.iloc[:,\u001b[34m1\u001b[39;49;00m:]\r\n \r\n \r\n \u001b[37m## --- Your code here --- ##\u001b[39;49;00m\r\n \r\n\r\n \u001b[37m## TODO: Define a model \u001b[39;49;00m\r\n model = \u001b[36mNone\u001b[39;49;00m\r\n \r\n \r\n \u001b[37m## TODO: Train the model\u001b[39;49;00m\r\n \r\n \r\n \r\n \u001b[37m## --- End of your code --- ##\u001b[39;49;00m\r\n \r\n\r\n \u001b[37m# Save the trained model\u001b[39;49;00m\r\n joblib.dump(model, os.path.join(args.model_dir, \u001b[33m\"\u001b[39;49;00m\u001b[33mmodel.joblib\u001b[39;49;00m\u001b[33m\"\u001b[39;49;00m))\r\n"
]
],
[
[
"### Provided code\n\nIf you read the code above, you can see that the starter code includes a few things:\n* Model loading (`model_fn`) and saving code\n* Getting SageMaker's default hyperparameters\n* Loading the training data by name, `train.csv` and extracting the features and labels, `train_x`, and `train_y`\n\nIf you'd like to read more about model saving with [joblib for sklearn](https://scikit-learn.org/stable/modules/model_persistence.html) or with [torch.save](https://pytorch.org/tutorials/beginner/saving_loading_models.html), click on the provided links.",
"_____no_output_____"
],
[
"---\n# Create an Estimator\n\nWhen a custom model is constructed in SageMaker, an entry point must be specified. This is the Python file which will be executed when the model is trained; the `train.py` function you specified above. To run a custom training script in SageMaker, construct an estimator, and fill in the appropriate constructor arguments:\n\n* **entry_point**: The path to the Python script SageMaker runs for training and prediction.\n* **source_dir**: The path to the training script directory `source_sklearn` OR `source_pytorch`.\n* **entry_point**: The path to the Python script SageMaker runs for training and prediction.\n* **source_dir**: The path to the training script directory `train_sklearn` OR `train_pytorch`.\n* **entry_point**: The path to the Python script SageMaker runs for training.\n* **source_dir**: The path to the training script directory `train_sklearn` OR `train_pytorch`.\n* **role**: Role ARN, which was specified, above.\n* **train_instance_count**: The number of training instances (should be left at 1).\n* **train_instance_type**: The type of SageMaker instance for training. Note: Because Scikit-learn does not natively support GPU training, Sagemaker Scikit-learn does not currently support training on GPU instance types.\n* **sagemaker_session**: The session used to train on Sagemaker.\n* **hyperparameters** (optional): A dictionary `{'name':value, ..}` passed to the train function as hyperparameters.\n\nNote: For a PyTorch model, there is another optional argument **framework_version**, which you can set to the latest version of PyTorch, `1.0`.\n\n## EXERCISE: Define a Scikit-learn or PyTorch estimator\n\nTo import your desired estimator, use one of the following lines:\n```\nfrom sagemaker.sklearn.estimator import SKLearn\n```\n```\nfrom sagemaker.pytorch import PyTorch\n```",
"_____no_output_____"
]
],
[
[
"# your import and estimator code, here\n# import a PyTorch wrapper\nfrom sagemaker.pytorch import PyTorch\n\n# specify an output path\noutput_path = f\"s3://{bucket}/{prefix}\"\n\n# instantiate a pytorch estimator\nestimator = PyTorch(\n entry_point=\"train.py\",\n source_dir=\"source_pytorch\",\n role=role,\n framework_version=\"1.0\",\n train_instance_count=1,\n train_instance_type=\"ml.c4.xlarge\",\n output_path=output_path,\n sagemaker_session=sagemaker_session,\n hyperparameters={\n \"input_features\": 2,\n \"hidden_dim\": 20, \n \"output_dim\": 1,\n \"epochs\": 160\n })\n",
"_____no_output_____"
]
],
[
[
"## EXERCISE: Train the estimator\n\nTrain your estimator on the training data stored in S3. This should create a training job that you can monitor in your SageMaker console.",
"_____no_output_____"
]
],
[
[
"train_data_path = input_data + \"/train.csv\"\nprint(train_data_path)",
"s3://sagemaker-us-west-2-203336335427/sagemaker/plagiarism-data/train.csv\n"
],
[
"%%time\n\n# Train your estimator on S3 training data\nestimator.fit({'train': train_data_path})",
"2019-06-22 14:23:18 Starting - Starting the training job...\n2019-06-22 14:23:33 Starting - Launching requested ML instances.........\n2019-06-22 14:25:00 Starting - Preparing the instances for training......\n2019-06-22 14:26:15 Downloading - Downloading input data\n2019-06-22 14:26:15 Training - Downloading the training image..\n\u001b[31mbash: cannot set terminal process group (-1): Inappropriate ioctl for device\u001b[0m\n\u001b[31mbash: no job control in this shell\u001b[0m\n\u001b[31m2019-06-22 14:26:28,875 sagemaker-containers INFO Imported framework sagemaker_pytorch_container.training\u001b[0m\n\u001b[31m2019-06-22 14:26:28,878 sagemaker-containers INFO No GPUs detected (normal if no gpus installed)\u001b[0m\n\u001b[31m2019-06-22 14:26:28,890 sagemaker_pytorch_container.training INFO Block until all host DNS lookups succeed.\u001b[0m\n\u001b[31m2019-06-22 14:26:30,309 sagemaker_pytorch_container.training INFO Invoking user training script.\u001b[0m\n\u001b[31m2019-06-22 14:26:30,542 sagemaker-containers INFO Module train does not provide a setup.py. \u001b[0m\n\u001b[31mGenerating setup.py\u001b[0m\n\u001b[31m2019-06-22 14:26:30,543 sagemaker-containers INFO Generating setup.cfg\u001b[0m\n\u001b[31m2019-06-22 14:26:30,543 sagemaker-containers INFO Generating MANIFEST.in\u001b[0m\n\u001b[31m2019-06-22 14:26:30,543 sagemaker-containers INFO Installing module with the following command:\u001b[0m\n\u001b[31m/usr/bin/python -m pip install -U . \u001b[0m\n\u001b[31mProcessing /opt/ml/code\u001b[0m\n\u001b[31mBuilding wheels for collected packages: train\n Running setup.py bdist_wheel for train: started\n Running setup.py bdist_wheel for train: finished with status 'done'\n Stored in directory: /tmp/pip-ephem-wheel-cache-1cbf4gh0/wheels/35/24/16/37574d11bf9bde50616c67372a334f94fa8356bc7164af8ca3\u001b[0m\n\u001b[31mSuccessfully built train\u001b[0m\n\u001b[31mInstalling collected packages: train\u001b[0m\n\u001b[31mSuccessfully installed train-1.0.0\u001b[0m\n\u001b[31mYou are using pip version 18.1, however version 19.1.1 is available.\u001b[0m\n\u001b[31mYou should consider upgrading via the 'pip install --upgrade pip' command.\u001b[0m\n\u001b[31m2019-06-22 14:26:32,183 sagemaker-containers INFO No GPUs detected (normal if no gpus installed)\u001b[0m\n\u001b[31m2019-06-22 14:26:32,195 sagemaker-containers INFO Invoking user script\n\u001b[0m\n\u001b[31mTraining Env:\n\u001b[0m\n\u001b[31m{\n \"additional_framework_parameters\": {},\n \"channel_input_dirs\": {\n \"train\": \"/opt/ml/input/data/train\"\n },\n \"current_host\": \"algo-1\",\n \"framework_module\": \"sagemaker_pytorch_container.training:main\",\n \"hosts\": [\n \"algo-1\"\n ],\n \"hyperparameters\": {\n \"hidden_dim\": 20,\n \"input_features\": 2,\n \"epochs\": 160,\n \"output_dim\": 1\n },\n \"input_config_dir\": \"/opt/ml/input/config\",\n \"input_data_config\": {\n \"train\": {\n \"TrainingInputMode\": \"File\",\n \"S3DistributionType\": \"FullyReplicated\",\n \"RecordWrapperType\": \"None\"\n }\n },\n \"input_dir\": \"/opt/ml/input\",\n \"is_master\": true,\n \"job_name\": \"sagemaker-pytorch-2019-06-22-14-23-17-874\",\n \"log_level\": 20,\n \"master_hostname\": \"algo-1\",\n \"model_dir\": \"/opt/ml/model\",\n \"module_dir\": \"s3://sagemaker-us-west-2-203336335427/sagemaker-pytorch-2019-06-22-14-23-17-874/source/sourcedir.tar.gz\",\n \"module_name\": \"train\",\n \"network_interface_name\": \"eth0\",\n \"num_cpus\": 4,\n \"num_gpus\": 0,\n \"output_data_dir\": \"/opt/ml/output/data\",\n \"output_dir\": \"/opt/ml/output\",\n \"output_intermediate_dir\": \"/opt/ml/output/intermediate\",\n \"resource_config\": {\n \"current_host\": \"algo-1\",\n \"hosts\": [\n \"algo-1\"\n ],\n \"network_interface_name\": \"eth0\"\n },\n \"user_entry_point\": \"train.py\"\u001b[0m\n\u001b[31m}\n\u001b[0m\n\u001b[31mEnvironment variables:\n\u001b[0m\n\u001b[31mSM_HOSTS=[\"algo-1\"]\u001b[0m\n\u001b[31mSM_NETWORK_INTERFACE_NAME=eth0\u001b[0m\n\u001b[31mSM_HPS={\"epochs\":160,\"hidden_dim\":20,\"input_features\":2,\"output_dim\":1}\u001b[0m\n\u001b[31mSM_USER_ENTRY_POINT=train.py\u001b[0m\n\u001b[31mSM_FRAMEWORK_PARAMS={}\u001b[0m\n\u001b[31mSM_RESOURCE_CONFIG={\"current_host\":\"algo-1\",\"hosts\":[\"algo-1\"],\"network_interface_name\":\"eth0\"}\u001b[0m\n\u001b[31mSM_INPUT_DATA_CONFIG={\"train\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"}}\u001b[0m\n\u001b[31mSM_OUTPUT_DATA_DIR=/opt/ml/output/data\u001b[0m\n\u001b[31mSM_CHANNELS=[\"train\"]\u001b[0m\n\u001b[31mSM_CURRENT_HOST=algo-1\u001b[0m\n\u001b[31mSM_MODULE_NAME=train\u001b[0m\n\u001b[31mSM_LOG_LEVEL=20\u001b[0m\n\u001b[31mSM_FRAMEWORK_MODULE=sagemaker_pytorch_container.training:main\u001b[0m\n\u001b[31mSM_INPUT_DIR=/opt/ml/input\u001b[0m\n\u001b[31mSM_INPUT_CONFIG_DIR=/opt/ml/input/config\u001b[0m\n\u001b[31mSM_OUTPUT_DIR=/opt/ml/output\u001b[0m\n\u001b[31mSM_NUM_CPUS=4\u001b[0m\n\u001b[31mSM_NUM_GPUS=0\u001b[0m\n\u001b[31mSM_MODEL_DIR=/opt/ml/model\u001b[0m\n\u001b[31mSM_MODULE_DIR=s3://sagemaker-us-west-2-203336335427/sagemaker-pytorch-2019-06-22-14-23-17-874/source/sourcedir.tar.gz\u001b[0m\n\u001b[31mSM_TRAINING_ENV={\"additional_framework_parameters\":{},\"channel_input_dirs\":{\"train\":\"/opt/ml/input/data/train\"},\"current_host\":\"algo-1\",\"framework_module\":\"sagemaker_pytorch_container.training:main\",\"hosts\":[\"algo-1\"],\"hyperparameters\":{\"epochs\":160,\"hidden_dim\":20,\"input_features\":2,\"output_dim\":1},\"input_config_dir\":\"/opt/ml/input/config\",\"input_data_config\":{\"train\":{\"RecordWrapperType\":\"None\",\"S3DistributionType\":\"FullyReplicated\",\"TrainingInputMode\":\"File\"}},\"input_dir\":\"/opt/ml/input\",\"is_master\":true,\"job_name\":\"sagemaker-pytorch-2019-06-22-14-23-17-874\",\"log_level\":20,\"master_hostname\":\"algo-1\",\"model_dir\":\"/opt/ml/model\",\"module_dir\":\"s3://sagemaker-us-west-2-203336335427/sagemaker-pytorch-2019-06-22-14-23-17-874/source/sourcedir.tar.gz\",\"module_name\":\"train\",\"network_interface_name\":\"eth0\",\"num_cpus\":4,\"num_gpus\":0,\"output_data_dir\":\"/opt/ml/output/data\",\"output_dir\":\"/opt/ml/output\",\"output_intermediate_dir\":\"/opt/ml/output/intermediate\",\"resource_config\":{\"current_host\":\"algo-1\",\"hosts\":[\"algo-1\"],\"network_interface_name\":\"eth0\"},\"user_entry_point\":\"train.py\"}\u001b[0m\n\u001b[31mSM_USER_ARGS=[\"--epochs\",\"160\",\"--hidden_dim\",\"20\",\"--input_features\",\"2\",\"--output_dim\",\"1\"]\u001b[0m\n\u001b[31mSM_OUTPUT_INTERMEDIATE_DIR=/opt/ml/output/intermediate\u001b[0m\n\u001b[31mSM_CHANNEL_TRAIN=/opt/ml/input/data/train\u001b[0m\n\u001b[31mSM_HP_HIDDEN_DIM=20\u001b[0m\n\u001b[31mSM_HP_INPUT_FEATURES=2\u001b[0m\n\u001b[31mSM_HP_EPOCHS=160\u001b[0m\n\u001b[31mSM_HP_OUTPUT_DIM=1\u001b[0m\n\u001b[31mPYTHONPATH=/usr/local/bin:/usr/lib/python36.zip:/usr/lib/python3.6:/usr/lib/python3.6/lib-dynload:/usr/local/lib/python3.6/dist-packages:/usr/lib/python3/dist-packages\n\u001b[0m\n\u001b[31mInvoking script with the following command:\n\u001b[0m\n\u001b[31m/usr/bin/python -m train --epochs 160 --hidden_dim 20 --input_features 2 --output_dim 1\n\n\u001b[0m\n\n2019-06-22 14:26:44 Uploading - Uploading generated training model\n2019-06-22 14:26:44 Completed - Training job completed\n\u001b[31mUsing device cpu.\u001b[0m\n\u001b[31mGet train data loader.\u001b[0m\n\u001b[31mEpoch: 1, Loss: 0.7071206739970616\u001b[0m\n\u001b[31mEpoch: 2, Loss: 0.7037808809961591\u001b[0m\n\u001b[31mEpoch: 3, Loss: 0.7021009496280125\u001b[0m\n\u001b[31mEpoch: 4, Loss: 0.6933521117482867\u001b[0m\n\u001b[31mEpoch: 5, Loss: 0.6866948519434247\u001b[0m\n\u001b[31mEpoch: 6, Loss: 0.6780169095311847\u001b[0m\n\u001b[31mEpoch: 7, Loss: 0.6839605229241508\u001b[0m\n\u001b[31mEpoch: 8, Loss: 0.6744157075881958\u001b[0m\n\u001b[31mEpoch: 9, Loss: 0.651854259627206\u001b[0m\n\u001b[31mEpoch: 10, Loss: 0.6555183359554836\u001b[0m\n\u001b[31mEpoch: 11, Loss: 0.6628325070653643\u001b[0m\n\u001b[31mEpoch: 12, Loss: 0.6543110864503043\u001b[0m\n\u001b[31mEpoch: 13, Loss: 0.6452595591545105\u001b[0m\n\u001b[31mEpoch: 14, Loss: 0.6578087210655212\u001b[0m\n\u001b[31mEpoch: 15, Loss: 0.6551924007279533\u001b[0m\n\u001b[31mEpoch: 16, Loss: 0.6593806062425885\u001b[0m\n\u001b[31mEpoch: 17, Loss: 0.6477518166814532\u001b[0m\n\u001b[31mEpoch: 18, Loss: 0.6344135659081596\u001b[0m\n\u001b[31mEpoch: 19, Loss: 0.6481472594397408\u001b[0m\n\u001b[31mEpoch: 20, Loss: 0.6454513158117022\u001b[0m\n\u001b[31mEpoch: 21, Loss: 0.6502796070916312\u001b[0m\n\u001b[31mEpoch: 22, Loss: 0.6392346620559692\u001b[0m\n\u001b[31mEpoch: 23, Loss: 0.6326409237725394\u001b[0m\n\u001b[31mEpoch: 24, Loss: 0.6241270133427211\u001b[0m\n\u001b[31mEpoch: 25, Loss: 0.6339142577988761\u001b[0m\n\u001b[31mEpoch: 26, Loss: 0.6388565301895142\u001b[0m\n\u001b[31mEpoch: 27, Loss: 0.635871980871473\u001b[0m\n\u001b[31mEpoch: 28, Loss: 0.6284005727086749\u001b[0m\n\u001b[31mEpoch: 29, Loss: 0.6229295475142342\u001b[0m\n\u001b[31mEpoch: 30, Loss: 0.6126991595540728\u001b[0m\n\u001b[31mEpoch: 31, Loss: 0.6224521824291774\u001b[0m\n\u001b[31mEpoch: 32, Loss: 0.6029389074870518\u001b[0m\n\u001b[31mEpoch: 33, Loss: 0.6246544207845416\u001b[0m\n\u001b[31mEpoch: 34, Loss: 0.6066625033106122\u001b[0m\n\u001b[31mEpoch: 35, Loss: 0.6361497981207711\u001b[0m\n\u001b[31mEpoch: 36, Loss: 0.605789235660008\u001b[0m\n\u001b[31mEpoch: 37, Loss: 0.6173808404377529\u001b[0m\n\u001b[31mEpoch: 38, Loss: 0.6165845819881984\u001b[0m\n\u001b[31mEpoch: 39, Loss: 0.6048610891614642\u001b[0m\n\u001b[31mEpoch: 40, Loss: 0.603884858744485\u001b[0m\n\u001b[31mEpoch: 41, Loss: 0.5992041230201721\u001b[0m\n\u001b[31mEpoch: 42, Loss: 0.6058305842535836\u001b[0m\n\u001b[31mEpoch: 43, Loss: 0.6031211018562317\u001b[0m\n\u001b[31mEpoch: 44, Loss: 0.5870840890066964\u001b[0m\n\u001b[31mEpoch: 45, Loss: 0.5785234570503235\u001b[0m\n\u001b[31mEpoch: 46, Loss: 0.5978949069976807\u001b[0m\n\u001b[31mEpoch: 47, Loss: 0.6109089638505664\u001b[0m\n\u001b[31mEpoch: 48, Loss: 0.5906146849904742\u001b[0m\n\u001b[31mEpoch: 49, Loss: 0.582529604434967\u001b[0m\n\u001b[31mEpoch: 50, Loss: 0.5762825948851449\u001b[0m\n\u001b[31mEpoch: 51, Loss: 0.5893087642533439\u001b[0m\n\u001b[31mEpoch: 52, Loss: 0.5761790190424237\u001b[0m\n\u001b[31mEpoch: 53, Loss: 0.5832474402018956\u001b[0m\n\u001b[31mEpoch: 54, Loss: 0.593135586806706\u001b[0m\n\u001b[31mEpoch: 55, Loss: 0.5601540293012347\u001b[0m\n\u001b[31mEpoch: 56, Loss: 0.5599925475461143\u001b[0m\n\u001b[31mEpoch: 57, Loss: 0.5537959848131452\u001b[0m\n\u001b[31mEpoch: 58, Loss: 0.5797965739454541\u001b[0m\n\u001b[31mEpoch: 59, Loss: 0.5551231844084603\u001b[0m\n\u001b[31mEpoch: 60, Loss: 0.580093617950167\u001b[0m\n\u001b[31mEpoch: 61, Loss: 0.5606128488268171\u001b[0m\n\u001b[31mEpoch: 62, Loss: 0.5406584228788104\u001b[0m\n\u001b[31mEpoch: 63, Loss: 0.5698254193578448\u001b[0m\n\u001b[31mEpoch: 64, Loss: 0.5662476548126766\u001b[0m\n\u001b[31mEpoch: 65, Loss: 0.5479061688695636\u001b[0m\n\u001b[31mEpoch: 66, Loss: 0.5619021568979535\u001b[0m\n\u001b[31mEpoch: 67, Loss: 0.5370438184056964\u001b[0m\n\u001b[31mEpoch: 68, Loss: 0.5428952063832965\u001b[0m\n\u001b[31mEpoch: 69, Loss: 0.531809708901814\u001b[0m\n\u001b[31mEpoch: 70, Loss: 0.5416172742843628\u001b[0m\n\u001b[31mEpoch: 71, Loss: 0.5373765357903072\u001b[0m\n\u001b[31mEpoch: 72, Loss: 0.5277367915425982\u001b[0m\n\u001b[31mEpoch: 73, Loss: 0.5386972086770194\u001b[0m\n\u001b[31mEpoch: 74, Loss: 0.5482018036501748\u001b[0m\n\u001b[31mEpoch: 75, Loss: 0.5396692284515926\u001b[0m\n\u001b[31mEpoch: 76, Loss: 0.5409984886646271\u001b[0m\n\u001b[31mEpoch: 77, Loss: 0.5290544203349522\u001b[0m\n\u001b[31mEpoch: 78, Loss: 0.5177273537431445\u001b[0m\n\u001b[31mEpoch: 79, Loss: 0.5357641407421657\u001b[0m\n\u001b[31mEpoch: 80, Loss: 0.5237000371728625\u001b[0m\n\u001b[31mEpoch: 81, Loss: 0.5373982063361576\u001b[0m\n\u001b[31mEpoch: 82, Loss: 0.5206293208258492\u001b[0m\n\u001b[31mEpoch: 83, Loss: 0.5282896842275348\u001b[0m\n\u001b[31mEpoch: 84, Loss: 0.5116998212678092\u001b[0m\n\u001b[31mEpoch: 85, Loss: 0.5148033499717712\u001b[0m\n\u001b[31mEpoch: 86, Loss: 0.5245996458189828\u001b[0m\n\u001b[31mEpoch: 87, Loss: 0.5077688183103289\u001b[0m\n\u001b[31mEpoch: 88, Loss: 0.5155066336904254\u001b[0m\n\u001b[31mEpoch: 89, Loss: 0.5165118958268847\u001b[0m\n\u001b[31mEpoch: 90, Loss: 0.5185291171073914\u001b[0m\n\u001b[31mEpoch: 91, Loss: 0.521134010383061\u001b[0m\n\u001b[31mEpoch: 92, Loss: 0.5066553269113813\u001b[0m\n\u001b[31mEpoch: 93, Loss: 0.5268582318510328\u001b[0m\n\u001b[31mEpoch: 94, Loss: 0.49145709191049847\u001b[0m\n\u001b[31mEpoch: 95, Loss: 0.49894087655203684\u001b[0m\n\u001b[31mEpoch: 96, Loss: 0.5112712468419757\u001b[0m\n\u001b[31mEpoch: 97, Loss: 0.48671725392341614\u001b[0m\n\u001b[31mEpoch: 98, Loss: 0.5093451951231275\u001b[0m\n\u001b[31mEpoch: 99, Loss: 0.5039170512131282\u001b[0m\n\u001b[31mEpoch: 100, Loss: 0.4992073050567082\u001b[0m\n\u001b[31mEpoch: 101, Loss: 0.4788713668073927\u001b[0m\n\u001b[31mEpoch: 102, Loss: 0.4924752286502293\u001b[0m\n\u001b[31mEpoch: 103, Loss: 0.47363746591976713\u001b[0m\n\u001b[31mEpoch: 104, Loss: 0.4894975083214896\u001b[0m\n\u001b[31mEpoch: 105, Loss: 0.4861730805465153\u001b[0m\n\u001b[31mEpoch: 106, Loss: 0.4948428784097944\u001b[0m\n\u001b[31mEpoch: 107, Loss: 0.47396051457950045\u001b[0m\n\u001b[31mEpoch: 108, Loss: 0.45400748934064594\u001b[0m\n\u001b[31mEpoch: 109, Loss: 0.470756219966071\u001b[0m\n\u001b[31mEpoch: 110, Loss: 0.4751508746828352\u001b[0m\n\u001b[31mEpoch: 111, Loss: 0.4811019003391266\u001b[0m\n\u001b[31mEpoch: 112, Loss: 0.467385470867157\u001b[0m\n\u001b[31mEpoch: 113, Loss: 0.4589931198528835\u001b[0m\n\u001b[31mEpoch: 114, Loss: 0.46190227781023296\u001b[0m\n\u001b[31mEpoch: 115, Loss: 0.46634296008518766\u001b[0m\n\u001b[31mEpoch: 116, Loss: 0.4664492394242968\u001b[0m\n\u001b[31mEpoch: 117, Loss: 0.48572804246629986\u001b[0m\n\u001b[31mEpoch: 118, Loss: 0.47120015961783274\u001b[0m\n\u001b[31mEpoch: 119, Loss: 0.4487059073788779\u001b[0m\n\u001b[31mEpoch: 120, Loss: 0.4761164401258741\u001b[0m\n\u001b[31mEpoch: 121, Loss: 0.46036294954163687\u001b[0m\n\u001b[31mEpoch: 122, Loss: 0.44780620081084116\u001b[0m\n\u001b[31mEpoch: 123, Loss: 0.4454659947327205\u001b[0m\n\u001b[31mEpoch: 124, Loss: 0.44227770396641325\u001b[0m\n\u001b[31mEpoch: 125, Loss: 0.4427126092570169\u001b[0m\n\u001b[31mEpoch: 126, Loss: 0.43006190231868197\u001b[0m\n\u001b[31mEpoch: 127, Loss: 0.43163258263043\u001b[0m\n\u001b[31mEpoch: 128, Loss: 0.4291735717228481\u001b[0m\n\u001b[31mEpoch: 129, Loss: 0.42120826670101713\u001b[0m\n\u001b[31mEpoch: 130, Loss: 0.4262322613171169\u001b[0m\n\u001b[31mEpoch: 131, Loss: 0.4448316012110029\u001b[0m\n\u001b[31mEpoch: 132, Loss: 0.43400382144110544\u001b[0m\n\u001b[31mEpoch: 133, Loss: 0.45890144790921894\u001b[0m\n\u001b[31mEpoch: 134, Loss: 0.43256732395717074\u001b[0m\n\u001b[31mEpoch: 135, Loss: 0.4140941415514265\u001b[0m\n\u001b[31mEpoch: 136, Loss: 0.41966648612703594\u001b[0m\n\u001b[31mEpoch: 137, Loss: 0.4354919067450932\u001b[0m\n\u001b[31mEpoch: 138, Loss: 0.42835437825747896\u001b[0m\n\u001b[31mEpoch: 139, Loss: 0.43030776722090586\u001b[0m\n\u001b[31mEpoch: 140, Loss: 0.4246128669806889\u001b[0m\n\u001b[31mEpoch: 141, Loss: 0.4140352521623884\u001b[0m\n\u001b[31mEpoch: 142, Loss: 0.4148673287459782\u001b[0m\n\u001b[31mEpoch: 143, Loss: 0.4313798078468868\u001b[0m\n\u001b[31mEpoch: 144, Loss: 0.4247971943446568\u001b[0m\n\u001b[31mEpoch: 145, Loss: 0.38768674220357624\u001b[0m\n\u001b[31mEpoch: 146, Loss: 0.4048320140157427\u001b[0m\n\u001b[31mEpoch: 147, Loss: 0.4054703286715916\u001b[0m\n\u001b[31mEpoch: 148, Loss: 0.415200229201998\u001b[0m\n\u001b[31mEpoch: 149, Loss: 0.4073622226715088\u001b[0m\n\u001b[31mEpoch: 150, Loss: 0.3949581469808306\u001b[0m\n\u001b[31mEpoch: 151, Loss: 0.3985371525798525\u001b[0m\n\u001b[31mEpoch: 152, Loss: 0.39647333536829266\u001b[0m\n\u001b[31mEpoch: 153, Loss: 0.3768504432269505\u001b[0m\n\u001b[31mEpoch: 154, Loss: 0.40951140012059895\u001b[0m\n\u001b[31mEpoch: 155, Loss: 0.3976847401687077\u001b[0m\n\u001b[31mEpoch: 156, Loss: 0.3805894191776003\u001b[0m\n\u001b[31mEpoch: 157, Loss: 0.40720964329583303\u001b[0m\n\u001b[31mEpoch: 158, Loss: 0.39968447600092205\u001b[0m\n\u001b[31mEpoch: 159, Loss: 0.3937579478536333\u001b[0m\n\u001b[31mEpoch: 160, Loss: 0.41074818798473905\u001b[0m\n\u001b[31m2019-06-22 14:26:34,269 sagemaker-containers INFO Reporting training SUCCESS\u001b[0m\nBillable seconds: 52\nCPU times: user 486 ms, sys: 13.7 ms, total: 500 ms\nWall time: 3min 42s\n"
]
],
[
[
"## EXERCISE: Deploy the trained model\n\nAfter training, deploy your model to create a `predictor`. If you're using a PyTorch model, you'll need to create a trained `PyTorchModel` that accepts the trained `<model>.model_data` as an input parameter and points to the provided `source_pytorch/predict.py` file as an entry point. \n\nTo deploy a trained model, you'll use `<model>.deploy`, which takes in two arguments:\n* **initial_instance_count**: The number of deployed instances (1).\n* **instance_type**: The type of SageMaker instance for deployment.\n\nNote: If you run into an instance error, it may be because you chose the wrong training or deployment instance_type. It may help to refer to your previous exercise code to see which types of instances we used.",
"_____no_output_____"
]
],
[
[
"%%time\n\n# uncomment, if needed\nfrom sagemaker.pytorch import PyTorchModel\n\nmodel = PyTorchModel(\n entry_point=\"predict.py\",\n role=role, \n framework_version=\"1.0\",\n model_data=estimator.model_data,\n source_dir=\"source_pytorch\"\n)\n\n# deploy your model to create a predictor\npredictor = model.deploy(initial_instance_count=1, instance_type=\"ml.t2.medium\")\n",
"--------------------------------------------------------------------------------------!CPU times: user 635 ms, sys: 44.5 ms, total: 680 ms\nWall time: 7min 15s\n"
]
],
[
[
"---\n# Evaluating Your Model\n\nOnce your model is deployed, you can see how it performs when applied to our test data.\n\nThe provided cell below, reads in the test data, assuming it is stored locally in `data_dir` and named `test.csv`. The labels and features are extracted from the `.csv` file.",
"_____no_output_____"
]
],
[
[
"\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\nimport os\n\n# read in test data, assuming it is stored locally\ntest_data = pd.read_csv(os.path.join(data_dir, \"test.csv\"), header=None, names=None)\n\n# labels are in the first column\ntest_y = test_data.iloc[:,0]\ntest_x = test_data.iloc[:,1:]",
"_____no_output_____"
]
],
[
[
"## EXERCISE: Determine the accuracy of your model\n\nUse your deployed `predictor` to generate predicted, class labels for the test data. Compare those to the *true* labels, `test_y`, and calculate the accuracy as a value between 0 and 1.0 that indicates the fraction of test data that your model classified correctly. You may use [sklearn.metrics](https://scikit-learn.org/stable/modules/classes.html#module-sklearn.metrics) for this calculation.\n\n**To pass this project, your model should get at least 90% test accuracy.**",
"_____no_output_____"
]
],
[
[
"# First: generate predicted, class labels\ntest_y_preds = predictor.predict(test_x)\n\n\n\"\"\"\nDON'T MODIFY ANYTHING IN THIS CELL THAT IS BELOW THIS LINE\n\"\"\"\n# test that your model generates the correct number of labels\nassert len(test_y_preds)==len(test_y), 'Unexpected number of predictions.'\nprint('Test passed!')",
"Test passed!\n"
],
[
"# Second: calculate the test accuracy\nfrom sklearn.metrics import accuracy_score\n\naccuracy = accuracy_score(test_y, test_y_preds)\n\nprint(accuracy)\n\n\n## print out the array of predicted and true labels, if you want\nprint('\\nPredicted class labels: ')\nprint(test_y_preds)\nprint('\\nTrue class labels: ')\nprint(test_y.values)",
"0.96\n\nPredicted class labels: \n[[1.]\n [1.]\n [1.]\n [1.]\n [1.]\n [1.]\n [1.]\n [0.]\n [0.]\n [0.]\n [0.]\n [0.]\n [1.]\n [1.]\n [1.]\n [1.]\n [1.]\n [1.]\n [0.]\n [1.]\n [0.]\n [1.]\n [1.]\n [0.]\n [0.]]\n\nTrue class labels: \n[1 1 1 1 1 1 0 0 0 0 0 0 1 1 1 1 1 1 0 1 0 1 1 0 0]\n"
]
],
[
[
"### Question 1: How many false positives and false negatives did your model produce, if any? And why do you think this is?",
"_____no_output_____"
],
[
"** Answer**: \n",
"_____no_output_____"
]
],
[
[
"# code to evaluate the endpoint on test data\n# returns a variety of model metrics\ndef evaluate(test_preds, test_labels, verbose=True):\n # rounding and squeezing array\n test_preds = np.squeeze(np.round(test_preds))\n \n # calculate true positives, false positives, true negatives, false negatives\n tp = np.logical_and(test_labels, test_preds).sum()\n fp = np.logical_and(1-test_labels, test_preds).sum()\n tn = np.logical_and(1-test_labels, 1-test_preds).sum()\n fn = np.logical_and(test_labels, 1-test_preds).sum()\n \n # calculate binary classification metrics\n recall = tp / (tp + fn)\n precision = tp / (tp + fp)\n accuracy = (tp + tn) / (tp + fp + tn + fn)\n \n # print metrics\n if verbose:\n print(pd.crosstab(test_labels, test_preds, rownames=['actuals'], colnames=['predictions']))\n print(\"\\n{:<11} {:.3f}\".format('Recall:', recall))\n print(\"{:<11} {:.3f}\".format('Precision:', precision))\n print(\"{:<11} {:.3f}\".format('Accuracy:', accuracy))\n print()\n \n return {'TP': tp, 'FP': fp, 'FN': fn, 'TN': tn, \n 'Precision': precision, 'Recall': recall, 'Accuracy': accuracy}\n\n\nmetrics = evaluate(test_y_preds, test_y.values, True)",
"predictions 0.0 1.0\nactuals \n0 9 1\n1 0 15\n\nRecall: 1.000\nPrecision: 0.938\nAccuracy: 0.960\n\n"
]
],
[
[
"false positives is 1 and false negatives is 0. The result is pretty good. The reason may be 1. sample is small, 2. features is not enough and didn't describe too much characters of the text. ",
"_____no_output_____"
],
[
"### Question 2: How did you decide on the type of model to use? ",
"_____no_output_____"
],
[
"** Answer**:\nThe basic model of sklearn and pytorch are all linear model. The problem is linear inseparable. Thus, deep learning is better for this problem because the pytorch model stacks two layer linear models. ",
"_____no_output_____"
],
[
"----\n## EXERCISE: Clean up Resources\n\nAfter you're done evaluating your model, **delete your model endpoint**. You can do this with a call to `.delete_endpoint()`. You need to show, in this notebook, that the endpoint was deleted. Any other resources, you may delete from the AWS console, and you will find more instructions on cleaning up all your resources, below.",
"_____no_output_____"
]
],
[
[
"# uncomment and fill in the line below!\npredictor.delete_endpoint()\n",
"_____no_output_____"
]
],
[
[
"### Deleting S3 bucket\n\nWhen you are *completely* done with training and testing models, you can also delete your entire S3 bucket. If you do this before you are done training your model, you'll have to recreate your S3 bucket and upload your training data again.",
"_____no_output_____"
]
],
[
[
"# deleting bucket, uncomment lines below\n\n# bucket_to_delete = boto3.resource('s3').Bucket(bucket)\n# bucket_to_delete.objects.all().delete()",
"_____no_output_____"
]
],
[
[
"### Deleting all your models and instances\n\nWhen you are _completely_ done with this project and do **not** ever want to revisit this notebook, you can choose to delete all of your SageMaker notebook instances and models by following [these instructions](https://docs.aws.amazon.com/sagemaker/latest/dg/ex1-cleanup.html). Before you delete this notebook instance, I recommend at least downloading a copy and saving it, locally.",
"_____no_output_____"
],
[
"---\n## Further Directions\n\nThere are many ways to improve or add on to this project to expand your learning or make this more of a unique project for you. A few ideas are listed below:\n* Train a classifier to predict the *category* (1-3) of plagiarism and not just plagiarized (1) or not (0).\n* Utilize a different and larger dataset to see if this model can be extended to other types of plagiarism.\n* Use language or character-level analysis to find different (and more) similarity features.\n* Write a complete pipeline function that accepts a source text and submitted text file, and classifies the submitted text as plagiarized or not.\n* Use API Gateway and a lambda function to deploy your model to a web application.\n\nThese are all just options for extending your work. If you've completed all the exercises in this notebook, you've completed a real-world application, and can proceed to submit your project. Great job!",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
]
] |
cbfbe7869a2df9bb93fa554b1ce81d5482748e0c
| 71,198 |
ipynb
|
Jupyter Notebook
|
udacity-dl/3-linear-algebra/Linear Regression Project.ipynb
|
yu-george/ml
|
8eedc62df9e7f37312bba39fa45bb9e9c6361028
|
[
"Unlicense"
] | null | null | null |
udacity-dl/3-linear-algebra/Linear Regression Project.ipynb
|
yu-george/ml
|
8eedc62df9e7f37312bba39fa45bb9e9c6361028
|
[
"Unlicense"
] | null | null | null |
udacity-dl/3-linear-algebra/Linear Regression Project.ipynb
|
yu-george/ml
|
8eedc62df9e7f37312bba39fa45bb9e9c6361028
|
[
"Unlicense"
] | null | null | null | 53.451952 | 15,196 | 0.710399 |
[
[
[
"# ไปปๆ้ไธไธชไฝ ๅๆฌข็ๆดๆฐ๏ผ่ฟ่ฝๅธฎไฝ ๅพๅฐ็จณๅฎ็็ปๆ\nseed = 2333 # todo",
"_____no_output_____"
]
],
[
[
"# ๆฌข่ฟๆฅๅฐ็บฟๆงๅๅฝ้กน็ฎ\n\n่ฅ้กน็ฎไธญ็้ข็ฎๆๅฐ้พๆฒกๅฎๆไนๆฒกๅ
ณ็ณป๏ผๆไปฌ้ผๅฑไฝ ๅธฆ็้ฎ้ขๆไบค้กน็ฎ๏ผ่ฏๅฎกไบบไผ็ปไบไฝ ่ฏธๅคๅธฎๅฉใ\n\nๆๆ้ๅ้ข้ฝๅฏไปฅไธๅ๏ผไธๅฝฑๅ้กน็ฎ้่ฟใๅฆๆไฝ ๅไบ๏ผ้ฃไน้กน็ฎ่ฏๅฎกไผๅธฎไฝ ๆนๆน๏ผไนไผๅ ไธบ้ๅ้จๅๅ้่ๅคๅฎไธบไธ้่ฟใ\n\nๅ
ถไธญ้ไปฃ็ ้ขๅฏไปฅๆไบคๆๅๅๆซๆ็ pdf ๆไปถ๏ผๆไฝฟ็จ Latex ๅจๆๆกฃไธญ็ดๆฅๅ็ญใ",
"_____no_output_____"
],
[
"# 1 ็ฉ้ต่ฟ็ฎ\n\n## 1.1 ๅๅปบไธไธช 4*4 ็ๅไฝ็ฉ้ต",
"_____no_output_____"
]
],
[
[
"# ่ฟไธช้กน็ฎ่ฎพ่ฎกๆฅๅธฎไฝ ็ๆ python list ๅ็บฟๆงไปฃๆฐ\n# ไฝ ไธ่ฝ่ฐ็จไปปไฝNumPyไปฅๅ็ธๅ
ณ็็งๅญฆ่ฎก็ฎๅบๆฅๅฎๆไฝไธ\n\n\n# ๆฌ้กน็ฎ่ฆๆฑ็ฉ้ต็ปไธไฝฟ็จไบ็ปดๅ่กจ่กจ็คบ๏ผๅฆไธ๏ผ\nA = [[1,2,3], \n [2,3,3], \n [1,2,5]]\n\nB = [[1,2,3,5], \n [2,3,3,5], \n [1,2,5,1]]\n\n# ๅ้ไน็จไบ็ปดๅ่กจ่กจ็คบ\nC = [[1],\n [2],\n [3]]\n\n#TODO ๅๅปบไธไธช 4*4 ๅไฝ็ฉ้ต\nI = [[1,0,0,0],\n [0,1,0,0],\n [0,0,1,0],\n [0,0,0,1]]",
"_____no_output_____"
]
],
[
[
"## 1.2 ่ฟๅ็ฉ้ต็่กๆฐๅๅๆฐ",
"_____no_output_____"
]
],
[
[
"# TODO ่ฟๅ็ฉ้ต็่กๆฐๅๅๆฐ\ndef shape(M):\n return len(M), len(M[0])",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ shape ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_shape",
".\n----------------------------------------------------------------------\nRan 1 test in 0.001s\n\nOK\n"
]
],
[
[
"## 1.3 ๆฏไธชๅ
็ด ๅ่ไบๅ
ฅๅฐ็นๅฎๅฐๆฐๆฐไฝ",
"_____no_output_____"
]
],
[
[
"# TODO ๆฏไธชๅ
็ด ๅ่ไบๅ
ฅๅฐ็นๅฎๅฐๆฐๆฐไฝ\n# ็ดๆฅไฟฎๆนๅๆฐ็ฉ้ต๏ผๆ ่ฟๅๅผ\ndef matxRound(M, decPts=4):\n for row in range(len(M)):\n for col in range(len(M[row])):\n M[row][col] = round(M[row][col], decPts)",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ matxRound ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_matxRound",
".\n----------------------------------------------------------------------\nRan 1 test in 0.010s\n\nOK\n"
]
],
[
[
"## 1.4 ่ฎก็ฎ็ฉ้ต็่ฝฌ็ฝฎ",
"_____no_output_____"
]
],
[
[
"# TODO ่ฎก็ฎ็ฉ้ต็่ฝฌ็ฝฎ\ndef transpose(M):\n rows, cols = shape(M)\n new = []\n for col in range(cols):\n new.append([])\n for row in range(rows):\n new[col].append(0)\n for row in range(rows):\n for col in range(cols):\n new[col][row] = M[row][col]\n return new\n\n# Pythonic approach\n# return [list(col) for col in zip(*M)]",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ transpose ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_transpose",
".\n----------------------------------------------------------------------\nRan 1 test in 0.030s\n\nOK\n"
]
],
[
[
"## 1.5 ่ฎก็ฎ็ฉ้ตไนๆณ AB",
"_____no_output_____"
]
],
[
[
"# TODO ่ฎก็ฎ็ฉ้ตไนๆณ AB๏ผๅฆๆๆ ๆณ็ธไนๅraise ValueError\ndef dot(a, b):\n return sum(i * j for i, j in zip(a, b))\n\ndef matxMultiply(A, B):\n n, m = shape(A)\n p, q = shape(B)\n if m != p:\n raise ValueError('dimension of matrices don\\'t match')\n result = []\n for row in range(n):\n result.append([])\n for col in range(q):\n # ith row jth column = dot product of ith row of A and jth column of B\n # im not using [row][col] to index but rather just appending...\n # just because i can\n result[row].append(dot(A[row], [i[col] for i in B]))\n return result",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ matxMultiply ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_matxMultiply",
".\n----------------------------------------------------------------------\nRan 1 test in 0.124s\n\nOK\n"
]
],
[
[
"---\n\n# 2 Gaussian Jordan ๆถๅ
ๆณ\n\n## 2.1 ๆ้ ๅขๅนฟ็ฉ้ต\n\n$ A = \\begin{bmatrix}\n a_{11} & a_{12} & ... & a_{1n}\\\\\n a_{21} & a_{22} & ... & a_{2n}\\\\\n a_{31} & a_{22} & ... & a_{3n}\\\\\n ... & ... & ... & ...\\\\\n a_{n1} & a_{n2} & ... & a_{nn}\\\\\n\\end{bmatrix} , b = \\begin{bmatrix}\n b_{1} \\\\\n b_{2} \\\\\n b_{3} \\\\\n ... \\\\\n b_{n} \\\\\n\\end{bmatrix}$\n\n่ฟๅ $ Ab = \\begin{bmatrix}\n a_{11} & a_{12} & ... & a_{1n} & b_{1}\\\\\n a_{21} & a_{22} & ... & a_{2n} & b_{2}\\\\\n a_{31} & a_{22} & ... & a_{3n} & b_{3}\\\\\n ... & ... & ... & ...& ...\\\\\n a_{n1} & a_{n2} & ... & a_{nn} & b_{n} \\end{bmatrix}$",
"_____no_output_____"
]
],
[
[
"# TODO ๆ้ ๅขๅนฟ็ฉ้ต๏ผๅ่ฎพA๏ผb่กๆฐ็ธๅ\ndef augmentMatrix(A, b):\n# assert n == shape(b)[0] # this is in the assumption!\n new = [A[row] + b[row] for row in range(len(A))]\n return new",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ augmentMatrix ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_augmentMatrix",
".\n----------------------------------------------------------------------\nRan 1 test in 0.004s\n\nOK\n"
]
],
[
[
"## 2.2 ๅ็ญ่กๅๆข\n- ไบคๆขไธค่ก\n- ๆๆ่กไนไปฅไธไธช้้ถๅธธๆฐ\n- ๆๆ่กๅ ไธๅฆไธ่ก็่ฅๅนฒๅ๏ผ",
"_____no_output_____"
]
],
[
[
"# TODO r1 <---> r2\n# ็ดๆฅไฟฎๆนๅๆฐ็ฉ้ต๏ผๆ ่ฟๅๅผ\ndef swapRows(M, r1, r2):\n if r1 != r2:\n M[r1], M[r2] = M[r2], M[r1]",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ swapRows ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_swapRows",
".\n----------------------------------------------------------------------\nRan 1 test in 0.004s\n\nOK\n"
],
[
"## TODO r1 <--- r1 * scale\n# scaleไธบ0ๆฏ้ๆณ่พๅ
ฅ๏ผ่ฆๆฑ raise ValueError\n# ็ดๆฅไฟฎๆนๅๆฐ็ฉ้ต๏ผๆ ่ฟๅๅผ\ndef scaleRow(M, r, scale):\n if scale == 0:\n raise ValueError('cannot scale a matrix by zero')\n for col in range(len(M[r])):\n M[r][col] *= scale",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ scaleRow ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_scaleRow",
".\n----------------------------------------------------------------------\nRan 1 test in 0.005s\n\nOK\n"
],
[
"# TODO r1 <--- r1 + r2*scale\n# ็ดๆฅไฟฎๆนๅๆฐ็ฉ้ต๏ผๆ ่ฟๅๅผ\ndef addScaledRow(M, r1, r2, scale):\n if scale == 0:\n raise ValueError('cannot scale a matrix by zero')\n for col in range(len(M[r1])):\n M[r1][col] += scale * M[r2][col]\n ",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ addScaledRow ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_addScaledRow",
".\n----------------------------------------------------------------------\nRan 1 test in 0.002s\n\nOK\n"
]
],
[
[
"## 2.3 Gaussian Jordan ๆถๅ
ๆณๆฑ่งฃ Ax = b",
"_____no_output_____"
],
[
"### 2.3.1 ็ฎๆณ\n\nๆญฅ้ชค1 ๆฃๆฅA๏ผbๆฏๅฆ่กๆฐ็ธๅ\n\nๆญฅ้ชค2 ๆ้ ๅขๅนฟ็ฉ้ตAb\n\nๆญฅ้ชค3 ้ๅ่ฝฌๆขAbไธบๅ็ฎ่ก้ถๆขฏๅฝข็ฉ้ต [ไธญๆ็ปดๅบ้พๆฅ](https://zh.wikipedia.org/wiki/%E9%98%B6%E6%A2%AF%E5%BD%A2%E7%9F%A9%E9%98%B5#.E5.8C.96.E7.AE.80.E5.90.8E.E7.9A.84-.7Bzh-hans:.E8.A1.8C.3B_zh-hant:.E5.88.97.3B.7D-.E9.98.B6.E6.A2.AF.E5.BD.A2.E7.9F.A9.E9.98.B5)\n \n ๅฏนไบAb็ๆฏไธๅ๏ผๆๅไธๅ้คๅค๏ผ\n ๅฝๅๅไธบๅc\n ๅฏปๆพๅcไธญ ๅฏน่ง็บฟไปฅๅๅฏน่ง็บฟไปฅไธๆๆๅ
็ด ๏ผ่ก c~N๏ผ็็ปๅฏนๅผ็ๆๅคงๅผ\n ๅฆๆ็ปๅฏนๅผๆๅคงๅผไธบ0\n ้ฃไนAไธบๅฅๅผ็ฉ้ต๏ผ่ฟๅNone (ไฝ ๅฏไปฅๅจ้ๅ้ฎ้ข2.4ไธญ่ฏๆไธบไปไน่ฟ้Aไธๅฎๆฏๅฅๅผ็ฉ้ต)\n ๅฆๅ\n ไฝฟ็จ็ฌฌไธไธช่กๅๆข๏ผๅฐ็ปๅฏนๅผๆๅคงๅผๆๅจ่กไบคๆขๅฐๅฏน่ง็บฟๅ
็ด ๆๅจ่ก๏ผ่กc๏ผ \n ไฝฟ็จ็ฌฌไบไธช่กๅๆข๏ผๅฐๅc็ๅฏน่ง็บฟๅ
็ด ็ผฉๆพไธบ1\n ๅคๆฌกไฝฟ็จ็ฌฌไธไธช่กๅๆข๏ผๅฐๅc็ๅ
ถไปๅ
็ด ๆถไธบ0\n \nๆญฅ้ชค4 ่ฟๅAb็ๆๅไธๅ\n\n**ๆณจ๏ผ** ๆไปฌๅนถๆฒกๆๆ็
งๅธธ่งๆนๆณๅ
ๆ็ฉ้ต่ฝฌๅไธบ่ก้ถๆขฏๅฝข็ฉ้ต๏ผๅ่ฝฌๆขไธบๅ็ฎ่ก้ถๆขฏๅฝข็ฉ้ต๏ผ่ๆฏไธๆญฅๅฐไฝใๅฆๆไฝ ็ๆๅธธ่งๆนๆณ็่ฏ๏ผๅฏไปฅๆ่ไธไธไธค่
็็ญไปทๆงใ",
"_____no_output_____"
],
[
"### 2.3.2 ็ฎๆณๆจๆผ\n\nไธบไบๅ
ๅไบ่งฃGaussian Jordanๆถๅ
ๆณ็่ฎก็ฎๆต็จ๏ผ่ฏทๆ นๆฎGaussian Jordanๆถๅ
ๆณ๏ผๅๅซๆๅจๆจๆผ็ฉ้ตAไธบ***ๅฏ้็ฉ้ต***๏ผ็ฉ้ตAไธบ***ๅฅๅผ็ฉ้ต***ไธค็งๆ
ๅตใ",
"_____no_output_____"
],
[
"#### ๆจๆผ็คบไพ \n\n\n$Ab = \\begin{bmatrix}\n -7 & 5 & -1 & 1\\\\\n 1 & -3 & -8 & 1\\\\\n -10 & -2 & 9 & 1\\end{bmatrix}$\n\n$ --> $\n$\\begin{bmatrix}\n 1 & \\frac{1}{5} & -\\frac{9}{10} & -\\frac{1}{10}\\\\\n 0 & -\\frac{16}{5} & -\\frac{71}{10} & \\frac{11}{10}\\\\\n 0 & \\frac{32}{5} & -\\frac{73}{10} & \\frac{3}{10}\\end{bmatrix}$\n\n$ --> $\n$\\begin{bmatrix}\n 1 & 0 & -\\frac{43}{64} & -\\frac{7}{64}\\\\\n 0 & 1 & -\\frac{73}{64} & \\frac{3}{64}\\\\\n 0 & 0 & -\\frac{43}{4} & \\frac{5}{4}\\end{bmatrix}$\n\n$ --> $\n$\\begin{bmatrix}\n 1 & 0 & 0 & -\\frac{3}{16}\\\\\n 0 & 1 & 0 & -\\frac{59}{688}\\\\\n 0 & 0 & 1 & -\\frac{5}{43}\\end{bmatrix}$\n \n\n#### ๆจๆผๆไปฅไธ่ฆๆฑ:\n1. ๅฑ็คบๆฏไธๅ็ๆถๅ
็ปๆ, ๆฏๅฆ3*3็็ฉ้ต, ้่ฆๅไธๆญฅ\n2. ็จๅๆฐๆฅ่กจ็คบ\n3. ๅๆฐไธ่ฝๅ็บฆๅ\n4. ๆไปฌๅทฒ็ป็ปๅบไบlatex็่ฏญๆณ,ไฝ ๅช่ฆๆ้ถๆนๆไฝ ่ฆ็ๆฐๅญ(ๆๅๆฐ)ๅณๅฏ\n5. ๆฃๆฅไฝ ็็ญๆก, ๅฏไปฅ็จ[่ฟไธช](http://www.math.odu.edu/~bogacki/cgi-bin/lat.cgi?c=sys), ๆ่
ๅ้ข้่ฟๅๅ
ๆต่ฏๅ็`gj_Solve`\n\n_ไฝ ๅฏไปฅ็จpython็ [fractions](https://docs.python.org/2/library/fractions.html) ๆจกๅ่พ
ๅฉไฝ ็็บฆๅ_",
"_____no_output_____"
],
[
"#### ไปฅไธๅผๅงไฝ ็ๅฐ่ฏๅง!",
"_____no_output_____"
]
],
[
[
"# ไธ่ฆไฟฎๆน่ฟ้๏ผ\nfrom helper import *\nA = generateMatrix(3,seed,singular=False)\nb = np.ones(shape=(3,1),dtype=int) # it doesn't matter\nAb = augmentMatrix(A.tolist(),b.tolist()) # ่ฏท็กฎไฟไฝ ็ๅขๅนฟ็ฉ้ตๅทฒ็ปๅๅฅฝไบ\nprintInMatrixFormat(Ab,padding=3,truncating=0)",
"-10, 9, 5 || 1 \n -4, 3, -4 || 1 \n -2, 3, 5 || 1 \n"
]
],
[
[
"่ฏทๆ็
ง็ฎๆณ็ๆญฅ้ชค3๏ผ้ๆญฅๆจๆผ***ๅฏ้็ฉ้ต***็ๅๆขใ\n\nๅจไธ้ขๅๅบๆฏไธๆฌกๅพช็ฏไฝๆง่กไนๅ็ๅขๅนฟ็ฉ้ตใ\n\n่ฆๆฑ๏ผ\n1. ๅๅๆฐ่ฟ็ฎ\n2. ไฝฟ็จ`\\frac{n}{m}`ๆฅๆธฒๆๅๆฐ๏ผๅฆไธ๏ผ\n - $\\frac{n}{m}$\n - $-\\frac{a}{b}$\n\n\n$ Ab = \\begin{bmatrix}\n -10 & 9 & 5 & 1 \\\\\n -4 & 3 & -4 & 1 \\\\\n -2 & 3 & 5 & 1 \\end{bmatrix}$\n\n$ --> \\begin{bmatrix}\n 1 & -\\frac{9}{10} & -\\frac{5}{10} & -\\frac{1}{10} \\\\\n 0 & -\\frac{3}{5} & -6 & \\frac{3}{5} \\\\\n 0 & \\frac{6}{5} & 4 & \\frac{4}{5} \\end{bmatrix}$\n \n$ --> \\begin{bmatrix}\n 1 & 0 & \\frac{5}{2} & \\frac{1}{2} \\\\\n 0 & 1 & \\frac{10}{3} & \\frac{2}{3} \\\\\n 0 & 0 & -4 & 1 \\end{bmatrix}$\n \n$ --> \\begin{bmatrix}\n 1 & 0 & 0 & \\frac{9}{8} \\\\\n 0 & 1 & 0 & \\frac{3}{2} \\\\\n 0 & 0 & 1 & -\\frac{1}{4} \\end{bmatrix}$",
"_____no_output_____"
]
],
[
[
"# ไธ่ฆไฟฎๆน่ฟ้๏ผ\nA = generateMatrix(3,seed,singular=True)\nb = np.ones(shape=(3,1),dtype=int)\nAb = augmentMatrix(A.tolist(),b.tolist()) # ่ฏท็กฎไฟไฝ ็ๅขๅนฟ็ฉ้ตๅทฒ็ปๅๅฅฝไบ\nprintInMatrixFormat(Ab,padding=3,truncating=0)",
" 6, 1, 9 || 1 \n 0, 1, 1 || 1 \n -6, 1, -7 || 1 \n"
]
],
[
[
"่ฏทๆ็
ง็ฎๆณ็ๆญฅ้ชค3๏ผ้ๆญฅๆจๆผ***ๅฅๅผ็ฉ้ต***็ๅๆขใ\n\nๅจไธ้ขๅๅบๆฏไธๆฌกๅพช็ฏไฝๆง่กไนๅ็ๅขๅนฟ็ฉ้ตใ\n\n่ฆๆฑ๏ผ\n1. ๅๅๆฐ่ฟ็ฎ\n2. ไฝฟ็จ`\\frac{n}{m}`ๆฅๆธฒๆๅๆฐ๏ผๅฆไธ๏ผ\n - $\\frac{n}{m}$\n - $-\\frac{a}{b}$\n\n\n$ Ab = \\begin{bmatrix}\n 6 & 1 & 9 & 1 \\\\\n 0 & 1 & 1 & 1 \\\\\n -6 & 1 & -7 & 1 \\end{bmatrix}$\n\n$ --> \\begin{bmatrix}\n 1 & \\frac{1}{6} & \\frac{3}{2} & \\frac{1}{6} \\\\\n 0 & 1 & 1 & 1 \\\\\n 0 & 2 & 2 & 2 \\end{bmatrix}$\n \n$ --> \\begin{bmatrix}\n 1 & 0 & \\frac{4}{3} & 0 \\\\\n 0 & 1 & 1 & 0 \\\\\n 0 & 0 & 0 & 0 \\end{bmatrix}$\n ",
"_____no_output_____"
],
[
"### 2.3.3 ๅฎ็ฐ Gaussian Jordan ๆถๅ
ๆณ",
"_____no_output_____"
]
],
[
[
"# TODO ๅฎ็ฐ Gaussain Jordan ๆนๆณๆฑ่งฃ Ax = b\n\n\"\"\" Gaussian Jordan ๆนๆณๆฑ่งฃ Ax = b.\n ๅๆฐ\n A: ๆน้ต \n b: ๅๅ้\n decPts: ๅ่ไบๅ
ฅไฝๆฐ๏ผ้ป่ฎคไธบ4\n epsilon: ๅค่ฏปๆฏๅฆไธบ0็้ๅผ๏ผ้ป่ฎค 1.0e-16\n \n ่ฟๅๅๅ้ x ไฝฟๅพ Ax = b \n ่ฟๅNone๏ผๅฆๆ A๏ผb ้ซๅบฆไธๅ\n ่ฟๅNone๏ผๅฆๆ A ไธบๅฅๅผ็ฉ้ต\n\"\"\"\n\nfrom pprint import pprint\n\ndef gj_Solve(A, b, decPts=4, epsilon=1.0e-16):\n size = len(A)\n if size != len(b):\n return None\n C = augmentMatrix(A, b)\n for c in range(size):\n absCol = [abs(row[c]) for row in C[c:]]\n maxNum = max(absCol)\n if maxNum < epsilon: # singular matrix\n return None\n swapRows(C, absCol.index(maxNum) + c, c)\n scaleRow(C, c, 1 / C[c][c])\n for r in range(size):\n if r == c or abs(C[r][c]) < epsilon:\n continue\n addScaledRow(C, r, c, -C[r][c])\n solution = [[row[-1]] for row in C]\n return solution",
"_____no_output_____"
],
[
"# ่ฟ่กไปฅไธไปฃ็ ๆต่ฏไฝ ็ gj_Solve ๅฝๆฐ\n%run -i -e test.py LinearRegressionTestCase.test_gj_Solve",
".\n----------------------------------------------------------------------\nRan 1 test in 4.122s\n\nOK\n"
]
],
[
[
"###### (้ๅ) 2.4 ็ฎๆณๆญฃ็กฎๅคๆญไบๅฅๅผ็ฉ้ต๏ผ\n\nๅจ็ฎๆณ็ๆญฅ้ชค3 ไธญ๏ผๅฆๆๅ็ฐๆไธๅๅฏน่ง็บฟๅๅฏน่ง็บฟไปฅไธๆๆๅ
็ด ้ฝไธบ0๏ผ้ฃไนๅๆญๅฎ่ฟไธช็ฉ้ตไธบๅฅๅผ็ฉ้ตใ\n\nๆไปฌ็จๆญฃๅผ็่ฏญ่จๆ่ฟฐ่ฟไธชๅฝ้ข๏ผๅนถ่ฏๆไธบ็ใ\n\n่ฏๆไธ้ข็ๅฝ้ข๏ผ\n\n**ๅฆๆๆน้ต A ๅฏไปฅ่ขซๅไธบ4ไธช้จๅ: ** \n\n$ A = \\begin{bmatrix}\n I & X \\\\\n Z & Y \\\\\n\\end{bmatrix} , \\text{ๅ
ถไธญ I ไธบๅไฝ็ฉ้ต๏ผZ ไธบๅ
จ0็ฉ้ต๏ผY ็็ฌฌไธๅๅ
จ0}$๏ผ\n\n**้ฃไนAไธบๅฅๅผ็ฉ้ตใ**\n\nๆ็คบ๏ผไปๅค็ง่งๅบฆ้ฝๅฏไปฅๅฎๆ่ฏๆ\n- ่่็ฉ้ต Y ๅ ็ฉ้ต A ็็งฉ\n- ่่็ฉ้ต Y ๅ ็ฉ้ต A ็่กๅๅผ\n- ่่็ฉ้ต A ็ๆไธๅๆฏๅ
ถไปๅ็็บฟๆง็ปๅ",
"_____no_output_____"
],
[
"**TODO** ่ฏๆ๏ผ\n\n็ฑไบ $A,\\ I$ ไธบๆน้ต๏ผๅ $Y$ ไธบๆน้ต๏ผๆ็ฅ๏ผ$X$ ็่กๆฐไธ $Z$ ็ๅๆฐไธ่ดใ \n่ฎพ $n$ ไธบ $I$ ็ๅคงๅฐ๏ผๅนถ่ฎพ $x_1,\\ x_2,\\dots,\\ x_n$ ไธบ็ฉ้ต $X$ ็็ฌฌไธๅ็ๅ
็ด ๏ผ่ฎพ $A_i$ ไธบ็ฉ้ต $A$ ็็ฌฌ $i$ ๅใ\n\n็ฑไบ $Y$ ็็ฌฌไธๅๅไธบ 0๏ผ้ฃไน $A_{n+1}$๏ผ$Y$ ๆๅจ็่ฟไธๅ๏ผๅฏไปฅ่ขซ่กจ็คบไธบ\n$$ A_{n+1} = x_1 A_1 + x_2 A_2 + \\cdots + x_n A_n = \\sum_{i=1}^n x_i A_i $$\n\nๅ ๆญค $A_{n+1}$ ่ฟไธๅๅนถไธๆฏ็บฟๆง็ฌ็ซ็๏ผๆฏ $A$ ๅ $n$ ๅ็็บฟๆง็ปๅ๏ผ๏ผๅๅ $A$ ไธบๆน้ต๏ผๅๅพๅบ $A$ ไธๆปก็งฉ๏ผๅณ $A$ ๆฏๅฅๅผ็ฉ้ต๏ผ่ฏๆฏใ",
"_____no_output_____"
],
[
"# 3 ็บฟๆงๅๅฝ",
"_____no_output_____"
],
[
"## 3.1 ้ๆบ็ๆๆ ทๆฌ็น",
"_____no_output_____"
]
],
[
[
"# ไธ่ฆไฟฎๆน่ฟ้๏ผ\n# ่ฟ่กไธๆฌกๅฐฑๅคไบ๏ผ\nfrom helper import *\nfrom matplotlib import pyplot as plt\n%matplotlib inline\n\nX,Y = generatePoints(seed,num=100)\n\n## ๅฏ่งๅ\nplt.xlim((-5,5))\nplt.xlabel('x',fontsize=18)\nplt.ylabel('y',fontsize=18)\nplt.scatter(X,Y,c='b')\nplt.show()",
"_____no_output_____"
]
],
[
[
"## 3.2 ๆๅไธๆก็ด็บฟ\n\n### 3.2.1 ็ๆตไธๆก็ด็บฟ",
"_____no_output_____"
]
],
[
[
"#TODO ่ฏท้ๆฉๆ้ๅ็็ด็บฟ y = mx + b\nm1 = 2/5\nb1 = 13.8\n\n# ไธ่ฆไฟฎๆน่ฟ้๏ผ\nplt.xlim((-5,5))\nx_vals = plt.axes().get_xlim()\ny_vals = [m1*x+b1 for x in x_vals]\nplt.plot(x_vals, y_vals, '-', color='r')\n\nplt.xlabel('x',fontsize=18)\nplt.ylabel('y',fontsize=18)\nplt.scatter(X,Y,c='b')\n\nplt.show()",
"_____no_output_____"
]
],
[
[
"### 3.2.2 ่ฎก็ฎๅนณๅๅนณๆน่ฏฏๅทฎ (MSE)",
"_____no_output_____"
],
[
"ๆไปฌ่ฆ็ผ็จ่ฎก็ฎๆ้็ด็บฟ็ๅนณๅๅนณๆน่ฏฏๅทฎ(MSE), ๅณๆฐๆฎ้ไธญๆฏไธช็นๅฐ็ด็บฟ็Yๆนๅ่ท็ฆป็ๅนณๆน็ๅนณๅๆฐ๏ผ่กจ่พพๅผๅฆไธ๏ผ\n$$\nMSE = \\frac{1}{n}\\sum_{i=1}^{n}{(y_i - mx_i - b)^2}\n$$",
"_____no_output_____"
]
],
[
[
"# TODO ๅฎ็ฐไปฅไธๅฝๆฐๅนถ่พๅบๆ้็ด็บฟ็MSE\n\ndef calculateMSE(X,Y,m,b):\n error = 0\n for x, y in zip(X, Y):\n diff = (y - m * x - b) ** 2\n error += diff\n error /= len(X)\n return error\n\n# Pythonic approach\n# return sum([(y - m * x - b) ** 2 for x, y in zip(X, Y)]) / len(Y)\n\nprint(calculateMSE(X,Y,m1,b1))",
"1.489379289068421\n"
]
],
[
[
"### 3.2.3 ่ฐๆดๅๆฐ $m, b$ ๆฅ่ทๅพๆๅฐ็ๅนณๆนๅนณๅ่ฏฏๅทฎ\n\nไฝ ๅฏไปฅ่ฐๆด3.2.1ไธญ็ๅๆฐ $m1,b1$ ่ฎฉ่็นๅๅ่ฆ็ๅจ็บข็บฟๅจๅด๏ผ็ถๅๅพฎ่ฐ $m1, b1$ ่ฎฉMSEๆๅฐใ",
"_____no_output_____"
],
[
"## 3.3 (้ๅ) ๆพๅฐๅๆฐ $m, b$ ไฝฟๅพๅนณๆนๅนณๅ่ฏฏๅทฎๆๅฐ\n\n**่ฟไธ้จๅ้่ฆ็ฎๅ็ๅพฎ็งฏๅ็ฅ่ฏ( $ (x^2)' = 2x $ )ใๅ ไธบ่ฟๆฏไธไธช็บฟๆงไปฃๆฐ้กน็ฎ๏ผๆไปฅ่ฎพไธบ้ๅใ**\n\nๅๅๆไปฌๆๅจ่ฐ่ๅๆฐ๏ผๅฐ่ฏๆพๅฐๆๅฐ็ๅนณๆนๅนณๅ่ฏฏๅทฎใไธ้ขๆไปฌ่ฆ็ฒพ็กฎๅพๆฑ่งฃ $m, b$ ไฝฟๅพๅนณๆนๅนณๅ่ฏฏๅทฎๆๅฐใ\n\nๅฎไน็ฎๆ ๅฝๆฐ $E$ ไธบ\n$$\nE = \\frac{1}{2}\\sum_{i=1}^{n}{(y_i - mx_i - b)^2}\n$$\n\nๅ ไธบ $E = \\frac{n}{2}MSE$, ๆไปฅ $E$ ๅๅฐๆๅฐๅผๆถ๏ผ$MSE$ ไนๅๅฐๆๅฐๅผใ่ฆๆพๅฐ $E$ ็ๆๅฐๅผ๏ผๅณ่ฆๆพๅฐ $m, b$ ไฝฟๅพ $E$ ็ธๅฏนไบ $m$, $E$ ็ธๅฏนไบ $b$ ็ๅๅฏผๆฐ็ญไบ0. \n\nๅ ๆญคๆไปฌ่ฆ่งฃไธ้ข็ๆน็จ็ปใ\n\n$$\n\\begin{cases}\n\\displaystyle\n\\frac{\\partial E}{\\partial m} =0 \\\\\n\\\\\n\\displaystyle\n\\frac{\\partial E}{\\partial b} =0 \\\\\n\\end{cases}\n$$\n\n### 3.3.1 ่ฎก็ฎ็ฎๆ ๅฝๆฐ็ธๅฏนไบๅๆฐ็ๅฏผๆฐ\n้ฆๅ
ๆไปฌ่ฎก็ฎไธคไธชๅผๅญๅทฆ่พน็ๅผ\n\n่ฏๆ/่ฎก็ฎ๏ผ\n$$\n\\frac{\\partial E}{\\partial m} = \\sum_{i=1}^{n}{-x_i(y_i - mx_i - b)}\n$$\n\n$$\n\\frac{\\partial E}{\\partial b} = \\sum_{i=1}^{n}{-(y_i - mx_i - b)}\n$$",
"_____no_output_____"
],
[
"**TODO** ่ฏๆ:\n\n่ฎพ $u_i = (y_i - mx_i - b)$๏ผๆไปฌๅฏไปฅ้ๅ็ฎๆ ๅฝๆฐ $E$ ไธบ\n$$ E = \\frac{1}{2} \\sum_{i=1}^n u_i^2 $$\n่ฎพ $E_i = \\frac{1}{2} u_i^2$๏ผๆไปฌๅฏไปฅๅ้ๅ็ฎๆ ๅฝๆฐ $E$ ไธบ๏ผtrivially๏ผ\n$$ E = \\sum_{i=1}^n E_i$$\nๅ๏ผๆไปฌๆ\n$$ \\frac{\\partial E}{\\partial m} = \\sum_{i=1}^n \\frac{\\partial E_i}{\\partial m} \\quad\\textrm{ๅ}\\quad \\frac{\\partial E}{\\partial b} = \\sum_{i=1}^n \\frac{\\partial E_i}{\\partial b} $$\n\nๆไปฌๅ
ๆฑ $E_i$ ๅ
ณไบ $m$ ็ๅๅฏผๆฐ๏ผๅณ $\\frac{\\partial E_i}{\\partial m}$๏ผๆ นๆฎ้พๅผๆณๅ๏ผๆไปฌๆ๏ผๅฏนไบไปปๆ็ $i \\in \\{1,2,\\dots,n\\}$\n$$ \\frac{\\partial E_i}{\\partial m} = \\frac{\\mathrm{d} E_i}{\\mathrm{d} u_i} \\cdot \\frac{\\partial u_i}{\\partial m} $$\nๅจ $u_i$ ไธญ๏ผๅชๆไธ้กน $-mx_i$ ๆฏๅ $m$ ็ธๅ
ณ็๏ผๅ ๆญค $\\frac{\\partial u_i}{\\partial m} = -x_i$๏ผๅณ\n$$ \\frac{\\partial E_i}{\\partial m} = u_i\\cdot (-x_i) = -x_i(y_i - mx_i - b) $$\nๅณ\n$$ \\frac{\\partial E}{\\partial m} = \\sum_{i=1}^n -x_i(y_i - mx_i - b) $$\n\nๅ็๏ผๆไปฌๅฏๅพ\n$$ \\frac{\\partial E_i}{\\partial b} = \\frac{\\partial E_i}{\\partial u_i}\\cdot\\frac{\\partial u_i}{\\partial b} = u_i\\cdot (-1) = -(y_i - mx_i - b) $$\nๅณ\n$$ \\frac{\\partial E}{\\partial b} = \\sum_{i=1}^n -(y_i - mx_i - b) $$\n่ฏๆฏใ",
"_____no_output_____"
],
[
"### 3.3.2 ๅฎไพๆจๆผ\n\n็ฐๅจๆไปฌๆไบไธไธชไบๅ
ไบๆฌกๆน็จ็ป\n\n$$\n\\begin{cases}\n\\displaystyle\n\\sum_{i=1}^{n}{-x_i(y_i - mx_i - b)} =0 \\\\\n\\displaystyle\n\\sum_{i=1}^{n}{-(y_i - mx_i - b)} =0 \\\\\n\\end{cases}\n$$\n\nไธบไบๅ ๅผบ็่งฃ๏ผๆไปฌ็จไธไธชๅฎ้
ไพๅญๆผ็ปใ\n\nๆไปฌ่ฆ็จไธไธช็น $(1,1),\\ (2,2),\\ (3,2)$ ๆฅๆๅไธๆก็ด็บฟ $y = mx + b$, ่ฏทๅๅบ\n\n- ็ฎๆ ๅฝๆฐ $E$, \n- ไบๅ
ไบๆฌกๆน็จ็ป๏ผ\n- ๅนถๆฑ่งฃๆไผๅๆฐ $m, b$",
"_____no_output_____"
],
[
"**TODO** ๅๅบ็ฎๆ ๅฝๆฐ๏ผๆน็จ็ปๅๆไผๅๆฐ\n\n1. ็ฎๆ ๅฝๆฐ $E$\n\n\\begin{align}\n E &= \\frac{1}{2}\\sum_{i=1}^n(y_i - mx_i - b)^2 \\\\\n &= \\frac{1}{2}\\left((1 - m - b)^2 + (2 - 2m - b)^2 + (3 - 2m - b)^2\\right) \\\\\n &= \\frac{1}{2}\\left(9m^2 - 22m + 10mb - 12b + 3b^2 + 14\\right) \\\\\n &= \\frac{9}{2}m^2 - 11m + 5mb - 6b + \\frac{3}{2}b^2 + 7\n\\end{align}\n\n2. ไบๅ
ไบๆฌกๆน็จ็ป\n\n$$\\begin{cases}\n \\displaystyle \\sum_{i=1}^n -x_i(y_i - mx_i - b) = 0 \\\\\n \\displaystyle \\sum_{i=1}^n -(y_i - mx_i - b) = 0 \\\\\n\\end{cases} \\; \\Rightarrow \\;\n\\begin{cases}\n \\displaystyle -(1 - m - b) - 2(2 - 2m - b) - 3(2 - 3m - b) = 0 \\\\\n \\displaystyle -(1 - m - b) - (2 - 2m - b) - (2 - 3m - b) = 0 \\\\\n\\end{cases} \\; \\Rightarrow \\;\n\\begin{cases}\n \\displaystyle -11 + 14m + 6b = 0 \\\\\n \\displaystyle -5 + 8m + 3b = 0 \\\\\n\\end{cases}$$\n\n3. ๆฑ่งฃๆไผๅๆฐ $m,\\ b$\n\n\\begin{cases}\n -11 + 14m + 6b = 0 \\\\\n -5 + 8m + 3b = 0 \\\\\n\\end{cases}\n\n$$ (1) + -2 \\times (2) $$\n$$ \\Downarrow $$\n\n\\begin{cases}\n \\displaystyle -1 - 2m = 0 \\quad \\\\\n \\displaystyle -5 + 8m + 3b = 0 \\quad \\\\\n\\end{cases}\n\n$$ \\therefore\\ m = -\\frac{1}{2} $$\n\n$$ \\therefore\\ b = \\frac{11 - 14m}{6} = \\frac{18}{6} = 3 $$\n\n$$\\therefore\\begin{cases}\n \\displaystyle m = -\\frac{1}{2}\\\\\n \\displaystyle b = 3\\\\\n\\end{cases}$$",
"_____no_output_____"
],
[
"### 3.3.3 ๅฐๆน็จ็ปๅๆ็ฉ้ตๅฝขๅผ\n\nๆไปฌ็ไบๅ
ไบๆฌกๆน็จ็ปๅฏไปฅ็จๆด็ฎๆด็็ฉ้ตๅฝขๅผ่กจ่พพ๏ผๅฐๆน็จ็ปๅๆ็ฉ้ตๅฝขๅผๆดๆๅฉไบๆไปฌไฝฟ็จ Gaussian Jordan ๆถๅ
ๆณๆฑ่งฃใ\n\n่ฏท่ฏๆ \n$$\n\\begin{bmatrix}\n \\frac{\\partial E}{\\partial m} \\\\\n \\frac{\\partial E}{\\partial b} \n\\end{bmatrix} = X^TXh - X^TY\n$$\n\nๅ
ถไธญๅ้ $Y$, ็ฉ้ต $X$ ๅ ๅ้ $h$ ๅๅซไธบ :\n$$\nY = \\begin{bmatrix}\n y_1 \\\\\n y_2 \\\\\n ... \\\\\n y_n\n\\end{bmatrix}\n,\nX = \\begin{bmatrix}\n x_1 & 1 \\\\\n x_2 & 1\\\\\n ... & ...\\\\\n x_n & 1 \\\\\n\\end{bmatrix},\nh = \\begin{bmatrix}\n m \\\\\n b \\\\\n\\end{bmatrix}\n$$",
"_____no_output_____"
],
[
"**TODO** ่ฏๆ:\n\n้ฆๅ
๏ผ\n$$ X^\\top X = \\begin{bmatrix}\n \\displaystyle \\sum_{i=1}^n x_i^2 & \\displaystyle \\sum_{i=1}^n x_i \\\\\n \\displaystyle \\sum_{i=1}^n x_i & \\displaystyle \\sum_{i=1}^n 1\n\\end{bmatrix}$$\nๅ ๆญค๏ผ\n$$ X^\\top Xh = \\begin{bmatrix}\n \\displaystyle \\sum_{i=1}^n mx_i^2 + \\sum_{i=1}^n bx_i \\\\\n \\displaystyle \\sum_{i=1}^n mx_i + \\sum_{i=1}^n b\n\\end{bmatrix}$$\nๅฆๅค๏ผ\n$$ X^\\top Y = \\begin{bmatrix}\n \\displaystyle \\sum_{i=1}^n x_i y_i \\\\\n \\displaystyle \\sum_{i=1}^n y_i\n\\end{bmatrix}$$\nๅ ๆญค๏ผ\n$$ X^\\top Xh - X^\\top Y = \\begin{bmatrix}\n \\displaystyle \\sum_{i=1}^n mx_i^2 + \\sum_{i=1}^n bx_i - \\sum_{i=1}^n x_i y_i \\\\\n \\displaystyle \\sum_{i=1}^n mx_i + \\sum_{i=1}^n b - \\sum_{i=1}^n y_i\n\\end{bmatrix} = \\begin{bmatrix}\n \\displaystyle \\sum_{i=1}^n (mx_i^2 + bx_i - x_i y_i) \\\\\n \\displaystyle \\sum_{i=1}^n (mx_i + b - y_i)\n\\end{bmatrix} = \\begin{bmatrix}\n \\displaystyle \\sum_{i=1}^n -x_i(y_i - mx_i - b) \\\\\n \\displaystyle \\sum_{i=1}^n -(y_i - mx_i - b)\n\\end{bmatrix} = \\begin{bmatrix}\n \\displaystyle \\frac{\\partial E}{\\partial m} \\\\\n \\displaystyle \\frac{\\partial E}{\\partial b} \n\\end{bmatrix}$$\n่ฏๆฏใ",
"_____no_output_____"
],
[
"่ณๆญคๆไปฌ็ฅ้๏ผ้่ฟๆฑ่งฃๆน็จ $X^TXh = X^TY$ ๆฅๆพๅฐๆไผๅๆฐใ่ฟไธชๆน็จๅๅ้่ฆ๏ผไปๆไธไธชๅๅญๅซๅ **Normal Equation**๏ผไนๆ็ด่ง็ๅ ไฝๆไนใไฝ ๅฏไปฅๅจ [ๅญ็ฉบ้ดๆๅฝฑ](http://open.163.com/movie/2010/11/J/U/M6V0BQC4M_M6V2AJLJU.html) ๅ [ๆๅฝฑ็ฉ้ตไธๆๅฐไบไน](http://open.163.com/movie/2010/11/P/U/M6V0BQC4M_M6V2AOJPU.html) ็ๅฐๆดๅคๅ
ณไบ่ฟไธชๆน็จ็ๅ
ๅฎนใ",
"_____no_output_____"
],
[
"### 3.4 ๆฑ่งฃ $X^TXh = X^TY$ \n\nๅจ3.3 ไธญ๏ผๆไปฌ็ฅ้็บฟๆงๅๅฝ้ฎ้ข็ญไปทไบๆฑ่งฃ $X^TXh = X^TY$ (ๅฆๆไฝ ้ๆฉไธๅ3.3๏ผๅฐฑๅๆข็็ธไฟกๅง๏ผๅๅ)",
"_____no_output_____"
]
],
[
[
"# TODO ๅฎ็ฐ็บฟๆงๅๅฝ\n'''\nๅๆฐ๏ผX, Y ๅญๅจ็ไธไธๅฏนๅบ็ๆจชๅๆ ไธ็บตๅๆ ็ไธคไธชไธ็ปดๆฐ็ป\n่ฟๅ๏ผm๏ผb ๆตฎ็นๆฐ\n'''\ndef linearRegression(X,Y):\n X = [[i, 1] for i in X]\n Y = [[i] for i in Y]\n A = matxMultiply(transpose(X), X)\n b = matxMultiply(transpose(X), Y)\n x = gj_Solve(A, b)\n return x[0][0], x[1][0]\n\nm2,b2 = linearRegression(X,Y)\nassert isinstance(m2,float),\"m is not a float\"\nassert isinstance(b2,float),\"b is not a float\"\nprint(m2,b2)",
"0.30111301039028665 13.867374963487162\n"
]
],
[
[
"ไฝ ๆฑๅพ็ๅๅฝ็ปๆๆฏไปไน๏ผ\n่ฏทไฝฟ็จ่ฟ่กไปฅไธไปฃ็ ๅฐๅฎ็ปๅบๆฅใ",
"_____no_output_____"
]
],
[
[
"# ่ฏทไธ่ฆไฟฎๆนไธ้ข็ไปฃ็ \nx1,x2 = -5,5\ny1,y2 = x1*m2+b2, x2*m2+b2\n\nplt.xlim((-5,5))\nplt.xlabel('x',fontsize=18)\nplt.ylabel('y',fontsize=18)\nplt.scatter(X,Y,c='b')\nplt.plot((x1,x2),(y1,y2),'r')\nplt.title('y = {m:.4f}x + {b:.4f}'.format(m=m2,b=b2))\nplt.show()",
"_____no_output_____"
]
],
[
[
"ไฝ ๆฑๅพ็ๅๅฝ็ปๆๅฏนๅฝๅๆฐๆฎ้็MSEๆฏๅคๅฐ๏ผ",
"_____no_output_____"
]
],
[
[
"print(calculateMSE(X,Y,m2,b2))",
"1.402604878826932\n"
]
]
] |
[
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code",
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfbf995df079d25fdd8e5d1a9f594a8953c1cc8
| 299,004 |
ipynb
|
Jupyter Notebook
|
batch_processing.ipynb
|
ptrbortolotti/pCrunch
|
df2488891d8a0d884cb90edd5bb0412ac0af248f
|
[
"Apache-2.0"
] | 5 |
2020-06-30T14:23:18.000Z
|
2021-09-02T08:06:24.000Z
|
batch_processing.ipynb
|
ptrbortolotti/pCrunch
|
df2488891d8a0d884cb90edd5bb0412ac0af248f
|
[
"Apache-2.0"
] | 6 |
2021-03-30T21:17:35.000Z
|
2022-01-10T16:50:44.000Z
|
batch_processing.ipynb
|
ptrbortolotti/pCrunch
|
df2488891d8a0d884cb90edd5bb0412ac0af248f
|
[
"Apache-2.0"
] | 9 |
2020-05-18T14:33:18.000Z
|
2022-01-05T08:38:18.000Z
| 202.440081 | 73,992 | 0.8714 |
[
[
[
"# Batch Processing!\n#### A notebook to show some of the capilities available through the pCunch package\n\nThis is certainly not an exhaustive look at everything that the pCrunch module can do, but should hopefully provide some insight. \n...or, maybe I'm just procrastinating doing more useful work.",
"_____no_output_____"
]
],
[
[
"# Python Modules and instantiation\nimport numpy as np\nimport matplotlib.pyplot as plt \nimport pandas as pd\nimport time\nimport os\n# %matplotlib widget\n# ROSCO toolbox modules \nfrom ROSCO_toolbox import utilities as rosco_utilities\n# WISDEM modules\nfrom wisdem.aeroelasticse.Util import FileTools\n# Batch Analysis tools\nfrom pCrunch import Processing, Analysis\nfrom pCrunch import pdTools\n\n# Instantiate fast_IO\nfast_io = rosco_utilities.FAST_IO()\nfast_pl = rosco_utilities.FAST_Plots()\n\nimport importlib\nProcessing = importlib.reload(Processing)\nAnalysis = importlib.reload(Analysis)",
"_____no_output_____"
]
],
[
[
"## Define file paths and filenames\nI'm loading a case matrix that is output when using wisdem.aeroelasticse.CaseGen_General to run a series of batch runs to initialize the output files here. \n\nNote that this isn't necessary, just my workflow in this notebook.",
"_____no_output_____"
]
],
[
[
"# point to some file paths\noutfile_base = '/Users/nabbas/Documents/Projects/ROSCO_dev/DLC_Analysis/DLC_Outputs/5MW_Land_DLC11/'\nfname_case_matrix = os.path.join(outfile_base,'case_matrix.yaml')",
"_____no_output_____"
],
[
"# Load case matrix into datafraome\ncase_matrix = FileTools.load_yaml(fname_case_matrix, package=1)\ncm = pd.DataFrame(case_matrix)\n\n# pull wind speed values from InflowWind filenames\nwindspeeds, seed, IECtype, cmw = Processing.get_windspeeds(cm, return_df=True)\ncmw.head()",
"_____no_output_____"
]
],
[
[
"#### Comparison cases\nI'm comparing two different controllers here, so I'm going to define two lists of output filenames, each corresponding to the output files from each controller",
"_____no_output_____"
]
],
[
[
"# Define controllers we care to separate things by\ncontrollers = list(set(cmw[('ServoDyn', 'DLL_FileName')]))\ncontrollers\n\n# Parse find outfiles names\noutfiles = []\nfor cont in controllers:\n case_names = cmw[cmw[('ServoDyn','DLL_FileName')]==cont]['Case_Name']\n outnames = list( outfile_base + case_names + '.outb' )\n outfiles.append(outnames)",
"_____no_output_____"
]
],
[
[
"### outfiles\nIn the end, we just need a list of OpenFAST output files. Here, we have a structure that looks something like `[[], []]`. This could be extended any amount like `[[],[],...,[], []]`, or just be one list of strings `[]`.",
"_____no_output_____"
],
[
"## Now we can do some processing!\n\nFirst, let's load the FAST_Processing class and initialize some parameters.\n",
"_____no_output_____"
]
],
[
[
"fp = Processing.FAST_Processing()\nfp.OpenFAST_outfile_list = outfiles\nfp.dataset_names = ['DLC1.1', 'DLC1.3']\nfp.to = 30\nfp.parallel_analysis = True\nfp.save_LoadRanking = False\nfp.save_SummaryStats = False\nfp.verbose=True\n\n# # Can defined specific variables for load ranking if desired\n# fp.ranking_vars = [[\"RotSpeed\"], \n# [\"OoPDefl1\", \"OoPDefl2\", \"OoPDefl3\"], \n# ['RootMxc1', 'RootMxc2', 'RootMxc3'],\n# ['TwrBsFyt'],\n# ] ",
"_____no_output_____"
]
],
[
[
"#### The fast way to compare things.\nWe could now collect all of the summary stats and load rankings using:\n```\nstats,load_rankings = fp.batch_processing()\n```\nIn `fp.batch_processing()` most of the analysis is done for any structure of data. I'm going to step through things a bit more piecewise in this notebook, however.\n\nNOTE: The goal in `batch_processing` is to have a \"do anything\" script. It is a work in progress, but getting there...",
"_____no_output_____"
]
],
[
[
"# stats,load_rankings = fp.batch_processing()",
"_____no_output_____"
]
],
[
[
"## Design Comparisons\nWe can use fp.design_comparison to compare multiple sets of runs (like we are in this case...). This will generate summary stats and load rankings, running in parrallel when it can and is told to. `fp.batch_processing()` functionally does the same thing if we give it an outfile matrix with equal size lists. We'll show the design comparison here to show a break down",
"_____no_output_____"
]
],
[
[
"stats, load_ranking = fp.design_comparison(outfiles)",
"_____no_output_____"
]
],
[
[
"#### Breaking it down further...\n\n`fp.batch_processing()` calls `Analysis.Loads_Analysls.full_loads_analysis()` to load openfast data, generate stats, and calculate load rankings. Because we defined `fp.parallel_analysis=True` this process was parallelized. This helps for speed and memory reasons, because now every openfast run is not saved. `fp.batch_processing()` then takes all of the output data and parses it back together. \n\nSeparately, we call call `Analysis.Loads_Analysls.full_loads_analysis()` with `return_FastData=True` and all of the fast data will be returned. Because we are comparing data though, we'll stick with the design comparison tools.\n",
"_____no_output_____"
],
[
"#### Loading data\nWe can also just load previously parsed data if we ran `FAST_Processing` with the `save_LoadRankings` and `save_SummaryStates` flags as True.",
"_____no_output_____"
]
],
[
[
"# Or load stats and load rankings\nroot = '/Users/nabbas/Documents/Projects/ROSCO_dev/DLC_Analysis/DLC_Outputs/5MW_Land_DLC11/stats/'\nlrfile = [root+'dataset0_LoadRanking.yaml', root+'dataset1_LoadRanking.yaml']\nsfile = [root+'dataset0_stats.yaml', root+'dataset1_stats.yaml']\nfname_case_matrix = root+'../case_matrix.yaml'\n\nstats = [FileTools.load_yaml(sf, package=1) for sf in sfile]\nload_rankings = [FileTools.load_yaml(lf, package=1) for lf in lrfile]\ncase_matrix = FileTools.load_yaml(fname_case_matrix, package=1)\ncm = pd.DataFrame(case_matrix)",
"_____no_output_____"
]
],
[
[
"### We can look at our data a bit further with pandas dataframes\nThe data here is just for a few runs for simplicity. Usually you'd do this for a LOT more cases...",
"_____no_output_____"
]
],
[
[
"stats_df = pdTools.dict2df(stats, names=['ROSCO', 'Legacy'])\nstats_df.head()",
"_____no_output_____"
]
],
[
[
"### Load Ranking\nLets re-run the load ranking for the sake of example. We'll have to load the analysis tools, and then run the load ranking for the stats we just found",
"_____no_output_____"
]
],
[
[
"fa = Analysis.Loads_Analysis()\nfa.t0 = 30\nfa.verbose = False",
"_____no_output_____"
]
],
[
[
"Define the ranking variables and statiscits of interest. Note that `len(ranking_vars) == len(ranking_stats)`! We can pass this a list of stats (multiple runs), a dictionary with one run of stats, or a pandas dataframe with the requisite stats. If the inner list contains multiple OpenFAST channels, the load_rankings function will find the min/max/mean of the collection of the channels (e.g., max out of plane tip deflection of all three blades). \n\nWe'll also output a dictionary and a pandas DataFrame from `fa.load_ranking()`",
"_____no_output_____"
]
],
[
[
"fa.ranking_vars = [['TwrBsFxt'], ['OoPDefl1', 'OoPDefl2', 'OoPDefl3']]\nfa.ranking_stats = ['max', 'min']\nload_ranking, load_ranking_df = fa.load_ranking(stats_df, get_df=True)\nload_ranking_df.head()",
"_____no_output_____"
]
],
[
[
"This is organized for each iteration of `[ranking_vars, ranking_stats]`. The stats are ordered accordingly, and `(stat)_case_idx` refers to the case name index of each load. ",
"_____no_output_____"
],
[
"## Wind speed related analysis\nWe often want to make sense of some batch output data with data binned by windspeed. We can leverage the case-matrix from our output data to figure out the input wind speeds. Of course, `('InflowWind', 'Filename')` must exist in the case matrix. Lets load the wind speeds, save them, and append them to the case matrix as `('InflowWind', 'WindSpeed')`.",
"_____no_output_____"
]
],
[
[
"windspeed, seed, IECtype, cmw = Processing.get_windspeeds(cm, return_df=True)\ncmw",
"_____no_output_____"
]
],
[
[
"### AEP\nNow that we know the wind speeds that we were operating at, we can find the AEP. We define the turbine class here, and the cumulative distribution or probability density function \nfor the Weibull distribution per IEC 61400 is generated. We can then calculate the AEP. \n\nIf we first want to verify the PDF, we initialize the `power_production` function, define the turbine class, and can plot a PDF (or CDF) for a given range of wind speeds:",
"_____no_output_____"
]
],
[
[
"pp = Analysis.Power_Production()\npp.turbine_class = 2\nVrange = np.arange(2,26) # Range of wind speeds being considered\nweib_prob = pp.prob_WindDist(Vrange,disttype='pdf')\nplt.close('all')\nplt.plot(Vrange, weib_prob)\nplt.grid(True)\nplt.xlabel(\"Wind Speed m/s\")\nplt.ylabel('Probability')\nplt.title('Probability Density Function \\n IEC Class 2 Wind Speeds ')\nplt.show()\n",
"_____no_output_____"
]
],
[
[
"To get the AEP, we need to provide the wind speeds that the simulations were run for, and the corresponding average power results. Internally, in power_production.AEP, the mean power for a given average wind sped is multiplied times the wind speed's probability, then extrapolated to represent yearly production. \n\nNote: this might throw a python warning due to some poor pandas indexing practices - to be cleaned up eventually!\n\nTo get the AEP for each, the process is simple:",
"_____no_output_____"
]
],
[
[
"AEP = pp.AEP(stats, windspeeds)\nprint('AEP = {}'.format(AEP))",
"WARNING: Assuming the input windspeed array is duplicated for each dataset.\nAEP = [22792675.10042736 22727677.57731066]\n"
]
],
[
[
"##### About the wind speed warning:\nHere, we get a warning about the input windspeed array. This is because we passed the complete array output from Processing.get_windspeeds to the AEP function. The input windspeeds to power_production.AEP must satisfy either of the following two conditions:\n- each wind speed value corresponds to each each statistic value, so `len(windspeeds) = len(stats_df)`\n- each wind speed value corresponds to each run in the case matrix, so `len(windspeeds) = len(cm)`\n\nIf the second of these conditions is satisfied, it is assumed that each dataset has the same wind speeds corresponding to each run. So, in this case, the wind speeds corresponding to DLC_1.1 and DLC_1.3 should be the same. ",
"_____no_output_____"
],
[
"## Plotting\nFinally, we can make some plots. There are a few tools we have at our disposal here. First, we can look at more plots that show our design performance as a function of wind speed. Notably, we can pass the stats dictionary or dataframe to these statistics-related scripts.\n\nCurrently, `an_plts.stat_curve()` can plot a \"statistics curve\" for of two types, a bar or a line graph. \n\nA bar graph is useful to compare design cases easily:",
"_____no_output_____"
]
],
[
[
"plt.close()\nan_plts = Analysis.wsPlotting()\nan_plts.stat_curve(windspeed, stats, 'TwrBsFxt', 'bar', names=['ROSCO', 'Legacy'])\nplt.show()",
"_____no_output_____"
]
],
[
[
"A line graph can be useful to show turbulent wind curves. Here we show the means with a first level of errorbars corresponding to standard deviations, and a second level showing minimums and maximums.",
"_____no_output_____"
]
],
[
[
"an_plts.stat_curve(windspeed, stats, 'GenPwr', 'line', stat_idx=0, names=['ROSCO'])\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Load Ranking (soon)\nWe can plot the load rankings... \n... pulling this into `Analysis.py` is in progress.\n\nFirst, we define how we will classify our comparisons. Most commonly this would be `('IEC','DLC')`, but I'm comparing controllers here. The `classifier_type` functionally refers to the channel of the case matrix to separate the data by, and the `classifier_names` are simply labels for the classifiers. ",
"_____no_output_____"
]
],
[
[
"# Define a classification channel from the case-matrix \nclassifier_type = ('ServoDyn', 'DLL_FileName')\nclassifier_names = ['ROSCO', 'legacy']",
"_____no_output_____"
],
[
"# Plot load rankings\nfig_list, ax_list = an_plts.plot_load_ranking(load_ranking, cm, classifier_type, classifier_names=classifier_names, n_rankings=10, caseidx_labels=True)\n\n# modify axis labels\nfor ax in ax_list:\n ax.set_xlabel('Controller [-]', fontsize=10, fontweight='bold')\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Time domain plotting\nWe can also look at our data from the time domain results. \n\nWe can compare any number of channels using the ROSCO toolbox plotting tools. First we'll load two cases to plot together, then plot the time histories.",
"_____no_output_____"
]
],
[
[
"# Load some time domain cases \nfilenames = [outfiles[0][70], outfiles[1][70]] # select the 70th run from each dataset\nfast_data = fast_io.load_FAST_out(filenames, tmin=30)\n\n# Change names so the legends make sense\nfast_data[0]['meta']['name'] = 'ROSCO'\nfast_data[1]['meta']['name'] = 'Legacy'",
"_____no_output_____"
],
[
"# Define the plots we want to make (can be as many or as few channels and plots as you would like...)\ncases = {'Baseline': ['Wind1VelX', 'GenPwr', 'BldPitch1', 'GenTq', 'RotSpeed'],\n 'Blade' : ['OoPDefl1', 'RootMyb1']}\n\n# plot\nfast_pl.plot_fast_out(cases, fast_data)\nplt.show()",
"_____no_output_____"
]
],
[
[
"### Spectral Analysis\n\nWe can additionally do some frequency domain analysis. Here, `spec_cases` is defined by `(channel, run)` where the run index corresponds to the desired plotting index in the loaded fast data.\n\n",
"_____no_output_____"
]
],
[
[
"spec_cases = [('RootMyb1', 0), ('TwrBsFxt', 1)]\ntwrfreq = .0716\ntwrfreq_label = ['Tower']\nfig, ax = fast_pl.plot_spectral(fast_data, spec_cases, \n show_RtSpeed=True, RtSpeed_idx=[0],\n add_freqs=[twrfreq], add_freq_labels=twrfreq_label,\n averaging='Welch')\nax.set_title('DLC_1.1')\nplt.show()",
"[WARN] dt from tmax-tmin different from dt from t2-t1\n[WARN] dt from tmax-tmin different from dt from t2-t1\n"
]
],
[
[
"### Other fun plots\n\nFinally, we can plot the data distribution of any channels from our fast output data",
"_____no_output_____"
]
],
[
[
"channels = ['GenPwr']\ncaseid = [0,1]\nan_plts.distribution(fast_data, channels, caseid, names=['ROSCO', 'Legacy'])\nplt.show()",
"_____no_output_____"
]
],
[
[
"## In conclusion...\nIf you made it this far, thanks for reading... \n\nThere are a number of smaller subfunctionalities that are also available within these tools shows above. Perhaps most importantly, everything is fairly modularar - the hope being that these can provide some high-level tools that everyone can assimilate into their own workflows without too much disruption.\n\nPlease add, contribute, fix, etc... That would be great for everyone involved!",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown",
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
]
] |
cbfbfa66f198ce344a7c280d97e70e9f0262c11b
| 173,664 |
ipynb
|
Jupyter Notebook
|
pc_vs_lda_on_digits.ipynb
|
snayan06/Implimentation-of-PCA-algorithm-of-machine-learning
|
49146185c82944c9f75233abdb856b17a762dd73
|
[
"Apache-2.0"
] | 3 |
2019-10-22T16:57:24.000Z
|
2021-07-30T07:21:24.000Z
|
pc_vs_lda_on_digits.ipynb
|
snayan06/Implimentation-of-PCA-algorithm-of-machine-learning
|
49146185c82944c9f75233abdb856b17a762dd73
|
[
"Apache-2.0"
] | null | null | null |
pc_vs_lda_on_digits.ipynb
|
snayan06/Implimentation-of-PCA-algorithm-of-machine-learning
|
49146185c82944c9f75233abdb856b17a762dd73
|
[
"Apache-2.0"
] | null | null | null | 469.362162 | 82,558 | 0.908743 |
[
[
[
"print(__doc__)\n\nimport matplotlib.pyplot as plt\n\nfrom sklearn import datasets\nfrom sklearn.decomposition import PCA\nfrom sklearn.discriminant_analysis import LinearDiscriminantAnalysis",
"Automatically created module for IPython interactive environment\n"
],
[
"digits = datasets.load_digits()\nprint (digits)\n",
"{'data': array([[ 0., 0., 5., ..., 0., 0., 0.],\n [ 0., 0., 0., ..., 10., 0., 0.],\n [ 0., 0., 0., ..., 16., 9., 0.],\n ...,\n [ 0., 0., 1., ..., 6., 0., 0.],\n [ 0., 0., 2., ..., 12., 0., 0.],\n [ 0., 0., 10., ..., 12., 1., 0.]]), 'target': array([0, 1, 2, ..., 8, 9, 8]), 'target_names': array([0, 1, 2, 3, 4, 5, 6, 7, 8, 9]), 'images': array([[[ 0., 0., 5., ..., 1., 0., 0.],\n [ 0., 0., 13., ..., 15., 5., 0.],\n [ 0., 3., 15., ..., 11., 8., 0.],\n ...,\n [ 0., 4., 11., ..., 12., 7., 0.],\n [ 0., 2., 14., ..., 12., 0., 0.],\n [ 0., 0., 6., ..., 0., 0., 0.]],\n\n [[ 0., 0., 0., ..., 5., 0., 0.],\n [ 0., 0., 0., ..., 9., 0., 0.],\n [ 0., 0., 3., ..., 6., 0., 0.],\n ...,\n [ 0., 0., 1., ..., 6., 0., 0.],\n [ 0., 0., 1., ..., 6., 0., 0.],\n [ 0., 0., 0., ..., 10., 0., 0.]],\n\n [[ 0., 0., 0., ..., 12., 0., 0.],\n [ 0., 0., 3., ..., 14., 0., 0.],\n [ 0., 0., 8., ..., 16., 0., 0.],\n ...,\n [ 0., 9., 16., ..., 0., 0., 0.],\n [ 0., 3., 13., ..., 11., 5., 0.],\n [ 0., 0., 0., ..., 16., 9., 0.]],\n\n ...,\n\n [[ 0., 0., 1., ..., 1., 0., 0.],\n [ 0., 0., 13., ..., 2., 1., 0.],\n [ 0., 0., 16., ..., 16., 5., 0.],\n ...,\n [ 0., 0., 16., ..., 15., 0., 0.],\n [ 0., 0., 15., ..., 16., 0., 0.],\n [ 0., 0., 2., ..., 6., 0., 0.]],\n\n [[ 0., 0., 2., ..., 0., 0., 0.],\n [ 0., 0., 14., ..., 15., 1., 0.],\n [ 0., 4., 16., ..., 16., 7., 0.],\n ...,\n [ 0., 0., 0., ..., 16., 2., 0.],\n [ 0., 0., 4., ..., 16., 2., 0.],\n [ 0., 0., 5., ..., 12., 0., 0.]],\n\n [[ 0., 0., 10., ..., 1., 0., 0.],\n [ 0., 2., 16., ..., 1., 0., 0.],\n [ 0., 0., 15., ..., 15., 0., 0.],\n ...,\n [ 0., 4., 16., ..., 16., 6., 0.],\n [ 0., 8., 16., ..., 16., 8., 0.],\n [ 0., 1., 8., ..., 12., 1., 0.]]]), 'DESCR': \".. _digits_dataset:\\n\\nOptical recognition of handwritten digits dataset\\n--------------------------------------------------\\n\\n**Data Set Characteristics:**\\n\\n :Number of Instances: 5620\\n :Number of Attributes: 64\\n :Attribute Information: 8x8 image of integer pixels in the range 0..16.\\n :Missing Attribute Values: None\\n :Creator: E. Alpaydin (alpaydin '@' boun.edu.tr)\\n :Date: July; 1998\\n\\nThis is a copy of the test set of the UCI ML hand-written digits datasets\\nhttps://archive.ics.uci.edu/ml/datasets/Optical+Recognition+of+Handwritten+Digits\\n\\nThe data set contains images of hand-written digits: 10 classes where\\neach class refers to a digit.\\n\\nPreprocessing programs made available by NIST were used to extract\\nnormalized bitmaps of handwritten digits from a preprinted form. From a\\ntotal of 43 people, 30 contributed to the training set and different 13\\nto the test set. 32x32 bitmaps are divided into nonoverlapping blocks of\\n4x4 and the number of on pixels are counted in each block. This generates\\nan input matrix of 8x8 where each element is an integer in the range\\n0..16. This reduces dimensionality and gives invariance to small\\ndistortions.\\n\\nFor info on NIST preprocessing routines, see M. D. Garris, J. L. Blue, G.\\nT. Candela, D. L. Dimmick, J. Geist, P. J. Grother, S. A. Janet, and C.\\nL. Wilson, NIST Form-Based Handprint Recognition System, NISTIR 5469,\\n1994.\\n\\n.. topic:: References\\n\\n - C. Kaynak (1995) Methods of Combining Multiple Classifiers and Their\\n Applications to Handwritten Digit Recognition, MSc Thesis, Institute of\\n Graduate Studies in Science and Engineering, Bogazici University.\\n - E. Alpaydin, C. Kaynak (1998) Cascading Classifiers, Kybernetika.\\n - Ken Tang and Ponnuthurai N. Suganthan and Xi Yao and A. Kai Qin.\\n Linear dimensionalityreduction using relevance weighted LDA. School of\\n Electrical and Electronic Engineering Nanyang Technological University.\\n 2005.\\n - Claudio Gentile. A New Approximate Maximal Margin Classification\\n Algorithm. NIPS. 2000.\"}\n"
],
[
"print(digits.data.shape)\nimport matplotlib.pyplot as plt \nplt.gray() \nplt.matshow(digits.images[1]) \nplt.show() ",
"(1797, 64)\n"
],
[
"X = digits.data\ny = digits.target\ntarget_names = digits.target_names",
"_____no_output_____"
],
[
"pca = PCA(n_components=8)\nX_r = pca.fit(X).transform(X)\nprint(X_r)",
"[[-1.25945764e+00 2.12749022e+01 -9.46305678e+00 ... -7.44130762e+00\n 3.24250335e+00 2.55576259e+00]\n [ 7.95761349e+00 -2.07686915e+01 4.43951359e+00 ... -6.48635966e+00\n 2.12331026e+00 -4.61813926e+00]\n [ 6.99190463e+00 -9.95601456e+00 2.95857208e+00 ... -4.50872909e+00\n 1.85877798e+00 -1.64246107e+01]\n ...\n [ 1.08012869e+01 -6.96025080e+00 5.59954511e+00 ... -1.23748424e+01\n -4.49060616e+00 -7.41304819e+00]\n [-4.87209984e+00 1.24239614e+01 -1.01708616e+01 ... -4.47917801e-03\n -2.99996478e+00 -4.35616059e+00]\n [-3.44395163e-01 6.36553706e+00 1.07737055e+01 ... -3.04915294e+00\n -1.16045173e+01 6.67389380e-01]]\n"
],
[
"lda = LinearDiscriminantAnalysis(n_components=8)\nX_r2 = lda.fit(X, y).transform(X)\nprint(X_r2)",
"[[-2.0146322 -5.62348616 -0.18659403 ... 0.57975458 -0.10934851\n -0.18350667]\n [ 0.2209674 3.59240033 2.14901657 ... -2.22254446 -0.12311509\n -3.39210556]\n [ 2.23485453 2.70950363 4.26992781 ... -1.2814336 2.51256615\n 1.08370006]\n ...\n [-1.04697178 1.48367733 3.04170263 ... 2.03913572 1.54913444\n 1.99315402]\n [ 0.40787292 -1.99167301 -0.36583552 ... 0.14266115 0.91659536\n 0.52685635]\n [ 0.17414501 0.88717463 1.37776831 ... 1.83958022 -0.05149654\n 3.18833067]]\n"
],
[
"# Percentage of variance explained for each components\nprint('explained variance ratio (first two components): %s'\n % str(pca.explained_variance_ratio_))\n\nprint('explained variance ratio (first two components): %s'\n % str(lda.explained_variance_ratio_))\n\nplt.figure()\ncolors = ['navy', 'turquoise', 'darkorange','crimson','pink','olive','darkmagenta','lavender','coral','lightcyan']\nlw = 9\n\nfor color, i, target_name in zip(colors, [0, 1, 2, 3, 4, 5, 6, 7, 8, 9], target_names):\n plt.scatter(X_r[y == i, 0], X_r[y == i, 1], color=color, alpha=.8, lw=lw,\n label=target_name)\nplt.legend(loc='best', shadow=False, scatterpoints=1)\nplt.title('PCA of digits dataset')\n\nplt.figure()\nfor color, i, target_name in zip(colors, [0, 1, 2,3,4,5,6,7,8,9], target_names):\n plt.scatter(X_r2[y == i, 0], X_r2[y == i, 1], alpha=.8, color=color,\n label=target_name)\nplt.legend(loc='best', shadow=False, scatterpoints=1)\nplt.title('LDA of digits dataset')\n\nplt.show()",
"explained variance ratio (first two components): [0.14890594 0.13618771 0.11794594 0.08409979 0.05782415 0.04916908\n 0.04315933 0.03661357]\nexplained variance ratio (first two components): [0.28912041 0.18262788 0.16962345 0.1167055 0.08301253 0.06565685\n 0.04310127 0.0293257 ]\n"
],
[
"",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfbfe3c21262b15aa88c49ad39c7957cfae283c
| 51,637 |
ipynb
|
Jupyter Notebook
|
Notebooks/Models/PCA.ipynb
|
nextBillyonair/DPM
|
840ffaafe15c208b200b74094ffa8fe493b4c975
|
[
"MIT"
] | 1 |
2021-07-20T14:02:55.000Z
|
2021-07-20T14:02:55.000Z
|
Notebooks/Models/PCA.ipynb
|
nextBillyonair/DPM
|
840ffaafe15c208b200b74094ffa8fe493b4c975
|
[
"MIT"
] | null | null | null |
Notebooks/Models/PCA.ipynb
|
nextBillyonair/DPM
|
840ffaafe15c208b200b74094ffa8fe493b4c975
|
[
"MIT"
] | null | null | null | 71.42047 | 15,692 | 0.800182 |
[
[
[
"import torch\nfrom dpm.models import pca, PCA\nimport numpy as np\nfrom scipy.stats import ortho_group\nfrom dpm.models.decomposition import ProbabilisticPCA,PPCA_Variational, PPCA_Variational_V2\nfrom dpm.visualize import plot_stats",
"_____no_output_____"
],
[
"def build_toy_dataset(N, D, K, sigma=1):\n z_train = np.random.normal(0.0, 1.0, size=(N, K))\n z_train = z_train - z_train.mean(0)\n w = ortho_group.rvs(D)[:, :K]\n w = torch.tensor(w).float()\n return torch.tensor(z_train).float().mm(w.t()), w, z_train\n\nN = 5000 # number of data points\nD = 2 # data dimensionality\nK = 1 # latent dimensionality\n\nx_train, w, z_train = build_toy_dataset(N, D, K)\nx_train.shape, w.shape",
"_____no_output_____"
],
[
"w",
"_____no_output_____"
],
[
"ppca = ProbabilisticPCA(2, 1, noise=0.0001)\nvariational_dist = PPCA_Variational_V2(ppca)",
"_____no_output_____"
],
[
"ppca.num_parameters",
"_____no_output_____"
],
[
"ppca.W",
"_____no_output_____"
],
[
"stats = None",
"_____no_output_____"
],
[
"stats = ppca.fit(x_train, variational_dist, stats=stats, epochs=1000)",
"_____no_output_____"
],
[
"plot_stats(stats)",
"_____no_output_____"
],
[
"ppca.W / (ppca.W ** 2).sum()",
"_____no_output_____"
],
[
"w / (w ** 2).sum()",
"_____no_output_____"
],
[
"from matplotlib import pyplot as plt\nplt.scatter(x_train[:, 0], x_train[:, 1])\nplt.show()",
"_____no_output_____"
],
[
"variational_dist.sample(x_train).shape",
"_____no_output_____"
],
[
"ppca.W.shape",
"_____no_output_____"
],
[
"x_train, ppca.sample(variational_dist.sample(x_train))",
"_____no_output_____"
],
[
"ppca.sample(batch_size=64)",
"_____no_output_____"
],
[
"ppca.transform(x_train)",
"_____no_output_____"
],
[
"z_train",
"_____no_output_____"
],
[
"def pca(X, k=2):\n if not isinstance(X, torch.Tensor):\n X = torch.tensor(X).float()\n X = X - X.mean(dim=0, keepdim=True)\n U, S, V = torch.svd(X)\n# return U, S, V\n return torch.mm(X, V[:k].t())",
"_____no_output_____"
],
[
"ret = pca(x_train, 1)\n",
"_____no_output_____"
],
[
"ret",
"_____no_output_____"
],
[
"U,S,V = ret",
"_____no_output_____"
],
[
"x_train.mm(w)",
"_____no_output_____"
],
[
"torch.mm(x_train, V[:1].t())",
"_____no_output_____"
],
[
"from matplotlib import pyplot as plt\nplt.scatter(x_train[:, 0], x_train[:, 1])\nplt.show()",
"_____no_output_____"
],
[
"ret, _ = torch.sort(ret)",
"_____no_output_____"
],
[
"x_train, _ = torch.sort(x_train)",
"_____no_output_____"
],
[
"(ret.abs() - x_train.float().abs()).pow(2).sum()",
"_____no_output_____"
],
[
"m = PCA(1)",
"_____no_output_____"
],
[
"ret = m.fit_transform(x_train)\nm.V",
"_____no_output_____"
],
[
"x_train, _ = torch.sort(x_train)",
"_____no_output_____"
],
[
"ret, _ = torch.sort(ret)",
"_____no_output_____"
],
[
"(ret.abs() - x_train.float().abs()).pow(2).sum()",
"_____no_output_____"
],
[
"x_train.shape",
"_____no_output_____"
]
]
] |
[
"code"
] |
[
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfc00ddc31ee8fa2cce4638497442ea75bb4344
| 26,285 |
ipynb
|
Jupyter Notebook
|
tf_keras_test.ipynb
|
laicheil/force2019
|
e39f29724228f82a633f2de9eaae1fb3c64449c0
|
[
"Unlicense"
] | null | null | null |
tf_keras_test.ipynb
|
laicheil/force2019
|
e39f29724228f82a633f2de9eaae1fb3c64449c0
|
[
"Unlicense"
] | null | null | null |
tf_keras_test.ipynb
|
laicheil/force2019
|
e39f29724228f82a633f2de9eaae1fb3c64449c0
|
[
"Unlicense"
] | null | null | null | 49.039179 | 7,349 | 0.646795 |
[
[
[
"<a href=\"https://colab.research.google.com/github/laicheil/force2019/blob/master/tf_keras_test.ipynb\" target=\"_parent\"><img src=\"https://colab.research.google.com/assets/colab-badge.svg\" alt=\"Open In Colab\"/></a>",
"_____no_output_____"
]
],
[
[
"!pip3 install --upgrade laicheil.force2019==0.post0.dev6\nfrom laicheil.force2019 import something\nsomething()",
"_____no_output_____"
],
[
"from tensorflow.python.client import device_lib\nimport numpy as np\ndevice_lib.list_local_devices()",
"_____no_output_____"
]
],
[
[
"Upload the data files",
"_____no_output_____"
]
],
[
[
"from google.colab import files\n\nuploaded = files.upload()\n\n#for fn in uploaded.keys():\n# print('User uploaded file \"{name}\" with length {length} bytes'.format(\n# name=fn, length=len(uploaded[fn])))",
"_____no_output_____"
],
[
"!mkdir hackathon_training_data \n!unzip hackathon_training_data.zip -d hackathon_training_data\n!ls hackathon_training_data/",
"_____no_output_____"
],
[
"import os\nimport json\ndata_path = 'hackathon_training_data'\nlist_of_files = os.listdir(data_path)\nnum_of_files = len(list_of_files)\nfirst_file_path = os.path.join(data_path, list_of_files[0])\n#print (first_file_path)\nwith open(first_file_path,'r') as read_file:\n shape_of_files = (num_of_files,) + np.asarray(json.load(read_file)).shape + (1, )\n#print (shape_of_files)\ndata = np.zeros((shape_of_files))\nlabels = np.zeros(num_of_files)\nlabels_ce = np.zeros((num_of_files,2))\nfor i, filename in enumerate(os.listdir(data_path)):\n full_path = os.path.join(data_path,filename)\n labels[i] = int(filename.startswith('good'))\n labels_ce[i, int(filename.startswith('good'))] = 1\n with open(full_path,'r') as read_file:\n data[i, :, :, 0] = np.asarray(json.load(read_file))\n\nprint('labels shape', labels.shape)\nprint('labels for CE shape', labels_ce.shape)\nprint('data shape', data.shape)",
"labels shape (200,)\nlabels for CE shape (200, 2)\ndata shape (200, 40, 39, 1)\n"
],
[
"print(labels)\nprint(os.listdir(data_path))",
"_____no_output_____"
]
],
[
[
"\n\nImage data generators for the inputs",
"_____no_output_____"
]
],
[
[
"from tensorflow.keras.preprocessing import image\nfrom sklearn.model_selection import train_test_split\n\ndatagen = image.ImageDataGenerator (\n featurewise_center = True,\n featurewise_std_normalization=True,\n vertical_flip=True,\n horizontal_flip=True,\n rotation_range=90)\ndatagen.fit (data)\n\ntrain_samples, validation_samples, train_labels, validation_labels = train_test_split(data, labels, test_size=.334)\n\ntrain_generator = datagen.flow(train_samples, train_labels, batch_size=32)\nvalidation_generator = datagen.flow(validation_samples , validation_labels , batch_size=32)\n\n\ntrain_samples_ce, validation_samples_ce, train_labels_ce, validation_labels_ce = train_test_split(data, labels_ce, test_size=.334)\ntrain_ce_generator = datagen.flow(train_samples_ce, train_labels_ce, batch_size=32)\nvalidation_ce_generator = datagen.flow(validation_samples_ce , validation_labels_ce , batch_size=32)\n\ntest_ce_genera",
"_____no_output_____"
]
],
[
[
"[link text](https://)Loading the ResNet50 model from the tensorflow-keras library",
"_____no_output_____"
]
],
[
[
"from tensorflow.keras.applications.resnet50 import ResNet50\nfrom tensorflow.keras.applications.resnet50 import preprocess_input, decode_predictions\n\n\nimport tensorflow as tf\n\nconfig = tf.ConfigProto()\nconfig.gpu_options.allow_growth = True\n#config.gpu_options.per_process_gpu_memory_fraction = 0.33\n\nfrom tensorflow.python.keras import backend as K\n#with tf.device('/device:GPU:0'):\nK.set_session (tf.Session (config = config))\n\nprint('DONE LOADING MODEL')",
"DONE LOADING MODEL\n"
]
],
[
[
"Callbacks",
"_____no_output_____"
]
],
[
[
"\nimport datetime\n\nnow = datetime.datetime.now ()\ndate_str = now.strftime('%Y%m%d%H%M')\ncheckpoint_init_name = 'init_chkpnt_'+date_str+'.hdf5'\nfrom tensorflow.python.keras.callbacks import CSVLogger, EarlyStopping, ModelCheckpoint\ncallbacks = [ \n EarlyStopping (monitor='val_acc', patience=9, verbose=1),\n ModelCheckpoint(checkpoint_init_name, monitor='val_acc', save_best_only=True, save_weights_only=True, verbose=1)\n ]",
"_____no_output_____"
],
[
"from tensorflow.keras.models import Model\nfrom tensorflow.keras.layers import Input,Lambda, Dense, Flatten\nfrom tensorflow.image import grayscale_to_rgb\n\n## inputs\ninputs = Input (shape=data.shape[1:])#samples.shape[1:]\n#\n## from grayscale to RGB, Xception needs 3 Channel input\nx = Lambda (lambda x: grayscale_to_rgb (x), name='grayscale_to_rgb') (inputs) \nbase_model = ResNet50(weights='imagenet', input_tensor=x,include_top=False)\noutput = Flatten()(base_model.output)\noutput = Dense(1000, activation='relu')(output)\noutput = Dense(100, activation='relu')(output)\noutput = Dense(2, activation='softmax')(output)\n## The model\nnum_layers = len(base_model.layers)\n#for i, layer in enumerate (base_model.layers):\n# layer.trainable = i < 8 or i > num_layers-8\nmodel = Model (inputs=inputs, outputs=output)\nmodel.compile(optimizer='nadam',\n loss='categorical_crossentropy',\n metrics=['accuracy'])",
"/usr/local/lib/python3.6/dist-packages/keras_applications/resnet50.py:265: UserWarning: The output shape of `ResNet50(include_top=False)` has been changed since Keras 2.2.0.\n warnings.warn('The output shape of `ResNet50(include_top=False)` '\n"
],
[
"print(model.output_shape)",
"(None, 2)\n"
]
],
[
[
"Train",
"_____no_output_____"
]
],
[
[
"model.fit_generator(train_ce_generator, steps_per_epoch=int(train_samples.shape[0]), epochs=100,validation_data=validation_ce_generator)\nevaluation = model.evaluate_generator(validation_ce_generator)\n\nprint(evaluation)",
"_____no_output_____"
]
],
[
[
"Train using kfold",
"_____no_output_____"
]
],
[
[
"from sklearn.model_selection import KFold\n\nk_checkpoint_basename = 'CHK_' + date_str + '_K'\nkf = KFold (shuffle=True, n_splits=5)\nlast_good_model_weights = ''\nk=0\nfor train_index, test_index in kf.split(data, labels_ce):\n print('At fold K=',k,' with ', len(train_index), ' samples out of total ', data.shape[0])\n kf_filepath=k_checkpoint_basename + str(k) + '.hdf5'\n callbacks[-1].filepath = kf_filepath\n history = model.fit_generator (generator = datagen.flow(data[train_index], labels_ce[train_index], batch_size=16), \n validation_data = datagen.flow(data[test_index] , labels_ce[test_index] , batch_size=16),\n steps_per_epoch = int(data.shape[0]/4), \n epochs = 2, \n callbacks = callbacks)\n if os.path.isfile(kf_filepath):\n #model.load_weights (kf_filepath) #Load best\n last_good_model_weights = kf_filepath\n if os.path.isfile(last_good_model_weights):\n model.load_weights (last_good_model_weights)\n evaluation = model.evaluate_generator(test_ce_generator)\n print ('Evaluation Mean Squared Error on test data for k =', k, 'is:', evaluation*100.)\n folds_map [k] = {\n 'evaluation' : evaluation,\n 'history' : history,\n 'filepath' : kf_filepath } \n k += 1\n",
"At fold K= 0 with 160 samples out of total 200\nEpoch 1/2\n49/50 [============================>.] - ETA: 1s - loss: 0.9980 - acc: 0.5038\nEpoch 00001: val_acc did not improve from 0.57500\n50/50 [==============================] - 63s 1s/step - loss: 0.9927 - acc: 0.5050 - val_loss: 0.6934 - val_acc: 0.4750\nEpoch 2/2\n49/50 [============================>.] - ETA: 0s - loss: 0.7190 - acc: 0.5128\nEpoch 00002: val_acc did not improve from 0.57500\n50/50 [==============================] - 15s 292ms/step - loss: 0.7186 - acc: 0.5088 - val_loss: 0.6936 - val_acc: 0.4750\n"
],
[
"\nevaluation = model.evaluate_generator(validation_ce_generator)\npredict = model.predict_generator(validation_ce_generator)\n\nprint(evaluation)\nprint(predict)",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
]
] |
cbfc00f5d7fd18beefbf67ce46e764aaf4656c95
| 13,579 |
ipynb
|
Jupyter Notebook
|
docs/examples/use_cases/mxnet/mxnet-resnet50.ipynb
|
cclauss/DALI
|
38e2ca2a7004f602f069b8d87437d7a23996b491
|
[
"ECL-2.0",
"Apache-2.0"
] | null | null | null |
docs/examples/use_cases/mxnet/mxnet-resnet50.ipynb
|
cclauss/DALI
|
38e2ca2a7004f602f069b8d87437d7a23996b491
|
[
"ECL-2.0",
"Apache-2.0"
] | null | null | null |
docs/examples/use_cases/mxnet/mxnet-resnet50.ipynb
|
cclauss/DALI
|
38e2ca2a7004f602f069b8d87437d7a23996b491
|
[
"ECL-2.0",
"Apache-2.0"
] | null | null | null | 42.040248 | 998 | 0.558951 |
[
[
[
"# MXNet with DALI - ResNet 50 example\n\n## Overview\n\nThis example shows, how to use DALI pipelines with Apache MXNet.\n\n## ResNet 50 pipeline\n\nLet us first define a few global constants.",
"_____no_output_____"
]
],
[
[
"from __future__ import print_function\nfrom nvidia.dali.pipeline import Pipeline\nimport nvidia.dali.ops as ops\nimport nvidia.dali.types as types\n\nN = 8 # number of GPUs\nbatch_size = 128 # batch size per GPU\n\ndb_folder = \"/data/imagenet/train-480-val-256-recordio/\"",
"_____no_output_____"
]
],
[
[
"### The training pipeline\n\nThe training pipeline consists of the following steps:\n * Data is first read from MXNet's recordIO file (the reader op is given a name `Reader` for later use)\n * Then, images are decoded using nvJPEG\n * RGB images are then randomly cropped and resized to the final size of (224, 224) pixels\n * Finally, the batch is transposed from NHWC layout to NCHW layout, normalized and randomly mirrored.\n \n`DALIClassificationIterator`, which we will use for interfacing with MXNet in this example, requires outputs of the pipeline to follow (image, label) structure.",
"_____no_output_____"
]
],
[
[
"class HybridTrainPipe(Pipeline):\n def __init__(self, batch_size, num_threads, device_id, num_gpus):\n super(HybridTrainPipe, self).__init__(batch_size, num_threads, device_id, seed = 12 + device_id)\n self.input = ops.MXNetReader(path = [db_folder+\"train.rec\"], index_path=[db_folder+\"train.idx\"],\n random_shuffle = True, shard_id = device_id, num_shards = num_gpus,\n pad_last_batch=True)\n self.decode = ops.ImageDecoderRandomCrop(device = \"mixed\",\n output_type = types.RGB,\n random_aspect_ratio = [0.8, 1.25],\n random_area = [0.1, 1.0],\n num_attempts = 100)\n self.resize = ops.Resize(device = \"gpu\", resize_x = 224, resize_y = 224)\n self.cmnp = ops.CropMirrorNormalize(device = \"gpu\",\n dtype = types.FLOAT,\n output_layout = types.NCHW,\n crop = (224, 224),\n mean = [0.485 * 255,0.456 * 255,0.406 * 255],\n std = [0.229 * 255,0.224 * 255,0.225 * 255])\n self.coin = ops.CoinFlip(probability = 0.5)\n\n def define_graph(self):\n rng = self.coin()\n self.jpegs, self.labels = self.input(name = \"Reader\")\n images = self.decode(self.jpegs)\n images = self.resize(images)\n output = self.cmnp(images, mirror = rng)\n return [output, self.labels]\n",
"_____no_output_____"
]
],
[
[
"### The validation pipeline\n\nThe validation pipeline is similar to the training pipeline, but omits the random resized crop and random mirroring steps, as well as shuffling the data coming from the reader.",
"_____no_output_____"
]
],
[
[
"class HybridValPipe(Pipeline):\n def __init__(self, batch_size, num_threads, device_id, num_gpus):\n super(HybridValPipe, self).__init__(batch_size, num_threads, device_id, seed = 12 + device_id)\n self.input = ops.MXNetReader(path = [db_folder+\"val.rec\"], index_path=[db_folder+\"val.idx\"],\n random_shuffle = False, shard_id = device_id, num_shards = num_gpus,\n pad_last_batch=True)\n self.decode = ops.ImageDecoder(device = \"mixed\", output_type = types.RGB)\n self.cmnp = ops.CropMirrorNormalize(device = \"gpu\",\n dtype = types.FLOAT,\n output_layout = types.NCHW,\n crop = (224, 224),\n mean = [0.485 * 255,0.456 * 255,0.406 * 255],\n std = [0.229 * 255,0.224 * 255,0.225 * 255])\n\n def define_graph(self):\n self.jpegs, self.labels = self.input(name = \"Reader\")\n images = self.decode(self.jpegs)\n output = self.cmnp(images)\n return [output, self.labels]\n",
"_____no_output_____"
],
[
"trainpipes = [HybridTrainPipe(batch_size=batch_size, num_threads=2, device_id = i, num_gpus = N) for i in range(N)]\nvalpipes = [HybridValPipe(batch_size=batch_size, num_threads=2, device_id = i, num_gpus = N) for i in range(N)]",
"_____no_output_____"
]
],
[
[
"### Using the MXNet plugin\n\nMXNet data iterators need to know what is the size of the dataset. Since DALI pipelines may consist of multiple readers, potentially with differently sized datasets, we need to specify the reader which we ask for the epoch size. That is why we gave a name to readers in both training and validation pipelines.\n\nIn order to get the epoch size out of the reader, we need to build one of the training and one of the validation pipelines.",
"_____no_output_____"
]
],
[
[
"trainpipes[0].build()\nvalpipes[0].build()",
"_____no_output_____"
],
[
"print(\"Training pipeline epoch size: {}\".format(trainpipes[0].epoch_size(\"Reader\")))\nprint(\"Validation pipeline epoch size: {}\".format(valpipes[0].epoch_size(\"Reader\")))",
"Training pipeline epoch size: 1281167\nValidation pipeline epoch size: 50000\n"
]
],
[
[
"Now we can make MXNet iterators out of our pipelines, using `DALIClassificationIterator` class.",
"_____no_output_____"
]
],
[
[
"from nvidia.dali.plugin.mxnet import DALIClassificationIterator\ndali_train_iter = DALIClassificationIterator(trainpipes, reader_name=\"Reader\", fill_last_batch=False)\ndali_val_iter = DALIClassificationIterator(valpipes, reader_name=\"Reader\", fill_last_batch=False)",
"_____no_output_____"
]
],
[
[
"## Training with MXNet\n\nOnce we have MXNet data iterators from `DALIClassificationIterator`, we can use them instead of MXNet's`mx.io.ImageRecordIter`. Here we show modified `train_imagenet.py` example that uses our DALI pipelines.",
"_____no_output_____"
]
],
[
[
"import os.path\nimport argparse\nimport logging\nlogging.basicConfig(level=logging.DEBUG)\nfrom resnetn.common import find_mxnet, data, fit\nimport mxnet as mx\n\ngpus_string = \"\".join(str(list(range(N)))).replace('[','').replace(']','')\n\ns = ['--gpu', gpus_string,\n '--batch-size', str(batch_size * N),\n '--num-epochs', '1',\n '--data-train', '/data/imagenet/train-480-val-256-recordio/train.rec',\n '--data-val', '/data/imagenet/train-480-val-256-recordio/val.rec',\n '--disp-batches', '100',\n '--network', 'resnet-v1',\n '--num-layers', '50',\n '--data-nthreads', '40',\n '--min-random-scale', '0.533',\n '--max-random-shear-ratio', '0',\n '--max-random-rotate-angle', '0',\n '--max-random-h', '0',\n '--max-random-l', '0',\n '--max-random-s', '0',\n '--dtype', 'float16']\n\n# parse args\nparser = argparse.ArgumentParser(description=\"train imagenet-1k\",\n formatter_class=argparse.ArgumentDefaultsHelpFormatter)\nfit.add_fit_args(parser)\ndata.add_data_args(parser)\ndata.add_data_aug_args(parser)\n# use a large aug level\ndata.set_data_aug_level(parser, 3)\nparser.set_defaults(\n # network\n network = 'resnet',\n num_layers = 50,\n # data\n num_classes = 1000,\n num_examples = 1281167,\n image_shape = '3,224,224',\n min_random_scale = 1, # if input image has min size k, suggest to use\n # 256.0/x, e.g. 0.533 for 480\n # train\n num_epochs = 80,\n lr_step_epochs = '30,60',\n dtype = 'float32'\n )\nargs = parser.parse_args(s)\n\n\n# load network\nfrom importlib import import_module\nnet = import_module('resnetn.symbols.'+args.network)\nsym = net.get_symbol(1000, 50, \"3,224,224\", dtype='float16')\n\ndef get_dali_iter(args, kv=None):\n return (dali_train_iter, dali_val_iter)\n\n# train\n#fit.fit(args, sym, data.get_rec_iter)\nfit.fit(args, sym, get_dali_iter)",
"INFO:root:start with arguments Namespace(batch_size=1024, benchmark=0, data_nthreads=40, data_train='/data/imagenet/train-480-val-256-recordio/train.rec', data_train_idx='', data_val='/data/imagenet/train-480-val-256-recordio/val.rec', data_val_idx='', disp_batches=100, dtype='float16', gc_threshold=0.5, gc_type='none', gpus='0, 1, 2, 3, 4, 5, 6, 7', image_shape='3,224,224', initializer='default', kv_store='device', load_epoch=None, loss='', lr=0.1, lr_factor=0.1, lr_step_epochs='30,60', macrobatch_size=0, max_random_aspect_ratio=0.25, max_random_h=0, max_random_l=0, max_random_rotate_angle=0, max_random_s=0, max_random_scale=1, max_random_shear_ratio=0.0, min_random_scale=0.533, model_prefix=None, mom=0.9, monitor=0, network='resnet-v1', num_classes=1000, num_epochs=1, num_examples=1281167, num_layers=50, optimizer='sgd', pad_size=0, random_crop=1, random_mirror=1, rgb_mean='123.68,116.779,103.939', test_io=0, top_k=0, warmup_epochs=5, warmup_strategy='linear', wd=0.0001)\nINFO:root:Epoch[0] Batch [100]\tSpeed: 4407.30 samples/sec\taccuracy=0.001141\nINFO:root:Epoch[0] Batch [200]\tSpeed: 4444.77 samples/sec\taccuracy=0.003184\nINFO:root:Epoch[0] Batch [300]\tSpeed: 4395.88 samples/sec\taccuracy=0.006074\nINFO:root:Epoch[0] Batch [400]\tSpeed: 4384.70 samples/sec\taccuracy=0.011182\nINFO:root:Epoch[0] Batch [500]\tSpeed: 4389.42 samples/sec\taccuracy=0.017441\nINFO:root:Epoch[0] Batch [600]\tSpeed: 4382.10 samples/sec\taccuracy=0.026377\nINFO:root:Epoch[0] Batch [700]\tSpeed: 4388.26 samples/sec\taccuracy=0.036611\nINFO:root:Epoch[0] Batch [800]\tSpeed: 4383.51 samples/sec\taccuracy=0.047139\nINFO:root:Epoch[0] Batch [900]\tSpeed: 4402.73 samples/sec\taccuracy=0.057686\nINFO:root:Epoch[0] Batch [1000]\tSpeed: 4392.32 samples/sec\taccuracy=0.067861\nINFO:root:Epoch[0] Batch [1100]\tSpeed: 4384.42 samples/sec\taccuracy=0.079248\nINFO:root:Epoch[0] Batch [1200]\tSpeed: 4385.37 samples/sec\taccuracy=0.090088\nINFO:root:Epoch[0] Train-accuracy=0.098537\nINFO:root:Epoch[0] Time cost=295.153\nWARNING:root:DALI iterator does not support resetting while epoch is not finished. Ignoring...\nINFO:root:Epoch[0] Validation-accuracy=0.104393\n"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
]
] |
cbfc038dd926f3c81755a4e606fee92fa6fd11e6
| 63,353 |
ipynb
|
Jupyter Notebook
|
code/manipulate_regonline_output.ipynb
|
mattgiguere/EPRV
|
238b9e8ca8b3086ea569b5b4a387e1a204202080
|
[
"MIT"
] | null | null | null |
code/manipulate_regonline_output.ipynb
|
mattgiguere/EPRV
|
238b9e8ca8b3086ea569b5b4a387e1a204202080
|
[
"MIT"
] | null | null | null |
code/manipulate_regonline_output.ipynb
|
mattgiguere/EPRV
|
238b9e8ca8b3086ea569b5b4a387e1a204202080
|
[
"MIT"
] | null | null | null | 30.949194 | 364 | 0.485802 |
[
[
[
"#manipulate_regonline_output\n\nThis notebook reads the RegOnline output into a pandas DataFrame and reworks it to have each row contain the attendee, the Doppler Primer Session, the Monday Breakout session, and the Tuesday breakout session in each row.",
"_____no_output_____"
]
],
[
[
"import re\nimport numpy as np\nimport pandas as pd\nimport matplotlib.pyplot as plt\nimport matplotlib.image as mpimg\nimport matplotlib",
"_____no_output_____"
],
[
"#%matplotlib inline",
"_____no_output_____"
]
],
[
[
"### Read the RegOnline output into a pandas DataFrame",
"_____no_output_____"
]
],
[
[
"df = pd.read_excel('/Users/matt/projects/EPRV/data/AttendeeReportCrop_20150703.xls', encoding='utf-8')",
"_____no_output_____"
],
[
"df.columns",
"_____no_output_____"
],
[
"df.loc[36:37]",
"_____no_output_____"
]
],
[
[
"### Extract the Sunday Sessions\n\nRegOnline outputs multiple entries for each person, and each entry differs by the `AgendaItem`. `AgendaItem`s exist for all sessions happening on all days. In this section, we extract the sessions happening on Sunday, which are all prefixed by \"Doppler Primer: \".",
"_____no_output_____"
]
],
[
[
"sundf = df[df['AgendaItem'].str.contains('Doppler Primer:')].copy()\nlen(sundf)",
"_____no_output_____"
]
],
[
[
"Let's create two new columns in our DataFrame: the `Primer`, and the `PrimerID`. The `Primer` column will contain the name of the Doppler Primer session (minus the `Doppler Primer: ` prefix), and the `PrimerID` will be a session identifier that will later be used in plotting.",
"_____no_output_____"
]
],
[
[
"sundf['PrimerID'] = 0",
"_____no_output_____"
],
[
"sundf['Primer'] = [re.search(r'(.*):\\s(.*)$', item).group(2) for item in sundf['AgendaItem']]",
"_____no_output_____"
],
[
"sundf[['AgendaItem', 'Primer']].head(3)",
"_____no_output_____"
],
[
"sundf['Primer'].unique()",
"_____no_output_____"
]
],
[
[
"Now loop through the five unique sessions, updating the `PrimerID` column for each participant:",
"_____no_output_____"
]
],
[
[
"dopID = 0\nfor agItem in sundf['Primer'].unique():\n sundf.loc[sundf['Primer'] == agItem, 'PrimerID'] = dopID\n dopID += 1",
"_____no_output_____"
]
],
[
[
"Create an abbreviated code for each session. This will be added to the nametag to spark conversation among participants.",
"_____no_output_____"
]
],
[
[
"sun_ses = ['IC', 'DC', 'SM', 'SA', 'NA']",
"_____no_output_____"
]
],
[
[
"A quick preview of the first few rows to see the result:",
"_____no_output_____"
]
],
[
[
"sundf[['AgendaItem', 'Primer', 'PrimerID']].head(4)",
"_____no_output_____"
]
],
[
[
"### Extract the Monday Sessions\n\nNow to do the same for the Monday sessions.",
"_____no_output_____"
]
],
[
[
"mondf = df[df['AgendaItem'].str.contains('Monday Break-out:')].copy()\nlen(mondf)",
"_____no_output_____"
],
[
"mondf['MonID'] = 0\n\nmondf['Monday'] = [re.search(r'(.*):\\s(.*)$', item).group(2) for item in mondf['AgendaItem']]\n\nmondf['Monday'].unique()\n\nmonID = 0\nfor agItem in mondf['Monday'].unique():\n mondf.loc[mondf['Monday'] == agItem, 'MonID'] = monID\n monID += 1",
"_____no_output_____"
],
[
"mondf['Monday'].unique()",
"_____no_output_____"
],
[
"mon_ses = ['FS', 'NA', 'TC', 'BC', 'FC']",
"_____no_output_____"
],
[
"mondf[['AgendaItem', 'Monday', 'MonID']].head(4)",
"_____no_output_____"
]
],
[
[
"### Extract Tuesday Sessions",
"_____no_output_____"
]
],
[
[
"tuedf = df[df['AgendaItem'].str.contains('Tuesday Break-out:')].copy()\nlen(tuedf)",
"_____no_output_____"
],
[
"tuedf['TueID'] = 0\n\ntuedf['Tuesday'] = [re.search(r'(.*):\\s(.*)$', item).group(2) for item in tuedf['AgendaItem']]\n\ntuedf['Tuesday'].unique()\n\ntuesID = 0\nfor agItem in tuedf['Tuesday'].unique():\n tuedf.loc[tuedf['Tuesday'] == agItem, 'TueID'] = tuesID\n tuesID += 1",
"_____no_output_____"
],
[
"tuedf['Tuesday'].unique()",
"_____no_output_____"
],
[
"tue_ses = ['ST', 'DC', 'LB', 'PS', 'NA']",
"_____no_output_____"
],
[
"tuedf[['AgendaItem', 'Tuesday', 'TueID']].head(4)",
"_____no_output_____"
]
],
[
[
"### Combine the DataFrames\n\nWe only need to join on one field. However, pandas does something weird, where it creates multiple `GroupId_x` columns when joining multiple times. The simple solution is just to join on multiple columns since we know they're all consistent.",
"_____no_output_____"
]
],
[
[
"fulldf = df[['RegId', 'GroupId', 'FirstName', 'LastName', 'Company']]",
"_____no_output_____"
],
[
"print(len(fulldf))\nfulldf = fulldf.drop_duplicates()\nprint(len(fulldf))\nprint(len(sundf))\nprint(len(mondf))\nprint(len(tuedf))",
"396\n137\n126\n135\n135\n"
],
[
"fulldf.columns",
"_____no_output_____"
],
[
"sundf.columns",
"_____no_output_____"
],
[
"newdf = pd.merge(fulldf, sundf, on=['RegId', 'GroupId', 'FirstName', 'LastName', 'Company'], how='left')\nprint(len(newdf))\n\nnewdf = pd.merge(newdf, mondf, on=['RegId', 'GroupId', 'FirstName', 'LastName', 'Company'], how='left')\nprint(len(newdf))\n\nnewdf = pd.merge(newdf, tuedf, on=['RegId', 'GroupId', 'FirstName', 'LastName', 'Company'], how='left')\nprint(len(newdf))",
"137\n137\n137\n"
],
[
"newdf.head(5)",
"_____no_output_____"
],
[
"newdf.columns",
"_____no_output_____"
]
],
[
[
"Now create a new DataFrame that is a subset of the `newdf` with only the columns of interest. Also, make sure the DataFrame is sorted by lastname, the index is reset, and it's a copy of `newdf` instead of a pointer to `newdf`.",
"_____no_output_____"
]
],
[
[
"finaldf = newdf[['FirstName', 'LastName', 'Company', 'Primer', 'PrimerID', 'Monday', 'MonID', 'Tuesday', 'TueID']].sort('LastName').reset_index().copy()",
"_____no_output_____"
],
[
"finaldf.head(5)",
"_____no_output_____"
],
[
"len(finaldf)",
"_____no_output_____"
],
[
"finaldf.columns",
"_____no_output_____"
]
],
[
[
"Now replace all empty cells for \"Company\" to a very general location:",
"_____no_output_____"
]
],
[
[
"finaldf.Company = ['Earth' if pd.isnull(company_el) else company_el for company_el in finaldf.Company]",
"_____no_output_____"
]
],
[
[
"Replace NaNs for PrimerID with the \"Not Attending\" ID:",
"_____no_output_____"
]
],
[
[
"finaldf.PrimerID = [4 if pd.isnull(primerid_el) else primerid_el for primerid_el in finaldf.PrimerID]",
"_____no_output_____"
]
],
[
[
"Check for NaNs in the Monday ID:",
"_____no_output_____"
]
],
[
[
"len(finaldf[pd.isnull(finaldf['MonID'])])",
"_____no_output_____"
]
],
[
[
"Replace NaNs for the MonID with the \"Not Attending\" ID:",
"_____no_output_____"
]
],
[
[
"finaldf.MonID = [4 if pd.isnull(monid_el) else monid_el for monid_el in finaldf.MonID]",
"_____no_output_____"
],
[
"len(finaldf[pd.isnull(finaldf['MonID'])])",
"_____no_output_____"
]
],
[
[
"Replace NaNs for the TueID with the \"Not Attending\" ID:",
"_____no_output_____"
]
],
[
[
"len(finaldf[pd.isnull(finaldf['TueID'])])",
"_____no_output_____"
],
[
"finaldf.TueID = [4 if pd.isnull(tueid_el) else tueid_el for tueid_el in finaldf.TueID]",
"_____no_output_____"
],
[
"len(finaldf[pd.isnull(finaldf['TueID'])])",
"_____no_output_____"
]
],
[
[
"Test out the wrap-around text for institute for participants that have long institution names. This regular expression will look for institutions (or Companies, as RegOnline refers to them), and find items that have a '/', and if no '/', either a '-', ',', or 'at' in the text. If so, add a newline character to make the text wrap around to the next line.\n\nWe'll first test the output on a participant's institution that contains both a '/' and a '-':",
"_____no_output_____"
]
],
[
[
"p = re.compile ('(/|^(?!.*/).*-|^(?!.*/).*,|^(?!.*/).*\\sat\\s)')\np.subn(r'\\1\\n', finaldf.loc[2].Company)[0]",
"_____no_output_____"
]
],
[
[
"And test a cell that is long, contains `at`, but `at` is part of a longer word:",
"_____no_output_____"
]
],
[
[
"p.subn(r'\\1\\n', finaldf.loc[53].Company)[0]",
"_____no_output_____"
]
],
[
[
"And a quick test on a few more institutions:",
"_____no_output_____"
]
],
[
[
"[p.sub(r'\\1\\n', company_el) if len(company_el) > 30 else company_el for company_el in finaldf.head(5).Company.values]",
"_____no_output_____"
]
],
[
[
"Now update the full `Company` column of the DataFrame:",
"_____no_output_____"
]
],
[
[
"finaldf.Company = [p.sub(r'\\1\\n', company_el) if len(company_el) > 30 else company_el for company_el in finaldf.Company.values]",
"_____no_output_____"
]
],
[
[
"## Plot Labels\n\nNow that we have our DataFrame cleaned up the way we want it we can print the data to the Avery 5392 format. This format contains 6 4\"x3\" nametags per sheet.",
"_____no_output_____"
]
],
[
[
"png = mpimg.imread('/Users/matt/projects/EPRV/images/NameTag2.png')",
"_____no_output_____"
],
[
"png.shape",
"_____no_output_____"
],
[
"import matplotlib.font_manager as mfm\nfontpaths = fontpaths=['/System/Library/Fonts/',\n '/Library/Fonts',\n '/Library/Fonts/Microsoft',\n '/usr/X11/lib/X11/fonts',\n '/opt/X11/share/fonts',\n '/Users/matt/Library/Fonts']\n\nblaa = mfm.findSystemFonts(fontpaths=fontpaths)",
"_____no_output_____"
],
[
"colors = ['#FFE2A9', '#4BA4D8', '#768085', '#BF5338', '#335B8F']\ncolors2 = ['#335B8F', '#BF5338', '#768085', '#4BA4D8', '#FFE2A9']\ncolors3 = ['#4BA4D8', '#FFE2A9', '#BF5338', '#768085', '#335B8F']\n\ncirc_ypos = 775\nname_dict = {'family': 'YaleNew-Roman',\n 'color': '#D6E8E1',\n 'weight': 'bold',\n 'size': 28\n }\n\ncompany_dict = {'family': 'YaleNew-Roman',\n 'color': '#D6E8E1',\n 'weight': 'bold',\n 'size': 16\n }\n\ncircle_dict = {'family': 'YaleNew-Roman',\n 'color': '#1D2523',\n 'weight': 'normal',\n 'size': 20\n }\n\n\ndef change_name_size(name, name_dict):\n if len(name) < 16:\n name_dict['size'] = 28\n elif ((len(name) >= 16) and (len(name) < 19)):\n name_dict['size'] = 24\n elif ((len(name) >= 19) and (len(name) < 24)):\n name_dict['size'] = 20\n elif ((len(name) >= 24) and (len(name) < 30)):\n name_dict['size'] = 17\n else:\n name_dict['size'] = 16\n return name_dict\n \n\ndef change_company_size(company, company_dict):\n newlines = len(re.findall(r'\\n', finaldf.loc[0].Company))\n if newlines == 0:\n if len(company) < 15:\n company_dict['size'] = 18\n elif ((len(company) >= 15) and (len(company) < 30)):\n company_dict['size'] = 14\n elif ((len(company) >= 30) and (len(company) < 40)):\n company_dict['size'] = 12\n elif ((len(company) >= 40) and (len(company) < 50)):\n company_dict['size'] = 10\n else:\n company_dict['size'] = 8\n else:\n if len(company) < 15:\n company_dict['size'] = 18\n elif ((len(company) >= 15) and (len(company) < 40)):\n company_dict['size'] = 14\n elif ((len(company) >= 40) and (len(company) < 50)):\n company_dict['size'] = 12\n else:\n company_dict['size'] = 10\n return company_dict\n \n\n# The HP Color LaserJet CP4020 offsets things by 1/16th of an inch left-to-right.\n# This fudge factor should fix that:\nhrz_fdg = 1. / 16./ 8.5\nleftarr = np.array([0.0294, 0.5, 0.0294, 0.5, 0.0294, 0.5]) + hrz_fdg\nbottomarr = [0.091, 0.091, 0.364, 0.364, 0.637, 0.637]\nwidth = 0.4706\nheight = 0.273\n\n# loop through the total number of pages:\nfor page in range(int(np.ceil((len(finaldf))/6.))):\n print('Now on page: {}'.format(page))\n fig = plt.figure(figsize=(8.5, 11))\n for indx in range(6):\n # add an if statement to handle the last page if there are less than\n # six participants remaining:\n if ((page*6 + indx) < len(finaldf)):\n rect = [leftarr[indx], bottomarr[indx], width, height]\n ax = fig.add_axes(rect)\n ax.imshow(png)\n ax.get_xaxis().set_visible(False)\n ax.get_yaxis().set_visible(False)\n\n print(u'Now making name tag for: {} {}'.format(finaldf.loc[page*6 + indx].FirstName, finaldf.loc[page*6 + indx].LastName))\n \n #add name text:\n name = finaldf.loc[page*6 + indx].FirstName + ' ' + finaldf.loc[page*6 + indx].LastName \n this_name_dict = change_name_size(name, name_dict)\n ax.text(600, 500, name, fontdict=this_name_dict, horizontalalignment='center')\n\n #add company text:\n company = finaldf.loc[page*6 + indx].Company\n this_co_dict = change_company_size(company, company_dict)\n ax.text(600, 625, company, fontdict=this_co_dict, horizontalalignment='center')\n\n #add circles for sessions:\n circ1 = plt.Circle((750, circ_ypos), 70, color=colors[int(finaldf.loc[page*6 + indx].PrimerID)])\n fig.gca().add_artist(circ1)\n ax.text(750, circ_ypos + 27.5, sun_ses[int(finaldf.loc[page*6 + indx].PrimerID)], fontdict=circle_dict, horizontalalignment='center')\n\n circ2 = plt.Circle((925, circ_ypos), 70, color=colors2[int(finaldf.loc[page*6 + indx].MonID)])\n fig.gca().add_artist(circ2)\n ax.text(925, circ_ypos + 27.5, mon_ses[int(finaldf.loc[page*6 + indx].MonID)], fontdict=circle_dict, horizontalalignment='center')\n\n circ3 = plt.Circle((1100, circ_ypos), 70, color=colors3[int(finaldf.loc[page*6 + indx].TueID)])\n fig.gca().add_artist(circ3)\n ax.text(1100, circ_ypos + 27.5, tue_ses[int(finaldf.loc[page*6 + indx].TueID)], fontdict=circle_dict, horizontalalignment='center')\n\n plt.savefig('../nametags/nameTags_bold_p'+str(page)+'.png', dpi=300)\n",
"Now on page: 0\nNow making name tag for: Arthur Adams\nNow making name tag for: Rachel Akeson\nNow making name tag for: Guillem Anglada-Escude\nNow making name tag for: Ruth Angus\nNow making name tag for: Pamela Arriagada\nNow making name tag for: Mariona Badenas\nNow on page: 1\nNow making name tag for: Roman Baluev\nNow making name tag for: Fabienne Bastien\nNow making name tag for: Ozgur Basturk\nNow making name tag for: Sarbani Basu\nNow making name tag for: Florian Bauer\nNow making name tag for: Jacob Bean\nNow on page: 2\nNow making name tag for: Thomas Beatty\nNow making name tag for: Eric Bechter\nNow making name tag for: Megan Bedell\nNow making name tag for: Sagi Ben-Ami\nNow making name tag for: Cullen Blake\nNow making name tag for: Andreas Boesch\nNow on page: 3\nNow making name tag for: Adam Bolton\nNow making name tag for: Francesco Borsa\nNow making name tag for: Francois Bouchy\nNow making name tag for: Brendan Bowler\nNow making name tag for: Tabetha Boyajian\nNow making name tag for: John Brewer\nNow on page: 4\nNow making name tag for: Lars A. Buchhave\nNow making name tag for: Jennifer Burt\nNow making name tag for: Richard Capps\nNow making name tag for: Maria Federica Cersullo\nNow making name tag for: Abhijit Chakraborty\nNow making name tag for: Jessi Cisewski\nNow on page: 5\nNow making name tag for: William Cochran\nNow making name tag for: Uriel Conod\nNow making name tag for: Matthew Cornachione\nNow making name tag for: Justin Crepp\nNow making name tag for: Mario Damasso\nNow making name tag for: Allen Davis\nNow on page: 6\nNow making name tag for: Rebekah Dawson\nNow making name tag for: Fabio Del Sordo\nNow making name tag for: Rodrigo Diaz\nNow making name tag for: Scott Diddams\nNow making name tag for: Courtney Dressing\nNow making name tag for: Xavier Dumusque\nNow on page: 7\nNow making name tag for: Jason Eastman\nNow making name tag for: Michael Endl\nNow making name tag for: Joรฃo Faria\nNow making name tag for: Tobias Feger\nNow making name tag for: Pedro Figueira\nNow making name tag for: Debra Fischer\nNow on page: 8\nNow making name tag for: Eric Ford\nNow making name tag for: Daniel Foreman-Mackey\nNow making name tag for: BJ Fulton\nNow making name tag for: Gabor Furesz\nNow making name tag for: Peter Gao\nNow making name tag for: Bernard Gaudi\nNow on page: 9\nNow making name tag for: Matteo Genoni\nNow making name tag for: Paolo Giacobbe\nNow making name tag for: Matt Giguere\nNow making name tag for: Steve Girvin\nNow making name tag for: Erica Gonzales\nNow making name tag for: Frank Grundahl\nNow on page: 10\nNow making name tag for: Guillaume HEBRARD\nNow making name tag for: Sam Halverson\nNow making name tag for: Artie Hatzes\nNow making name tag for: Raphaelle Haywood\nNow making name tag for: Guillaume Hebrard\nNow making name tag for: Enrique Herrero\nNow on page: 11\nNow making name tag for: David Hogg\nNow making name tag for: Joshua Hopgood\nNow making name tag for: Andrew Howard\nNow making name tag for: John Johnson\nNow making name tag for: Paul Jorden\nNow making name tag for: Colby Jurgenson\nNow on page: 12\nNow making name tag for: Marco Landoni\nNow making name tag for: Antonino Francesco Lanza\nNow making name tag for: David Latham\nNow making name tag for: Gregory Laughlin\nNow making name tag for: Christophe Lovis\nNow making name tag for: Zaira M. Berdiรฑas\nNow on page: 13\nNow making name tag for: Bo Ma\nNow making name tag for: Gregory Mace\nNow making name tag for: Suvrath Mahadevan\nNow making name tag for: Luca Malavolta\nNow making name tag for: Rosemary Mardling\nNow making name tag for: Tyler McCracken\nNow on page: 14\nNow making name tag for: Nate McCrady\nNow making name tag for: Michael Mossman\nNow making name tag for: Ati Motalebi\nNow making name tag for: Claire Moutou\nNow making name tag for: Benjamin Nelson\nNow making name tag for: Grzegorz Nowak\nNow on page: 15\nNow making name tag for: Nikhil Padmanabhan\nNow making name tag for: Hannu Parviainen\nNow making name tag for: Francesco Pepe\nNow making name tag for: Mario Perez\nNow making name tag for: David Phillips\nNow making name tag for: Peter Plavchan\nNow on page: 16\nNow making name tag for: Lisa Prato\nNow making name tag for: Sam Quinn\nNow making name tag for: Andreas Quirrenbach\nNow making name tag for: Jayadev Rajagopal\nNow making name tag for: Vinesh Rajpaul\nNow making name tag for: Gert Raskin\nNow on page: 17\nNow making name tag for: Ansgar Reiners\nNow making name tag for: Paul Robertson\nNow making name tag for: Albert Rosich\nNow making name tag for: Arpita Roy\nNow making name tag for: Nuno Santos\nNow making name tag for: Luis Fernando Sarmiento\nNow on page: 18\nNow making name tag for: Bun'ei Sato\nNow making name tag for: David Sawyer\nNow making name tag for: Joseph Schmitt\nNow making name tag for: Christian Schwab\nNow making name tag for: Andreas Seifahrt\nNow making name tag for: Evan Sinukoff\nNow on page: 19\nNow making name tag for: David Sliski\nNow making name tag for: Scott Smith\nNow making name tag for: Alessandro Sozzetti\nNow making name tag for: Gudmundur Stefansson\nNow making name tag for: Klaus Strassmeier\nNow making name tag for: Julian Stuermer\nNow on page: 20\nNow making name tag for: Andrew Szentgyorgyi\nNow making name tag for: Damien Sรฉgransan\nNow making name tag for: Angelle Tanner\nNow making name tag for: Ryan Terrien\nNow making name tag for: Renรฉ Tronsgaard Rasmussen\nNow making name tag for: Stephane Udry\nNow on page: 21\nNow making name tag for: Jeffrey Valenti\nNow making name tag for: Sharon Xuesong Wang\nNow making name tag for: Ji Wang\nNow making name tag for: Michael Weber\nNow making name tag for: Lauren Weiss\nNow making name tag for: Robert Wittenmyer\nNow on page: 22\nNow making name tag for: Jason Wright\nNow making name tag for: Mesut YILMAZ\nNow making name tag for: Xu Yi\nNow making name tag for: Mesut Yilmaz\nNow making name tag for: Mathias Zechmeister\n"
],
[
"finaldf.columns",
"_____no_output_____"
],
[
"finaldf.FirstName.values",
"_____no_output_____"
],
[
"finaldf.LastName.values",
"_____no_output_____"
],
[
"hrz_fdg = 1. / 16./ 8.5\nleftarr = np.array([0.0294, 0.5, 0.0294, 0.5, 0.0294, 0.5])\n",
"_____no_output_____"
],
[
"leftarr + hrz_fdg",
"_____no_output_____"
]
]
] |
[
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code",
"markdown",
"code"
] |
[
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code"
],
[
"markdown"
],
[
"code",
"code",
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code"
],
[
"markdown"
],
[
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code",
"code"
]
] |
cbfc062b6e15485602e7ae08382f0f1fae22a6ef
| 16,219 |
ipynb
|
Jupyter Notebook
|
Reports/0124-Using_Optimization_in_Hyperparameter_settings_in_Deep_Learning.ipynb
|
sille1994/Optimization-of-Hyperparameters-in-a-Convolution-Neural-Network
|
5976467bfa35e2679c0fa85c73187c6b6381d115
|
[
"MIT"
] | null | null | null |
Reports/0124-Using_Optimization_in_Hyperparameter_settings_in_Deep_Learning.ipynb
|
sille1994/Optimization-of-Hyperparameters-in-a-Convolution-Neural-Network
|
5976467bfa35e2679c0fa85c73187c6b6381d115
|
[
"MIT"
] | null | null | null |
Reports/0124-Using_Optimization_in_Hyperparameter_settings_in_Deep_Learning.ipynb
|
sille1994/Optimization-of-Hyperparameters-in-a-Convolution-Neural-Network
|
5976467bfa35e2679c0fa85c73187c6b6381d115
|
[
"MIT"
] | 1 |
2020-09-25T16:17:14.000Z
|
2020-09-25T16:17:14.000Z
| 52.830619 | 1,958 | 0.684444 |
[
[
[
"# <center>Using Optimization in Hyperparameter settings in Deep Learning</center>\n\n<center>by Cecilie Dura Andrรฉ</center>\n\n\n\n\n<img src=\"https://blog.ml.cmu.edu/wp-content/uploads/2018/12/heatmap.001-min.jpeg\" width=\"90%\">\n<p style=\"text-align: right;\">Image from: https://blog.ml.cmu.edu/2018/12/12/massively-parallel-hyperparameter-optimization/</p>\n",
"_____no_output_____"
],
[
"---",
"_____no_output_____"
],
[
"In Deep Learning people often has to explore network structure, regularization, and optimization to get the best model. Thus, automated hyperparameter optimization (HPO) is needed and it is shown that tailed solution to a problem leads to the state-of-the-art performance of the model (Feurer and Hutter 2019). It also leads to fair comparisons of the model with different hyperparameters, thus the reproducibility of the studies would become better (J Bergstra, Yamins, and Cox).\n\nThe most basic HPO method is called Grid search or full factorial design. Here the user selects a couple of given values for each of the hyperparameters. Then, a grid search can be used to evaluate the Cartesian product. This requires a lot of computational memory and this model also suffers from the curse of dimensionality, when the number of hyperparameters becomes too big (โDesign and Analysis of Experiments by Douglas Montgomery: A Supplement for Using JMPโ 2013).\n\nAlternatively to grid search there is also something called random search (James Bergstra and Bengio). Here a selected finite search space is given and the model can randomly select values for the hyperparameters within the search space. This method is preferred over grid search if one of the hyperparameters are more important than others (Hutter, Hoos, and Leyton-Brown). This method will often with time fine the optimum, but it takes a longer time than guided search methods.\n\nPopulation-based methods, e.g. genetic algorithms, evolutionary algorithms, and evolutionary strategies, can also be used. Here a set of configurations is maintained. Small changes and different combinations are used to find a better configuration.\n\nBayesian optimization is the preferred method for HPO in tuning deep neural networks. By using Bayesian optimization in deep learning state-of-the-art results have been seen in image classification (Snoek, Larochelle, and Adams), (Snoek et al. 2015). Bayesian optimization is an iterative model, which first calculates the probabilistic surrogate model fitted to all observations. An acquisition function then determines different points, trade-offs, e.g.",
"_____no_output_____"
],
[
"---\n# References\n\nBergstra, James, and Yoshua Bengio. โRandom Search for Hyper-Parameter Optimization,โ 25.\n\nBergstra, J, D Yamins, and D D Cox. โMaking a Science of Model Search: Hyperparameter Optimization in Hundreds of Dimensions for Vision Architectures,โ 9. โDesign and Analysis of Experiments by Douglas Montgomery: A Supplement for Using JMP.โ 2013, 26.\n\nFeurer, Matthias, and Frank Hutter. 2019. โHyperparameter Optimization.โ In Automated Machine Learning, edited by Frank Hutter, Lars Kotthoff, and Joaquin Vanschoren, 3โ33. Cham: Springer International Publishing. http://link.springer.com/10.1007/978-3-030-05318-5_1.\n\nHutter, Frank, Holger Hoos, and Kevin Leyton-Brown. โAn Efficient Approach for Assessing Hyperparameter Importance,โ 9.\n\nSnoek, Jasper, Hugo Larochelle, and Ryan P Adams. โPractical Bayesian Optimization of Machine Learning Algorithms,โ 9.\n\nSnoek, Jasper, Oren Rippel, Kevin Swersky, Ryan Kiros, Nadathur Satish, Narayanan Sundaram, Md Mostofa Ali Patwary, \n\nPrabhat, and Ryan P. Adams. 2015. โScalable Bayesian Optimization Using Deep Neural Networks.โ arXiv:1502.05700 [stat], July. http://arxiv.org/abs/1502.05700.",
"_____no_output_____"
]
]
] |
[
"markdown"
] |
[
[
"markdown",
"markdown",
"markdown",
"markdown"
]
] |
Subsets and Splits
No community queries yet
The top public SQL queries from the community will appear here once available.