{ "cells": [ { "cell_type": "markdown", "metadata": {}, "source": [ "# Vision Transformer" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "原論文 \n", "https://openreview.net/forum?id=YicbFdNTTy" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## アルゴリズムの理解" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "ViT は大きく以下の 3 つの部分で構成されている(画像クラス分類の場合)\n", "\n", "- Input Layer\n", " - 入力(画像)をパッチに分割\n", " - 「クラストークン」 と 「パッチ」 のベクトルを出力\n", "- Encoder\n", " - Self-Attention 処理\n", " - 「クラストークン」 を出力\n", "- MLPHead\n", " - 入力画像に対するラベルを予測(クラス分類器)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_01](image/vit_01.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Input Layer" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "1. パッチに分割\n", "2. 埋め込み (Embedding)\n", "3. CLS (Class Token)\n", "4. 位置埋め込み (Positional Embedding)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "説明のため パッチ 4 つ で図解" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- パッチに分割" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_patch](image/vit_patch.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- 埋め込み" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_patch2emb](image/vit_patch2emb.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- クラストークンと位置埋め込み" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_patch2clsemb](image/vit_patch2clsemb.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- 実装" ] }, { "cell_type": "code", "execution_count": 2, "metadata": { "gather": { "logged": 1667634927256 } }, "outputs": [], "source": [ "import torch\n", "import torch.nn as nn\n", "import torch.nn.functional as F" ] }, { "cell_type": "code", "execution_count": 3, "metadata": { "gather": { "logged": 1667634927440 } }, "outputs": [], "source": [ "class VitInputLayer(nn.Module):\n", " def __init__(self,\n", " in_channels:int=3,\n", " emb_dim:int=384,\n", " nump_patch_row:int=2,\n", " image_size:int=32):\n", "\n", " super(VitInputLayer, self).__init__()\n", " self.in_channels = in_channels\n", " self.emb_dim = emb_dim\n", " self.num_patch_row = nump_patch_row\n", " self.image_size = image_size\n", "\n", " # パッチ数\n", " self.num_patch = self.num_patch_row**2\n", " # パッチのサイズ : 画像 1 辺が 32 なら patch_size = 16\n", " self.patch_size = int(self.image_size // self.num_patch_row)\n", "\n", " # 入力画像のパッチ分割・埋め込み\n", " self.patch_emb_layer = nn.Conv2d(\n", " in_channels=self.in_channels,\n", " out_channels=self.emb_dim,\n", " kernel_size=self.patch_size,\n", " stride=self.patch_size)\n", "\n", " # CLS\n", " self.cls_token = nn.Parameter(torch.randn(1, 1, emb_dim))\n", "\n", " # Position Embedding\n", " # CLS が先頭に結合されているため長さ emb_dim の位置埋め込みベクトルを(パッチ数 +1)個用意\n", " self.pos_emb = nn.Parameter(torch.randn(1, self.num_patch+1, emb_dim))\n", "\n", " def forward(self, x:torch.Tensor) -> torch.Tensor:\n", " # パッチの埋め込み & flatten\n", "\n", " ## Patch の埋め込み (B, C, H, W) -> (B, D, H/P, W/P)\n", " z_0 = self.patch_emb_layer(x)\n", "\n", " ## パッチの flatten (B, D, H/P, W/P) -> (B, D, Np)\n", " ## Np はパッチの数 (=H*W/P^2)\n", " z_0 = z_0.flatten(2)\n", "\n", " ## 軸の入れ替え\n", " z_0 = z_0.transpose(1, 2)\n", "\n", " # パッチの埋め込みの先頭に CLS を結合\n", " ## (B, Np, D) -> (B, N, D) : N = (Np + 1)\n", " ## cls_token は (1, 1, D) なので repeat で (B, 1, D) に変換(複製)して結合する\n", " z_0 = torch.cat([self.cls_token.repeat(repeats=(x.size(0), 1, 1)), z_0], dim=1)\n", "\n", " # Position Embedding の加算\n", " ## (B, N, D) -> (B, N, D)\n", " z_0 = z_0 + self.pos_emb\n", "\n", " return z_0" ] }, { "cell_type": "code", "execution_count": 4, "metadata": { "gather": { "logged": 1667634927589 } }, "outputs": [ { "data": { "text/plain": [ "torch.Size([1, 5, 384])" ] }, "execution_count": 4, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# check\n", "x = torch.randn(1, 3, 32, 32)\n", "input_layer = VitInputLayer()\n", "z_0 = input_layer(x)\n", "z_0.shape" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Encoder\n", "#### Self-Attention(自己注意)\n", "1. パッチ内の情報の抽出\n", " - -> 埋め込み\n", "2. 自分との類似度測定\n", " - -> ベクトル同士の内積\n", "3. 類似度に基づいた合体\n", " - -> 内積の値を係数にした加重和\n", " - -> 係数 : 内積の Softmax で算出\n", " - -> 加重和 : Attention Weight" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Self-Attention のイメージ" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_self_attention_image](image/vit_self_attention_image.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "Self-Attention でも、埋め込みによって情報の抽出を行う。 \n", "線形層を 3 つ用意し、それぞれの線形層で埋め込んだあとの各ベクトルを以下のように呼ぶ。\n", "\n", "- q (query)\n", "- k (key)\n", "- v (value)\n", "\n", "q, k, v ともに全く同じベクトルから埋め込んだ結果だが、それぞれ異なる線形層を用いて埋め込まれているため、異なる値を取る。 \n", "\n", "q, k, v に分ける表現は動画サイトの動画検索に例えるとわかりやすい。\n", "\n", "- q : 検索キーワード\n", "- k : 動画タイトル\n", "- v : 動画\n", "\n", "検索キーワードから動画を検索する際は、検索キーワードと動画のタイトルの一致度を見る。 \n", "Self-Attention も同様に、q, k の類似度を計算し、その類似度をもとに v の加重和を行う。\n", "\n", "内積 -> 行列積 (ソフトマックスで正規化) -> 類似度になる。" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_self_attention](image/vit_self_attention.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Multi-Head Self-Attention\n", "パッチ同士の関係は 1 つの Attension Weight が保持している。 \n", "この Attention Weight が複数あれば、各パッチ間の関係を、Attention Weight の数だけ学習できる。 \n", "1 つのパッチから複数個の q, k, v を埋め込み、複数の Attention Weight を獲得すれば良い。 \n", "ハイパーパラメータ「ヘッドの数」で指定する。" ] }, { "cell_type": "code", "execution_count": 5, "metadata": { "gather": { "logged": 1667634927869 } }, "outputs": [], "source": [ "class MultiHeadSelfAttention(nn.Module):\n", " def __init__(self,\n", " emb_dim:int=384,\n", " head:int=3,\n", " dropout:float=0.):\n", "\n", " super(MultiHeadSelfAttention, self).__init__()\n", " self.head = head\n", " self.emb_dim = emb_dim\n", " self.head_dim = emb_dim // head\n", " # D_h の 二乗根:qk^T を割るための係数\n", " self.sqrt_dh = self.head**0.5\n", "\n", " # 入力を query, key, value に埋め込むための線形層\n", " self.w_q = nn.Linear(emb_dim, emb_dim, bias=False)\n", " self.w_k = nn.Linear(emb_dim, emb_dim, bias=False)\n", " self.w_v = nn.Linear(emb_dim, emb_dim, bias=False)\n", "\n", " # Dropout\n", " self.attn_drop = nn.Dropout(dropout)\n", "\n", " # MHSA の結果を出力に埋め込むための線形層\n", " self.w_o = nn.Sequential(\n", " nn.Linear(emb_dim, emb_dim),\n", " nn.Dropout(dropout))\n", "\n", " def forward(self, z:torch.Tensor) -> torch.Tensor:\n", " batch_size, num_patch, _ = z.size()\n", "\n", " # 埋め込み:(B, N, D) -> (B, N, D)\n", " q = self.w_q(z)\n", " k = self.w_k(z)\n", " v = self.w_v(z)\n", "\n", " # (q, k, v) を head に分ける\n", " ## まずベクトルを head の個数に分ける\n", " ## (B, N, D) -> (B, N, h, D//h)\n", " q = q.view(batch_size, num_patch, self.head, self.head_dim)\n", " k = k.view(batch_size, num_patch, self.head, self.head_dim)\n", " v = v.view(batch_size, num_patch, self.head, self.head_dim)\n", " ## Self-Attention ができるように(バッチサイズ、ヘッド、トークン数、バッチのベクトル)の形状にする\n", " ## (B, N, h, D//h) -> (B, h, N, D///h)\n", " q = q.transpose(1, 2)\n", " k = k.transpose(1, 2)\n", " v = v.transpose(1, 2)\n", "\n", " # 内積\n", " ## (B, h, N, D//h) -> (B, h, D//h, N)\n", " k_T = k.transpose(2, 3)\n", " ## (B, h, N, D//h) x (B, h, D//h, N) -> (B, h, N, N)\n", " dots = (q @ k_T) / self.sqrt_dh\n", " ## 列方向にソフトマックス\n", " attn = F.softmax(dots, dim=-1)\n", " attn = self.attn_drop(attn)\n", "\n", " # 加重和\n", " ## (B, h, N, N) x (B, h, N, D//h) -> (B, h, N, D//h)\n", " out = attn @ v\n", " ## (B, h, N, D//h) -> (B, N, h, D//h)\n", " out = out.transpose(1, 2)\n", " ## (B, N, h, D//h) -> (B, N, D)\n", " out = out.reshape(batch_size, num_patch, self.emb_dim)\n", "\n", " # 出力層\n", " ## (B, N, D) -> (B, N, D)\n", " out = self.w_o(out)\n", "\n", " return out" ] }, { "cell_type": "code", "execution_count": 6, "metadata": { "gather": { "logged": 1667634928013 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "torch.Size([1, 5, 384])\n", "torch.Size([1, 5, 384])\n" ] } ], "source": [ "# check\n", "print(z_0.shape)\n", "mhsa = MultiHeadSelfAttention()\n", "out = mhsa(z_0)\n", "print(out.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### Encoder Block" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- LayerNormalization\n", "- Multi-Head Self-Attention\n", "- MLP(活性化関数:GERU)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_encoder_block](image/vit_encoder_block.png)" ] }, { "cell_type": "code", "execution_count": 7, "metadata": { "gather": { "logged": 1667634928152 } }, "outputs": [], "source": [ "class VitEncoderBlock(nn.Module):\n", " def __init__(self,\n", " emb_dim:int=384,\n", " head:int=8,\n", " hidden_dim:int=384*4,\n", " dropout:float=0.\n", " ):\n", "\n", " super(VitEncoderBlock, self).__init__()\n", "\n", " # 1 つ目の LayerNorm\n", " self.ln1 = nn.LayerNorm(emb_dim)\n", " # mhsa\n", " self.msa = MultiHeadSelfAttention(\n", " emb_dim=emb_dim,\n", " head=head,\n", " dropout=dropout\n", " )\n", "\n", " # 2 つ目の LayerNorm\n", " self.ln2 = nn.LayerNorm(emb_dim)\n", " # MLP\n", " self.mlp = nn.Sequential(\n", " nn.Linear(emb_dim, hidden_dim),\n", " nn.GELU(),\n", " nn.Dropout(dropout),\n", " nn.Linear(hidden_dim, emb_dim),\n", " nn.Dropout(dropout)\n", " )\n", "\n", " def forward(self, z:torch.Tensor) -> torch.Tensor:\n", " # Encoder Block の前半\n", " out = self.msa(self.ln1(z)) + z\n", " # Encoder Block の後半\n", " out = self.mlp(self.ln2(out)) + out\n", "\n", " return out" ] }, { "cell_type": "code", "execution_count": 8, "metadata": { "gather": { "logged": 1667634928319 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "torch.Size([1, 5, 384])\n" ] } ], "source": [ "# check\n", "vit_enc = VitEncoderBlock()\n", "z_1 = vit_enc(z_0)\n", "print(z_1.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "### ViT 全体" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- Input Layer\n", "- Encoder\n", "- MLP Head" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit_mlp](image/vit_mlp.png)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- 全体像" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "![vit](image/vit.png)" ] }, { "cell_type": "code", "execution_count": 9, "metadata": { "gather": { "logged": 1667634928446 } }, "outputs": [], "source": [ "class Vit(nn.Module):\n", " def __init__(self,\n", " in_channels:int=3,\n", " num_classes:int=10,\n", " emb_dim:int=384,\n", " num_patch_row:int=2,\n", " image_size:int=32,\n", " num_blocks:int=7,\n", " head:int=8,\n", " hidden_dim:int=384*4,\n", " dropout:float=0.\n", " ):\n", "\n", " super(Vit, self).__init__()\n", "\n", " # Input Layer\n", " self.input_layer = VitInputLayer(\n", " in_channels,\n", " emb_dim,\n", " num_patch_row,\n", " image_size)\n", "\n", " # Encoder (Encoder Block の多段)\n", " self.encoder = nn.Sequential(*[\n", " VitEncoderBlock(\n", " emb_dim=emb_dim,\n", " head=head,\n", " hidden_dim=hidden_dim,\n", " dropout=dropout\n", " )\n", " for _ in range(num_blocks)])\n", "\n", " # MLP Head\n", " self.mlp_head = nn.Sequential(\n", " nn.LayerNorm(emb_dim),\n", " nn.Linear(emb_dim, num_classes)\n", " )\n", "\n", " def forward(self, x:torch.Tensor) -> torch.Tensor:\n", " # Input Layer\n", " ## (B, C, H, W) -> (B, N, D)\n", " ## N: トークン数(パッチ数 +1)D: ベクトルの長さ\n", " out = self.input_layer(x)\n", "\n", " # Encoder\n", " ## (B, N, D) -> (B, N, D)\n", " out = self.encoder(out)\n", "\n", " # クラストークンのみ抜き出す\n", " ## (B, N, D) -> (B, D)\n", " cls_token = out[:, 0]\n", "\n", " # MLP Head\n", " ## (B, D) -> (B, M)\n", " pred = self.mlp_head(cls_token)\n", "\n", " return pred" ] }, { "cell_type": "code", "execution_count": 10, "metadata": { "gather": { "logged": 1667634928590 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "torch.Size([1, 10])\n" ] } ], "source": [ "# check\n", "x = torch.randn(1, 3, 32, 32)\n", "vit = Vit()\n", "pred = vit(x)\n", "print(pred.shape)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 犬猫画像分類を試し" ] }, { "cell_type": "code", "execution_count": 11, "metadata": { "gather": { "logged": 1667634928891 } }, "outputs": [], "source": [ "import os\n", "import random\n", "import numpy as np\n", "import pandas as pd\n", "\n", "# torch\n", "import torch\n", "from torch import nn\n", "from torch.optim import Adam\n", "from torch.optim.optimizer import Optimizer\n", "from torch.utils import data\n", "\n", "# torchvision\n", "from torchvision import transforms as T\n", "\n", "# scikit-learn\n", "# from sklearn.metrics import mean_squared_error\n", "from sklearn.metrics import accuracy_score" ] }, { "cell_type": "code", "execution_count": 12, "metadata": { "gather": { "logged": 1667634929027 } }, "outputs": [], "source": [ "def seed_torch(seed=0):\n", " random.seed(seed)\n", " os.environ['PYTHONHASHSEED'] = str(seed)\n", " np.random.seed(seed)\n", " torch.manual_seed(seed)\n", " torch.cuda.manual_seed(seed)\n", " torch.backends.cudnn.deterministic = True" ] }, { "cell_type": "code", "execution_count": 13, "metadata": { "gather": { "logged": 1667634929163 } }, "outputs": [], "source": [ "DEVICE = torch.device('cuda' if torch.cuda.is_available() else 'cpu')" ] }, { "cell_type": "code", "execution_count": 14, "metadata": { "gather": { "logged": 1667634929331 }, "jupyter": { "outputs_hidden": false, "source_hidden": false }, "nteract": { "transient": { "deleting": false } } }, "outputs": [], "source": [ "# !unzip -q ./dog_cat_data.zip" ] }, { "cell_type": "code", "execution_count": 15, "metadata": { "gather": { "logged": 1667634929498 } }, "outputs": [], "source": [ "from glob import glob\n", "from PIL import Image" ] }, { "cell_type": "code", "execution_count": 16, "metadata": { "gather": { "logged": 1667634929638 } }, "outputs": [ { "data": { "text/plain": [ "300" ] }, "execution_count": 16, "metadata": {}, "output_type": "execute_result" } ], "source": [ "dog_filepaths = sorted(glob('./dog_cat_data/train/dog/*.jpg'))\n", "cat_filepaths = sorted(glob('./dog_cat_data/train/cat/*.jpg'))\n", "paths = dog_filepaths + cat_filepaths\n", "len(paths)" ] }, { "cell_type": "code", "execution_count": 17, "metadata": { "gather": { "logged": 1667634929789 } }, "outputs": [], "source": [ "class MyDataset(data.Dataset):\n", " def __init__(self, paths):\n", " self.paths = paths\n", " self.transform = T.Compose([\n", " T.Resize(256),\n", " T.CenterCrop(224),\n", " T.ToTensor(),\n", " T.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])])\n", " self.labels = [0 if p.split('/')[-2] == 'cat' else 1 for p in self.paths]\n", " def __getitem__(self, idx):\n", " path = self.paths[idx]\n", " img = Image.open(path).convert('RGB')\n", " img_transformed = self.transform(img)\n", " label = self.labels[idx]\n", " return img_transformed, label\n", " def __len__(self):\n", " return len(self.paths)" ] }, { "cell_type": "code", "execution_count": 18, "metadata": { "gather": { "logged": 1667634929932 } }, "outputs": [ { "data": { "text/plain": [ "(210, 90)" ] }, "execution_count": 18, "metadata": {}, "output_type": "execute_result" } ], "source": [ "dataset = MyDataset(paths=paths)\n", "n_train = int(len(dataset) * 0.7)\n", "n_val = len(dataset) - n_train\n", "train, val = data.random_split(dataset, [n_train, n_val])\n", "len(train), len(val)" ] }, { "cell_type": "code", "execution_count": 19, "metadata": { "gather": { "logged": 1667634930150 } }, "outputs": [], "source": [ "batch_size = 32\n", "train_loader = data.DataLoader(train, batch_size, shuffle=True, drop_last=True)\n", "val_loader = data.DataLoader(train, batch_size)" ] }, { "cell_type": "code", "execution_count": 20, "metadata": { "gather": { "logged": 1667634930304 } }, "outputs": [ { "data": { "text/plain": [ "(torch.Size([32, 3, 224, 224]), torch.Size([32]))" ] }, "execution_count": 20, "metadata": {}, "output_type": "execute_result" } ], "source": [ "# check\n", "x, t = next(iter(train_loader))\n", "x.shape, t.shape" ] }, { "cell_type": "code", "execution_count": 21, "metadata": { "gather": { "logged": 1667634931055 } }, "outputs": [], "source": [ "def train(\n", " model: nn.Module,\n", " optimizer: Optimizer,\n", " train_loader: data.DataLoader):\n", "\n", " model.train()\n", " # criterion = nn.MSELoss() # 回帰\n", " criterion = nn.CrossEntropyLoss() # 分類\n", " epoch_loss = 0.0\n", " epoch_accuracy = 0.0\n", "\n", " for i, (x_i, y_i) in enumerate(train_loader):\n", " x_i = x_i.to(DEVICE, dtype=torch.float32)\n", " # y_i = y_i.to(DEVICE, dtype=torch.float32).reshape(-1, 1) # 回帰\n", " y_i = y_i.to(DEVICE, dtype=torch.int64) # 分類\n", " output = model(x_i)\n", " loss = criterion(output, y_i)\n", " optimizer.zero_grad()\n", " loss.backward()\n", " accuracy = (output.argmax(dim=1) == y_i).float().mean()\n", " epoch_loss += loss\n", " epoch_accuracy += accuracy\n", " optimizer.step()\n", " return epoch_loss / len(train_loader), epoch_accuracy / len(train_loader)\n", "\n", "def valid(model: nn.Module, valid_loader: data.DataLoader):\n", " model.eval()\n", " criterion = nn.CrossEntropyLoss()\n", " valid_loss = 0.0\n", " valid_accuracy = 0.0\n", " for x_i, y_i in valid_loader:\n", " x_i = x_i.to(DEVICE, dtype=torch.float32)\n", " y_i = y_i.to(DEVICE, dtype=torch.int64)\n", " with torch.no_grad():\n", " output = model(x_i)\n", " loss = criterion(output, y_i)\n", " accuracy = (output.argmax(dim=1) == y_i).float().mean()\n", " valid_loss += loss\n", " valid_accuracy += accuracy\n", " return valid_loss / len(valid_loader), valid_accuracy / len(valid_loader)" ] }, { "cell_type": "markdown", "metadata": { "nteract": { "transient": { "deleting": false } } }, "source": [ "補足:学習率のスケジューラ" ] }, { "cell_type": "code", "execution_count": 22, "metadata": { "gather": { "logged": 1667634931964 }, "jupyter": { "outputs_hidden": false, "source_hidden": false }, "nteract": { "transient": { "deleting": false } } }, "outputs": [ { "name": "stderr", "output_type": "stream", "text": [ "/anaconda/envs/azureml_py38/lib/python3.8/site-packages/torch/optim/lr_scheduler.py:129: UserWarning: Detected call of `lr_scheduler.step()` before `optimizer.step()`. In PyTorch 1.1.0 and later, you should call them in the opposite order: `optimizer.step()` before `lr_scheduler.step()`. Failure to do this will result in PyTorch skipping the first value of the learning rate schedule. See more details at https://pytorch.org/docs/stable/optim.html#how-to-adjust-learning-rate\n", " warnings.warn(\"Detected call of `lr_scheduler.step()` before `optimizer.step()`. \"\n" ] }, { "data": { "image/png": "iVBORw0KGgoAAAANSUhEUgAAAlMAAAFlCAYAAADPim3FAAAABHNCSVQICAgIfAhkiAAAAAlwSFlzAAALEgAACxIB0t1+/AAAADh0RVh0U29mdHdhcmUAbWF0cGxvdGxpYiB2ZXJzaW9uMy4yLjEsIGh0dHA6Ly9tYXRwbG90bGliLm9yZy+j8jraAAAgAElEQVR4nOzde1yUZf7/8dcNgWgompmhVuo3gZDDoAhimqCmrLZgpl/LLA+ZtvstXdc1O5ip6+7a1rbabr9W19La3Vw2O2it61IZq7UWnsYTpljiCTpooYkSivfvD2RimAEGZmAGeD8fDx46N/fhM/dnbvhw3dd9XYZpmoiIiIhI3fh5OwARERGRxkzFlIiIiIgbVEyJiIiIuEHFlIiIiIgbVEyJiIiIuEHFlIiIiIgbrvDWga+++mqza9eu9XqMoqIirrzyyno9htSNcuOblBffpdz4JuXFd3k6N9u3bz9pmmYHZ9/zWjHVtWtXtm3bVq/HyMrKIjk5uV6PIXWj3Pgm5cV3KTe+SXnxXZ7OjWEYR6r6nm7ziYiIiLhBxZSIiIiIG1RMiYiIiLhBxZSIiIiIG1RMiYiIiLhBxZSIiIiIG1RMiYiIiLhBxZSIiIiIG1RMiYiIiLihxmLKMIyXDMP4yjCMvVV83zAM4znDMA4ZhrHbMIxeng9TRERExDe50jK1Ckit5vs/Anpc/poKvOB+WCIiIiKNQ41z85mmuckwjK7VrJIOvGKapgl8bBhGW8MwQk3TLPBQjHXysxW3ElR8iuTDod4Mo9n78rtiTp793mF5V79SOHyVFyJyU/RoiJ/k7Sjc9uonR1lrPeGw/KZWF0hu+HCkAuXGNykvUh1PTHTcGThW4fXxy8sciinDMKZS1npFx44dycrK8sDhnfus9AxBAaUUFhbW2zGkZl98V8r3pdDC/4dl35fCRT8aXW6Czx7mbGEh1rPdvB2K217+5DxHv7vE9a1/aJw++t0lTrUy6/W6lJopN75JeWl8zp4922C58UQx5TLTNJcDywHi4+PN+pxpe87hLly8eJG2M9fW2zGkZtOWbQEgY1qSbdnYZVsoLCzk3zN/5K2w6mblCNpCk5gh/oUDW2jb1nlemsL7a8yUG9+kvDQ+WVlZDZYbTzzNdwK4rsLrLpeXiYiIiDR5niim1gH3Xn6qry9w2tv9pUREREQaSo23+QzDWA0kA1cbhnEceBIIADBN80/AemA4cAg4BzT+3rkiIiIiLnLlab67avi+CfyfxyISERERaUQ0ArqIiIiIGxr0ab6G9v1Fk7GXnyarKN3SmXGJ13shoqbN2TgsOQVniAxt47Du0e8uOeRGeakf7uYFlJv6omvGNykvvquq8b4iO7XhyR/39EJEZZpsy9TVV7agxRWGw/KcgjNOEyHuW2s9QU7BGbtlkaFtSLd0tluWbulsN1YLKC/1yZ28gHJTn3TN+CblxXc5y40vaLItU9e0aUHgpfNk3Jlkt9zZX93iOZGhbezGYXFmXOL1dDr/OcnJ9uO1SP2pa15AualvumZ8k/Liu1zJTUNrsi1TIiIiIg1BxZSIiIiIG1RMiYiIiLhBxZSIiIiIG5psB3SAEyUnmLTBfkD2vMAzhJQmAL7VeU1EREQapyZbTA3vPpzCwkKH5cXGMfD3QkAiIiLSJDXZYmpM2Bg65HcgOTnZbnniyju8E5CIiIg0SeozJSIiIuIGFVMiIiIiblAxJSIiIuIGFVMiIiIiblAxJSIiIuIGFVMiIiIiblAxJSIiIuIGFVMiIiIiblAxJSIiIuIGFVMiIiIiblAxJSIiIuKGJjs3X3XOmUcd5ui7pdNQnh42zUsRNT6vfnKUtdYTdstyCs4QGdqmzvvMKTjD2GVb7JalWzozLvH6Ou+zuamPvJTvQ7lxj64Z39RQeQHlpjac5QU88/OsPjS7lqlbOg2llWH/YT5nHmVTfqaXImqc1lpPkFNwxm5ZZGgb0i2d67S/dEtnhwskp+CM04tJqubpvIBy4ym6ZnxTQ+QFlJvacpYXcP/nWX1pdi1TZa1P9i1QlVupxDWRoW3ImJbkkX2NS7ze4S82Z3/ZSc08mRdQbjxJ14xvqu+8gHJTF57+WVafml3LlIiIiIgnqZgSERERcYOKKRERERE3qJgSERERcYOKKRERERE3qJgSERERcYOKKRERERE3qJgSERERcYOKKRERERE3qJgSERERcYOKKRERERE3NLu5+apSbBxj0oZJdsuGdx/OmLAxXopIREREGgMVU0BIaQL42y878M0BABVTIiIiUi0VU0C70ltoV3oLK1N/mJ26ciuViIiIiDPqMyUiIiLiBhVTIiIiIm5QMSUiIiLiBhVTIiIiIm5QMSUiIiLiBhVTIiIiIm5QMSUiIiLiBhVTIiIiIm5QMSUiIiLiBo2AXo0D3xzQfH0iIiJSLRVTVRjefbjDMs3XJyIiIpW5VEwZhpEKLKVsOuAVpmkurvT964GXgbaX13nENM31Ho61QY0JG+NQNGm+PhEREamsxj5ThmH4A88DPwIigbsMw4istNpc4B+macYBdwL/z9OBioiIiPgiVzqgJwCHTNP83DTNEuDvQHqldUygzeX/hwD5ngtRRERExHe5cpuvM3CswuvjQGKldeYDmYZhPARcCQzxSHQiIiIiPs5THdDvAlaZpvk7wzCSgL8YhhFlmualiisZhjEVmArQsWNHsrKyPHR4586ePevSMQoLzwPUuG5hYaFL6zUHrp6zqriSG3eP4WmWy/m3+kg8zjREXjxxnOaoOV4zjYGuGd/kifPlam48wZVi6gRwXYXXXS4vq+g+IBXANM0thmEEAVcDX1VcyTTN5cBygPj4eDM5ObluUbsoKysLV47xwoEt5BSc4YUDLeyWp1s6My7xetvrlze8DODSPpuKVz85ylpr5XRD/vnviQxtQ3JyUp3260puXM1LgzncFvCd/DvLTUPkBXwwNz5E14zv0jXjm+ojL+B6bjzBlT5TW4EehmF0MwwjkLIO5usqrXMUGAxgGMZNQBDwtScDrU/pls5EhraxW5ZTcMbpD8TmZq31BDkFZxyWR4a2Id3SuV6PrbxUz1luGiIvoNxUR9eM79I145u8mRdPqbFlyjTNi4ZhPAj8m7JhD14yTXOfYRgLgW2maa4DZgF/NgxjJmWd0SeapmnWZ+CeNC7xeoe/DMYu2+KlaHxPZGgbMqbV/a+DulJeaqbc+CblxXcpN77JW3nxFJf6TF0eM2p9pWXzKvw/B7jZs6GJiIiI+D7NzSciIiLiBhVTIiIiIm7Q3Hy1pMmPRUREpCIVU7WgyY9FRESkMhVTtaDJj0VERKQy9ZkSERERcYOKKRERERE3qJgSERERcYOKKRERERE3qJgSERERcYOKKRERERE3qJgSERERcYOKKRERERE3aNBOD9AUMyIiIs2Xiik3aYoZERGR5k3FlJs0xYyIiEjzpj5TIiIiIm5QMSUiIiLiBhVTIiIiIm5QMSUiIiLiBhVTIiIiIm5QMSUiIiLiBhVTIiIiIm5QMSUiIiLiBg3aWU80xYyIiEjzoGKqHmiKGRERkeZDxVQ90BQzIiIizYf6TImIiIi4QcWUiIiIiBtUTImIiIi4QcWUiIiIiBtUTImIiIi4QcWUiIiIiBtUTImIiIi4QeNMNSCNii4iItL0qJhqIBoVXUREpGlSMdVANCq6iIhI06Riqho5BWcYu2yL3bJ0S2fGJV7vpYjq16ufHGWt9YTdspyCM0SGtvFSRM41t7yAcuOrlBffpdz4psaSl9pSB/QqpFs6OyQ3p+CMw4egKVlrPUFOwRm7ZZGhbUi3dPZSRI6aY15AufFVyovvUm58U2PIS12oZaoK4xKvd/jLoPJfD01RZGgbMqYleTuMKjXXvIBy46uUF9+l3PgmX89LXahlSkRERMQNKqZERERE3KBiSkRERMQNKqZERERE3KAO6F7mbFR00MjoIiIijYWKKS9yNio6aGR0ERGRxkTFlBc5GxUdNDK6iIhIY6I+UyIiIiJuUDElIiIi4gYVUyIiIiJuUDElIiIi4gaXiinDMFINwzhgGMYhwzAeqWKd/zUMI8cwjH2GYbzq2TBFREREfFONT/MZhuEPPA/cChwHthqGsc40zZwK6/QAHgVuNk3zW8MwrqmvgEVERER8iStDIyQAh0zT/BzAMIy/A+lAToV17geeN03zWwDTNL/ydKDNjbPBPDWQp4iIiO9xpZjqDByr8Po4kFhpnTAAwzA+AvyB+aZpbqi8I8MwpgJTATp27EhWVlYdQnbd2bNnPXqMwsLzAPUe940XbqTQr5DCwkLbshMlJygsLKRDfod6O25DvT/wbG4aKm7L5XxYG+D8VNZQ77GxXjPeomvGd+ma8U2N9ZqpiacG7bwC6AEkA12ATYZhRJumWVhxJdM0lwPLAeLj483k5GQPHd65rKwsPHmMFw5sASA5Oclj+3QmmWSHZeWtVPV5zhrq/YFnc9NgcR9ue/k4yfV7HCca6j021mvGW3TN+C5dM76psV4zNXGlA/oJ4LoKr7tcXlbRcWCdaZoXTNM8DBykrLgSERERadJcKaa2Aj0Mw+hmGEYgcCewrtI6b1HWKoVhGFdTdtvvcw/GKSIiIuKTarzNZ5rmRcMwHgT+TVl/qJdM09xnGMZCYJtpmusuf2+oYRg5QCkw2zTNU/UZuIj4ngsXLnD8+HGKi4sB+L+4lgDs37/fm2HVm4Z8fyEhIR47TlPPCzTce/RkXqDp56YxXDNBQUF06dKFgIAAl7dxqc+UaZrrgfWVls2r8H8T+PnlLxFppo4fP07r1q3p2rUrhmEQ+PVZAP6nQ7CXI6sfDfn+vvvuO1q3bu2RfTX1vEDDvUdP5gWafm58/ZoxTZNTp05x/PhxunXr5vJ2GgFdRDymuLiY9u3bYxiGt0MREak1wzBo3769rXXdVSqmRMSjVEiJSGNWl59hnhoaQRqABvIUERHxPSqmGonh3Yc7LDvwzQEAFVMiIiJepNt8jcSYsDGsTF1p9xV+Vbi3wxJpErKysrjtttvqZd8TJ060jcK8ZMkSzp0757H9rlmzxiP7qmjVqlXk5+fbXk+ZMoWcnJxqtqhecHD9dDRetWoVDz74YJ23dzXn7sb/yiuvEBUVRXR0NHFxcTzzzDO13se6detYvHixW3G89dZbGIbBp59+6tZ+aisvL4+oqCgAtm3bxsLHZtd5X84+83l5ebRs2RKLxUJkZCT33nsvFy5cAGDz5s1MnDixzserDRVTItLklZSUUFRU5O0w6lRMlZaW1lM0zlUuplasWEFkZKRHj3Hx4kWP7s9XmKbJpUuXbK//9a9/sWTJEjIzM9mzZw8ff/wxISEhtd5vWloajzzyiFuxrV69mv79+7N69Wq39uOO+Ph45v36aY/v93/+53+wWq3s2bOH48eP849//MNhndOnT9vlxtNUTIlIvVjw9j5+nmHl5xlWxi7b4pGvBW/vq1UM+/fvZ9asWYSHh3Pw4EEAtm7dSr9+/YiNjSUhIYHvvvvObpvs7GySkpKIi4ujX79+HDhQdjt93759JCQkYLFYiImJITc3l3NFRUwZdwexsbFERUWRkZHhEENISAiBgYE899xz5Ofnk5KSQkpKClD2Cy46OpqoqCjmzJlj2yY4OJhZs2YRGxvLli1beOWVV0hKSiI2NpZ77rnHtt6mTZvo168f3bt3r7KVqqpjxHS9lpkzZ9KzZ08GDx7M119/zZo1a9i2bRt33303FouF8+fPk5yczLZt22xxzZ49m549ezJkyBCys7NJTk6me/furFtXeSxne1lZWQwYMIC0tLQqi7OioiJGjBjhcD6ryll+fj6pqan06NGDhx9+2LafzMxMRv9oEGmD+zNmzBjOni17HH/Dhg1ERETQq1cv3njjDdv68+fPt2sxioqKIi8vzyG+p59+mj59+hATE8OTTz4JwJEjRwgPD+fee+8lKiqKY8d+mMr2N7/5Dc888wydOnUCoEWLFtx///0AWK1W+vbtS0xMDLfffjvffvstAC//+QWG9Y8nJiaGO++8E7BvhZs4cSLTp093mndn8UHZHHUffvghL774In//+9/tcpKcnMzo0aOJiIjg7rvvpmykI+jatStPPvkkvXr1Ijo62taiVVRUxOTJk0lISCAuLo61a9cCZS1EAwYMoFevXvTq1Yv//ve/DucvKyuL++8ebTvnkydPtn1+nnvuOdt6v/zlLwkPD6d///7cddddLrfm+fv7k5CQwIkTZZO0BAYG2orXDz/8kPDwcObPn8/Ro0dd2l9tqJgSkSalqKiIlStX0r9/f+6//34iIyPZvXs3cXFxlJSUMHbsWJYuXcquXbt47733aNmypd32ERERbN68mZ07d7Jw4UIee+wxAP70pz8xY8YMrFYr27Zto0uXLmza+B7XXBvKrl272Lt3L6mpqQ7xLF26lH79+jF9+nQ6derEBx98wAcffEB+fj5z5sxh48aNWK1Wtm7dyltvvWV7D4mJiezatYt27dqxaNEi3nnnHXbt2sXSpUtt+y4oKODDDz/knXfecdpyUd0xzp0rIj4+nn379jFw4EAWLFjA6NGjiY+P529/+xtWq9Xh3BQVFTFo0CD27dtH69atmTt3Lu+++y5vvvkm8+bNczh+ZTt27GDp0qW2wrayDRs20KlTJ7vzWV3OrFYrGRkZ7Nmzh4yMDI4dO8bJkydZtGgRr6x5m3Xvf0h8fDzPPvssxcXF3H///bz99tts376dL774osZ4K8rMzCQ3N5fs7GysVivbt29n06ZNAOTm5vLTn/6Uffv2ccMNN9i22bt3L71793a6v3vvvZennnqK3bt3Ex0dzYIFCwBY9tyzrHv/I3bv3s2f/vQnp9s6y3t18a1du5bU1FTCwsJo374927dvt+1r586dLFmyhJycHD7//HM++ugj2/euvvpqduzYwU9+8hNbQfOrX/2KQYMGkZ2dzQcffMDs2bMpKirimmuu4d1332XHjh1kZGQwffr0Gs/pp59+yr///W+ys7NZsGABFy5cYOvWrbz++uvs2rWLf/3rX7ZC3hXFxcV88skntuswMTHRdr2MGDGCLVu2EBISQlpaGqmpqbz22muUlJS4vP/qqAO6iNSLJ3/ck8+8MABhaGgoMTExrFixgoiICLvvHThwgNDQUPr06QNAmzZtHLY/ffo0EyZMIDc3F8MwbP0vkpKS+NWvfsXx48cZNWoUPXr0IDwykt/Mf4w5c+Zw2223MWDAAJfj3Lp1K8nJyXTo0AGAu+++m02bNjFy5Ej8/f254447ANi4cSNjxoyhffv2AFx11VW2fYwcORI/Pz8iIyP58ssvXT5G9M1D8PPzY+zYsQCMHz+eUaNG1RhzYGCg7RdVdHQ0LVq0ICAggOjoaKctOZUlJCRUOxBidHQ0s2bNsjufe/bsqTJngwcPtrU8REZGcuTIEQoLC8nJyWHsbbeWrXTpIklJSXz66ad069aNHj162N7z8uXLa4y5XGZmJpmZmcTFxQFlrT25ubkkJiZyww030LdvX5f3dfr0aQoLCxk4cCAAEyZMYMyYsgeJwiOj+PlP7mP82NGMHDnS6fbO8l5VfLfccgurV69mxowZANx5552sXr3aVuQlJCTQpUsXACwWC3l5efTv3x/A9pno3bu3rSUvMzOTdevW2Yqr4uJijh49SqdOnXjwwQexWq34+/tXWTBXNGLECFq0aEGLFi245ppr+PLLL/noo49IT08nKCiIoKAgfvzjH9e4n88++wyLxcLhw4cZMWIEMTExTte7+uqrmTlzJjNnzmTLli1MnjyZX/7yl+zevbvGY9RExZSINClr1qzhxRdfZNSoUdx5551MmDDBrrWgJk888QQpKSm8+eab5OXl2WadHzduHImJifzzn/9k+PDhLFu2jG7RCax9bzP7szcxd+5cBg8e7FILTU2CgoLw9/evcb0WLVrY/l9+e6auXBlbJyAgwLaen5+f7fh+fn4u9YO68sorq/1+WFgYO3bsYP369bbzefvtt1e5fsX37+/vz8WLFzFNk1tvvZVFz/0Z+KGQt1qtVe7niiuusOtP42zARtM0efTRR5k2bZrd8r1791b5vnr27Mn27dsZNGhQlceubMWra8je8hE7Nr/Hr371K/bs2eOwjrO8VxXfN998w8aNG9mzZw+GYVBaWophGDz99NMO+yo/h5WPU3G5aZq8/vrrhIfbPwA1f/58OnbsyK5du7h06RJBQUE1vtfqjl0b5X2mTp48yc0338y6detIS0tzum5OTg4rV67krbfeYuDAgbbbru7SbT4RaVKGDh1KRkYGmzdvJiQkhPT0dIYMGUJeXh7h4eEUFBSwdetWoGy6ico/wE+fPk3nzp2Bsr4q5T7//HO6d+/O9OnTSU9PZ/fu3Xz5RQEtW7Zi/PjxzJ49mx07dlQbW+vWrW39fRISEvjPf/7DyZMnKS0tZfXq1baWiooGDRrEa6+9xqlTZdOdfvPNNy6fi+qOcenSJVt/m1dffdXWGlExxoaWn59Pq1b259OVnFXUt29fPvroI/I+/wwouzV58OBBIiIiyMvL47PPypZX7IjdtWtXW+527NjB4cOHHfY7bNgwXnrpJVv/qxMnTvDVV19V+34effRRZs+ebbulWFJSwooVKwgJCaFdu3Zs3rwZgL/85S8MHDiQS5cuUXDiOEn9b+Gpp57i9OnTtuPVpKr41qxZwz333MORI0fIy8vj2LFjdOvWzXbs2ho2bBh/+MMfbEXczp07gbLrJjQ0FD8/P/7yl7/U+cGJm2++mbfffpvi4mLOnj3LO++84/K2V199NYsXL+Y3v/mNw/d27NhB3759mTJlChEREezcuZMVK1aQmJhYpzgrU8tUI6eBPEWca9++PTNmzGDGjBlkZ2fj7+9PYGAgGRkZPPTQQ5w/f56WLVvy3nvv2W338MMPM2HCBBYtWsSIESNsy//xj3/wl7/8hYCAAK699loee+wx1r67iacWzCUo8AoCAgJ44YUXqo1p6tSppKam2vpOLV68mJSUFEzTZMSIEaSnpzts07NnTx5//HGGDx9OQEAAcXFxdkWeMxaLBavVSmhoqNNjfPb1WVq1upLs7GwWLVrENddcY+vsPXHiRB544AFatmzJli1bXDzbP8jPz2fKlCmsX7++5pUr2bNnD7Nnz8bPz892Pl3JWUUdOnRg1apV/OyByZR8/z2BV/ixaNEiwsLCWL58OSNGjKBVq1YMGDDAVjTecccdvPLKK/Ts2ZPExETCwsIc9jt06FD2799PUlISUNYZ/69//avDevPmzSM+Pp60tDSGDx/Ol19+yZAhQzBNE8MwmDx5MgAvv/wyDzzwAOfOnaN79+6sXLmS0tJSZv10Ct99d4YAP4Pp06fTtm1bl85dVfGtXr3a7sGD8ve7evVq223e2njiiSf42c9+RkxMDJcuXaJbt2688847/PSnP7Wdx9TU1BpbIavSp08f0tLSiImJoWPHjkRHR9s9ATlt2jR+9rOfAXDdddc5PJ04cuRI5s+fz+bNm7FYLLblLVu2ZOXKldx00011iqsmhrtNw3UVHx9v1qZjWV2UP6ngKWOXlf1gyZiW5LF9uuO1g6+x/nP7H1gHvjlA+FXhrExdWev9NeT782RuGizulZd/sU76Z/0ex4mGeo/u5mX//v12P6y80WeqITXk+/PkhLqffX2WmK7XUlTkWqtHY9RQufH0RMe6Zsr6ewUHB3Pu3DluueUWli9fTq9evWp9LHdyU/lnGYBhGNtN04x3tr5aphqxMWFjHFqgKrdSiYiINCZTp04lJyeH4uJiJkyYUKdCqqGpmBIRaYZ259VuaABPOnXqFIMHD3ZY/v7779ueWpTm69VXX/V2CLWmYkpERBpU+/btq326TqSx0dN8IiIiIm5QMSUiIiLiBhVTIiIiIm5Qn6kmSGNPiYiINBy1TDUxw7sPJ/wq+2H+D3xzwGE8KhH5QVZWFrfddlu97HvixIlkZWUBsGTJEs6dO+ex/ZaPYO5Jq1atIj8/3/Z6ypQp5OTk1Hl/wcH1M17SqlWrePDBB+u8vas5dyf+AwcOkJycjMVi4aabbmLq1KlA2dQ2dRnUtKa45s+fT+fOnbFYLERGRtoNaFnxcyiep5apJkZjT4k4Kikp4cKFC3UeldlTlixZwvjx42nVqpXL25SWlro0T5+nrFq1iqioKDp16gTAihUrPH6MixcvcsUVTe/Xj2mamKaJn19ZO8X06dOZOXOmbWT78nn2rFYr27ZtY/jw4R6PYebMmfziF78gNzeX3r17M3r0aAICAuzW+fbbb2nXrp3Hj92cqWVKROrHvx6h05uj6fTm6LLR4z3x9a9HahXC/v37mTVrFuHh4bZZ7Ldu3Uq/fv2IjY0lISHBYR667OxskpKSiIuLo1+/fhw4cACAffv2kZCQgMViISYmhtzcXM4VFTFl3B3ExsYSFRVlm5KlopCQEAIDA3nuuefIz88nJSWFlJQUoGx+uOjoaKKiouym/AgODmbWrFnExsayZcsWXnnlFZKSkoiNjeWee+6xrbdp0yb69etH9+7dq2ylquoYMV2vZebMmfTs2ZPBgwfz9ddfs2bNGrZt28bdd9+NxWLh/PnzJCcnUz5bRXBwMLNnz6Znz54MGTKE7OxskpOT6d69O+vWras2F1lZWQwYMIC0tDQiIyOdrlNUVMSIESMczmdVOcvPzyc1NZUePXrw8MMP2/aTmZnJ6B8NIm1wf8aMGWObr27Dhg1ERETQq1cv3njjDdv68+fP55lnnrG9joqKIi8vzyG+p59+mj59+hATE8OTTz4JwJEjRwgPD+fee+8lKiqKY8eO2dYvKCigS5cuttfR0dGUlJQwb948MjIysFgsZGRkUFRUxOTJk0lISODHg27m3X+VzUe3atUq0tPTSU5OpkePHixYsKDac1xRjx49aNWqFd9++y3ww+cQ4KGHHmLQoEH87W9/czqps9Re0/vToJ7lFJyxTe1RLt3SmXGJ13sporp59ZOjrLWesFuWU3CGyNA2XorIPc7yAsqNL7hkmpy/YD/p6RV+BgH+9fO3XFFREf/4xz948cUXAZg0aRLz58+nddJA91gAACAASURBVOvWlJSUMHbsWDIyMujTpw9nzpyhZcuWdttHRESwefNmrrjiCt577z0ee+wxXn/9dZb84XnunDSN9NFjKSkpoaS0lPf//S+uuTaUD979N1A22WtlS5cuBaBfv348++yzfPDBB1x99dXk5+czZ84ctm/fTrt27Rg6dChvvfUWI0eOpKioiMTERH73u9+xb98+Fi1aRGZmJl27drWb6LigoIAPP/yQTz/9lLS0NEaPHm137KqOEX3zEM6dK+K6sCjWPfZL/vDMYn7+yFyWLH2O+Ph4nnnmGeLjHWfNKCoqYtCgQTz99NPcfvvtzJ07l3fffZecnBwmTJhAWlpatbnZsWMHe/fupVu3bk6/v2HDBjp16sQ///lP2/msLmdWq5WdO3dy9gIkxEXz43GTCQoKYu6TC1j+97e4KqQNa156nmeffZaHH36Y+++/n40bN3LjjTfWel66zMxMcnNzyc7OxjRN0tLS2LRpE1dddRW5ubm8/PLL9O3b126bmTNnMmjQIPr168fQoUOZNGkSbdu2ZeHChWzbto0//vGPADz22GMMGjSIl156iZ2HjnP70GRuviWFr74rZsvHn7B+0ydce1UIqSn9GTFihNPcODvXPXr04JprrgF++BwC/PWvf2X79u289NJLzJs3j+HDhzNlyhRiY2NrdU5qcurs9xSev2C3rPhCKUEBDdfS2lBUTNVCuqWzw7KcgjMAje4X9lrrCYdf0JGhbZy+R19XVczKjZf9aDHnqvlhWl9zj4WGhhITE8OKFSuIiIiw+96BAwcIDQ2lT58+ALRp41ignj59mgkTJpCbm4thGFy4UBZ/pCWe53//NF/kn2DYbWl07X4jUdHRPLvoCebMmcNtt93GgAEDXI5z69atJCcn06FDBwDuvvtuNm3axMiRI/H39+eOO+4AYOPGjYwZM8Y2MvhVV11l28fIkSPx8/MjMjKSL7/80uVjDBjyI/z8/BgxsuwY6aPH8pOJ4xxyVVlgYCCpqalAWStLixYtCAgIIDo62mlLTmUJCQlVFlLl+5w1a5bd+dyzZ0+VORs8eDAhISGc/Pos3XuEk3/sKGfOnObQwU+ZMDIVfz+D0osXSEpK4tNPP6Vbt2706NEDgPHjx7N8+fIaYy6XmZlJZmYmcXFxQNn8cbm5uSQmJnLDDTc4FFJQVsgPGzaMDRs2sHbtWpYtW8auXbuc7nvdunU888wzlF4yKSn5nvwTZS1cNw8cRMvWbfkef0aNGsWHH35YbTH1+9//npUrV3Lw4EHefvvtKtfr3bs3vXv3pri4mGXLlpGQkMBvfvMbfv7zn7t8TmpSeP6CQ/EUFOBP25YB1WzVOKmYqoVxidc7/GJ21hrSWESGtvGZSZvd4SwvoNz4gvbBLWgf3MJuWflEp/VlzZo1vPjii4waNYo777yTCRMmcMMNN7i8/RNPPEFKSgpvvvkmeXl5tomf0+74X2J7xbP34yweGD+GZcuWMWTQIKw7d7J+/Xrmzp3L4MGDmTdvntvvISgoyKV+Ui1a/HBuazNpfXlO/qdDMFdccQXGd1fa+vlUJyAgAMMwAPDz87Md38/Pj4sXL9a4fU191sLCwtixY4fd+bz99turXL/i+w+4wp+OrQNpRRDDhg6163wNVDvi+hVXXMGlS5dsr53d+jJNk0cffZRp06bZLd+7d2+176tTp05MnjyZyZMnExUVxd69e53u+/XXXyc83P7hoYLcvbRpGWBXjJSf/6qU95lat24d9913H5999hlBQUEO6128eJH169fz0ksvcejQIRYuXMj48eOr3Xdd1OcfTr5EfaaaifLhEsq/8gKf4Vv/Td4OS8Tjhg4dSkZGBps3byYkJIT09HSGDBlCXl4e4eHhFBQUsHXrVqBsVvnKRcDp06fp3LmsFXDVqlW25UfzDnN9125Mnz6d9PR0du/eTX5+Pq1atWL8+PHMnj2bHTt2VBtb69atbf19EhIS+M9//sPJkycpLS1l9erVDBw40GGbQYMG8dprr3Hq1CkAu9t8NanuGJcuXbL1s3r11VeJT0xyiLGhOTufruSsor59+/LRRx9x6NAhoOzW5MGDB4mIiCAvL4/PPvsMwK7Y6tq1qy13O3bs4PDhww77HTZsGC+99JKt/9WJEyf46quvqn0/GzZssLVsfvHFF5w6dYrOnTs7nONhw4bxhz/8wVYQ79y50/a9d999l8Jvv6H4/Hneeustbr755mqPWS4tLY34+Hhefvllh+89++yzhIWF8frrrzNr1iz27t3LnDlzbLcEpfbUMtUMDO/u+MRIsXEMmt5taxGb9u3bM2PGDGbMmEF2djb+/v4EBgaSkZHBQw89xPnz52nZsiXvvfee3XYPP/wwEyZMYNGiRYwYMcK2fP26N3jrtb9zZVALrr32Wh577DG2bt3K7Nmz8fPzIyAggBdeeKHamKZOnUpqaiqdOnXigw8+YPHixaSkpGCaJiNGjLA99VVRz549efzxxxk+fDgBAQHExcXZFXnOWCwWrFYroaGhVR7jyiuvJDs7m0WLFnHNNdew+PmyPmYTJ07kgQceoGXLlmzZUvvW3fz8fKZMmVKnR//37NnjcD5dyVlFHTp0YNWqVdx11118//33ACxatIiwsDCWL1/OiBEjaNWqFQMGDLAVNHfccQevvPIKPXv2JDExkbCwMIf9Dh06lP3795OUVFZ0BgcH89e//tVhvXnz5hEfH09aWhqZmZnMmDHD1jL09NNPc+2115KSksLixYuxWCw8+uijPPHEE/zsZz8jJiaGS5cu0a1bN955p6wTekJCAv83eTxf5OczacI9tlt8586ds+vc7uz23Lx58xg3bhz333+/XctjTEwMVqvV6W1uqRujNk3DnhQfH2+WPyFSX7KysmxN9PWl/FZSY7slk7iyrK/EJ5Ne98rxG2VuVl7+xTrpn57ZXxW8+ZlyNy/79+/npptuqnad8tt8ja3p39txf/fdd7Ru3dpj+wsODra1soD33587vBm7p/NS0apVq9i2bRszn1wMNL7cePsz5U5unP0sMwxju2maTjus6TafiIiIiBt0m09EpBmq2CrV0E6dOsXgwYMdlr///vu2pxal7JbrxIkT6/2hDXGfiikREWlQ7du3r/bpOpHGRrf5RERERNygYkpERETEDbrN14wVG8ccJkEe3n24w0TJIiIiUjUVU81USGmCwzhTB74pm9BVxZSIiIjrdJuvmWpXegtdS37BytSVtq/wq8Jr3lCkCcrKyuK2226rl31PnDiRrKwsAJYsWcK5c+c8tt/yEcw9adWqVeTn59teT5kyhZycnDrvLzi4fsYYWrVqFQ8++GCdt3c15+7EP3/+fDp37ozFYrF9FRYW1nl/9clqtdoNtLpu3ToWL15c7TYVczB//nyeeeYZh3XCrg3BYrEQFRXFj3/8Y9v7rzhVU1OgYkpEmrySkhKKioq8HUadiqnS0tJ6isa5ysXUihUriIyM9OgxXJnHrzEyTdNujj8omyvParXavtq2beul6KpXuZhKS0vjkUcecXu/QUEtsVqt7N27l6uuuornn3/eYZ2ioiLbtDuNlYopEakXT2U/ZTcfZPnX4x//lMc//qnT79X09VT2U7WKYf/+/cyaNYvw8HAOHjwIwNatW+nXrx+xsbEkJCQ4zEOXnZ1NUlIScXFx9OvXjwMHym5/H/x0P6OGJWOxWIiJiSE3N5eioiJGjBhBbGwsUVFRZGRkOMQQEhJCYGAgzz33HPn5+aSkpJCSkgKUzQ8XHR1NVFQUc+bMsW0THBzMrFmziI2NZcuWLbzyyiskJSURGxvLPffcY1tv06ZN9OvXj+7du1fZSlXdMWbOnEnPnj0ZPHgwp05+zb/efott27Zx9913Y7FYOH/+PMnJyZTPVhEcHMzs2bPp2bMnQ4YMITs7m+TkZLp37866deuqzUVWVhYDBgwgLS2tyuKsqvNZVc7y8/NJTU1lcKKFpxbMte0nMzOTpKQkevXqxZgxY2xjam3YsIGIiAh69erFG2+8YVu/cqtKVFQUeXl5DvE9/fTT9OnTh5iYGJ588kkAjhw5Qnh4OPfeey9RUVEcO3as2vMA8Pvf/57JkycDZVPoREVFce7cOebPn88999xDUlISPXr04M9//jNQVqQtnv84UVFRREdH285L+YwFo0ePJiIigrvvvts2v9/27dsZOHAgvXv3ZtiwYRQUFACQnJzMnDlzSEhIICwsjM2bN1NSUsK8efPIyMjAYrGQkZFh1+r09ttvk5iYSFxcHEOGDOHLL7+s8T06k5SUxIkTJwDw9/fnqquuAuDgwYOEhYXxi1/8gv3799dp396mYkpEmpSioiJWrlxJ//79uf/++4mMjGT37t3ExcVRUlLC2LFjWbp0Kbt27eK9996jZcuWdttHRESwefNmdu7cycKFC3nssccAWP3yi0y4/ydYrVa2bdtGly5d2LBhA506dWLXrl3s3buX1NRUh3iWLl1Kv379mD59um1Ovg8++ID8/HzmzJnDxo0bsVqtbN26lbfeesv2HhITE9m1axft2rVj0aJFvPPOO+zatYulS5fa9l1QUMCHH37IO++847QVoaZjxMfHs2/fPgYOHMgfnlnMj348kvj4eP72t79htVodzk1RURGDBg1i3759tG7dmrlz5/Luu+/y5ptvMm/evBpzs2PHDpYuXWorbCtzdj6ry5nVaiUjI4P1//mYf659g2PHjnHy5EkWLVrEe++9x44dO4iPj+fZZ5+luLiY+++/n7fffpvt27fzxRdf1BhvRZmZmeTm5pKdnY3VamX79u1s2lQ2WXxubi4//elP2bdvHzfccIPddr///e9tt/jKi+gZM2Zw6NAh3nzzTSZNmsSyZcto1aoVALt372bjxo1s2bKFhQsXkp+fz7/fWcf+vXts73/27Nm24mjnzp0sWbKEnJwcPv/8cz766CMuXLjAQw89xJo1a9i+fTuTJ0/m8ccft8V08eJFsrOzWbJkCQsWLCAwMJCFCxcyduxYrFYrY8eOtXsP/fv35+OPP2bnzp3ceeed/Pa3v63VuYOyFtb333+ftLQ0AK677jpbQRsXF8fu3buJiIhgypQp9O/fn5UrV/pEa7Kr1AFd7Bz45oCe8BOPmJMwx+ny+p6vKzQ0lJiYGFasWEFERITd9w4cOEBoaCh9+vQBcDrR6+nTp5kwYQK5ubkYhmG7/RAXn8D/W/I0F86cZNSoUfTo0YPo6GhmzZrFnDlzuO222xgwYIDLcW7dupXk5GQ6dOgAwN13382mTZsYOXIk/v7+3HFH2fyZGzduZMyYMbaRwcv/mgcYOXIkfn5+REZGOm0tqO4Yfn5+tl+a48eP57a0kTXGHBgYaCsYo6OjadGiBQEBAURHRzttyaksISGBbt26Vfl9Z+dzz549VeZs8ODBhISEcLLkLDeGRXDkyBEKCwvJycnh5ptvBspu8SYlJfHpp5/SrVs3evToYXvPy5cvrzHmcpmZmWRmZhIXFweUjSCfm5tLYmIiN9xwA3379nW63cyZM/nFL35ht8zPz49Vq1YRExPDtGnTbLECpKen07JlS1q2bElKSgrZ2dlsy97CbaNG4+/vT8eOHRk4cCBbt26lTZs2JCQk2CY8tlgs5OXl0bZtW/bu3cutt94KlBUyoaGhtmOMGjUKgN69e7uUt+PHjzN27FgKCgooKSmpNoeVFRefx2KxcOLECW666SZbTJW1bt2aKVOmMGXKFPbv3899993HjBkzOHPmjMvH8ia1TInN8O7DHTqhH/jmAOs/r/3s7yLesmbNGjp37syoUaNYuHAhR44cqdX2TzzxBCkpKezdu5e3336b4uJiANLu+F+WvZJBy5YtGT58OBs3biQsLIwdO3YQHR3N3LlzWbhwoUfeQ1BQEP7+/jWu16JFC9v/3Z603jBqXCUgIADj8np+fn624/v5+bnUD+rKK6+s9vu1PZ8V37+/f1kMpmly66232voo5eTk8OKLL1a7nyuuuMKur1N5zisyTZNHH33Utt9Dhw5x3333ufS+nMnNzSU4ONiufxpgO79Vva7M/hz4285Bz549bbHu2bOHzMxMh23K16/JQw89xIMPPsiePXtYtmyZ0/NTlfI+U0eOHME0Tad9psrl5eWxYMECbr/9dq677rp6ecCivqiYEpsxYWPsnu7TE37SGA0dOpSMjAw2b95MSEgI6enpDBkyhLy8PMLDwykoKGDr1q1A2azylX+ZnD59ms6dOwNlnbHLHc07zPVduzF9+nTS09PZvXs3+fn5tGrVivHjxzN79mx27NhRbWytW7e29fdJSEjgP//5DydPnqS0tJTVq1czcOBAh20GDRrEa6+9xqlTpwD45ptvXD4X1R3j0qVLtl9Wr776KvGJSQ4xNjRn59OVnFXUt29fPvroIw4dOgSU3Zo8ePAgERER5OXl8dlnnwFlfcnKde3a1Za7HTt2cPjwYYf9Dhs2jJdeesnW/+rEiRN89dVXdXqfp0+fZvr06WzatIlTp07ZFQ1r166luLiYU6dOkZWVRZ8+feiTmMQ/33qD0tJSvv76azZt2kRCQkKV+w8PD+frr79my5YtAFy4cIF9+/ZVG1N1ea94Tbz88su1fbsAtGrViueee47f/e53DvnLy8tjyJAhjBw5krZt2/LRRx+RkZHB0KFD63Qsb1AxJSJNUvv27ZkxYwZWq5Vf//rX+Pv7ExgYSEZGBg899BCxsbHceuutDn9lP/zwwzz66KPExcXZ/dBfv+4NfnRLAhaLhb1793LvvfeyZ88eEhLKli1YsIC5c+dWDsPO1KlTSU1NJSUlhdDQUBYvXkxKSgqxsbH07t2b9PR0h2169uzJ448/zvDhw4mNjeXnP/95je/dYrEAVHuMK6+8kuzsbKKioti4cSMPziq7LTtx4kQeeOABWwf02srPz2f48OG13g5wej5dyVlFHTp0YNWqVdx1113ExMTYbvEFBQWxfPlyRowYQa9evbjmmmts29xxxx1888039OzZkz/+8Y+EhYU57Hfo0KGMGzeOpKQkoqOjGT16tNPiY968eXad8Sv2mSq/DTdz5kz+7//+j7CwMF588UUeeeQRW2EWExNDSkoKffv25YknnqBTp04MHZFGRGRPYmNjGTRoEL/97W+59tprqzwHgYGBrFmzhjlz5hAbG4vFYuG///1vtec+JSWFnJwcWwf0iubPn8+YMWPo3bs3V199dZX7WLRoEV26dLF9VRYXF0dMTIxdIQtlLWS//vWvsVqtzJgxo1FOdm243TRcR/Hx8Wb5EyL1pfxJh/o0dllZ5Z8xLalej+NprsZd3n9qZepKjx6/UeZm5Yiyfyf90zP7q4I3P1Pu5mX//v3cdNNN1a5T332m6ou34/7uu+9o3bq1x/YXHBxsa2UB778/d3gzdk/mZf78+QQHBzv0sWqsufF23O7kxtnPMsMwtpumGe9sfbVMiYiIiLhBT/OJiDRDFVulGtqpU6cYPHiww/L333+/Ud7i8ZT58+d7OwSpIxVTUiNnwyWAhkwQ50zTrPEJJGne2rdvj9Vq9XYYIk7VpfuTiimp1vDuzjuSalJkcSYoKIhTp07Rvn17FVQi0uiYpsmpU6cICgqq1XYqpqRaY8LGOC2YnLVUiXTp0oXjx4/z9ddfV7nO1999D0DJyRZVruOLvB13cXFxrX/A14a33587vBl7fecFGm9uvB13XXMTFBTk9GnE6rhUTBmGkQosBfyBFaZpOp1K2jCMO4A1QB/TNOv3UT0R8TkBAQE1jo483/a0oqUhQvIYb8edlZVlG327Pnj7/bnDm7HXd16g8ebG23E3RG7K1fg0n2EY/sDzwI+ASOAuwzAcZqk0DKM1MAP4xNNBioiIiPgqV4ZGSAAOmab5uWmaJcDfAceR5eCXwFOA6+PMi4iIiDRyrtzm6wwcq/D6OJBYcQXDMHoB15mm+U/DMGZXtSPDMKYCUwE6duxIVlZWrQOujbNnz9b7MQoLy0YIru/jeJq7cRcWFnKi5AS3//12u+XxV8Zzc+ubq9jqB40xN5bCQgCsjSzu2miMeWko3o67vnPj7ffnDl0zvsnbcTdEbsq53QHdMAw/4FlgYk3rmqa5HFgOZSOg1/cI2A0xyvYLB8ruCScnN64R0N2N++uDXztMgHzgmwMcCjjE48mP17h9o8zN4baX95fsmf1VwZufqUaZlwbi7bjrOzfefn/u0DXjm7wdd0PkppwrxdQJ4LoKr7tcXlauNRAFZF1+FPpaYJ1hGGnqhN50OXvKT0/4iYhIc+RKn6mtQA/DMLoZhhEI3AnYZnE0TfO0aZpXm6bZ1TTNrsDHgAopERERaRZqLKZM07wIPAj8G9gP/MM0zX2GYSw0DCOtvgMUERER8WUu9ZkyTXM9sL7SsnlVrJvsflgiIiIijYNGQBePcjaPn+bwExGRpkzFlHiMs3n8NIefiIg0dSqmxGP0hJ+IiDRHrjzNJyIiIiJVUDElIiIi4gYVUyIiIiJuUJ8pqXfOnvC78cKNJJPsnYBEREQ8SMWU1KuqnvAr9Cv0QjQiIiKep2JK6lVVT/gVFqqYEhGRpkF9pkRERETcoJYpD8gpOMPYZVvslqVbOjMu8XovRWTv1U+OstZ6wm5ZTsEZIkPbeCmihqPc+CblxTc5ywv4Tm6c5QWab258JS/QfK+Zciqm3JRu6eywLKfgDIDPfMjXWk84fKgjQ9s4jb2hnCg5Ue/Tzig3vkl58U1VvTdfyo2zvEDzzI0v5QWa5zVTkYopN41LvN7hw+zsLztviwxtQ8a0JG+HAZQVTZX7TNXHtDPKjW9SXnyTs7yA7+WmueUFdM00BiqmpMGNCRtDh/wOJCcn25Zp2hkREWms1AFdRERExA0qpkRERETcoNt84jOcjZTu6U7pIiIinqZiSnxCVSOlg2c7pYuIiHiaiinxCVWNlC4iIuLr1GdKRERExA0qpkRERETcoNt84tPUKV1ERHydiinxWeqULiIijYGKKfFZ6pQuIiKNgfpMiYiIiLhBLVPS6KgflYiI+BIVU9KoqB+ViIj4GhVT0qioH5WIiPga9ZkSERERcYOKKRERERE36DafNAnOOqWDOqaLiEj9UzEljZ6zTumgjukiItIwVExJo+esUzqoY7qIiDQM9ZkSERERcYNapqRJq9yXKi/wDCGlCUCS94ISEZEmRcWUNFnO+lIVG8fA3wvBiIhIk6ViSposZ32pElfe4aVoRESkqVKfKRERERE3qGVKmp1i45gmShYREY9RMSXNSkhpgkOfKY1HJSIi7lAxJc1Ku9JbaFd6CytTf3iaT+NRiYiIO1RMieB8Ohrd+hMREVeomJJmz9kQCrr1JyIirlIxJc2esyEUdOtPRERcpaERRERERNyglimRKjj0ozK+ZLh5JbrxJyIiFamYEnHCaT8qSsBAxZSIiNhRMSXihNN+VKvivRSNiIj4MhVTIrVwgBINoSAiInZUTIm4aLh5JRj2yzSEgoiIuFRMGYaRCiylbCKOFaZpLq70/Z8DU4CLwNfAZNM0j3g4VhGvGkMwY8xgSF1pW6YhFEREpMZiyjAMf+B54FbgOLDVMIx1pmnmVFhtJxBvmuY5wzB+AvwWGFsfATcWOQVnGLtsi92ydEtnxiVeX6/HffWTo6y1nnCIJTK0Tb0etzHxRm6c5aU8FuWmjK4Z3+Ur14zyYk/XjO9wpWUqAThkmubnAIZh/B1IB2zFlGmaH1RY/2NgvCeDbGzSLZ0dluUUnAGo9w/5WusJhw91ZGgbpzE1R/WRG1emonGWF1Buyuma8V3eyo3yUj1dM77FlWKqM3CswuvjQGI1698H/MudoBq7cYnXO3yYK//1UJ8iQ9uQMS2p5hWbIbdz88UeWDnC9nI4Z8EoKVt+2QHKXo/5aJVt2bxTpyEQegaGOO4zhwp/mniXpbAQDrdt8OOOA8YF2i/bF3gaTgErnZwzD6oyN97MS/RoiPeNW8je/Hmmn2VV0+8Z3+LRDuiGYYwH4oGBVXx/KjAVoGPHjmRlZXny8A7Onj1b78dwVWHheYB6j6ehjuOuxpib0BYxdAwqhMJC27JbgVtpYbfeQ60vcpGLFH73w3oXL5ZePlYhvqy0tNRnYmyoc+ZruQk+e5izhYVYz3azW94YrxlfP4YnNLe8NORx3NWQuXGlmDoBXFfhdZfLy+wYhjEEeBwYaJrm9852ZJrmcmA5QHx8vJmcnFzbeGslKyuL+j6Gq144UPYXQ3Jy/VbyDXUcdzXO3CS7tL8rNkziwDcHmNklzLYsp+AMIaUJZE6ZU8coG4Yv5WXa5b+y6/uv34Y6jstWjqAtOOTBl3LTED9n9LOs9vR7xl5D5saVYmor0MMwjG6UFVF3UtYqb2MYRhywDEg1TfMrj0cp0og4Gz292DhW9iysiIg0OTUWU6ZpXjQM40Hg35T9OnjJNM19hmEsBLaZprkOeBoIBl4zDAPgqGmaafUYt4jPcjZ6euLKOyg2jjkdSkGDfoqING4u9ZkyTXM9sL7SsnkV/j/Ew3GJNCkhpQlOW6Y06KeISOOnEdBFGkC70ltoV3oLK1Pt+xho0E8RkcZPxZSIl7kyTpWIiPguFVMiXuSss7pu/YmINC4qpkS8yFlndd36ExFpXFRMifgg3foTEWk8VEyJ+Bjd+hMRaVxUTIn4mKpu/am1SkTEN6mYEmkE1FolIuK7VEyJNALqqC4i4rtUTIk0Yrr1JyLifSqmRBop3foTEfENKqZEGil1VBcR8Q0qpkSaELVWiYg0PBVTIk2IWqtERBqeiimRJk6tVSIi9UvFlEgTV5vWqhsv3EgyyQ0YnYhI46diSqQZqqq1qtCv0AvRiIg0biqmRJqhqlqr9n21z+lgoOpfJSJSNT9vByAivmF49+F0DuzssPzANwdY//l6L0QkItI4qGVKRICy1qoO+R1ITk62E856egAACy5JREFUW66nAUVEqqdiSkSq5ax/1bYvt7Hty20OLVYqsESkOVIxJSLVcta/6rWDrzkUUhpuQUSaKxVTIlJrGhxUROQHKqZExCN0O1BEmisVUyLiEbodKCLNlYqpBpRTcIaxy7bYLUu3dGZc4vV12t+rnxxlrfWEwzEiQ9vUOcbmypO5UV5+4O7tQF0zvkvXjG/SNeMdKqYaSLrFcfyenIIzAHX+kK+1nnD4UEeGtnF6LKmap3OjvFTP1duBZ9t9T6h/LJTeYluma8Y36JrxTfo94z0qphrIuMTrHT7Mlf96qIvI0DZkTEtyez/NWX3kRnmpmqu3A7+9mEd4lxasTJ1jW6ZrxjfomvFN+j3jPSqmRMTrXL0dmBd4hpDSBEA/2EXEd6iYEhGf5Ox24Dm/g5zzO8ikDZ86XV+d2kXEG1RMiYhPctZaNXTFU5z2z3ZYV0MwiIg3qZgSkUajXekttCu9hZWp9rf5nPW5clZg6TahiNQHFVMi0ui52qm9qtuEasESEXeomBKRJsnV24S6RSgi7lIxJSLNhrPbhK7eIgQVWCLinIopEWnWXL1FqAJLRKqiYkpEpBIVWCJSGyqmRERc4G6BBSqyRJoqFVMiInXkaoEFasUSacpUTImIeJCzAgtq0YplfFn2b4VpdABuvHAjySR7OlwR8QAVUyIiDaA2rViVbftyG9vYxqENh+yWq1VLxDeomBIR8RKnrVgrR8AXe8D8yrboNa5iXenpsuWXbTO+L2vV+u9TNR5nuHklYwj2WNzOzDt1uuw/K0MadFtvsxQWwuG23g6jSlWe2+jRED/JcQOpExVTIiK+JHq0w6IxBHPrdxdp2/aHX9qvmWdZbxTVuLttxvdsM75nvVnzug1RdIkPKC/KVUx5jIopERFfEj/J6S85a1YWycnJttdjLn/VpFa3Eo3vWd+xu0thVr7FuHDZFgAyJtV+3kN3tvW2ynnxNU7P7coRXoqm6VIxJSLShFXVIb4yV4suqHoSaYBJG9rYrat+XdIcqJgSERGXiy6oZWtXFWNuVZQXeIaQ0gSg8bVMiYCKKa/LKTjD2MvNsBWlWzozLvF62+tXPznKWusJh20jQ9tU3lQ8xFlulBfv0zXjfc4Kr7HLtpBTcIZzJT+c31D/CALa7LJb76sz33Oy6Hu7Zef8DnLO7yCTNnxa55jUAla1ytfMvFOnuTq4BR0rradrpu5UTHlRuqWz0+U5BWXN5RV/May1nnD4UEeGtqlyH+IeZ+dVefE+XTO+y9l5LThuITL0Flb+7w8tTmOXbeFopbx867/JoeiqDVdbwKrSlAsxZ3k5V1LKybPfOxRTumbqTsWUF41LvN7uh385Z391Q9mHOmOamsEbgrPcKC/ep2vGd7l3zbiXo9r096rM3UKssLCQlze8bLfMl4ozZ3nZ92v/KtfXNVM3KqZERKRRq01/r8rcKcSccbc4qw1fKtqaOxVTIiLSbLlTiAFkVRoawdPFWVXcKdqKrj4LwJWVpiyq6olMVzXn4s6lYsowjFRgKeAPrDBNc3Gl77cAXgF6A6eAsaZp5nk2VBEREd/mbnHmKneLtiCz2G5EfYAbLlws+88XtW9nqc2I/PUholUoc/73ba8cG1wopgzD8AeeB24FjgNbDcNYZ5pmToXV7gO+/f/t3V2IHmcZxvH/ZbZB00BjrSxtEm2kQYmCti41pUU21YO0FteDYCN+hGLJicUqilTBT/CgINaPSiE00SrSVWLRRYoiaUo90NDEgOZDMUbbJKZNQm00fqXBy4N5im/e7Nqlszsz+77XD8LO88yQvcPNvXvnfZ6ZsX2VpI3A3cCt8xFwRETEsKvTtG2559Nc/8+dvP7y818xs/9U9eqZ/vnZmO0T+QfVbNrPa4FDtg8DSJoEJoDeZmoC+Fw53g7cK0m2PYexDpX+W1lze2o3JC/dldx0U/LSPTuW3MzXT9/AmrPn5+HA2So3L+ZJ9LN9Iv+g0gv1O5I2AOtt317G7wfeYvuOnmv2lWuOlvEfyjWn+v6uzcBmgNHR0TdPTk7O5b/lAmfOnGHp0oX3nqlHjzzHL/587oL5664YYXzlRS1ENPcWYm6Sl+5KbropeemmmfICyc3/s27duj22x6Y71+gGdNtbgC0AY2Njnu/3GfVvDFwoxtsOoAELMTfjbQfQgIWYF0huumq87QAakLx0V5O5ecksrjkGrOwZryhz014jaQS4hGojekRERMRAm00z9TiwWtIqSYuBjcBU3zVTwKZyvAF4JPulIiIiYhi84DKf7XOS7gB+SvVohG2290v6ArDb9hSwFfiOpEPAM1QNV0RERMTAm9WeKdsPAw/3zX2m5/hfDPdG/oiIiBhSs1nmi4iIiIgZpJmKiIiIqCHNVEREREQNaaYiIiIiakgzFREREVFDmqmIiIiIGtJMRURERNSQZioiIiKihjRTERERETWorVfoSToJPDHP3+Yy4NQ8f494cZKbbkpeuiu56abkpbvmOjevtv3K6U601kw1QdJu22NtxxEXSm66KXnpruSmm5KX7moyN1nmi4iIiKghzVREREREDYPeTG1pO4CYUXLTTclLdyU33ZS8dFdjuRnoPVMRERER823QP5mKiIiImFcD20xJWi/pd5IOSbqr7XiGlaSVknZKOiBpv6Q7y/ylkn4m6ffl68vbjnVYSVokaa+kH5fxKkm7Su18T9LitmMcNpKWSdou6beSDkq6LjXTDZI+Wn6W7ZP0oKSXpmbaIWmbpBOS9vXMTVsnqnyt5OjXkq6Zy1gGspmStAj4BnATsAZ4j6Q17UY1tM4BH7O9BlgLfKjk4i5gh+3VwI4yjnbcCRzsGd8N3GP7KuAvwAdbiWq4fRX4ie3XAW+kyk9qpmWSlgMfBsZsvwFYBGwkNdOWbwHr++ZmqpObgNXlz2bgvrkMZCCbKeBa4JDtw7bPApPARMsxDSXbx23/qhz/jeqXwnKqfDxQLnsAeFc7EQ43SSuAdwD3l7GAG4Ht5ZLkpmGSLgHeCmwFsH3W9rOkZrpiBHiZpBFgCXCc1EwrbD8GPNM3PVOdTADfduWXwDJJl89VLIPaTC0HjvSMj5a5aJGkK4GrgV3AqO3j5dRTwGhLYQ27rwCfAP5Txq8AnrV9roxTO81bBZwEvlmWX++XdDGpmdbZPgZ8CXiSqok6DewhNdMlM9XJvPYFg9pMRcdIWgr8APiI7b/2nnN1S2luK22YpFuAE7b3tB1LnGcEuAa4z/bVwN/pW9JLzbSj7L+ZoGp4rwAu5sJlpuiIJutkUJupY8DKnvGKMhctkHQRVSP1XdsPlemnn/+ItXw90VZ8Q+x64J2S/kS1FH4j1V6dZWUJA1I7bTgKHLW9q4y3UzVXqZn2vR34o+2Ttp8DHqKqo9RMd8xUJ/PaFwxqM/U4sLrcYbGYaoPgVMsxDaWyB2crcND2l3tOTQGbyvEm4EdNxzbsbH/S9grbV1LVyCO23wvsBDaUy5Kbhtl+Cjgi6bVl6m3AAVIzXfAksFbSkvKz7fncpGa6Y6Y6mQI+UO7qWwuc7lkOrG1gH9op6Waq/SCLgG22v9hySENJ0g3Az4Hf8L99OZ+i2jf1feBVwBPAu233bySMhkgaBz5u+xZJr6H6pOpSYC/wPtv/bjO+YSPpTVQ3BSwGDgO3Uf3nNzXTMkmfB26lulN5L3A71d6b1EzDJD0IjAOXAU8DnwV+yDR1Uprfe6mWZf8B3GZ795zFMqjNVEREREQTBnWZLyIiIqIRaaYiIiIiakgzFREREVFDmqmIiIiIGtJMRURERNSQZioiIiKihjRTERERETWkmYqIiIio4b/1OIai8IpDZQAAAABJRU5ErkJggg==", "text/plain": [ "
" ] }, "metadata": { "needs_background": "light" }, "output_type": "display_data" } ], "source": [ "import matplotlib.pyplot as plt\n", "from torch.optim.lr_scheduler import StepLR, CosineAnnealingLR, ExponentialLR\n", "model = Vit(\n", " in_channels=3,\n", " num_classes=2,\n", " num_patch_row=8,\n", " image_size=224,\n", " dropout=.1)\n", "optimizer = Adam(model.parameters(), lr=1)\n", "\n", "schedulers = [\n", " lambda optim: CosineAnnealingLR(optim, T_max=10), # 半周期 10 の cosine\n", " lambda optim: StepLR(optim, step_size=30, gamma=.2), # 30 epoch ごとに学習率を 0.05 倍\n", " lambda optim: ExponentialLR(optim, gamma=.95) # 毎 epoch ごとに 0.95 倍\n", "]\n", "\n", "epochs = list(range(100))\n", "fig, ax = plt.subplots(figsize=(10, 6))\n", "\n", "for get_scheduler in schedulers:\n", " rates = []\n", " sche = get_scheduler(Adam(model.parameters(), lr=1.))\n", "\n", " for i in epochs:\n", " rates.append(sche.get_last_lr()[0])\n", " sche.step()\n", "\n", " ax.step(epochs, rates, label=type(sche))\n", "\n", "# ax.set_yscale('log')\n", "ax.grid()\n", "ax.legend();" ] }, { "cell_type": "code", "execution_count": 23, "metadata": { "gather": { "logged": 1667634932116 } }, "outputs": [], "source": [ "def run_fold(\n", " model: nn.Module,\n", " train_loader: data.DataLoader,\n", " valid_loader: data.DataLoader,\n", " n_epochs=50) -> np.ndarray:\n", "\n", " optimizer = Adam(model.parameters(), lr=1e-2)\n", " scheduler = CosineAnnealingLR(optimizer, T_max=5, eta_min=1e-4)\n", "\n", " for epoch in range(1, n_epochs + 1):\n", " print(f'epoch: {epoch} lr: {scheduler.get_last_lr()[0]:.4f}')\n", " train_loss, train_acc = train(model, optimizer, train_loader)\n", " valid_loss, valid_acc = valid(model=model, valid_loader=valid_loader)\n", " scheduler.step()\n", " print(f'score: train_loss {train_loss:.3f} train_acc {train_acc:.3f} valid_loss {valid_loss:.3f} valid_acc {valid_acc:.3f}')" ] }, { "cell_type": "code", "execution_count": 24, "metadata": { "gather": { "logged": 1667634933535 } }, "outputs": [], "source": [ "seed_torch(0)\n", "vit = Vit(\n", " in_channels=3,\n", " num_classes=2,\n", " num_patch_row=8,\n", " image_size=224,\n", " dropout=.1)\n", "vit = vit.to(DEVICE)" ] }, { "cell_type": "code", "execution_count": 25, "metadata": { "gather": { "logged": 1667635159384 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "epoch: 1 lr: 0.0100\n", "score: train_loss 2.258 train_acc 0.484 valid_loss 0.861 valid_acc 0.470\n", "epoch: 2 lr: 0.0091\n", "score: train_loss 0.777 train_acc 0.536 valid_loss 0.694 valid_acc 0.522\n", "epoch: 3 lr: 0.0066\n", "score: train_loss 0.701 train_acc 0.531 valid_loss 0.680 valid_acc 0.539\n", "epoch: 4 lr: 0.0035\n", "score: train_loss 0.695 train_acc 0.531 valid_loss 0.681 valid_acc 0.568\n", "epoch: 5 lr: 0.0010\n", "score: train_loss 0.707 train_acc 0.531 valid_loss 0.687 valid_acc 0.535\n", "epoch: 6 lr: 0.0001\n", "score: train_loss 0.676 train_acc 0.573 valid_loss 0.679 valid_acc 0.548\n", "epoch: 7 lr: 0.0010\n", "score: train_loss 0.697 train_acc 0.521 valid_loss 0.675 valid_acc 0.583\n", "epoch: 8 lr: 0.0035\n", "score: train_loss 0.809 train_acc 0.547 valid_loss 0.966 valid_acc 0.470\n", "epoch: 9 lr: 0.0066\n", "score: train_loss 0.803 train_acc 0.536 valid_loss 0.816 valid_acc 0.470\n", "epoch: 10 lr: 0.0091\n", "score: train_loss 0.840 train_acc 0.516 valid_loss 0.679 valid_acc 0.602\n", "epoch: 11 lr: 0.0100\n", "score: train_loss 0.794 train_acc 0.516 valid_loss 0.705 valid_acc 0.530\n", "epoch: 12 lr: 0.0091\n", "score: train_loss 0.846 train_acc 0.484 valid_loss 1.001 valid_acc 0.530\n", "epoch: 13 lr: 0.0066\n", "score: train_loss 0.798 train_acc 0.547 valid_loss 0.914 valid_acc 0.470\n", "epoch: 14 lr: 0.0035\n", "score: train_loss 0.810 train_acc 0.505 valid_loss 0.771 valid_acc 0.530\n", "epoch: 15 lr: 0.0010\n", "score: train_loss 0.723 train_acc 0.547 valid_loss 0.696 valid_acc 0.530\n", "epoch: 16 lr: 0.0001\n", "score: train_loss 0.691 train_acc 0.521 valid_loss 0.691 valid_acc 0.530\n", "epoch: 17 lr: 0.0010\n", "score: train_loss 0.697 train_acc 0.490 valid_loss 0.704 valid_acc 0.470\n", "epoch: 18 lr: 0.0035\n", "score: train_loss 0.702 train_acc 0.500 valid_loss 0.693 valid_acc 0.530\n", "epoch: 19 lr: 0.0066\n", "score: train_loss 0.705 train_acc 0.490 valid_loss 0.687 valid_acc 0.549\n", "epoch: 20 lr: 0.0091\n", "score: train_loss 0.701 train_acc 0.516 valid_loss 0.690 valid_acc 0.488\n" ] } ], "source": [ "seed_torch()\n", "run_fold(\n", " model=vit,\n", " train_loader=train_loader,\n", " valid_loader=val_loader,\n", " n_epochs=20)" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "学習はできるが、性能改善には DataAugmentation や事前学習済みモデルのファインチューニングが必要" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "---" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "## 事前学習済みモデルの活用" ] }, { "cell_type": "markdown", "metadata": {}, "source": [ "- PyTorch Image MOdels\n", "- https://github.com/rwightman/pytorch-image-models\n", "\n", "※ torchvisionmodels にもある" ] }, { "cell_type": "code", "execution_count": 26, "metadata": { "gather": { "logged": 1667635159604 } }, "outputs": [], "source": [ "# !pip3 -q install timm" ] }, { "cell_type": "code", "execution_count": 27, "metadata": { "gather": { "logged": 1667635159830 } }, "outputs": [], "source": [ "import timm" ] }, { "cell_type": "code", "execution_count": 28, "metadata": { "gather": { "logged": 1667635160040 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Available Vision Transformer Models: \n" ] }, { "data": { "text/plain": [ "['vit_base_patch8_224',\n", " 'vit_base_patch8_224_dino',\n", " 'vit_base_patch8_224_in21k',\n", " 'vit_base_patch16_18x2_224',\n", " 'vit_base_patch16_224',\n", " 'vit_base_patch16_224_dino',\n", " 'vit_base_patch16_224_in21k',\n", " 'vit_base_patch16_224_miil',\n", " 'vit_base_patch16_224_miil_in21k',\n", " 'vit_base_patch16_224_sam',\n", " 'vit_base_patch16_384',\n", " 'vit_base_patch16_plus_240',\n", " 'vit_base_patch16_rpn_224',\n", " 'vit_base_patch32_224',\n", " 'vit_base_patch32_224_clip_laion2b',\n", " 'vit_base_patch32_224_in21k',\n", " 'vit_base_patch32_224_sam',\n", " 'vit_base_patch32_384',\n", " 'vit_base_patch32_plus_256',\n", " 'vit_base_r26_s32_224',\n", " 'vit_base_r50_s16_224',\n", " 'vit_base_r50_s16_224_in21k',\n", " 'vit_base_r50_s16_384',\n", " 'vit_base_resnet26d_224',\n", " 'vit_base_resnet50_224_in21k',\n", " 'vit_base_resnet50_384',\n", " 'vit_base_resnet50d_224',\n", " 'vit_giant_patch14_224',\n", " 'vit_giant_patch14_224_clip_laion2b',\n", " 'vit_gigantic_patch14_224',\n", " 'vit_huge_patch14_224',\n", " 'vit_huge_patch14_224_clip_laion2b',\n", " 'vit_huge_patch14_224_in21k',\n", " 'vit_large_patch14_224',\n", " 'vit_large_patch14_224_clip_laion2b',\n", " 'vit_large_patch16_224',\n", " 'vit_large_patch16_224_in21k',\n", " 'vit_large_patch16_384',\n", " 'vit_large_patch32_224',\n", " 'vit_large_patch32_224_in21k',\n", " 'vit_large_patch32_384',\n", " 'vit_large_r50_s32_224',\n", " 'vit_large_r50_s32_224_in21k',\n", " 'vit_large_r50_s32_384',\n", " 'vit_relpos_base_patch16_224',\n", " 'vit_relpos_base_patch16_cls_224',\n", " 'vit_relpos_base_patch16_clsgap_224',\n", " 'vit_relpos_base_patch16_plus_240',\n", " 'vit_relpos_base_patch16_rpn_224',\n", " 'vit_relpos_base_patch32_plus_rpn_256',\n", " 'vit_relpos_medium_patch16_224',\n", " 'vit_relpos_medium_patch16_cls_224',\n", " 'vit_relpos_medium_patch16_rpn_224',\n", " 'vit_relpos_small_patch16_224',\n", " 'vit_relpos_small_patch16_rpn_224',\n", " 'vit_small_patch8_224_dino',\n", " 'vit_small_patch16_18x2_224',\n", " 'vit_small_patch16_36x1_224',\n", " 'vit_small_patch16_224',\n", " 'vit_small_patch16_224_dino',\n", " 'vit_small_patch16_224_in21k',\n", " 'vit_small_patch16_384',\n", " 'vit_small_patch32_224',\n", " 'vit_small_patch32_224_in21k',\n", " 'vit_small_patch32_384',\n", " 'vit_small_r26_s32_224',\n", " 'vit_small_r26_s32_224_in21k',\n", " 'vit_small_r26_s32_384',\n", " 'vit_small_resnet26d_224',\n", " 'vit_small_resnet50d_s16_224',\n", " 'vit_srelpos_medium_patch16_224',\n", " 'vit_srelpos_small_patch16_224',\n", " 'vit_tiny_patch16_224',\n", " 'vit_tiny_patch16_224_in21k',\n", " 'vit_tiny_patch16_384',\n", " 'vit_tiny_r_s16_p8_224',\n", " 'vit_tiny_r_s16_p8_224_in21k',\n", " 'vit_tiny_r_s16_p8_384']" ] }, "execution_count": 28, "metadata": {}, "output_type": "execute_result" } ], "source": [ "print('Available Vision Transformer Models: ')\n", "timm.list_models('vit*')" ] }, { "cell_type": "code", "execution_count": 29, "metadata": { "gather": { "logged": 1667635161097 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "Linear(in_features=768, out_features=1000, bias=True)\n", "Linear(in_features=768, out_features=2, bias=True)\n" ] } ], "source": [ "# ベースモデルの取得\n", "MODEL_NAME = 'vit_base_patch16_224'\n", "_model = timm.create_model(MODEL_NAME, pretrained=False)\n", "print(_model.head)\n", "for param in _model.parameters():\n", " param.requires_grad = False\n", "# タスクに合わせてアーキテクチャを変更\n", "_model.head = nn.Linear(_model.head.in_features, 2)\n", "print(_model.head)" ] }, { "cell_type": "code", "execution_count": 30, "metadata": { "gather": { "logged": 1667635161230 }, "jupyter": { "outputs_hidden": false, "source_hidden": false }, "nteract": { "transient": { "deleting": false } } }, "outputs": [], "source": [ "# アーキテクチャと更新するパラメータの確認\n", "# !pip3 -q install torchsummary\n", "# from torchsummary import summary\n", "# _model = _model.to(DEVICE)\n", "# summary(_model, input_size=(3, 224, 224))" ] }, { "cell_type": "code", "execution_count": 32, "metadata": { "gather": { "logged": 1667635161533 } }, "outputs": [], "source": [ "class ViTBased(nn.Module):\n", " def __init__(self, n_classes:int=2):\n", "\n", " super(ViTBased, self).__init__()\n", "\n", " self.model = timm.create_model(MODEL_NAME, pretrained=True)\n", " for param in self.model.parameters():\n", " param.requires_grad = False\n", " self.model.head = nn.Linear(self.model.head.in_features, n_classes)\n", "\n", " def forward(self, x:torch.Tensor) -> torch.Tensor:\n", " out = self.model(x)\n", " return out" ] }, { "cell_type": "code", "execution_count": 33, "metadata": { "gather": { "logged": 1667635163497 } }, "outputs": [], "source": [ "seed_torch()\n", "vit = ViTBased(n_classes=2)\n", "vit = vit.to(DEVICE)" ] }, { "cell_type": "code", "execution_count": 34, "metadata": { "gather": { "logged": 1667635163627 } }, "outputs": [], "source": [ "def run_fold(\n", " model: nn.Module,\n", " train_loader: data.DataLoader,\n", " valid_loader: data.DataLoader,\n", " n_epochs=50) -> np.ndarray:\n", "\n", " optimizer = Adam(model.parameters(), lr=1e-2)\n", "\n", " for epoch in range(1, n_epochs+1):\n", " print(f'epoch: {epoch}')\n", " train_loss, train_acc = train(model, optimizer, train_loader)\n", " valid_loss, valid_acc = valid(model=model, valid_loader=valid_loader)\n", " print(f'score: train_loss {train_loss:.3f} train_acc {train_acc:.3f} valid_loss {valid_loss:.3f} valid_acc {valid_acc:.3f}')" ] }, { "cell_type": "code", "execution_count": 35, "metadata": { "gather": { "logged": 1667635199368 } }, "outputs": [ { "name": "stdout", "output_type": "stream", "text": [ "epoch: 1\n", "score: train_loss 0.381 train_acc 0.891 valid_loss 0.072 valid_acc 0.974\n", "epoch: 2\n", "score: train_loss 0.037 train_acc 0.984 valid_loss 0.000 valid_acc 1.000\n", "epoch: 3\n", "score: train_loss 0.000 train_acc 1.000 valid_loss 0.000 valid_acc 1.000\n" ] } ], "source": [ "run_fold(\n", " model=vit,\n", " train_loader=train_loader,\n", " valid_loader=val_loader,\n", " n_epochs=3)" ] }, { "cell_type": "code", "execution_count": null, "metadata": { "jupyter": { "outputs_hidden": false, "source_hidden": false }, "nteract": { "transient": { "deleting": false } } }, "outputs": [], "source": [] } ], "metadata": { "kernel_info": { "name": "python3" }, "kernelspec": { "display_name": "Python 3 (ipykernel)", "language": "python", "name": "python3" }, "language_info": { "codemirror_mode": { "name": "ipython", "version": 3 }, "file_extension": ".py", "mimetype": "text/x-python", "name": "python", "nbconvert_exporter": "python", "pygments_lexer": "ipython3", "version": "3.8.13" }, "nteract": { "version": "nteract-front-end@1.0.0" }, "orig_nbformat": 4, "vscode": { "interpreter": { "hash": "195d00c3bc2576aa3aa8d34b1ef69c319bc4c5e1d04057dba8a69b2c34c3aaa0" } } }, "nbformat": 4, "nbformat_minor": 2 }