Spaces:
Running
on
Zero
Running
on
Zero
Upload 54 files
Browse filesThis view is limited to 50 files because it contains too many changes.
See raw diff
- .gitattributes +7 -0
- .gitignore +3 -0
- LICENSE +21 -0
- README.md +87 -10
- app.py +146 -0
- assets/example_result.png +3 -0
- assets/teaser.jpg +3 -0
- ckpt/DAPE/DAPE.pth +3 -0
- ckpt/RAM/ram_swin_large_14m_ckpt_here.txt +0 -0
- ckpt/SR_LoRA/model_20001.pkl +3 -0
- ckpt/SR_VAE/vae_encoder_20001.pt +3 -0
- inference_coz.py +364 -0
- lora/lora_layers.py +137 -0
- lora/lora_utils.py +61 -0
- osediff_sd3.py +725 -0
- ram/__init__.py +2 -0
- ram/configs/condition_config.json +3 -0
- ram/configs/med_config.json +21 -0
- ram/configs/q2l_config.json +22 -0
- ram/configs/swin/config_swinB_384.json +9 -0
- ram/configs/swin/config_swinL_384.json +9 -0
- ram/configs/swin/config_swinL_444.json +9 -0
- ram/data/ram_tag_list.txt +4585 -0
- ram/data/ram_tag_list_chinese.txt +4585 -0
- ram/data/ram_tag_list_threshold.txt +4585 -0
- ram/data/tag_list.txt +3429 -0
- ram/inference.py +46 -0
- ram/models/__init__.py +2 -0
- ram/models/bert.py +1035 -0
- ram/models/bert_lora.py +1040 -0
- ram/models/ram.py +317 -0
- ram/models/ram_lora.py +344 -0
- ram/models/swin_transformer.py +696 -0
- ram/models/swin_transformer_lora.py +660 -0
- ram/models/tag2text.py +419 -0
- ram/models/tag2text_lora.py +419 -0
- ram/models/utils.py +365 -0
- ram/models/vit.py +305 -0
- ram/transform.py +13 -0
- ram/utils/__init__.py +2 -0
- ram/utils/metrics.py +102 -0
- ram/utils/openset_utils.py +333 -0
- requirements.txt +52 -0
- samples/0064.png +3 -0
- samples/0245.png +3 -0
- samples/0393.png +3 -0
- samples/0457.png +3 -0
- samples/0479.png +3 -0
- scripts/inference/inference_coz_dapeprompt.sh +17 -0
- scripts/inference/inference_coz_nullprompt.sh +15 -0
.gitattributes
CHANGED
|
@@ -33,3 +33,10 @@ saved_model/**/* filter=lfs diff=lfs merge=lfs -text
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 33 |
*.zip filter=lfs diff=lfs merge=lfs -text
|
| 34 |
*.zst filter=lfs diff=lfs merge=lfs -text
|
| 35 |
*tfevents* filter=lfs diff=lfs merge=lfs -text
|
| 36 |
+
assets/example_result.png filter=lfs diff=lfs merge=lfs -text
|
| 37 |
+
assets/teaser.jpg filter=lfs diff=lfs merge=lfs -text
|
| 38 |
+
samples/0064.png filter=lfs diff=lfs merge=lfs -text
|
| 39 |
+
samples/0245.png filter=lfs diff=lfs merge=lfs -text
|
| 40 |
+
samples/0393.png filter=lfs diff=lfs merge=lfs -text
|
| 41 |
+
samples/0457.png filter=lfs diff=lfs merge=lfs -text
|
| 42 |
+
samples/0479.png filter=lfs diff=lfs merge=lfs -text
|
.gitignore
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
inference_results/
|
| 2 |
+
__pycache__/
|
| 3 |
+
ram_swin_large_14m.pth
|
LICENSE
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
MIT License
|
| 2 |
+
|
| 3 |
+
Copyright (c) 2025 Bryan Sangwoo Kim
|
| 4 |
+
|
| 5 |
+
Permission is hereby granted, free of charge, to any person obtaining a copy
|
| 6 |
+
of this software and associated documentation files (the "Software"), to deal
|
| 7 |
+
in the Software without restriction, including without limitation the rights
|
| 8 |
+
to use, copy, modify, merge, publish, distribute, sublicense, and/or sell
|
| 9 |
+
copies of the Software, and to permit persons to whom the Software is
|
| 10 |
+
furnished to do so, subject to the following conditions:
|
| 11 |
+
|
| 12 |
+
The above copyright notice and this permission notice shall be included in all
|
| 13 |
+
copies or substantial portions of the Software.
|
| 14 |
+
|
| 15 |
+
THE SOFTWARE IS PROVIDED "AS IS", WITHOUT WARRANTY OF ANY KIND, EXPRESS OR
|
| 16 |
+
IMPLIED, INCLUDING BUT NOT LIMITED TO THE WARRANTIES OF MERCHANTABILITY,
|
| 17 |
+
FITNESS FOR A PARTICULAR PURPOSE AND NONINFRINGEMENT. IN NO EVENT SHALL THE
|
| 18 |
+
AUTHORS OR COPYRIGHT HOLDERS BE LIABLE FOR ANY CLAIM, DAMAGES OR OTHER
|
| 19 |
+
LIABILITY, WHETHER IN AN ACTION OF CONTRACT, TORT OR OTHERWISE, ARISING FROM,
|
| 20 |
+
OUT OF OR IN CONNECTION WITH THE SOFTWARE OR THE USE OR OTHER DEALINGS IN THE
|
| 21 |
+
SOFTWARE.
|
README.md
CHANGED
|
@@ -1,12 +1,89 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
---
|
| 2 |
-
|
| 3 |
-
|
| 4 |
-
|
| 5 |
-
|
| 6 |
-
|
| 7 |
-
|
| 8 |
-
|
| 9 |
-
|
| 10 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 11 |
|
| 12 |
-
|
|
|
|
|
|
| 1 |
+
# Chain-of-Zoom: Extreme Super-Resolution via Scale Autoregression and Preference Alignment
|
| 2 |
+
|
| 3 |
+
This repository is the official implementation of [Chain-of-Zoom: Extreme Super-Resolution via Scale Autoregression and Preference Alignment](https://arxiv.org/abs/2505.18600), led by
|
| 4 |
+
|
| 5 |
+
[Bryan Sangwoo Kim](https://scholar.google.com/citations?user=ndWU-84AAAAJ&hl=en), [Jeongsol Kim](https://jeongsol.dev/), [Jong Chul Ye](https://bispl.weebly.com/professor.html)
|
| 6 |
+
|
| 7 |
+

|
| 8 |
+
|
| 9 |
+
[](https://bryanswkim.github.io/chain-of-zoom/)
|
| 10 |
+
[](https://arxiv.org/abs/2505.18600)
|
| 11 |
+
|
| 12 |
---
|
| 13 |
+
## 🔥 Summary
|
| 14 |
+
|
| 15 |
+
Modern single-image super-resolution (SISR) models deliver photo-realistic results at the scale factors on which they are trained, but show notable drawbacks:
|
| 16 |
+
|
| 17 |
+
1. **Blur and artifacts** when pushed to magnify beyond its training regime
|
| 18 |
+
2. **High computational costs and inefficiency** of retraining models when we want to magnify further
|
| 19 |
+
|
| 20 |
+
This brings us to the fundamental question: \
|
| 21 |
+
_How can we effectively utilize super-resolution models to explore much higher resolutions than they were originally trained for?_
|
| 22 |
+
|
| 23 |
+
We address this via **Chain-of-Zoom** 🔎, a model-agnostic framework that factorizes SISR into an autoregressive chain of intermediate scale-states with multi-scale-aware prompts.
|
| 24 |
+
CoZ repeatedly re-uses a backbone SR model, decomposing the conditional probability into tractable sub-problems to achieve extreme resolutions without additional training.
|
| 25 |
+
Because visual cues diminish at high magnifications, we augment each zoom step with multi-scale-aware text prompts generated by a prompt extractor VLM.
|
| 26 |
+
This prompt extractor can be fine-tuned through GRPO with a critic VLM to further align text guidance towards human preference.
|
| 27 |
+
|
| 28 |
+
## 🗓 ️News
|
| 29 |
+
- [May 2025] Code and paper are uploaded.
|
| 30 |
+
|
| 31 |
+
## 🛠️ Setup
|
| 32 |
+
First, create your environment. We recommend using the following commands.
|
| 33 |
+
|
| 34 |
+
```
|
| 35 |
+
git clone https://github.com/bryanswkim/Chain-of-Zoom.git
|
| 36 |
+
cd Chain-of-Zoom
|
| 37 |
+
|
| 38 |
+
conda create -n coz python=3.10
|
| 39 |
+
conda activate coz
|
| 40 |
+
pip install -r requirements.txt
|
| 41 |
+
```
|
| 42 |
+
|
| 43 |
+
## ⏳ Models
|
| 44 |
+
|
| 45 |
+
|Models|Checkpoints|
|
| 46 |
+
|:---------|:--------|
|
| 47 |
+
|Stable Diffusion v3|[Hugging Face](https://huggingface.co/stabilityai/stable-diffusion-3-medium)
|
| 48 |
+
|Qwen2.5-VL-3B-Instruct|[Hugging Face](https://huggingface.co/Qwen/Qwen2.5-VL-3B-Instruct)
|
| 49 |
+
|RAM|[Hugging Face](https://huggingface.co/spaces/xinyu1205/recognize-anything/blob/main/ram_swin_large_14m.pth)
|
| 50 |
+
|
| 51 |
+
## 🌄 Example
|
| 52 |
+
You can quickly check the results of using **CoZ** with the following example:
|
| 53 |
+
```
|
| 54 |
+
python inference_coz.py \
|
| 55 |
+
-i samples \
|
| 56 |
+
-o inference_results/coz_vlmprompt \
|
| 57 |
+
--rec_type recursive_multiscale \
|
| 58 |
+
--prompt_type vlm \
|
| 59 |
+
--lora_path ckpt/SR_LoRA/model_20001.pkl \
|
| 60 |
+
--vae_path ckpt/SR_VAE/vae_encoder_20001.pt \
|
| 61 |
+
--pretrained_model_name_or_path 'stabilityai/stable-diffusion-3-medium-diffusers' \
|
| 62 |
+
--ram_ft_path ckpt/DAPE/DAPE.pth \
|
| 63 |
+
--ram_path ckpt/RAM/ram_swin_large_14m.pth \
|
| 64 |
+
```
|
| 65 |
+
Which will give a result like below:
|
| 66 |
+
|
| 67 |
+

|
| 68 |
+
|
| 69 |
+
## 🔬 Efficient Memory
|
| 70 |
+
Using ```--efficient_memory``` allows CoZ to run on a single GPU with 24GB VRAM, but highly increases inference time due to offloading. \
|
| 71 |
+
We recommend using two GPUs.
|
| 72 |
+
|
| 73 |
+
## 📝 Citation
|
| 74 |
+
If you find our method useful, please cite as below or leave a star to this repository.
|
| 75 |
+
|
| 76 |
+
```
|
| 77 |
+
@article{kim2025chain,
|
| 78 |
+
title={Chain-of-Zoom: Extreme Super-Resolution via Scale Autoregression and Preference Alignment},
|
| 79 |
+
author={Kim, Bryan Sangwoo and Kim, Jeongsol and Ye, Jong Chul},
|
| 80 |
+
journal={arXiv preprint arXiv:2505.18600},
|
| 81 |
+
year={2025}
|
| 82 |
+
}
|
| 83 |
+
```
|
| 84 |
+
|
| 85 |
+
## 🤗 Acknowledgements
|
| 86 |
+
We thank the authors of [OSEDiff](https://github.com/cswry/OSEDiff) for sharing their awesome work!
|
| 87 |
|
| 88 |
+
> [!note]
|
| 89 |
+
> This work is currently in the preprint stage, and there may be some changes to the code.
|
app.py
ADDED
|
@@ -0,0 +1,146 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import gradio as gr
|
| 2 |
+
import subprocess
|
| 3 |
+
import os
|
| 4 |
+
import shutil
|
| 5 |
+
from pathlib import Path
|
| 6 |
+
from PIL import Image
|
| 7 |
+
import spaces
|
| 8 |
+
|
| 9 |
+
# -----------------------------------------------------------------------------
|
| 10 |
+
# CONFIGURE THESE PATHS TO MATCH YOUR PROJECT STRUCTURE
|
| 11 |
+
# -----------------------------------------------------------------------------
|
| 12 |
+
|
| 13 |
+
INPUT_DIR = "samples"
|
| 14 |
+
OUTPUT_DIR = "inference_results/coz_vlmprompt"
|
| 15 |
+
|
| 16 |
+
# -----------------------------------------------------------------------------
|
| 17 |
+
# HELPER FUNCTION TO RUN INFERENCE AND RETURN THE OUTPUT IMAGE
|
| 18 |
+
# -----------------------------------------------------------------------------
|
| 19 |
+
|
| 20 |
+
@spaces.GPU()
|
| 21 |
+
def run_with_upload(uploaded_image_path):
|
| 22 |
+
"""
|
| 23 |
+
1) Clear out INPUT_DIR (so old samples don’t linger).
|
| 24 |
+
2) Copy the uploaded image into INPUT_DIR.
|
| 25 |
+
3) Run your inference_coz.py command (which reads from -i INPUT_DIR).
|
| 26 |
+
4) After it finishes, find the most recently‐modified PNG in OUTPUT_DIR.
|
| 27 |
+
5) Return a PIL.Image, which Gradio will display.
|
| 28 |
+
"""
|
| 29 |
+
|
| 30 |
+
# 1) Make sure INPUT_DIR exists; if it does, delete everything inside.
|
| 31 |
+
os.makedirs(INPUT_DIR, exist_ok=True)
|
| 32 |
+
for fn in os.listdir(INPUT_DIR):
|
| 33 |
+
full_path = os.path.join(INPUT_DIR, fn)
|
| 34 |
+
try:
|
| 35 |
+
if os.path.isfile(full_path) or os.path.islink(full_path):
|
| 36 |
+
os.remove(full_path)
|
| 37 |
+
elif os.path.isdir(full_path):
|
| 38 |
+
shutil.rmtree(full_path)
|
| 39 |
+
except Exception as e:
|
| 40 |
+
print(f"Warning: could not delete {full_path}: {e}")
|
| 41 |
+
|
| 42 |
+
# 2) Copy the uploaded image into INPUT_DIR.
|
| 43 |
+
# Gradio will give us a path like "/tmp/gradio_xyz.png"
|
| 44 |
+
if uploaded_image_path is None:
|
| 45 |
+
return None
|
| 46 |
+
|
| 47 |
+
try:
|
| 48 |
+
# Open with PIL (this handles JPEG, BMP, TIFF, etc.)
|
| 49 |
+
pil_img = Image.open(uploaded_image_path).convert("RGB")
|
| 50 |
+
except Exception as e:
|
| 51 |
+
print(f"Error: could not open uploaded image: {e}")
|
| 52 |
+
return None
|
| 53 |
+
|
| 54 |
+
# Save it as "input.png" in our INPUT_DIR
|
| 55 |
+
save_path = Path(INPUT_DIR) / "input.png"
|
| 56 |
+
try:
|
| 57 |
+
pil_img.save(save_path, format="PNG")
|
| 58 |
+
except Exception as e:
|
| 59 |
+
print(f"Error: could not save as PNG: {e}")
|
| 60 |
+
return None
|
| 61 |
+
|
| 62 |
+
# 3) Build and run your inference_coz.py command.
|
| 63 |
+
# This will block until it completes.
|
| 64 |
+
cmd = [
|
| 65 |
+
"python", "inference_coz.py",
|
| 66 |
+
"-i", INPUT_DIR,
|
| 67 |
+
"-o", OUTPUT_DIR,
|
| 68 |
+
"--rec_type", "recursive_multiscale",
|
| 69 |
+
"--prompt_type", "vlm",
|
| 70 |
+
"--upscale", "2",
|
| 71 |
+
"--lora_path", "ckpt/SR_LoRA/model_20001.pkl",
|
| 72 |
+
"--vae_path", "ckpt/SR_VAE/vae_encoder_20001.pt",
|
| 73 |
+
"--pretrained_model_name_or_path", "stabilityai/stable-diffusion-3-medium-diffusers",
|
| 74 |
+
"--ram_ft_path", "ckpt/DAPE/DAPE.pth",
|
| 75 |
+
"--ram_path", "ckpt/RAM/ram_swin_large_14m.pth"
|
| 76 |
+
]
|
| 77 |
+
try:
|
| 78 |
+
subprocess.run(cmd, check=True)
|
| 79 |
+
except subprocess.CalledProcessError as err:
|
| 80 |
+
# If inference_coz.py crashes, we can print/log the error.
|
| 81 |
+
print("Inference failed:", err)
|
| 82 |
+
return None
|
| 83 |
+
|
| 84 |
+
# 4) After it finishes, scan OUTPUT_DIR for .png files.
|
| 85 |
+
|
| 86 |
+
RECUSIVE_DIR = f'{OUTPUT_DIR}/recursive'
|
| 87 |
+
|
| 88 |
+
if not os.path.isdir(RECUSIVE_DIR):
|
| 89 |
+
return None
|
| 90 |
+
|
| 91 |
+
png_files = [
|
| 92 |
+
os.path.join(RECUSIVE_DIR, fn)
|
| 93 |
+
for fn in os.listdir(RECUSIVE_DIR)
|
| 94 |
+
if fn.lower().endswith(".png")
|
| 95 |
+
]
|
| 96 |
+
if not png_files:
|
| 97 |
+
return None
|
| 98 |
+
|
| 99 |
+
# 5) Pick the most recently‐modified PNG
|
| 100 |
+
latest_png = max(png_files, key=os.path.getmtime)
|
| 101 |
+
|
| 102 |
+
# 6) Open and return a PIL.Image. Gradio will display it automatically.
|
| 103 |
+
try:
|
| 104 |
+
img = Image.open(latest_png).convert("RGB")
|
| 105 |
+
except Exception as e:
|
| 106 |
+
print(f"Error opening {latest_png}: {e}")
|
| 107 |
+
return None
|
| 108 |
+
|
| 109 |
+
return img
|
| 110 |
+
|
| 111 |
+
# -----------------------------------------------------------------------------
|
| 112 |
+
# BUILD THE GRADIO INTERFACE
|
| 113 |
+
# -----------------------------------------------------------------------------
|
| 114 |
+
|
| 115 |
+
with gr.Blocks() as demo:
|
| 116 |
+
gr.Markdown("## Upload an image, then click **Run Inference** to process it.")
|
| 117 |
+
|
| 118 |
+
# 1) Image upload component. We set type="filepath" so the callback
|
| 119 |
+
# (run_with_upload) will receive a local path to the uploaded file.
|
| 120 |
+
upload_image = gr.Image(
|
| 121 |
+
label="Upload your input image",
|
| 122 |
+
type="filepath"
|
| 123 |
+
)
|
| 124 |
+
|
| 125 |
+
# 2) A button that the user will click to launch inference.
|
| 126 |
+
run_button = gr.Button("Run Inference")
|
| 127 |
+
|
| 128 |
+
# 3) An output <Image> where we will show the final PNG.
|
| 129 |
+
output_image = gr.Image(
|
| 130 |
+
label="Inference Result",
|
| 131 |
+
type="pil" # because run_with_upload() returns a PIL.Image
|
| 132 |
+
)
|
| 133 |
+
|
| 134 |
+
# Wire the button: when clicked, call run_with_upload(upload_image), put
|
| 135 |
+
# its return value into output_image.
|
| 136 |
+
run_button.click(
|
| 137 |
+
fn=run_with_upload,
|
| 138 |
+
inputs=upload_image,
|
| 139 |
+
outputs=output_image
|
| 140 |
+
)
|
| 141 |
+
|
| 142 |
+
# -----------------------------------------------------------------------------
|
| 143 |
+
# START THE GRADIO SERVER
|
| 144 |
+
# -----------------------------------------------------------------------------
|
| 145 |
+
|
| 146 |
+
demo.launch(share=True)
|
assets/example_result.png
ADDED
|
Git LFS Details
|
assets/teaser.jpg
ADDED
|
Git LFS Details
|
ckpt/DAPE/DAPE.pth
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:a7028be2edcbe9ab0bd1c4ab6f2a2a86f4b44d32261a4faa50ae10fdd9b2feba
|
| 3 |
+
size 7194489
|
ckpt/RAM/ram_swin_large_14m_ckpt_here.txt
ADDED
|
File without changes
|
ckpt/SR_LoRA/model_20001.pkl
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:697d3f9ab69a222006ca3ae48503cf057774c8142646301e9bba90e58242e47e
|
| 3 |
+
size 8111108
|
ckpt/SR_VAE/vae_encoder_20001.pt
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
version https://git-lfs.github.com/spec/v1
|
| 2 |
+
oid sha256:ed7f7aa03dfcbce9016d51c5aa8d3920428b3d7c9a678c721cd062d01805ae4a
|
| 3 |
+
size 69346330
|
inference_coz.py
ADDED
|
@@ -0,0 +1,364 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import os
|
| 2 |
+
import sys
|
| 3 |
+
sys.path.append(os.getcwd())
|
| 4 |
+
import glob
|
| 5 |
+
import argparse
|
| 6 |
+
import torch
|
| 7 |
+
from torchvision import transforms
|
| 8 |
+
import torchvision.transforms.functional as F
|
| 9 |
+
import numpy as np
|
| 10 |
+
from PIL import Image
|
| 11 |
+
|
| 12 |
+
from ram.models.ram_lora import ram
|
| 13 |
+
from ram import inference_ram as inference
|
| 14 |
+
from utils.wavelet_color_fix import adain_color_fix, wavelet_color_fix
|
| 15 |
+
|
| 16 |
+
tensor_transforms = transforms.Compose([
|
| 17 |
+
transforms.ToTensor(),
|
| 18 |
+
])
|
| 19 |
+
ram_transforms = transforms.Compose([
|
| 20 |
+
transforms.Resize((384, 384)),
|
| 21 |
+
transforms.Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
|
| 22 |
+
])
|
| 23 |
+
|
| 24 |
+
def resize_and_center_crop(img: Image.Image, size: int) -> Image.Image:
|
| 25 |
+
w, h = img.size
|
| 26 |
+
scale = size / min(w, h)
|
| 27 |
+
new_w, new_h = int(w * scale), int(h * scale)
|
| 28 |
+
img = img.resize((new_w, new_h), Image.LANCZOS)
|
| 29 |
+
left = (new_w - size) // 2
|
| 30 |
+
top = (new_h - size) // 2
|
| 31 |
+
return img.crop((left, top, left + size, top + size))
|
| 32 |
+
|
| 33 |
+
def get_validation_prompt(args, image, prompt_image_path, dape_model=None, vlm_model=None, device='cuda'):
|
| 34 |
+
# prepare low-res tensor for SR input
|
| 35 |
+
lq = tensor_transforms(image).unsqueeze(0).to(device)
|
| 36 |
+
# select prompt source
|
| 37 |
+
if args.prompt_type == "null":
|
| 38 |
+
prompt_text = args.prompt or ""
|
| 39 |
+
elif args.prompt_type == "dape":
|
| 40 |
+
lq_ram = ram_transforms(lq).to(dtype=weight_dtype)
|
| 41 |
+
captions = inference(lq_ram, dape_model)
|
| 42 |
+
prompt_text = f"{captions[0]}, {args.prompt}," if args.prompt else captions[0]
|
| 43 |
+
elif args.prompt_type in ("vlm"):
|
| 44 |
+
message_text = None
|
| 45 |
+
|
| 46 |
+
if args.rec_type == "recursive":
|
| 47 |
+
message_text = "What is in this image? Give me a set of words."
|
| 48 |
+
print(f'MESSAGE TEXT: {message_text}')
|
| 49 |
+
messages = [
|
| 50 |
+
{"role": "system", "content": f"{message_text}"},
|
| 51 |
+
{
|
| 52 |
+
"role": "user",
|
| 53 |
+
"content": [
|
| 54 |
+
{"type": "image", "image": prompt_image_path}
|
| 55 |
+
]
|
| 56 |
+
}
|
| 57 |
+
]
|
| 58 |
+
text = vlm_processor.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
| 59 |
+
image_inputs, video_inputs = process_vision_info(messages)
|
| 60 |
+
inputs = vlm_processor(
|
| 61 |
+
text=[text],
|
| 62 |
+
images=image_inputs,
|
| 63 |
+
videos=video_inputs,
|
| 64 |
+
padding=True,
|
| 65 |
+
return_tensors="pt",
|
| 66 |
+
)
|
| 67 |
+
|
| 68 |
+
elif args.rec_type == "recursive_multiscale":
|
| 69 |
+
start_image_path = prompt_image_path[0]
|
| 70 |
+
input_image_path = prompt_image_path[1]
|
| 71 |
+
message_text = "The second image is a zoom-in of the first image. Based on this knowledge, what is in the second image? Give me a set of words."
|
| 72 |
+
print(f'START IMAGE PATH: {start_image_path}\nINPUT IMAGE PATH: {input_image_path}\nMESSAGE TEXT: {message_text}')
|
| 73 |
+
messages = [
|
| 74 |
+
{"role": "system", "content": f"{message_text}"},
|
| 75 |
+
{
|
| 76 |
+
"role": "user",
|
| 77 |
+
"content": [
|
| 78 |
+
{"type": "image", "image": start_image_path},
|
| 79 |
+
{"type": "image", "image": input_image_path}
|
| 80 |
+
]
|
| 81 |
+
}
|
| 82 |
+
]
|
| 83 |
+
print(f'MESSAGES\n{messages}')
|
| 84 |
+
|
| 85 |
+
text = vlm_processor.apply_chat_template(messages, tokenize=False, add_generation_prompt=True)
|
| 86 |
+
image_inputs, video_inputs = process_vision_info(messages)
|
| 87 |
+
inputs = vlm_processor(
|
| 88 |
+
text=[text],
|
| 89 |
+
images=image_inputs,
|
| 90 |
+
videos=video_inputs,
|
| 91 |
+
padding=True,
|
| 92 |
+
return_tensors="pt",
|
| 93 |
+
)
|
| 94 |
+
|
| 95 |
+
else:
|
| 96 |
+
raise ValueError(f"VLM prompt generation not implemented for rec_type: {args.rec_type}")
|
| 97 |
+
|
| 98 |
+
inputs = inputs.to("cuda")
|
| 99 |
+
|
| 100 |
+
original_sr_devices = {}
|
| 101 |
+
if args.efficient_memory and 'model' in globals() and hasattr(model, 'text_enc_1'): # Check if SR model is defined
|
| 102 |
+
print("Moving SR model components to CPU for VLM inference.")
|
| 103 |
+
original_sr_devices['text_enc_1'] = model.text_enc_1.device
|
| 104 |
+
original_sr_devices['text_enc_2'] = model.text_enc_2.device
|
| 105 |
+
original_sr_devices['text_enc_3'] = model.text_enc_3.device
|
| 106 |
+
original_sr_devices['transformer'] = model.transformer.device
|
| 107 |
+
original_sr_devices['vae'] = model.vae.device
|
| 108 |
+
|
| 109 |
+
model.text_enc_1.to('cpu')
|
| 110 |
+
model.text_enc_2.to('cpu')
|
| 111 |
+
model.text_enc_3.to('cpu')
|
| 112 |
+
model.transformer.to('cpu')
|
| 113 |
+
model.vae.to('cpu')
|
| 114 |
+
vlm_model.to('cuda') # vlm_model should already be on its device_map="auto" device
|
| 115 |
+
|
| 116 |
+
generated_ids = vlm_model.generate(**inputs, max_new_tokens=128)
|
| 117 |
+
generated_ids_trimmed = [
|
| 118 |
+
out_ids[len(in_ids) :] for in_ids, out_ids in zip(inputs.input_ids, generated_ids)
|
| 119 |
+
]
|
| 120 |
+
output_text = vlm_processor.batch_decode(
|
| 121 |
+
generated_ids_trimmed, skip_special_tokens=True, clean_up_tokenization_spaces=False
|
| 122 |
+
)
|
| 123 |
+
|
| 124 |
+
prompt_text = f"{output_text[0]}, {args.prompt}," if args.prompt else output_text[0]
|
| 125 |
+
|
| 126 |
+
if args.efficient_memory and 'model' in globals() and hasattr(model, 'text_enc_1'):
|
| 127 |
+
print("Restoring SR model components to original devices.")
|
| 128 |
+
vlm_model.to('cpu') # If vlm_model was moved to a specific cuda device and needs to be offloaded
|
| 129 |
+
model.text_enc_1.to(original_sr_devices['text_enc_1'])
|
| 130 |
+
model.text_enc_2.to(original_sr_devices['text_enc_2'])
|
| 131 |
+
model.text_enc_3.to(original_sr_devices['text_enc_3'])
|
| 132 |
+
model.transformer.to(original_sr_devices['transformer'])
|
| 133 |
+
model.vae.to(original_sr_devices['vae'])
|
| 134 |
+
else:
|
| 135 |
+
raise ValueError(f"Unknown prompt_type: {args.prompt_type}")
|
| 136 |
+
return prompt_text, lq
|
| 137 |
+
|
| 138 |
+
|
| 139 |
+
if __name__ == "__main__":
|
| 140 |
+
parser = argparse.ArgumentParser()
|
| 141 |
+
parser.add_argument('--input_image', '-i', type=str, default='preset/datasets/test_dataset/input', help='path to the input image')
|
| 142 |
+
parser.add_argument('--output_dir', '-o', type=str, default='preset/datasets/test_dataset/output', help='the directory to save the output')
|
| 143 |
+
parser.add_argument('--pretrained_model_name_or_path', type=str, default=None, help='sd model path')
|
| 144 |
+
parser.add_argument('--seed', type=int, default=42, help='Random seed to be used')
|
| 145 |
+
parser.add_argument('--process_size', type=int, default=512)
|
| 146 |
+
parser.add_argument('--upscale', type=int, default=4)
|
| 147 |
+
parser.add_argument('--align_method', type=str, choices=['wavelet', 'adain', 'nofix'], default='nofix')
|
| 148 |
+
parser.add_argument('--lora_path', type=str, default=None, help='for LoRA of SR model')
|
| 149 |
+
parser.add_argument('--vae_path', type=str, default=None)
|
| 150 |
+
parser.add_argument('--prompt', type=str, default='', help='user prompts')
|
| 151 |
+
parser.add_argument('--prompt_type', type=str, choices=['null','dape','vlm'], default='dape', help='type of prompt to use')
|
| 152 |
+
parser.add_argument('--ram_path', type=str, default=None)
|
| 153 |
+
parser.add_argument('--ram_ft_path', type=str, default=None)
|
| 154 |
+
parser.add_argument('--save_prompts', type=bool, default=True)
|
| 155 |
+
parser.add_argument('--mixed_precision', type=str, choices=['fp16', 'fp32'], default='fp16')
|
| 156 |
+
parser.add_argument('--merge_and_unload_lora', action='store_true', help='merge lora weights before inference')
|
| 157 |
+
parser.add_argument('--lora_rank', type=int, default=4)
|
| 158 |
+
parser.add_argument('--vae_decoder_tiled_size', type=int, default=224)
|
| 159 |
+
parser.add_argument('--vae_encoder_tiled_size', type=int, default=1024)
|
| 160 |
+
parser.add_argument('--latent_tiled_size', type=int, default=96)
|
| 161 |
+
parser.add_argument('--latent_tiled_overlap', type=int, default=32)
|
| 162 |
+
parser.add_argument('--rec_type', type=str, choices=['nearest', 'bicubic','onestep','recursive','recursive_multiscale'], default='recursive', help='type of inference to use')
|
| 163 |
+
parser.add_argument('--rec_num', type=int, default=4)
|
| 164 |
+
parser.add_argument('--efficient_memory', default=False, action='store_true')
|
| 165 |
+
args = parser.parse_args()
|
| 166 |
+
|
| 167 |
+
global weight_dtype
|
| 168 |
+
weight_dtype = torch.float32
|
| 169 |
+
if args.mixed_precision == "fp16":
|
| 170 |
+
weight_dtype = torch.float16
|
| 171 |
+
|
| 172 |
+
# initialize SR model
|
| 173 |
+
model = None
|
| 174 |
+
if args.rec_type not in ('nearest', 'bicubic'):
|
| 175 |
+
if not args.efficient_memory:
|
| 176 |
+
from osediff_sd3 import OSEDiff_SD3_TEST, SD3Euler
|
| 177 |
+
model = SD3Euler()
|
| 178 |
+
model.text_enc_1.to('cuda:0')
|
| 179 |
+
model.text_enc_2.to('cuda:0')
|
| 180 |
+
model.text_enc_3.to('cuda:0')
|
| 181 |
+
model.transformer.to('cuda:1', dtype=torch.float32)
|
| 182 |
+
model.vae.to('cuda:1', dtype=torch.float32)
|
| 183 |
+
for p in [model.text_enc_1, model.text_enc_2, model.text_enc_3, model.transformer, model.vae]:
|
| 184 |
+
p.requires_grad_(False)
|
| 185 |
+
model_test = OSEDiff_SD3_TEST(args, model)
|
| 186 |
+
else:
|
| 187 |
+
# For efficient memory, text encoders are moved to CPU/GPU on demand in get_validation_prompt
|
| 188 |
+
# Only load transformer and VAE initially if they are always on GPU
|
| 189 |
+
from osediff_sd3 import OSEDiff_SD3_TEST_efficient, SD3Euler
|
| 190 |
+
model = SD3Euler()
|
| 191 |
+
model.transformer.to('cuda', dtype=torch.float32)
|
| 192 |
+
model.vae.to('cuda', dtype=torch.float32)
|
| 193 |
+
for p in [model.text_enc_1, model.text_enc_2, model.text_enc_3, model.transformer, model.vae]:
|
| 194 |
+
p.requires_grad_(False)
|
| 195 |
+
model_test = OSEDiff_SD3_TEST_efficient(args, model)
|
| 196 |
+
|
| 197 |
+
# gather input images
|
| 198 |
+
if os.path.isdir(args.input_image):
|
| 199 |
+
image_names = sorted(glob.glob(f'{args.input_image}/*.png'))
|
| 200 |
+
else:
|
| 201 |
+
image_names = [args.input_image]
|
| 202 |
+
|
| 203 |
+
# load DAPE if needed
|
| 204 |
+
DAPE = None
|
| 205 |
+
if args.prompt_type == "dape":
|
| 206 |
+
DAPE = ram(pretrained=args.ram_path,
|
| 207 |
+
pretrained_condition=args.ram_ft_path,
|
| 208 |
+
image_size=384,
|
| 209 |
+
vit='swin_l')
|
| 210 |
+
DAPE.eval().to("cuda")
|
| 211 |
+
DAPE = DAPE.to(dtype=weight_dtype)
|
| 212 |
+
|
| 213 |
+
# load VLM pipeline if needed
|
| 214 |
+
vlm_model = None
|
| 215 |
+
global vlm_processor
|
| 216 |
+
global process_vision_info
|
| 217 |
+
vlm_processor = None
|
| 218 |
+
if args.prompt_type == "vlm":
|
| 219 |
+
from transformers import Qwen2_5_VLForConditionalGeneration, AutoProcessor
|
| 220 |
+
from qwen_vl_utils import process_vision_info
|
| 221 |
+
|
| 222 |
+
vlm_model_name = "Qwen/Qwen2.5-VL-3B-Instruct"
|
| 223 |
+
print(f"Loading base VLM model: {vlm_model_name}")
|
| 224 |
+
vlm_model = Qwen2_5_VLForConditionalGeneration.from_pretrained(
|
| 225 |
+
vlm_model_name,
|
| 226 |
+
torch_dtype="auto",
|
| 227 |
+
device_map="auto"
|
| 228 |
+
)
|
| 229 |
+
vlm_processor = AutoProcessor.from_pretrained(vlm_model_name)
|
| 230 |
+
print('Base VLM LOADING COMPLETE')
|
| 231 |
+
|
| 232 |
+
os.makedirs(args.output_dir, exist_ok=True)
|
| 233 |
+
os.makedirs(os.path.join(args.output_dir, 'per-sample'), exist_ok=True)
|
| 234 |
+
os.makedirs(os.path.join(args.output_dir, 'per-scale'), exist_ok=True)
|
| 235 |
+
os.makedirs(os.path.join(args.output_dir, 'recursive'), exist_ok=True)
|
| 236 |
+
print(f'There are {len(image_names)} images.')
|
| 237 |
+
print(f'Align Method Used: {args.align_method}')
|
| 238 |
+
print(f'Prompt Type: {args.prompt_type}')
|
| 239 |
+
|
| 240 |
+
# inference loop
|
| 241 |
+
for image_name in image_names:
|
| 242 |
+
bname = os.path.basename(image_name)
|
| 243 |
+
rec_dir = os.path.join(args.output_dir, 'per-sample', bname[:-4])
|
| 244 |
+
os.makedirs(rec_dir, exist_ok=True)
|
| 245 |
+
if args.save_prompts:
|
| 246 |
+
txt_path = os.path.join(rec_dir, 'txt')
|
| 247 |
+
os.makedirs(txt_path, exist_ok=True)
|
| 248 |
+
print(f'#### IMAGE: {bname}')
|
| 249 |
+
|
| 250 |
+
# first image
|
| 251 |
+
os.makedirs(os.path.join(args.output_dir, 'per-scale', 'scale0'), exist_ok=True)
|
| 252 |
+
first_image = Image.open(image_name).convert('RGB')
|
| 253 |
+
first_image = resize_and_center_crop(first_image, args.process_size)
|
| 254 |
+
first_image.save(f'{rec_dir}/0.png')
|
| 255 |
+
first_image.save(os.path.join(args.output_dir, 'per-scale', 'scale0', bname))
|
| 256 |
+
|
| 257 |
+
# recursion
|
| 258 |
+
for rec in range(args.rec_num):
|
| 259 |
+
print(f'RECURSION: {rec}')
|
| 260 |
+
os.makedirs(os.path.join(args.output_dir, 'per-scale', f'scale{rec+1}'), exist_ok=True)
|
| 261 |
+
start_image_path = None
|
| 262 |
+
input_image_path = None
|
| 263 |
+
prompt_image_path = None # this will hold the path(s) for prompt extraction
|
| 264 |
+
|
| 265 |
+
current_sr_input_image_pil = None
|
| 266 |
+
|
| 267 |
+
if args.rec_type in ('nearest', 'bicubic', 'onestep'):
|
| 268 |
+
start_image_pil_path = f'{rec_dir}/0.png'
|
| 269 |
+
start_image_pil = Image.open(start_image_pil_path).convert('RGB')
|
| 270 |
+
rscale = pow(args.upscale, rec+1)
|
| 271 |
+
w, h = start_image_pil.size
|
| 272 |
+
new_w, new_h = w // rscale, h // rscale
|
| 273 |
+
|
| 274 |
+
# crop from the original highest-res image available for this step
|
| 275 |
+
cropped_region = start_image_pil.crop(((w-new_w)//2, (h-new_h)//2, (w+new_w)//2, (h+new_h)//2))
|
| 276 |
+
|
| 277 |
+
if args.rec_type == 'onestep':
|
| 278 |
+
current_sr_input_image_pil = cropped_region.resize((w, h), Image.BICUBIC)
|
| 279 |
+
prompt_image_path = f'{rec_dir}/0_input_for_{rec+1}.png'
|
| 280 |
+
current_sr_input_image_pil.save(prompt_image_path)
|
| 281 |
+
elif args.rec_type == 'bicubic':
|
| 282 |
+
current_sr_input_image_pil = cropped_region.resize((w, h), Image.BICUBIC)
|
| 283 |
+
current_sr_input_image_pil.save(f'{rec_dir}/{rec+1}.png')
|
| 284 |
+
current_sr_input_image_pil.save(os.path.join(args.output_dir, 'per-scale', f'scale{rec+1}', bname))
|
| 285 |
+
continue
|
| 286 |
+
elif args.rec_type == 'nearest':
|
| 287 |
+
current_sr_input_image_pil = cropped_region.resize((w, h), Image.NEAREST)
|
| 288 |
+
current_sr_input_image_pil.save(f'{rec_dir}/{rec+1}.png')
|
| 289 |
+
current_sr_input_image_pil.save(os.path.join(args.output_dir, 'per-scale', f'scale{rec+1}', bname))
|
| 290 |
+
continue
|
| 291 |
+
|
| 292 |
+
elif args.rec_type == 'recursive':
|
| 293 |
+
# input for SR is based on the previous SR output, cropped and resized
|
| 294 |
+
prev_sr_output_path = f'{rec_dir}/{rec}.png'
|
| 295 |
+
prev_sr_output_pil = Image.open(prev_sr_output_path).convert('RGB')
|
| 296 |
+
rscale = args.upscale
|
| 297 |
+
w, h = prev_sr_output_pil.size
|
| 298 |
+
new_w, new_h = w // rscale, h // rscale
|
| 299 |
+
cropped_region = prev_sr_output_pil.crop(((w-new_w)//2, (h-new_h)//2, (w+new_w)//2, (h+new_h)//2))
|
| 300 |
+
current_sr_input_image_pil = cropped_region.resize((w, h), Image.BICUBIC)
|
| 301 |
+
|
| 302 |
+
# this resized image is also the input for VLM
|
| 303 |
+
input_image_path = f'{rec_dir}/{rec+1}_input.png'
|
| 304 |
+
current_sr_input_image_pil.save(input_image_path)
|
| 305 |
+
prompt_image_path = input_image_path
|
| 306 |
+
|
| 307 |
+
elif args.rec_type == 'recursive_multiscale':
|
| 308 |
+
prev_sr_output_path = f'{rec_dir}/{rec}.png'
|
| 309 |
+
prev_sr_output_pil = Image.open(prev_sr_output_path).convert('RGB')
|
| 310 |
+
rscale = args.upscale
|
| 311 |
+
w, h = prev_sr_output_pil.size
|
| 312 |
+
new_w, new_h = w // rscale, h // rscale
|
| 313 |
+
cropped_region = prev_sr_output_pil.crop(((w-new_w)//2, (h-new_h)//2, (w+new_w)//2, (h+new_h)//2))
|
| 314 |
+
current_sr_input_image_pil = cropped_region.resize((w, h), Image.BICUBIC)
|
| 315 |
+
|
| 316 |
+
# save the SR input image (which is the "zoomed-in" image for VLM)
|
| 317 |
+
zoomed_image_path = f'{rec_dir}/{rec+1}_input.png'
|
| 318 |
+
current_sr_input_image_pil.save(zoomed_image_path)
|
| 319 |
+
prompt_image_path = [prev_sr_output_path, zoomed_image_path]
|
| 320 |
+
|
| 321 |
+
else:
|
| 322 |
+
raise ValueError(f"Unknown recursion_type: {args.rec_type}")
|
| 323 |
+
|
| 324 |
+
# generate prompts
|
| 325 |
+
validation_prompt, lq = get_validation_prompt(args, current_sr_input_image_pil, prompt_image_path, DAPE, vlm_model)
|
| 326 |
+
if args.save_prompts:
|
| 327 |
+
with open(os.path.join(txt_path, f'{rec}.txt'), 'w', encoding='utf-8') as f:
|
| 328 |
+
f.write(validation_prompt)
|
| 329 |
+
print(f'TAG: {validation_prompt}')
|
| 330 |
+
|
| 331 |
+
# super-resolution
|
| 332 |
+
with torch.no_grad():
|
| 333 |
+
lq = lq * 2 - 1
|
| 334 |
+
|
| 335 |
+
if args.efficient_memory and model is not None:
|
| 336 |
+
print("Ensuring SR model components are on CUDA for SR inference.")
|
| 337 |
+
if not isinstance(model_test, OSEDiff_SD3_TEST_efficient):
|
| 338 |
+
model.text_enc_1.to('cuda:0')
|
| 339 |
+
model.text_enc_2.to('cuda:0')
|
| 340 |
+
model.text_enc_3.to('cuda:0')
|
| 341 |
+
# transformer and VAE should already be on CUDA per initialization
|
| 342 |
+
model.transformer.to('cuda', dtype=torch.float32)
|
| 343 |
+
model.vae.to('cuda', dtype=torch.float32)
|
| 344 |
+
|
| 345 |
+
output_image = model_test(lq, prompt=validation_prompt)
|
| 346 |
+
output_image = torch.clamp(output_image[0].cpu(), -1.0, 1.0)
|
| 347 |
+
output_pil = transforms.ToPILImage()(output_image * 0.5 + 0.5)
|
| 348 |
+
if args.align_method == 'adain':
|
| 349 |
+
output_pil = adain_color_fix(target=output_pil, source=current_sr_input_image_pil)
|
| 350 |
+
elif args.align_method == 'wavelet':
|
| 351 |
+
output_pil = wavelet_color_fix(target=output_pil, source=current_sr_input_image_pil)
|
| 352 |
+
|
| 353 |
+
output_pil.save(f'{rec_dir}/{rec+1}.png') # this is the SR output
|
| 354 |
+
output_pil.save(os.path.join(args.output_dir, 'per-scale', f'scale{rec+1}', bname))
|
| 355 |
+
|
| 356 |
+
# concatenate and save
|
| 357 |
+
imgs = [Image.open(os.path.join(rec_dir, f'{i}.png')).convert('RGB') for i in range(args.rec_num+1)]
|
| 358 |
+
concat = Image.new('RGB', (sum(im.width for im in imgs), max(im.height for im in imgs)))
|
| 359 |
+
x_off = 0
|
| 360 |
+
for im in imgs:
|
| 361 |
+
concat.paste(im, (x_off, 0))
|
| 362 |
+
x_off += im.width
|
| 363 |
+
concat.save(os.path.join(rec_dir, bname))
|
| 364 |
+
concat.save(os.path.join(args.output_dir, 'recursive', bname))
|
lora/lora_layers.py
ADDED
|
@@ -0,0 +1,137 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from typing import List, Optional, Set, Type, Union
|
| 2 |
+
|
| 3 |
+
import torch
|
| 4 |
+
from torch import nn
|
| 5 |
+
|
| 6 |
+
|
| 7 |
+
class LoraInjectedLinear(nn.Module):
|
| 8 |
+
"""
|
| 9 |
+
Linear layer with LoRA injection.
|
| 10 |
+
Taken from https://github.com/cloneofsimo/lora/blob/master/lora_diffusion/lora.py
|
| 11 |
+
"""
|
| 12 |
+
def __init__(
|
| 13 |
+
self, in_features, out_features, bias=False, r=4, dropout_p=0.1, scale=1.0
|
| 14 |
+
):
|
| 15 |
+
super().__init__()
|
| 16 |
+
|
| 17 |
+
if r > min(in_features, out_features):
|
| 18 |
+
raise ValueError(
|
| 19 |
+
f"LoRA rank {r} must be less or equal than {min(in_features, out_features)}"
|
| 20 |
+
)
|
| 21 |
+
self.r = r
|
| 22 |
+
self.linear = nn.Linear(in_features, out_features, bias)
|
| 23 |
+
self.lora_down = nn.Linear(in_features, r, bias=False)
|
| 24 |
+
self.dropout = nn.Dropout(dropout_p)
|
| 25 |
+
self.lora_up = nn.Linear(r, out_features, bias=False)
|
| 26 |
+
self.scale = scale
|
| 27 |
+
self.selector = nn.Identity()
|
| 28 |
+
|
| 29 |
+
nn.init.normal_(self.lora_down.weight, std=1 / r)
|
| 30 |
+
nn.init.zeros_(self.lora_up.weight)
|
| 31 |
+
|
| 32 |
+
def forward(self, input):
|
| 33 |
+
return (
|
| 34 |
+
self.linear(input.float())
|
| 35 |
+
+ self.dropout(self.lora_up(self.selector(self.lora_down(input.float()))))
|
| 36 |
+
* self.scale
|
| 37 |
+
).half()
|
| 38 |
+
|
| 39 |
+
def realize_as_lora(self):
|
| 40 |
+
return self.lora_up.weight.data * self.scale, self.lora_down.weight.data
|
| 41 |
+
|
| 42 |
+
def set_selector_from_diag(self, diag: torch.Tensor):
|
| 43 |
+
# diag is a 1D tensor of size (r,)
|
| 44 |
+
assert diag.shape == (self.r,)
|
| 45 |
+
self.selector = nn.Linear(self.r, self.r, bias=False)
|
| 46 |
+
self.selector.weight.data = torch.diag(diag)
|
| 47 |
+
self.selector.weight.data = self.selector.weight.data.to(
|
| 48 |
+
self.lora_up.weight.device
|
| 49 |
+
).to(self.lora_up.weight.dtype)
|
| 50 |
+
|
| 51 |
+
class LoraInjectedConv2d(nn.Module):
|
| 52 |
+
def __init__(
|
| 53 |
+
self,
|
| 54 |
+
in_channels: int,
|
| 55 |
+
out_channels: int,
|
| 56 |
+
kernel_size,
|
| 57 |
+
stride=1,
|
| 58 |
+
padding=0,
|
| 59 |
+
dilation=1,
|
| 60 |
+
groups: int = 1,
|
| 61 |
+
bias: bool = True,
|
| 62 |
+
r: int = 4,
|
| 63 |
+
dropout_p: float = 0.1,
|
| 64 |
+
scale: float = 1.0,
|
| 65 |
+
):
|
| 66 |
+
super().__init__()
|
| 67 |
+
if r > min(in_channels, out_channels):
|
| 68 |
+
raise ValueError(
|
| 69 |
+
f"LoRA rank {r} must be less or equal than {min(in_channels, out_channels)}"
|
| 70 |
+
)
|
| 71 |
+
self.r = r
|
| 72 |
+
self.conv = nn.Conv2d(
|
| 73 |
+
in_channels=in_channels,
|
| 74 |
+
out_channels=out_channels,
|
| 75 |
+
kernel_size=kernel_size,
|
| 76 |
+
stride=stride,
|
| 77 |
+
padding=padding,
|
| 78 |
+
dilation=dilation,
|
| 79 |
+
groups=groups,
|
| 80 |
+
bias=bias,
|
| 81 |
+
)
|
| 82 |
+
|
| 83 |
+
self.lora_down = nn.Conv2d(
|
| 84 |
+
in_channels=in_channels,
|
| 85 |
+
out_channels=r,
|
| 86 |
+
kernel_size=kernel_size,
|
| 87 |
+
stride=stride,
|
| 88 |
+
padding=padding,
|
| 89 |
+
dilation=dilation,
|
| 90 |
+
groups=groups,
|
| 91 |
+
bias=False,
|
| 92 |
+
)
|
| 93 |
+
self.dropout = nn.Dropout(dropout_p)
|
| 94 |
+
self.lora_up = nn.Conv2d(
|
| 95 |
+
in_channels=r,
|
| 96 |
+
out_channels=out_channels,
|
| 97 |
+
kernel_size=1,
|
| 98 |
+
stride=1,
|
| 99 |
+
padding=0,
|
| 100 |
+
bias=False,
|
| 101 |
+
)
|
| 102 |
+
self.selector = nn.Identity()
|
| 103 |
+
self.scale = scale
|
| 104 |
+
|
| 105 |
+
nn.init.normal_(self.lora_down.weight, std=1 / r)
|
| 106 |
+
nn.init.zeros_(self.lora_up.weight)
|
| 107 |
+
|
| 108 |
+
def forward(self, input):
|
| 109 |
+
return (
|
| 110 |
+
self.conv(input)
|
| 111 |
+
+ self.dropout(self.lora_up(self.selector(self.lora_down(input))))
|
| 112 |
+
* self.scale
|
| 113 |
+
)
|
| 114 |
+
|
| 115 |
+
def realize_as_lora(self):
|
| 116 |
+
return self.lora_up.weight.data * self.scale, self.lora_down.weight.data
|
| 117 |
+
|
| 118 |
+
def set_selector_from_diag(self, diag: torch.Tensor):
|
| 119 |
+
# diag is a 1D tensor of size (r,)
|
| 120 |
+
assert diag.shape == (self.r,)
|
| 121 |
+
self.selector = nn.Conv2d(
|
| 122 |
+
in_channels=self.r,
|
| 123 |
+
out_channels=self.r,
|
| 124 |
+
kernel_size=1,
|
| 125 |
+
stride=1,
|
| 126 |
+
padding=0,
|
| 127 |
+
bias=False,
|
| 128 |
+
)
|
| 129 |
+
self.selector.weight.data = torch.diag(diag)
|
| 130 |
+
|
| 131 |
+
# same device + dtype as lora_up
|
| 132 |
+
self.selector.weight.data = self.selector.weight.data.to(
|
| 133 |
+
self.lora_up.weight.device
|
| 134 |
+
).to(self.lora_up.weight.dtype)
|
| 135 |
+
|
| 136 |
+
|
| 137 |
+
|
lora/lora_utils.py
ADDED
|
@@ -0,0 +1,61 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import torch
|
| 2 |
+
from torch import nn
|
| 3 |
+
from lora.lora_layers import LoraInjectedLinear, LoraInjectedConv2d
|
| 4 |
+
|
| 5 |
+
def _find_modules(model, ancestor_class=None, search_class=[nn.Linear], exclude_children_of=[LoraInjectedLinear]):
|
| 6 |
+
# Get the targets we should replace all linears under
|
| 7 |
+
if ancestor_class is not None:
|
| 8 |
+
ancestors = (
|
| 9 |
+
module
|
| 10 |
+
for module in model.modules()
|
| 11 |
+
if module.__class__.__name__ in ancestor_class
|
| 12 |
+
)
|
| 13 |
+
else:
|
| 14 |
+
# this, incase you want to naively iterate over all modules.
|
| 15 |
+
ancestors = [module for module in model.modules()]
|
| 16 |
+
|
| 17 |
+
for ancestor in ancestors:
|
| 18 |
+
for fullname, module in ancestor.named_modules():
|
| 19 |
+
# if 'norm1_context' in fullname:
|
| 20 |
+
if any([isinstance(module, _class) for _class in search_class]):
|
| 21 |
+
*path, name = fullname.split(".")
|
| 22 |
+
parent = ancestor
|
| 23 |
+
while path:
|
| 24 |
+
parent = parent.get_submodule(path.pop(0))
|
| 25 |
+
if exclude_children_of and any(
|
| 26 |
+
[isinstance(parent, _class) for _class in exclude_children_of]
|
| 27 |
+
):
|
| 28 |
+
continue
|
| 29 |
+
yield parent, name, module
|
| 30 |
+
|
| 31 |
+
def extract_lora_ups_down(model, target_replace_module={'AdaLayerNormZero'}): # Attention for kv_lora
|
| 32 |
+
|
| 33 |
+
loras = []
|
| 34 |
+
|
| 35 |
+
for _m, _n, _child_module in _find_modules(
|
| 36 |
+
model,
|
| 37 |
+
target_replace_module,
|
| 38 |
+
search_class=[LoraInjectedLinear, LoraInjectedConv2d],
|
| 39 |
+
):
|
| 40 |
+
loras.append((_child_module.lora_up, _child_module.lora_down))
|
| 41 |
+
|
| 42 |
+
if len(loras) == 0:
|
| 43 |
+
raise ValueError("No lora injected.")
|
| 44 |
+
|
| 45 |
+
return loras
|
| 46 |
+
|
| 47 |
+
def save_lora_weight(
|
| 48 |
+
model,
|
| 49 |
+
path="./lora.pt",
|
| 50 |
+
target_replace_module={'AdaLayerNormZero'}, # Attention for kv_lora
|
| 51 |
+
save_half:bool=False
|
| 52 |
+
):
|
| 53 |
+
weights = []
|
| 54 |
+
for _up, _down in extract_lora_ups_down(
|
| 55 |
+
model, target_replace_module=target_replace_module
|
| 56 |
+
):
|
| 57 |
+
dtype = torch.float16 if save_half else torch.float32
|
| 58 |
+
weights.append(_up.weight.to("cpu").to(dtype))
|
| 59 |
+
weights.append(_down.weight.to("cpu").to(dtype))
|
| 60 |
+
|
| 61 |
+
torch.save(weights, path)
|
osediff_sd3.py
ADDED
|
@@ -0,0 +1,725 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import os
|
| 2 |
+
import sys
|
| 3 |
+
sys.path.append(os.getcwd())
|
| 4 |
+
import yaml
|
| 5 |
+
import copy
|
| 6 |
+
import torch
|
| 7 |
+
import torch.nn as nn
|
| 8 |
+
import torch.nn.functional as F
|
| 9 |
+
from typing import List, Tuple, Optional
|
| 10 |
+
import numpy as np
|
| 11 |
+
import lpips
|
| 12 |
+
from torchvision import transforms
|
| 13 |
+
from PIL import Image
|
| 14 |
+
from peft import LoraConfig, get_peft_model
|
| 15 |
+
|
| 16 |
+
from copy import deepcopy
|
| 17 |
+
from tqdm import tqdm
|
| 18 |
+
|
| 19 |
+
from diffusers import StableDiffusion3Pipeline, FluxPipeline
|
| 20 |
+
from lora.lora_layers import LoraInjectedLinear, LoraInjectedConv2d
|
| 21 |
+
|
| 22 |
+
def inject_lora_vae(vae, lora_rank=4, init_lora_weights="gaussian", verbose=False):
|
| 23 |
+
"""
|
| 24 |
+
Inject LoRA into the VAE's encoder
|
| 25 |
+
"""
|
| 26 |
+
vae.requires_grad_(False)
|
| 27 |
+
vae.train()
|
| 28 |
+
|
| 29 |
+
# Identify modules to LoRA-ify in the encoder
|
| 30 |
+
l_grep = ["conv1", "conv2", "conv_in", "conv_shortcut",
|
| 31 |
+
"conv", "conv_out", "to_k", "to_q", "to_v", "to_out.0"]
|
| 32 |
+
l_target_modules_encoder = []
|
| 33 |
+
for n, p in vae.named_parameters():
|
| 34 |
+
if "bias" in n or "norm" in n:
|
| 35 |
+
continue
|
| 36 |
+
for pattern in l_grep:
|
| 37 |
+
if (pattern in n) and ("encoder" in n):
|
| 38 |
+
l_target_modules_encoder.append(n.replace(".weight", ""))
|
| 39 |
+
elif ("quant_conv" in n) and ("post_quant_conv" not in n):
|
| 40 |
+
l_target_modules_encoder.append(n.replace(".weight", ""))
|
| 41 |
+
|
| 42 |
+
if verbose:
|
| 43 |
+
print("The following VAE parameters will get LoRA:")
|
| 44 |
+
print(l_target_modules_encoder)
|
| 45 |
+
|
| 46 |
+
# Create and add a LoRA adapter
|
| 47 |
+
lora_conf_encoder = LoraConfig(
|
| 48 |
+
r=lora_rank,
|
| 49 |
+
init_lora_weights=init_lora_weights,
|
| 50 |
+
target_modules=l_target_modules_encoder
|
| 51 |
+
)
|
| 52 |
+
|
| 53 |
+
adapter_name = "default_encoder"
|
| 54 |
+
try:
|
| 55 |
+
vae.add_adapter(lora_conf_encoder, adapter_name=adapter_name)
|
| 56 |
+
vae.set_adapter(adapter_name)
|
| 57 |
+
except ValueError as e:
|
| 58 |
+
if "already exists" in str(e):
|
| 59 |
+
print(f"Adapter with name {adapter_name} already exists. Skipping injection.")
|
| 60 |
+
else:
|
| 61 |
+
raise e
|
| 62 |
+
|
| 63 |
+
return vae, l_target_modules_encoder
|
| 64 |
+
|
| 65 |
+
def _find_modules(model, ancestor_class=None, search_class=[nn.Linear], exclude_children_of=[LoraInjectedLinear]):
|
| 66 |
+
# Get the targets we should replace all linears under
|
| 67 |
+
if ancestor_class is not None:
|
| 68 |
+
ancestors = (
|
| 69 |
+
module
|
| 70 |
+
for module in model.modules()
|
| 71 |
+
if module.__class__.__name__ in ancestor_class
|
| 72 |
+
)
|
| 73 |
+
else:
|
| 74 |
+
# this, in case you want to naively iterate over all modules.
|
| 75 |
+
ancestors = [module for module in model.modules()]
|
| 76 |
+
|
| 77 |
+
for ancestor in ancestors:
|
| 78 |
+
for fullname, module in ancestor.named_modules():
|
| 79 |
+
if any([isinstance(module, _class) for _class in search_class]):
|
| 80 |
+
*path, name = fullname.split(".")
|
| 81 |
+
parent = ancestor
|
| 82 |
+
while path:
|
| 83 |
+
parent = parent.get_submodule(path.pop(0))
|
| 84 |
+
if exclude_children_of and any(
|
| 85 |
+
[isinstance(parent, _class) for _class in exclude_children_of]
|
| 86 |
+
):
|
| 87 |
+
continue
|
| 88 |
+
yield parent, name, module
|
| 89 |
+
|
| 90 |
+
def inject_lora(model, ancestor_class, loras=None, r:int=4, dropout_p:float=0.0, scale:float=1.0, verbose:bool=False):
|
| 91 |
+
|
| 92 |
+
model.requires_grad_(False)
|
| 93 |
+
model.train()
|
| 94 |
+
|
| 95 |
+
names = []
|
| 96 |
+
require_grad_params = [] # to be updated
|
| 97 |
+
|
| 98 |
+
total_lora_params = 0
|
| 99 |
+
|
| 100 |
+
if loras is not None:
|
| 101 |
+
loras = torch.load(loras, map_location=model.device, weights_only=True)
|
| 102 |
+
loras = [lora.float() for lora in loras]
|
| 103 |
+
|
| 104 |
+
for _module, name, _child_module in _find_modules(model, ancestor_class): # SiLU + Linear Block
|
| 105 |
+
weight = _child_module.weight
|
| 106 |
+
bias = _child_module.bias
|
| 107 |
+
|
| 108 |
+
if verbose:
|
| 109 |
+
print(f'LoRA Injection : injecting lora into {name}')
|
| 110 |
+
|
| 111 |
+
_tmp = LoraInjectedLinear(
|
| 112 |
+
_child_module.in_features,
|
| 113 |
+
_child_module.out_features,
|
| 114 |
+
_child_module.bias is not None,
|
| 115 |
+
r=r,
|
| 116 |
+
dropout_p=dropout_p,
|
| 117 |
+
scale=scale,
|
| 118 |
+
)
|
| 119 |
+
_tmp.linear.weight = nn.Parameter(weight.float())
|
| 120 |
+
if bias is not None:
|
| 121 |
+
_tmp.linear.bias = nn.Parameter(bias.float())
|
| 122 |
+
|
| 123 |
+
# switch the module
|
| 124 |
+
_tmp.to(device=_child_module.weight.device, dtype=torch.float) # keep as float / mixed precision
|
| 125 |
+
_module._modules[name] = _tmp
|
| 126 |
+
|
| 127 |
+
require_grad_params.append(_module._modules[name].lora_up.parameters())
|
| 128 |
+
require_grad_params.append(_module._modules[name].lora_down.parameters())
|
| 129 |
+
|
| 130 |
+
if loras != None:
|
| 131 |
+
_module._modules[name].lora_up.weight = nn.Parameter(loras.pop(0))
|
| 132 |
+
_module._modules[name].lora_down.weight = nn.Parameter(loras.pop(0))
|
| 133 |
+
|
| 134 |
+
_module._modules[name].lora_up.weight.requires_grad = True
|
| 135 |
+
_module._modules[name].lora_down.weight.requires_grad = True
|
| 136 |
+
names.append(name)
|
| 137 |
+
|
| 138 |
+
if verbose:
|
| 139 |
+
# -------- Count LoRA parameters just added --------
|
| 140 |
+
lora_up_count = sum(p.numel() for p in _tmp.lora_up.parameters())
|
| 141 |
+
lora_down_count = sum(p.numel() for p in _tmp.lora_down.parameters())
|
| 142 |
+
lora_total_for_this_layer = lora_up_count + lora_down_count
|
| 143 |
+
total_lora_params += lora_total_for_this_layer
|
| 144 |
+
print(f" Added {lora_total_for_this_layer} params "
|
| 145 |
+
f"(lora_up={lora_up_count}, lora_down={lora_down_count})")
|
| 146 |
+
|
| 147 |
+
if verbose:
|
| 148 |
+
print(f"Total new LoRA parameters added: {total_lora_params}")
|
| 149 |
+
|
| 150 |
+
return require_grad_params, names
|
| 151 |
+
|
| 152 |
+
def add_mp_hook(transformer):
|
| 153 |
+
'''
|
| 154 |
+
For mixed precision of LoRA. (i.e. keep LoRA as float and others as half)
|
| 155 |
+
'''
|
| 156 |
+
def pre_hook(module, input):
|
| 157 |
+
return input.float()
|
| 158 |
+
|
| 159 |
+
def post_hook(module, input, output):
|
| 160 |
+
return output.half()
|
| 161 |
+
|
| 162 |
+
hooks = []
|
| 163 |
+
for _module, name, _child_module in _find_modules(transformer):
|
| 164 |
+
if isinstance(_child_module, LoraInjectedLinear):
|
| 165 |
+
hook = _child_module.lora_up.register_forward_pre_hook(pre_hook)
|
| 166 |
+
hooks.append(hook)
|
| 167 |
+
hook = _child_module.lora_down.register_forward_hook(post_hook)
|
| 168 |
+
hooks.append(hook)
|
| 169 |
+
|
| 170 |
+
return transformer, hooks
|
| 171 |
+
|
| 172 |
+
def compute_density_for_timestep_sampling(
|
| 173 |
+
weighting_scheme: str, batch_size: int, logit_mean: float = 0.0, logit_std: float = 1.0, mode_scale: Optional[float] = None
|
| 174 |
+
):
|
| 175 |
+
"""
|
| 176 |
+
Compute the density for sampling the timesteps when doing SD3 training.
|
| 177 |
+
|
| 178 |
+
Courtesy: This was contributed by Rafie Walker in https://github.com/huggingface/diffusers/pull/8528.
|
| 179 |
+
|
| 180 |
+
SD3 paper reference: https://arxiv.org/abs/2403.03206v1.
|
| 181 |
+
"""
|
| 182 |
+
if weighting_scheme == "logit_normal":
|
| 183 |
+
# See 3.1 in the SD3 paper ($rf/lognorm(0.00,1.00)$).
|
| 184 |
+
u = torch.normal(mean=logit_mean, std=logit_std, size=(batch_size,), device="cpu")
|
| 185 |
+
u = torch.nn.functional.sigmoid(u)
|
| 186 |
+
elif weighting_scheme == "mode":
|
| 187 |
+
u = torch.rand(size=(batch_size,), device="cpu")
|
| 188 |
+
u = 1 - u - mode_scale * (torch.cos(math.pi * u / 2) ** 2 - 1 + u)
|
| 189 |
+
else:
|
| 190 |
+
u = torch.rand(size=(batch_size,), device="cpu")
|
| 191 |
+
return u
|
| 192 |
+
|
| 193 |
+
def compute_loss_weighting_for_sd3(weighting_scheme: str, sigmas):
|
| 194 |
+
"""
|
| 195 |
+
Computes loss weighting scheme for SD3 training.
|
| 196 |
+
|
| 197 |
+
Courtesy: This was contributed by Rafie Walker in https://github.com/huggingface/diffusers/pull/8528.
|
| 198 |
+
|
| 199 |
+
SD3 paper reference: https://arxiv.org/abs/2403.03206v1.
|
| 200 |
+
"""
|
| 201 |
+
if weighting_scheme == "sigma_sqrt":
|
| 202 |
+
weighting = (sigmas**-2.0).float()
|
| 203 |
+
elif weighting_scheme == "cosmap":
|
| 204 |
+
bot = 1 - 2 * sigmas + 2 * sigmas**2
|
| 205 |
+
weighting = 2 / (math.pi * bot)
|
| 206 |
+
else:
|
| 207 |
+
weighting = torch.ones_like(sigmas)
|
| 208 |
+
return weighting
|
| 209 |
+
|
| 210 |
+
|
| 211 |
+
class StableDiffusion3Base():
|
| 212 |
+
def __init__(self, model_key:str='stabilityai/stable-diffusion-3-medium-diffusers', device='cuda', dtype=torch.float16):
|
| 213 |
+
self.device = device
|
| 214 |
+
self.dtype = dtype
|
| 215 |
+
|
| 216 |
+
pipe = StableDiffusion3Pipeline.from_pretrained(model_key, torch_dtype=self.dtype)
|
| 217 |
+
|
| 218 |
+
self.scheduler = pipe.scheduler
|
| 219 |
+
|
| 220 |
+
self.tokenizer_1 = pipe.tokenizer
|
| 221 |
+
self.tokenizer_2 = pipe.tokenizer_2
|
| 222 |
+
self.tokenizer_3 = pipe.tokenizer_3
|
| 223 |
+
self.text_enc_1 = pipe.text_encoder.to(device)
|
| 224 |
+
self.text_enc_2 = pipe.text_encoder_2.to(device)
|
| 225 |
+
self.text_enc_3 = pipe.text_encoder_3.to(device)
|
| 226 |
+
|
| 227 |
+
self.vae=pipe.vae.to(device)
|
| 228 |
+
|
| 229 |
+
self.transformer = pipe.transformer.to(device)
|
| 230 |
+
self.transformer.eval()
|
| 231 |
+
self.transformer.requires_grad_(False)
|
| 232 |
+
|
| 233 |
+
self.vae_scale_factor = (
|
| 234 |
+
2 ** (len(self.vae.config.block_out_channels)-1) if hasattr(self, "vae") and self.vae is not None else 8
|
| 235 |
+
)
|
| 236 |
+
|
| 237 |
+
del pipe
|
| 238 |
+
|
| 239 |
+
def encode_prompt(self, prompt: List[str], batch_size:int=1) -> List[torch.Tensor]:
|
| 240 |
+
'''
|
| 241 |
+
We assume that
|
| 242 |
+
1. number of tokens < max_length
|
| 243 |
+
2. one prompt for one image
|
| 244 |
+
'''
|
| 245 |
+
# CLIP encode (used for modulation of adaLN-zero)
|
| 246 |
+
# now, we have two CLIPs
|
| 247 |
+
text_clip1_ids = self.tokenizer_1(prompt,
|
| 248 |
+
padding="max_length",
|
| 249 |
+
max_length=77,
|
| 250 |
+
truncation=True,
|
| 251 |
+
return_tensors='pt').input_ids
|
| 252 |
+
text_clip1_emb = self.text_enc_1(text_clip1_ids.to(self.device), output_hidden_states=True)
|
| 253 |
+
pool_clip1_emb = text_clip1_emb[0].to(dtype=self.dtype, device=self.device)
|
| 254 |
+
text_clip1_emb = text_clip1_emb.hidden_states[-2].to(dtype=self.dtype, device=self.device)
|
| 255 |
+
|
| 256 |
+
text_clip2_ids = self.tokenizer_2(prompt,
|
| 257 |
+
padding="max_length",
|
| 258 |
+
max_length=77,
|
| 259 |
+
truncation=True,
|
| 260 |
+
return_tensors='pt').input_ids
|
| 261 |
+
text_clip2_emb = self.text_enc_2(text_clip2_ids.to(self.device), output_hidden_states=True)
|
| 262 |
+
pool_clip2_emb = text_clip2_emb[0].to(dtype=self.dtype, device=self.device)
|
| 263 |
+
text_clip2_emb = text_clip2_emb.hidden_states[-2].to(dtype=self.dtype, device=self.device)
|
| 264 |
+
|
| 265 |
+
# T5 encode (used for text condition)
|
| 266 |
+
text_t5_ids = self.tokenizer_3(prompt,
|
| 267 |
+
padding="max_length",
|
| 268 |
+
max_length=512,
|
| 269 |
+
truncation=True,
|
| 270 |
+
add_special_tokens=True,
|
| 271 |
+
return_tensors='pt').input_ids
|
| 272 |
+
text_t5_emb = self.text_enc_3(text_t5_ids.to(self.device))[0]
|
| 273 |
+
text_t5_emb = text_t5_emb.to(dtype=self.dtype, device=self.device)
|
| 274 |
+
|
| 275 |
+
# Merge
|
| 276 |
+
clip_prompt_emb = torch.cat([text_clip1_emb, text_clip2_emb], dim=-1)
|
| 277 |
+
clip_prompt_emb = torch.nn.functional.pad(
|
| 278 |
+
clip_prompt_emb, (0, text_t5_emb.shape[-1] - clip_prompt_emb.shape[-1])
|
| 279 |
+
)
|
| 280 |
+
prompt_emb = torch.cat([clip_prompt_emb, text_t5_emb], dim=-2)
|
| 281 |
+
pooled_prompt_emb = torch.cat([pool_clip1_emb, pool_clip2_emb], dim=-1)
|
| 282 |
+
|
| 283 |
+
return prompt_emb, pooled_prompt_emb
|
| 284 |
+
|
| 285 |
+
def initialize_latent(self, img_size:Tuple[int], batch_size:int=1, **kwargs):
|
| 286 |
+
H, W = img_size
|
| 287 |
+
lH, lW = H//self.vae_scale_factor, W//self.vae_scale_factor
|
| 288 |
+
lC = self.transformer.config.in_channels
|
| 289 |
+
latent_shape = (batch_size, lC, lH, lW)
|
| 290 |
+
|
| 291 |
+
z = torch.randn(latent_shape, device=self.device, dtype=self.dtype)
|
| 292 |
+
|
| 293 |
+
return z
|
| 294 |
+
|
| 295 |
+
def encode(self, image: torch.Tensor) -> torch.Tensor:
|
| 296 |
+
z = self.vae.encode(image).latent_dist.sample()
|
| 297 |
+
z = (z-self.vae.config.shift_factor) * self.vae.config.scaling_factor
|
| 298 |
+
return z
|
| 299 |
+
|
| 300 |
+
def decode(self, z: torch.Tensor) -> torch.Tensor:
|
| 301 |
+
z = (z/self.vae.config.scaling_factor) + self.vae.config.shift_factor
|
| 302 |
+
return self.vae.decode(z, return_dict=False)[0]
|
| 303 |
+
|
| 304 |
+
|
| 305 |
+
class SD3Euler(StableDiffusion3Base):
|
| 306 |
+
def __init__(self, model_key:str='stabilityai/stable-diffusion-3-medium-diffusers', device='cuda'):
|
| 307 |
+
super().__init__(model_key=model_key, device=device)
|
| 308 |
+
|
| 309 |
+
def inversion(self, src_img, prompts: List[str], NFE:int, cfg_scale: float=1.0, batch_size: int=1):
|
| 310 |
+
|
| 311 |
+
# encode text prompts
|
| 312 |
+
prompt_emb, pooled_emb = self.encode_prompt(prompts, batch_size)
|
| 313 |
+
null_prompt_emb, null_pooled_emb = self.encode_prompt([""], batch_size)
|
| 314 |
+
|
| 315 |
+
# initialize latent
|
| 316 |
+
src_img = src_img.to(device=self.device, dtype=self.dtype)
|
| 317 |
+
with torch.no_grad():
|
| 318 |
+
z = self.encode(src_img)
|
| 319 |
+
z0 = z.clone()
|
| 320 |
+
|
| 321 |
+
# timesteps (default option. You can make your custom here.)
|
| 322 |
+
self.scheduler.set_timesteps(NFE, device=self.device)
|
| 323 |
+
timesteps = self.scheduler.timesteps
|
| 324 |
+
timesteps = torch.cat([timesteps, torch.zeros(1, device=self.device)])
|
| 325 |
+
timesteps = reversed(timesteps)
|
| 326 |
+
sigmas = timesteps / self.scheduler.config.num_train_timesteps
|
| 327 |
+
|
| 328 |
+
# Solve ODE
|
| 329 |
+
pbar = tqdm(timesteps[:-1], total=NFE, desc='SD3 Euler Inversion')
|
| 330 |
+
for i, t in enumerate(pbar):
|
| 331 |
+
timestep = t.expand(z.shape[0]).to(self.device)
|
| 332 |
+
pred_v = self.predict_vector(z, timestep, prompt_emb, pooled_emb)
|
| 333 |
+
if cfg_scale != 1.0:
|
| 334 |
+
pred_null_v = self.predict_vector(z, timestep, null_prompt_emb, null_pooled_emb)
|
| 335 |
+
else:
|
| 336 |
+
pred_null_v = 0.0
|
| 337 |
+
|
| 338 |
+
sigma = sigmas[i]
|
| 339 |
+
sigma_next = sigmas[i+1]
|
| 340 |
+
|
| 341 |
+
z = z + (sigma_next - sigma) * (pred_null_v + cfg_scale * (pred_v - pred_null_v))
|
| 342 |
+
|
| 343 |
+
return z
|
| 344 |
+
|
| 345 |
+
def sample(self, prompts: List[str], NFE:int, img_shape: Optional[Tuple[int]]=None, cfg_scale: float=1.0, batch_size: int = 1, latent:Optional[torch.Tensor]=None):
|
| 346 |
+
imgH, imgW = img_shape if img_shape is not None else (512, 512)
|
| 347 |
+
|
| 348 |
+
# encode text prompts
|
| 349 |
+
with torch.no_grad():
|
| 350 |
+
prompt_emb, pooled_emb = self.encode_prompt(prompts, batch_size)
|
| 351 |
+
null_prompt_emb, null_pooled_emb = self.encode_prompt([""], batch_size)
|
| 352 |
+
|
| 353 |
+
# initialize latent
|
| 354 |
+
if latent is None:
|
| 355 |
+
z = self.initialize_latent((imgH, imgW), batch_size)
|
| 356 |
+
else:
|
| 357 |
+
z = latent
|
| 358 |
+
|
| 359 |
+
# timesteps (default option. You can make your custom here.)
|
| 360 |
+
self.scheduler.set_timesteps(NFE, device=self.device)
|
| 361 |
+
timesteps = self.scheduler.timesteps
|
| 362 |
+
sigmas = timesteps / self.scheduler.config.num_train_timesteps
|
| 363 |
+
|
| 364 |
+
# Solve ODE
|
| 365 |
+
pbar = tqdm(timesteps, total=NFE, desc='SD3 Euler')
|
| 366 |
+
for i, t in enumerate(pbar):
|
| 367 |
+
timestep = t.expand(z.shape[0]).to(self.device)
|
| 368 |
+
pred_v = self.predict_vector(z, timestep, prompt_emb, pooled_emb)
|
| 369 |
+
if cfg_scale != 1.0:
|
| 370 |
+
pred_null_v = self.predict_vector(z, timestep, null_prompt_emb, null_pooled_emb)
|
| 371 |
+
else:
|
| 372 |
+
pred_null_v = 0.0
|
| 373 |
+
|
| 374 |
+
sigma = sigmas[i]
|
| 375 |
+
sigma_next = sigmas[i+1] if i+1 < NFE else 0.0
|
| 376 |
+
|
| 377 |
+
z = z + (sigma_next - sigma) * (pred_null_v + cfg_scale * (pred_v - pred_null_v))
|
| 378 |
+
|
| 379 |
+
# decode
|
| 380 |
+
with torch.no_grad():
|
| 381 |
+
img = self.decode(z)
|
| 382 |
+
return img
|
| 383 |
+
|
| 384 |
+
|
| 385 |
+
class OSEDiff_SD3_GEN(torch.nn.Module):
|
| 386 |
+
def __init__(self, args, base_model):
|
| 387 |
+
super().__init__()
|
| 388 |
+
|
| 389 |
+
self.args = args
|
| 390 |
+
self.model = base_model
|
| 391 |
+
|
| 392 |
+
# Add lora to transformer
|
| 393 |
+
print('Adding Lora to OSEDiff_SD3_GEN')
|
| 394 |
+
self.transformer_gen = copy.deepcopy(self.model.transformer)
|
| 395 |
+
self.transformer_gen.to('cuda:1')
|
| 396 |
+
# self.transformer_gen = self.transformer_gen.float()
|
| 397 |
+
|
| 398 |
+
self.transformer_gen.requires_grad_(False)
|
| 399 |
+
self.transformer_gen.train()
|
| 400 |
+
self.transformer_gen, hooks = add_mp_hook(self.transformer_gen)
|
| 401 |
+
self.hooks = hooks
|
| 402 |
+
|
| 403 |
+
lora_params, _ = inject_lora(self.transformer_gen, {"AdaLayerNormZero"}, r=args.lora_rank, verbose=True)
|
| 404 |
+
# self.lora_params = lora_params
|
| 405 |
+
for name, param in self.transformer_gen.named_parameters():
|
| 406 |
+
if "lora_" in name:
|
| 407 |
+
param.requires_grad = True # LoRA up/down
|
| 408 |
+
else:
|
| 409 |
+
param.requires_grad = False # everything else
|
| 410 |
+
|
| 411 |
+
# Insert LoRA into VAE
|
| 412 |
+
print("Adding Lora to VAE")
|
| 413 |
+
self.model.vae, self.lora_vae_modules_encoder = inject_lora_vae(self.model.vae, lora_rank=args.lora_rank, verbose=True)
|
| 414 |
+
|
| 415 |
+
def predict_vector(self, z, t, prompt_emb, pooled_emb):
|
| 416 |
+
v = self.transformer_gen(hidden_states=z,
|
| 417 |
+
timestep=t,
|
| 418 |
+
pooled_projections=pooled_emb,
|
| 419 |
+
encoder_hidden_states=prompt_emb,
|
| 420 |
+
return_dict=False)[0]
|
| 421 |
+
return v
|
| 422 |
+
|
| 423 |
+
def forward(self, x_src, batch=None, args=None):
|
| 424 |
+
|
| 425 |
+
z_src = self.model.encode(x_src.to(dtype=torch.float32, device=self.model.vae.device))
|
| 426 |
+
z_src = z_src.to(self.transformer_gen.device)
|
| 427 |
+
|
| 428 |
+
# calculate prompt_embeddings and neg_prompt_embeddings
|
| 429 |
+
batch_size, _, _, _ = x_src.shape
|
| 430 |
+
with torch.no_grad():
|
| 431 |
+
prompt_embeds, pooled_embeds = self.model.encode_prompt(batch["prompt"], batch_size)
|
| 432 |
+
neg_prompt_embeds, neg_pooled_embeds = self.model.encode_prompt(batch["neg_prompt"], batch_size)
|
| 433 |
+
|
| 434 |
+
NFE = 1
|
| 435 |
+
self.model.scheduler.set_timesteps(NFE, device=self.model.device)
|
| 436 |
+
timesteps = self.model.scheduler.timesteps
|
| 437 |
+
sigmas = timesteps / self.model.scheduler.config.num_train_timesteps
|
| 438 |
+
sigmas = sigmas.to(self.transformer_gen.device)
|
| 439 |
+
|
| 440 |
+
# Solve ODE
|
| 441 |
+
i = 0
|
| 442 |
+
t = timesteps[0]
|
| 443 |
+
|
| 444 |
+
timestep = t.expand(z_src.shape[0]).to(self.transformer_gen.device)
|
| 445 |
+
prompt_embeds = prompt_embeds.to(self.transformer_gen.device, dtype=torch.float32)
|
| 446 |
+
pooled_embeds = pooled_embeds.to(self.transformer_gen.device, dtype=torch.float32)
|
| 447 |
+
pred_v = self.predict_vector(z_src, timestep, prompt_embeds, pooled_embeds)
|
| 448 |
+
pred_null_v = 0.0
|
| 449 |
+
|
| 450 |
+
sigma = sigmas[i]
|
| 451 |
+
sigma_next = sigmas[i+1] if i+1 < NFE else 0.0
|
| 452 |
+
|
| 453 |
+
z_src = z_src + (sigma_next - sigma) * (pred_null_v + 1 * (pred_v - pred_null_v))
|
| 454 |
+
|
| 455 |
+
output_image = self.model.decode(z_src.to(dtype=torch.float32, device=self.model.vae.device))
|
| 456 |
+
|
| 457 |
+
return output_image, z_src, prompt_embeds, pooled_embeds
|
| 458 |
+
|
| 459 |
+
|
| 460 |
+
class OSEDiff_SD3_REG(torch.nn.Module):
|
| 461 |
+
def __init__(self, args, base_model):
|
| 462 |
+
super().__init__()
|
| 463 |
+
|
| 464 |
+
self.args = args
|
| 465 |
+
self.model = base_model
|
| 466 |
+
self.transformer_org = self.model.transformer
|
| 467 |
+
|
| 468 |
+
# Add lora to transformer
|
| 469 |
+
print('Adding Lora to OSEDiff_SD3_REG')
|
| 470 |
+
self.transformer_reg = copy.deepcopy(self.transformer_org)
|
| 471 |
+
self.transformer_reg.to('cuda:1')
|
| 472 |
+
|
| 473 |
+
self.transformer_reg.requires_grad_(False)
|
| 474 |
+
self.transformer_reg.train()
|
| 475 |
+
self.transformer_reg, hooks = add_mp_hook(self.transformer_reg)
|
| 476 |
+
self.hooks = hooks
|
| 477 |
+
|
| 478 |
+
lora_params, _ = inject_lora(self.transformer_reg, {"AdaLayerNormZero"}, r=args.lora_rank, verbose=True)
|
| 479 |
+
for name, param in self.transformer_reg.named_parameters():
|
| 480 |
+
if "lora_" in name:
|
| 481 |
+
param.requires_grad = True # LoRA up/down
|
| 482 |
+
else:
|
| 483 |
+
param.requires_grad = False # everything else
|
| 484 |
+
|
| 485 |
+
def predict_vector_reg(self, z, t, prompt_emb, pooled_emb):
|
| 486 |
+
v = self.transformer_reg(hidden_states=z,
|
| 487 |
+
timestep=t,
|
| 488 |
+
pooled_projections=pooled_emb,
|
| 489 |
+
encoder_hidden_states=prompt_emb,
|
| 490 |
+
return_dict=False)[0]
|
| 491 |
+
return v
|
| 492 |
+
|
| 493 |
+
def predict_vector_org(self, z, t, prompt_emb, pooled_emb):
|
| 494 |
+
v = self.transformer_org(hidden_states=z,
|
| 495 |
+
timestep=t,
|
| 496 |
+
pooled_projections=pooled_emb,
|
| 497 |
+
encoder_hidden_states=prompt_emb,
|
| 498 |
+
return_dict=False)[0]
|
| 499 |
+
return v
|
| 500 |
+
|
| 501 |
+
def distribution_matching_loss(self, z0, prompt_embeds, pooled_embeds, global_step, args):
|
| 502 |
+
|
| 503 |
+
with torch.no_grad():
|
| 504 |
+
device = self.transformer_reg.device
|
| 505 |
+
# get timesteps and sigma
|
| 506 |
+
u = compute_density_for_timestep_sampling(
|
| 507 |
+
weighting_scheme="uniform",
|
| 508 |
+
batch_size=1,
|
| 509 |
+
logit_mean=0.0,
|
| 510 |
+
logit_std=1.0,
|
| 511 |
+
mode_scale=1.29,
|
| 512 |
+
)
|
| 513 |
+
|
| 514 |
+
t_idx = (u*1000).long().to(device)
|
| 515 |
+
self.model.scheduler.set_timesteps(1000, device=device)
|
| 516 |
+
times = self.model.scheduler.timesteps
|
| 517 |
+
t = times[t_idx]
|
| 518 |
+
sigma = t / 1000
|
| 519 |
+
|
| 520 |
+
# get noise and xt
|
| 521 |
+
z0 = z0.to(device)
|
| 522 |
+
noise = torch.randn_like(z0)
|
| 523 |
+
sigma = sigma.half()
|
| 524 |
+
zt = (1-sigma) * z0 + sigma * noise
|
| 525 |
+
|
| 526 |
+
# Get x0_prediction of transformer_reg
|
| 527 |
+
v_pred_reg = self.predict_vector_reg(zt, t, prompt_embeds.to(device), pooled_embeds.to(device))
|
| 528 |
+
reg_model_pred = v_pred_reg * (-sigma) + zt # this is x0_prediction for reg
|
| 529 |
+
|
| 530 |
+
# Get x0_prediction of transformer_org
|
| 531 |
+
org_device = self.transformer_org.device
|
| 532 |
+
v_pred_org = self.predict_vector_org(zt.to(org_device), t.to(org_device), prompt_embeds.to(org_device), pooled_embeds.to(org_device))
|
| 533 |
+
org_model_pred = v_pred_org * (-sigma.to(org_device)) + zt.to(org_device) # this is x0_prediction for org
|
| 534 |
+
|
| 535 |
+
# Visualization
|
| 536 |
+
if global_step % 100 == 1:
|
| 537 |
+
self.vsd_visualization(z0, noise, zt, reg_model_pred, org_model_pred, global_step, args)
|
| 538 |
+
|
| 539 |
+
weighting_factor = torch.abs(z0 - org_model_pred.to(device)).mean(dim=[1, 2, 3], keepdim=True)
|
| 540 |
+
|
| 541 |
+
grad = (reg_model_pred - org_model_pred.to(device)) / weighting_factor
|
| 542 |
+
loss = F.mse_loss(z0, (z0 - grad).detach())
|
| 543 |
+
|
| 544 |
+
return loss
|
| 545 |
+
|
| 546 |
+
def vsd_visualization(self, z0, noise, zt, reg_model_pred, org_model_pred, global_step, args):
|
| 547 |
+
#-------- Visualization --------#
|
| 548 |
+
# 1. Visualize latents, noise, zt
|
| 549 |
+
z0_img = self.model.decode(z0.to(dtype=torch.float32, device=self.model.vae.device))
|
| 550 |
+
ns_img = self.model.decode(noise.to(dtype=torch.float32, device=self.model.vae.device))
|
| 551 |
+
zt_img = self.model.decode(zt.to(dtype=torch.float32, device=self.model.vae.device))
|
| 552 |
+
|
| 553 |
+
z0_img_pil = transforms.ToPILImage()(torch.clamp(z0_img[0].cpu(), -1.0, 1.0) * 0.5 + 0.5)
|
| 554 |
+
ns_img_pil = transforms.ToPILImage()(torch.clamp(ns_img[0].cpu(), -1.0, 1.0) * 0.5 + 0.5)
|
| 555 |
+
zt_img_pil = transforms.ToPILImage()(torch.clamp(zt_img[0].cpu(), -1.0, 1.0) * 0.5 + 0.5)
|
| 556 |
+
|
| 557 |
+
# 2. Visualize reg_img, org_img
|
| 558 |
+
reg_img = self.model.decode(reg_model_pred.to(dtype=torch.float32, device=self.model.vae.device))
|
| 559 |
+
org_img = self.model.decode(org_model_pred.to(dtype=torch.float32, device=self.model.vae.device))
|
| 560 |
+
|
| 561 |
+
reg_img_pil = transforms.ToPILImage()(torch.clamp(reg_img[0].cpu(), -1.0, 1.0) * 0.5 + 0.5)
|
| 562 |
+
org_img_pil = transforms.ToPILImage()(torch.clamp(org_img[0].cpu(), -1.0, 1.0) * 0.5 + 0.5)
|
| 563 |
+
|
| 564 |
+
# Concatenate images side by side
|
| 565 |
+
w, h = z0_img_pil.width, z0_img_pil.height
|
| 566 |
+
combined_image = Image.new('RGB', (w*5, h))
|
| 567 |
+
combined_image.paste(z0_img_pil, (0, 0))
|
| 568 |
+
combined_image.paste(ns_img_pil, (w, 0))
|
| 569 |
+
combined_image.paste(zt_img_pil, (w*2, 0))
|
| 570 |
+
combined_image.paste(reg_img_pil, (w*3, 0))
|
| 571 |
+
combined_image.paste(org_img_pil, (w*4, 0))
|
| 572 |
+
combined_image.save(os.path.join(args.output_dir, f'visualization/vsd/{global_step}.png'))
|
| 573 |
+
#-------- Visualization --------#
|
| 574 |
+
|
| 575 |
+
def diff_loss(self, z0, prompt_embeds, pooled_embeds, net_lpips, args):
|
| 576 |
+
|
| 577 |
+
device = self.transformer_reg.device
|
| 578 |
+
u = compute_density_for_timestep_sampling(
|
| 579 |
+
weighting_scheme="uniform",
|
| 580 |
+
batch_size=1,
|
| 581 |
+
logit_mean=0.0,
|
| 582 |
+
logit_std=1.0,
|
| 583 |
+
mode_scale=1.29,
|
| 584 |
+
)
|
| 585 |
+
|
| 586 |
+
t_idx = (u*1000).long().to(device)
|
| 587 |
+
self.model.scheduler.set_timesteps(1000, device=device)
|
| 588 |
+
times = self.model.scheduler.timesteps
|
| 589 |
+
t = times[t_idx]
|
| 590 |
+
sigma = t / 1000
|
| 591 |
+
|
| 592 |
+
z0 = z0.to(device)
|
| 593 |
+
z0, prompt_embeds = z0.detach(), prompt_embeds.detach()
|
| 594 |
+
noise = torch.randn_like(z0)
|
| 595 |
+
sigma = sigma.half()
|
| 596 |
+
zt = (1-sigma) * z0 + sigma * noise # noisy latents
|
| 597 |
+
|
| 598 |
+
# v-prediction
|
| 599 |
+
v_pred = self.predict_vector_reg(zt, t, prompt_embeds.to(device), pooled_embeds.to(device))
|
| 600 |
+
model_pred = v_pred * (-sigma) + zt
|
| 601 |
+
target = z0
|
| 602 |
+
|
| 603 |
+
loss_weight = compute_loss_weighting_for_sd3("logit_normal", sigma)
|
| 604 |
+
diffusion_loss = loss_weight.float() * F.mse_loss(model_pred.float(), target.float())
|
| 605 |
+
|
| 606 |
+
loss_d = diffusion_loss
|
| 607 |
+
|
| 608 |
+
return loss_d.mean()
|
| 609 |
+
|
| 610 |
+
class OSEDiff_SD3_TEST(torch.nn.Module):
|
| 611 |
+
def __init__(self, args, base_model):
|
| 612 |
+
super().__init__()
|
| 613 |
+
|
| 614 |
+
self.args = args
|
| 615 |
+
self.model = base_model
|
| 616 |
+
self.lora_path = args.lora_path
|
| 617 |
+
self.vae_path = args.vae_path
|
| 618 |
+
|
| 619 |
+
# Add lora to transformer
|
| 620 |
+
print(f'Loading LoRA to Transformer from {self.lora_path}')
|
| 621 |
+
self.model.transformer.requires_grad_(False)
|
| 622 |
+
lora_params, _ = inject_lora(self.model.transformer, {"AdaLayerNormZero"}, loras=self.lora_path, r=args.lora_rank, verbose=False)
|
| 623 |
+
for name, param in self.model.transformer.named_parameters():
|
| 624 |
+
param.requires_grad = False
|
| 625 |
+
|
| 626 |
+
# Insert LoRA into VAE
|
| 627 |
+
print(f"Loading LoRA to VAE from {self.vae_path}")
|
| 628 |
+
self.model.vae, self.lora_vae_modules_encoder = inject_lora_vae(self.model.vae, lora_rank=args.lora_rank, verbose=False)
|
| 629 |
+
encoder_state_dict_fp16 = torch.load(self.vae_path, map_location="cpu")
|
| 630 |
+
self.model.vae.encoder.load_state_dict(encoder_state_dict_fp16)
|
| 631 |
+
|
| 632 |
+
def predict_vector(self, z, t, prompt_emb, pooled_emb):
|
| 633 |
+
v = self.model.transformer(hidden_states=z,
|
| 634 |
+
timestep=t,
|
| 635 |
+
pooled_projections=pooled_emb,
|
| 636 |
+
encoder_hidden_states=prompt_emb,
|
| 637 |
+
return_dict=False)[0]
|
| 638 |
+
return v
|
| 639 |
+
|
| 640 |
+
@torch.no_grad()
|
| 641 |
+
def forward(self, x_src, prompt):
|
| 642 |
+
|
| 643 |
+
z_src = self.model.vae.encode(x_src.to(dtype=torch.float32, device=self.model.vae.device)).latent_dist.sample() * self.model.vae.config.scaling_factor
|
| 644 |
+
|
| 645 |
+
z_src = z_src.to(self.model.transformer.device)
|
| 646 |
+
|
| 647 |
+
# calculate prompt_embeddings and neg_prompt_embeddings
|
| 648 |
+
batch_size, _, _, _ = x_src.shape
|
| 649 |
+
with torch.no_grad():
|
| 650 |
+
prompt_embeds, pooled_embeds = self.model.encode_prompt([prompt], batch_size)
|
| 651 |
+
|
| 652 |
+
self.model.scheduler.set_timesteps(1, device=self.model.device)
|
| 653 |
+
timesteps = self.model.scheduler.timesteps
|
| 654 |
+
|
| 655 |
+
# Solve ODE
|
| 656 |
+
t = timesteps[0]
|
| 657 |
+
timestep = t.expand(z_src.shape[0]).to(self.model.transformer.device)
|
| 658 |
+
prompt_embeds = prompt_embeds.to(self.model.transformer.device, dtype=torch.float32)
|
| 659 |
+
pooled_embeds = pooled_embeds.to(self.model.transformer.device, dtype=torch.float32)
|
| 660 |
+
pred_v = self.predict_vector(z_src, timestep, prompt_embeds, pooled_embeds)
|
| 661 |
+
|
| 662 |
+
z_src = z_src - pred_v
|
| 663 |
+
|
| 664 |
+
with torch.no_grad():
|
| 665 |
+
output_image = self.model.decode(z_src.to(dtype=torch.float32, device=self.model.vae.device))
|
| 666 |
+
|
| 667 |
+
return output_image
|
| 668 |
+
|
| 669 |
+
|
| 670 |
+
class OSEDiff_SD3_TEST_efficient(torch.nn.Module):
|
| 671 |
+
def __init__(self, args, base_model):
|
| 672 |
+
super().__init__()
|
| 673 |
+
|
| 674 |
+
self.args = args
|
| 675 |
+
self.model = base_model
|
| 676 |
+
self.lora_path = args.lora_path
|
| 677 |
+
self.vae_path = args.vae_path
|
| 678 |
+
|
| 679 |
+
# Add lora to transformer
|
| 680 |
+
print(f'Loading LoRA to Transformer from {self.lora_path}')
|
| 681 |
+
self.model.transformer.requires_grad_(False)
|
| 682 |
+
lora_params, _ = inject_lora(self.model.transformer, {"AdaLayerNormZero"}, loras=self.lora_path, r=args.lora_rank, verbose=False)
|
| 683 |
+
for name, param in self.model.transformer.named_parameters():
|
| 684 |
+
param.requires_grad = False
|
| 685 |
+
|
| 686 |
+
# Insert LoRA into VAE
|
| 687 |
+
print(f"Loading LoRA to VAE from {self.vae_path}")
|
| 688 |
+
self.model.vae, self.lora_vae_modules_encoder = inject_lora_vae(self.model.vae, lora_rank=args.lora_rank, verbose=False)
|
| 689 |
+
encoder_state_dict_fp16 = torch.load(self.vae_path, map_location="cpu")
|
| 690 |
+
self.model.vae.encoder.load_state_dict(encoder_state_dict_fp16)
|
| 691 |
+
|
| 692 |
+
def predict_vector(self, z, t, prompt_emb, pooled_emb):
|
| 693 |
+
v = self.model.transformer(hidden_states=z,
|
| 694 |
+
timestep=t,
|
| 695 |
+
pooled_projections=pooled_emb,
|
| 696 |
+
encoder_hidden_states=prompt_emb,
|
| 697 |
+
return_dict=False)[0]
|
| 698 |
+
return v
|
| 699 |
+
|
| 700 |
+
@torch.no_grad()
|
| 701 |
+
def forward(self, x_src, prompt):
|
| 702 |
+
|
| 703 |
+
z_src = self.model.vae.encode(x_src.to(dtype=torch.float32, device=self.model.vae.device)).latent_dist.sample() * self.model.vae.config.scaling_factor
|
| 704 |
+
|
| 705 |
+
z_src = z_src.to(self.model.transformer.device)
|
| 706 |
+
|
| 707 |
+
# calculate prompt_embeddings
|
| 708 |
+
batch_size, _, _, _ = x_src.shape
|
| 709 |
+
prompt_embeds, pooled_embeds = self.model.encode_prompt([prompt], batch_size)
|
| 710 |
+
|
| 711 |
+
self.model.scheduler.set_timesteps(1, device=self.model.device)
|
| 712 |
+
timesteps = self.model.scheduler.timesteps
|
| 713 |
+
|
| 714 |
+
# Solve ODE
|
| 715 |
+
t = timesteps[0]
|
| 716 |
+
timestep = t.expand(z_src.shape[0]).to(self.model.transformer.device)
|
| 717 |
+
prompt_embeds = prompt_embeds.to(self.model.transformer.device, dtype=torch.float32)
|
| 718 |
+
pooled_embeds = pooled_embeds.to(self.model.transformer.device, dtype=torch.float32)
|
| 719 |
+
pred_v = self.predict_vector(z_src, timestep, prompt_embeds, pooled_embeds)
|
| 720 |
+
z_src = z_src - pred_v
|
| 721 |
+
|
| 722 |
+
output_image = self.model.decode(z_src.to(dtype=torch.float32, device=self.model.vae.device))
|
| 723 |
+
|
| 724 |
+
return output_image
|
| 725 |
+
|
ram/__init__.py
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from .inference import inference_tag2text, inference_ram, inference_ram_openset
|
| 2 |
+
from .transform import get_transform
|
ram/configs/condition_config.json
ADDED
|
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"nf": 64
|
| 3 |
+
}
|
ram/configs/med_config.json
ADDED
|
@@ -0,0 +1,21 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"BertModel"
|
| 4 |
+
],
|
| 5 |
+
"attention_probs_dropout_prob": 0.1,
|
| 6 |
+
"hidden_act": "gelu",
|
| 7 |
+
"hidden_dropout_prob": 0.1,
|
| 8 |
+
"hidden_size": 768,
|
| 9 |
+
"initializer_range": 0.02,
|
| 10 |
+
"intermediate_size": 3072,
|
| 11 |
+
"layer_norm_eps": 1e-12,
|
| 12 |
+
"max_position_embeddings": 512,
|
| 13 |
+
"model_type": "bert",
|
| 14 |
+
"num_attention_heads": 12,
|
| 15 |
+
"num_hidden_layers": 12,
|
| 16 |
+
"pad_token_id": 0,
|
| 17 |
+
"type_vocab_size": 2,
|
| 18 |
+
"vocab_size": 30524,
|
| 19 |
+
"encoder_width": 768,
|
| 20 |
+
"add_cross_attention": true
|
| 21 |
+
}
|
ram/configs/q2l_config.json
ADDED
|
@@ -0,0 +1,22 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"architectures": [
|
| 3 |
+
"BertModel"
|
| 4 |
+
],
|
| 5 |
+
"attention_probs_dropout_prob": 0.1,
|
| 6 |
+
"hidden_act": "gelu",
|
| 7 |
+
"hidden_dropout_prob": 0.1,
|
| 8 |
+
"hidden_size": 768,
|
| 9 |
+
"initializer_range": 0.02,
|
| 10 |
+
"intermediate_size": 3072,
|
| 11 |
+
"layer_norm_eps": 1e-12,
|
| 12 |
+
"max_position_embeddings": 512,
|
| 13 |
+
"model_type": "bert",
|
| 14 |
+
"num_attention_heads": 4,
|
| 15 |
+
"num_hidden_layers": 2,
|
| 16 |
+
"pad_token_id": 0,
|
| 17 |
+
"type_vocab_size": 2,
|
| 18 |
+
"vocab_size": 30522,
|
| 19 |
+
"encoder_width": 768,
|
| 20 |
+
"add_cross_attention": true,
|
| 21 |
+
"add_tag_cross_attention": false
|
| 22 |
+
}
|
ram/configs/swin/config_swinB_384.json
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"ckpt": "pretrain_model/swin_base_patch4_window7_224_22k.pth",
|
| 3 |
+
"vision_width": 1024,
|
| 4 |
+
"image_res": 384,
|
| 5 |
+
"window_size": 12,
|
| 6 |
+
"embed_dim": 128,
|
| 7 |
+
"depths": [ 2, 2, 18, 2 ],
|
| 8 |
+
"num_heads": [ 4, 8, 16, 32 ]
|
| 9 |
+
}
|
ram/configs/swin/config_swinL_384.json
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"ckpt": "pretrain_model/swin_large_patch4_window12_384_22k.pth",
|
| 3 |
+
"vision_width": 1536,
|
| 4 |
+
"image_res": 384,
|
| 5 |
+
"window_size": 12,
|
| 6 |
+
"embed_dim": 192,
|
| 7 |
+
"depths": [ 2, 2, 18, 2 ],
|
| 8 |
+
"num_heads": [ 6, 12, 24, 48 ]
|
| 9 |
+
}
|
ram/configs/swin/config_swinL_444.json
ADDED
|
@@ -0,0 +1,9 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
{
|
| 2 |
+
"ckpt": "pretrain_model/swin_large_patch4_window12_384_22k.pth",
|
| 3 |
+
"vision_width": 1536,
|
| 4 |
+
"image_res": 444,
|
| 5 |
+
"window_size": 12,
|
| 6 |
+
"embed_dim": 192,
|
| 7 |
+
"depths": [ 2, 2, 18, 2 ],
|
| 8 |
+
"num_heads": [ 6, 12, 24, 48 ]
|
| 9 |
+
}
|
ram/data/ram_tag_list.txt
ADDED
|
@@ -0,0 +1,4585 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
3D CG rendering
|
| 2 |
+
3D glasses
|
| 3 |
+
abacus
|
| 4 |
+
abalone
|
| 5 |
+
monastery
|
| 6 |
+
belly
|
| 7 |
+
academy
|
| 8 |
+
accessory
|
| 9 |
+
accident
|
| 10 |
+
accordion
|
| 11 |
+
acorn
|
| 12 |
+
acrylic paint
|
| 13 |
+
act
|
| 14 |
+
action
|
| 15 |
+
action film
|
| 16 |
+
activity
|
| 17 |
+
actor
|
| 18 |
+
adaptation
|
| 19 |
+
add
|
| 20 |
+
adhesive tape
|
| 21 |
+
adjust
|
| 22 |
+
adult
|
| 23 |
+
adventure
|
| 24 |
+
advertisement
|
| 25 |
+
antenna
|
| 26 |
+
aerobics
|
| 27 |
+
spray can
|
| 28 |
+
afro
|
| 29 |
+
agriculture
|
| 30 |
+
aid
|
| 31 |
+
air conditioner
|
| 32 |
+
air conditioning
|
| 33 |
+
air sock
|
| 34 |
+
aircraft cabin
|
| 35 |
+
aircraft model
|
| 36 |
+
air field
|
| 37 |
+
air line
|
| 38 |
+
airliner
|
| 39 |
+
airman
|
| 40 |
+
plane
|
| 41 |
+
airplane window
|
| 42 |
+
airport
|
| 43 |
+
airport runway
|
| 44 |
+
airport terminal
|
| 45 |
+
airship
|
| 46 |
+
airshow
|
| 47 |
+
aisle
|
| 48 |
+
alarm
|
| 49 |
+
alarm clock
|
| 50 |
+
mollymawk
|
| 51 |
+
album
|
| 52 |
+
album cover
|
| 53 |
+
alcohol
|
| 54 |
+
alcove
|
| 55 |
+
algae
|
| 56 |
+
alley
|
| 57 |
+
almond
|
| 58 |
+
aloe vera
|
| 59 |
+
alp
|
| 60 |
+
alpaca
|
| 61 |
+
alphabet
|
| 62 |
+
german shepherd
|
| 63 |
+
altar
|
| 64 |
+
amber
|
| 65 |
+
ambulance
|
| 66 |
+
bald eagle
|
| 67 |
+
American shorthair
|
| 68 |
+
amethyst
|
| 69 |
+
amphitheater
|
| 70 |
+
amplifier
|
| 71 |
+
amusement park
|
| 72 |
+
amusement ride
|
| 73 |
+
anchor
|
| 74 |
+
ancient
|
| 75 |
+
anemone
|
| 76 |
+
angel
|
| 77 |
+
angle
|
| 78 |
+
animal
|
| 79 |
+
animal sculpture
|
| 80 |
+
animal shelter
|
| 81 |
+
animation
|
| 82 |
+
animation film
|
| 83 |
+
animator
|
| 84 |
+
anime
|
| 85 |
+
ankle
|
| 86 |
+
anklet
|
| 87 |
+
anniversary
|
| 88 |
+
trench coat
|
| 89 |
+
ant
|
| 90 |
+
antelope
|
| 91 |
+
antique
|
| 92 |
+
antler
|
| 93 |
+
anvil
|
| 94 |
+
apartment
|
| 95 |
+
ape
|
| 96 |
+
app
|
| 97 |
+
app icon
|
| 98 |
+
appear
|
| 99 |
+
appearance
|
| 100 |
+
appetizer
|
| 101 |
+
applause
|
| 102 |
+
apple
|
| 103 |
+
apple juice
|
| 104 |
+
apple pie
|
| 105 |
+
apple tree
|
| 106 |
+
applesauce
|
| 107 |
+
appliance
|
| 108 |
+
appointment
|
| 109 |
+
approach
|
| 110 |
+
apricot
|
| 111 |
+
apron
|
| 112 |
+
aqua
|
| 113 |
+
aquarium
|
| 114 |
+
aquarium fish
|
| 115 |
+
aqueduct
|
| 116 |
+
arcade
|
| 117 |
+
arcade machine
|
| 118 |
+
arch
|
| 119 |
+
arch bridge
|
| 120 |
+
archaelogical excavation
|
| 121 |
+
archery
|
| 122 |
+
archipelago
|
| 123 |
+
architect
|
| 124 |
+
architecture
|
| 125 |
+
archive
|
| 126 |
+
archway
|
| 127 |
+
area
|
| 128 |
+
arena
|
| 129 |
+
argument
|
| 130 |
+
arm
|
| 131 |
+
armadillo
|
| 132 |
+
armband
|
| 133 |
+
armchair
|
| 134 |
+
armoire
|
| 135 |
+
armor
|
| 136 |
+
army
|
| 137 |
+
army base
|
| 138 |
+
army tank
|
| 139 |
+
array
|
| 140 |
+
arrest
|
| 141 |
+
arrow
|
| 142 |
+
art
|
| 143 |
+
art exhibition
|
| 144 |
+
art gallery
|
| 145 |
+
art print
|
| 146 |
+
art school
|
| 147 |
+
art studio
|
| 148 |
+
art vector illustration
|
| 149 |
+
artichoke
|
| 150 |
+
article
|
| 151 |
+
artifact
|
| 152 |
+
artist
|
| 153 |
+
artists loft
|
| 154 |
+
ash
|
| 155 |
+
ashtray
|
| 156 |
+
asia temple
|
| 157 |
+
asparagus
|
| 158 |
+
asphalt road
|
| 159 |
+
assemble
|
| 160 |
+
assembly
|
| 161 |
+
assembly line
|
| 162 |
+
association
|
| 163 |
+
astronaut
|
| 164 |
+
astronomer
|
| 165 |
+
athlete
|
| 166 |
+
athletic
|
| 167 |
+
atlas
|
| 168 |
+
atm
|
| 169 |
+
atmosphere
|
| 170 |
+
atrium
|
| 171 |
+
attach
|
| 172 |
+
fighter jet
|
| 173 |
+
attend
|
| 174 |
+
attraction
|
| 175 |
+
atv
|
| 176 |
+
eggplant
|
| 177 |
+
auction
|
| 178 |
+
audi
|
| 179 |
+
audio
|
| 180 |
+
auditorium
|
| 181 |
+
aurora
|
| 182 |
+
author
|
| 183 |
+
auto factory
|
| 184 |
+
auto mechanic
|
| 185 |
+
auto part
|
| 186 |
+
auto show
|
| 187 |
+
auto showroom
|
| 188 |
+
car battery
|
| 189 |
+
automobile make
|
| 190 |
+
automobile model
|
| 191 |
+
motor vehicle
|
| 192 |
+
autumn
|
| 193 |
+
autumn forest
|
| 194 |
+
autumn leave
|
| 195 |
+
autumn park
|
| 196 |
+
autumn tree
|
| 197 |
+
avatar
|
| 198 |
+
avenue
|
| 199 |
+
aviator sunglasses
|
| 200 |
+
avocado
|
| 201 |
+
award
|
| 202 |
+
award ceremony
|
| 203 |
+
award winner
|
| 204 |
+
shed
|
| 205 |
+
ax
|
| 206 |
+
azalea
|
| 207 |
+
baboon
|
| 208 |
+
baby
|
| 209 |
+
baby bottle
|
| 210 |
+
baby carriage
|
| 211 |
+
baby clothe
|
| 212 |
+
baby elephant
|
| 213 |
+
baby food
|
| 214 |
+
baby seat
|
| 215 |
+
baby shower
|
| 216 |
+
back
|
| 217 |
+
backdrop
|
| 218 |
+
backlight
|
| 219 |
+
backpack
|
| 220 |
+
backyard
|
| 221 |
+
bacon
|
| 222 |
+
badge
|
| 223 |
+
badger
|
| 224 |
+
badlands
|
| 225 |
+
badminton
|
| 226 |
+
badminton racket
|
| 227 |
+
bag
|
| 228 |
+
bagel
|
| 229 |
+
bagpipe
|
| 230 |
+
baguette
|
| 231 |
+
bait
|
| 232 |
+
baked goods
|
| 233 |
+
baker
|
| 234 |
+
bakery
|
| 235 |
+
baking
|
| 236 |
+
baking sheet
|
| 237 |
+
balance
|
| 238 |
+
balance car
|
| 239 |
+
balcony
|
| 240 |
+
ball
|
| 241 |
+
ball pit
|
| 242 |
+
ballerina
|
| 243 |
+
ballet
|
| 244 |
+
ballet dancer
|
| 245 |
+
ballet skirt
|
| 246 |
+
balloon
|
| 247 |
+
balloon arch
|
| 248 |
+
baseball player
|
| 249 |
+
ballroom
|
| 250 |
+
bamboo
|
| 251 |
+
bamboo forest
|
| 252 |
+
banana
|
| 253 |
+
banana bread
|
| 254 |
+
banana leaf
|
| 255 |
+
banana tree
|
| 256 |
+
band
|
| 257 |
+
band aid
|
| 258 |
+
bandage
|
| 259 |
+
headscarf
|
| 260 |
+
bandeau
|
| 261 |
+
bangs
|
| 262 |
+
bracelet
|
| 263 |
+
balustrade
|
| 264 |
+
banjo
|
| 265 |
+
bank
|
| 266 |
+
bank card
|
| 267 |
+
bank vault
|
| 268 |
+
banknote
|
| 269 |
+
banner
|
| 270 |
+
banquet
|
| 271 |
+
banquet hall
|
| 272 |
+
banyan tree
|
| 273 |
+
baozi
|
| 274 |
+
baptism
|
| 275 |
+
bar
|
| 276 |
+
bar code
|
| 277 |
+
bar stool
|
| 278 |
+
barbecue
|
| 279 |
+
barbecue grill
|
| 280 |
+
barbell
|
| 281 |
+
barber
|
| 282 |
+
barber shop
|
| 283 |
+
barbie
|
| 284 |
+
barge
|
| 285 |
+
barista
|
| 286 |
+
bark
|
| 287 |
+
barley
|
| 288 |
+
barn
|
| 289 |
+
barn owl
|
| 290 |
+
barn door
|
| 291 |
+
barrel
|
| 292 |
+
barricade
|
| 293 |
+
barrier
|
| 294 |
+
handcart
|
| 295 |
+
bartender
|
| 296 |
+
baseball
|
| 297 |
+
baseball base
|
| 298 |
+
baseball bat
|
| 299 |
+
baseball hat
|
| 300 |
+
baseball stadium
|
| 301 |
+
baseball game
|
| 302 |
+
baseball glove
|
| 303 |
+
baseball pitcher
|
| 304 |
+
baseball team
|
| 305 |
+
baseball uniform
|
| 306 |
+
basement
|
| 307 |
+
basil
|
| 308 |
+
basin
|
| 309 |
+
basket
|
| 310 |
+
basket container
|
| 311 |
+
basketball
|
| 312 |
+
basketball backboard
|
| 313 |
+
basketball coach
|
| 314 |
+
basketball court
|
| 315 |
+
basketball game
|
| 316 |
+
basketball hoop
|
| 317 |
+
basketball player
|
| 318 |
+
basketball stadium
|
| 319 |
+
basketball team
|
| 320 |
+
bass
|
| 321 |
+
bass guitar
|
| 322 |
+
bass horn
|
| 323 |
+
bassist
|
| 324 |
+
bat
|
| 325 |
+
bath
|
| 326 |
+
bath heater
|
| 327 |
+
bath mat
|
| 328 |
+
bath towel
|
| 329 |
+
swimwear
|
| 330 |
+
bathrobe
|
| 331 |
+
bathroom
|
| 332 |
+
bathroom accessory
|
| 333 |
+
bathroom cabinet
|
| 334 |
+
bathroom door
|
| 335 |
+
bathroom mirror
|
| 336 |
+
bathroom sink
|
| 337 |
+
toilet paper
|
| 338 |
+
bathroom window
|
| 339 |
+
batman
|
| 340 |
+
wand
|
| 341 |
+
batter
|
| 342 |
+
battery
|
| 343 |
+
battle
|
| 344 |
+
battle rope
|
| 345 |
+
battleship
|
| 346 |
+
bay
|
| 347 |
+
bay bridge
|
| 348 |
+
bay window
|
| 349 |
+
bayberry
|
| 350 |
+
bazaar
|
| 351 |
+
beach
|
| 352 |
+
beach ball
|
| 353 |
+
beach chair
|
| 354 |
+
beach house
|
| 355 |
+
beach hut
|
| 356 |
+
beach towel
|
| 357 |
+
beach volleyball
|
| 358 |
+
lighthouse
|
| 359 |
+
bead
|
| 360 |
+
beagle
|
| 361 |
+
beak
|
| 362 |
+
beaker
|
| 363 |
+
beam
|
| 364 |
+
bean
|
| 365 |
+
bean bag chair
|
| 366 |
+
beanbag
|
| 367 |
+
bear
|
| 368 |
+
bear cub
|
| 369 |
+
beard
|
| 370 |
+
beast
|
| 371 |
+
beat
|
| 372 |
+
beautiful
|
| 373 |
+
beauty
|
| 374 |
+
beauty salon
|
| 375 |
+
beaver
|
| 376 |
+
bed
|
| 377 |
+
bedcover
|
| 378 |
+
bed frame
|
| 379 |
+
bedroom
|
| 380 |
+
bedding
|
| 381 |
+
bedpan
|
| 382 |
+
bedroom window
|
| 383 |
+
bedside lamp
|
| 384 |
+
bee
|
| 385 |
+
beech tree
|
| 386 |
+
beef
|
| 387 |
+
beekeeper
|
| 388 |
+
beeper
|
| 389 |
+
beer
|
| 390 |
+
beer bottle
|
| 391 |
+
beer can
|
| 392 |
+
beer garden
|
| 393 |
+
beer glass
|
| 394 |
+
beer hall
|
| 395 |
+
beet
|
| 396 |
+
beetle
|
| 397 |
+
beige
|
| 398 |
+
clock
|
| 399 |
+
bell pepper
|
| 400 |
+
bell tower
|
| 401 |
+
belt
|
| 402 |
+
belt buckle
|
| 403 |
+
bench
|
| 404 |
+
bend
|
| 405 |
+
bengal tiger
|
| 406 |
+
bento
|
| 407 |
+
beret
|
| 408 |
+
berry
|
| 409 |
+
berth
|
| 410 |
+
beverage
|
| 411 |
+
bib
|
| 412 |
+
bibimbap
|
| 413 |
+
bible
|
| 414 |
+
bichon
|
| 415 |
+
bicycle
|
| 416 |
+
bicycle helmet
|
| 417 |
+
bicycle wheel
|
| 418 |
+
biker
|
| 419 |
+
bidet
|
| 420 |
+
big ben
|
| 421 |
+
bike lane
|
| 422 |
+
bike path
|
| 423 |
+
bike racing
|
| 424 |
+
bike ride
|
| 425 |
+
bikini
|
| 426 |
+
bikini top
|
| 427 |
+
bill
|
| 428 |
+
billard
|
| 429 |
+
billboard
|
| 430 |
+
billiard table
|
| 431 |
+
bin
|
| 432 |
+
binder
|
| 433 |
+
binocular
|
| 434 |
+
biology laboratory
|
| 435 |
+
biplane
|
| 436 |
+
birch
|
| 437 |
+
birch tree
|
| 438 |
+
bird
|
| 439 |
+
bird bath
|
| 440 |
+
bird feeder
|
| 441 |
+
bird house
|
| 442 |
+
bird nest
|
| 443 |
+
birdbath
|
| 444 |
+
bird cage
|
| 445 |
+
birth
|
| 446 |
+
birthday
|
| 447 |
+
birthday cake
|
| 448 |
+
birthday candle
|
| 449 |
+
birthday card
|
| 450 |
+
birthday party
|
| 451 |
+
biscuit
|
| 452 |
+
bishop
|
| 453 |
+
bison
|
| 454 |
+
bit
|
| 455 |
+
bite
|
| 456 |
+
black
|
| 457 |
+
black sheep
|
| 458 |
+
blackberry
|
| 459 |
+
blackbird
|
| 460 |
+
blackboard
|
| 461 |
+
blacksmith
|
| 462 |
+
blade
|
| 463 |
+
blanket
|
| 464 |
+
sports coat
|
| 465 |
+
bleacher
|
| 466 |
+
blender
|
| 467 |
+
blessing
|
| 468 |
+
blind
|
| 469 |
+
eye mask
|
| 470 |
+
flasher
|
| 471 |
+
snowstorm
|
| 472 |
+
block
|
| 473 |
+
blog
|
| 474 |
+
blood
|
| 475 |
+
bloom
|
| 476 |
+
blossom
|
| 477 |
+
blouse
|
| 478 |
+
blow
|
| 479 |
+
hair drier
|
| 480 |
+
blowfish
|
| 481 |
+
blue
|
| 482 |
+
blue artist
|
| 483 |
+
blue jay
|
| 484 |
+
blue sky
|
| 485 |
+
blueberry
|
| 486 |
+
bluebird
|
| 487 |
+
pig
|
| 488 |
+
board
|
| 489 |
+
board eraser
|
| 490 |
+
board game
|
| 491 |
+
boardwalk
|
| 492 |
+
boat
|
| 493 |
+
boat deck
|
| 494 |
+
boat house
|
| 495 |
+
paddle
|
| 496 |
+
boat ride
|
| 497 |
+
bobfloat
|
| 498 |
+
bobcat
|
| 499 |
+
body
|
| 500 |
+
bodyboard
|
| 501 |
+
bodybuilder
|
| 502 |
+
boiled egg
|
| 503 |
+
boiler
|
| 504 |
+
bolo tie
|
| 505 |
+
bolt
|
| 506 |
+
bomb
|
| 507 |
+
bomber
|
| 508 |
+
bonasa umbellu
|
| 509 |
+
bone
|
| 510 |
+
bonfire
|
| 511 |
+
bonnet
|
| 512 |
+
bonsai
|
| 513 |
+
book
|
| 514 |
+
book cover
|
| 515 |
+
bookcase
|
| 516 |
+
folder
|
| 517 |
+
bookmark
|
| 518 |
+
bookshelf
|
| 519 |
+
bookstore
|
| 520 |
+
boom microphone
|
| 521 |
+
boost
|
| 522 |
+
boot
|
| 523 |
+
border
|
| 524 |
+
Border collie
|
| 525 |
+
botanical garden
|
| 526 |
+
bottle
|
| 527 |
+
bottle cap
|
| 528 |
+
bottle opener
|
| 529 |
+
bottle screw
|
| 530 |
+
bougainvillea
|
| 531 |
+
boulder
|
| 532 |
+
bouquet
|
| 533 |
+
boutique
|
| 534 |
+
boutique hotel
|
| 535 |
+
bow
|
| 536 |
+
bow tie
|
| 537 |
+
bow window
|
| 538 |
+
bowl
|
| 539 |
+
bowling
|
| 540 |
+
bowling alley
|
| 541 |
+
bowling ball
|
| 542 |
+
bowling equipment
|
| 543 |
+
box
|
| 544 |
+
box girder bridge
|
| 545 |
+
box turtle
|
| 546 |
+
boxer
|
| 547 |
+
underdrawers
|
| 548 |
+
boxing
|
| 549 |
+
boxing glove
|
| 550 |
+
boxing ring
|
| 551 |
+
boy
|
| 552 |
+
brace
|
| 553 |
+
bracket
|
| 554 |
+
braid
|
| 555 |
+
brain
|
| 556 |
+
brake
|
| 557 |
+
brake light
|
| 558 |
+
branch
|
| 559 |
+
brand
|
| 560 |
+
brandy
|
| 561 |
+
brass
|
| 562 |
+
brass plaque
|
| 563 |
+
bread
|
| 564 |
+
breadbox
|
| 565 |
+
break
|
| 566 |
+
breakfast
|
| 567 |
+
seawall
|
| 568 |
+
chest
|
| 569 |
+
brewery
|
| 570 |
+
brick
|
| 571 |
+
brick building
|
| 572 |
+
wall
|
| 573 |
+
brickwork
|
| 574 |
+
wedding dress
|
| 575 |
+
bride
|
| 576 |
+
groom
|
| 577 |
+
bridesmaid
|
| 578 |
+
bridge
|
| 579 |
+
bridle
|
| 580 |
+
briefcase
|
| 581 |
+
bright
|
| 582 |
+
brim
|
| 583 |
+
broach
|
| 584 |
+
broadcasting
|
| 585 |
+
broccoli
|
| 586 |
+
bronze
|
| 587 |
+
bronze medal
|
| 588 |
+
bronze sculpture
|
| 589 |
+
bronze statue
|
| 590 |
+
brooch
|
| 591 |
+
creek
|
| 592 |
+
broom
|
| 593 |
+
broth
|
| 594 |
+
brown
|
| 595 |
+
brown bear
|
| 596 |
+
brownie
|
| 597 |
+
brunch
|
| 598 |
+
brunette
|
| 599 |
+
brush
|
| 600 |
+
coyote
|
| 601 |
+
brussels sprout
|
| 602 |
+
bubble
|
| 603 |
+
bubble gum
|
| 604 |
+
bubble tea
|
| 605 |
+
bucket cabinet
|
| 606 |
+
shield
|
| 607 |
+
bud
|
| 608 |
+
buddha
|
| 609 |
+
buffalo
|
| 610 |
+
buffet
|
| 611 |
+
bug
|
| 612 |
+
build
|
| 613 |
+
builder
|
| 614 |
+
building
|
| 615 |
+
building block
|
| 616 |
+
building facade
|
| 617 |
+
building material
|
| 618 |
+
lamp
|
| 619 |
+
bull
|
| 620 |
+
bulldog
|
| 621 |
+
bullet
|
| 622 |
+
bullet train
|
| 623 |
+
bulletin board
|
| 624 |
+
bulletproof vest
|
| 625 |
+
bullfighting
|
| 626 |
+
megaphone
|
| 627 |
+
bullring
|
| 628 |
+
bumblebee
|
| 629 |
+
bumper
|
| 630 |
+
roll
|
| 631 |
+
bundle
|
| 632 |
+
bungee
|
| 633 |
+
bunk bed
|
| 634 |
+
bunker
|
| 635 |
+
bunny
|
| 636 |
+
buoy
|
| 637 |
+
bureau
|
| 638 |
+
burial chamber
|
| 639 |
+
burn
|
| 640 |
+
burrito
|
| 641 |
+
bus
|
| 642 |
+
bus driver
|
| 643 |
+
bus interior
|
| 644 |
+
bus station
|
| 645 |
+
bus stop
|
| 646 |
+
bus window
|
| 647 |
+
bush
|
| 648 |
+
business
|
| 649 |
+
business card
|
| 650 |
+
business executive
|
| 651 |
+
business suit
|
| 652 |
+
business team
|
| 653 |
+
business woman
|
| 654 |
+
businessman
|
| 655 |
+
bust
|
| 656 |
+
butcher
|
| 657 |
+
butchers shop
|
| 658 |
+
butte
|
| 659 |
+
butter
|
| 660 |
+
cream
|
| 661 |
+
butterfly
|
| 662 |
+
butterfly house
|
| 663 |
+
button
|
| 664 |
+
buttonwood
|
| 665 |
+
buy
|
| 666 |
+
taxi
|
| 667 |
+
cabana
|
| 668 |
+
cabbage
|
| 669 |
+
cabin
|
| 670 |
+
cabin car
|
| 671 |
+
cabinet
|
| 672 |
+
cabinetry
|
| 673 |
+
cable
|
| 674 |
+
cable car
|
| 675 |
+
cactus
|
| 676 |
+
cafe
|
| 677 |
+
canteen
|
| 678 |
+
cage
|
| 679 |
+
cake
|
| 680 |
+
cake stand
|
| 681 |
+
calculator
|
| 682 |
+
caldron
|
| 683 |
+
calendar
|
| 684 |
+
calf
|
| 685 |
+
call
|
| 686 |
+
phone box
|
| 687 |
+
calligraphy
|
| 688 |
+
calm
|
| 689 |
+
camcorder
|
| 690 |
+
camel
|
| 691 |
+
camera
|
| 692 |
+
camera lens
|
| 693 |
+
camouflage
|
| 694 |
+
camp
|
| 695 |
+
camper
|
| 696 |
+
campfire
|
| 697 |
+
camping
|
| 698 |
+
campsite
|
| 699 |
+
campus
|
| 700 |
+
can
|
| 701 |
+
can opener
|
| 702 |
+
canal
|
| 703 |
+
canary
|
| 704 |
+
cancer
|
| 705 |
+
candle
|
| 706 |
+
candle holder
|
| 707 |
+
candy
|
| 708 |
+
candy bar
|
| 709 |
+
candy cane
|
| 710 |
+
candy store
|
| 711 |
+
cane
|
| 712 |
+
jar
|
| 713 |
+
cannon
|
| 714 |
+
canopy
|
| 715 |
+
canopy bed
|
| 716 |
+
cantaloupe
|
| 717 |
+
cantilever bridge
|
| 718 |
+
canvas
|
| 719 |
+
canyon
|
| 720 |
+
cap
|
| 721 |
+
cape
|
| 722 |
+
cape cod
|
| 723 |
+
cappuccino
|
| 724 |
+
capsule
|
| 725 |
+
captain
|
| 726 |
+
capture
|
| 727 |
+
car
|
| 728 |
+
car dealership
|
| 729 |
+
car door
|
| 730 |
+
car interior
|
| 731 |
+
car logo
|
| 732 |
+
car mirror
|
| 733 |
+
parking lot
|
| 734 |
+
car seat
|
| 735 |
+
car show
|
| 736 |
+
car wash
|
| 737 |
+
car window
|
| 738 |
+
caramel
|
| 739 |
+
card
|
| 740 |
+
card game
|
| 741 |
+
cardboard
|
| 742 |
+
cardboard box
|
| 743 |
+
cardigan
|
| 744 |
+
cardinal
|
| 745 |
+
cargo
|
| 746 |
+
cargo aircraft
|
| 747 |
+
cargo ship
|
| 748 |
+
caribbean
|
| 749 |
+
carnation
|
| 750 |
+
carnival
|
| 751 |
+
carnivore
|
| 752 |
+
carousel
|
| 753 |
+
carp
|
| 754 |
+
carpenter
|
| 755 |
+
carpet
|
| 756 |
+
slipper
|
| 757 |
+
house finch
|
| 758 |
+
coach
|
| 759 |
+
dalmatian
|
| 760 |
+
aircraft carrier
|
| 761 |
+
carrot
|
| 762 |
+
carrot cake
|
| 763 |
+
carry
|
| 764 |
+
cart
|
| 765 |
+
carton
|
| 766 |
+
cartoon
|
| 767 |
+
cartoon character
|
| 768 |
+
cartoon illustration
|
| 769 |
+
cartoon style
|
| 770 |
+
carve
|
| 771 |
+
case
|
| 772 |
+
cash
|
| 773 |
+
cashew
|
| 774 |
+
casino
|
| 775 |
+
casserole
|
| 776 |
+
cassette
|
| 777 |
+
cassette deck
|
| 778 |
+
plaster bandage
|
| 779 |
+
casting
|
| 780 |
+
castle
|
| 781 |
+
cat
|
| 782 |
+
cat bed
|
| 783 |
+
cat food
|
| 784 |
+
cat furniture
|
| 785 |
+
cat tree
|
| 786 |
+
catacomb
|
| 787 |
+
catamaran
|
| 788 |
+
catamount
|
| 789 |
+
catch
|
| 790 |
+
catcher
|
| 791 |
+
caterpillar
|
| 792 |
+
catfish
|
| 793 |
+
cathedral
|
| 794 |
+
cattle
|
| 795 |
+
catwalk
|
| 796 |
+
catwalk show
|
| 797 |
+
cauliflower
|
| 798 |
+
cave
|
| 799 |
+
caviar
|
| 800 |
+
CD
|
| 801 |
+
CD player
|
| 802 |
+
cedar
|
| 803 |
+
ceiling
|
| 804 |
+
ceiling fan
|
| 805 |
+
celebrate
|
| 806 |
+
celebration
|
| 807 |
+
celebrity
|
| 808 |
+
celery
|
| 809 |
+
cello
|
| 810 |
+
smartphone
|
| 811 |
+
cement
|
| 812 |
+
graveyard
|
| 813 |
+
centerpiece
|
| 814 |
+
centipede
|
| 815 |
+
ceramic
|
| 816 |
+
ceramic tile
|
| 817 |
+
cereal
|
| 818 |
+
ceremony
|
| 819 |
+
certificate
|
| 820 |
+
chain
|
| 821 |
+
chain saw
|
| 822 |
+
chair
|
| 823 |
+
chairlift
|
| 824 |
+
daybed
|
| 825 |
+
chalet
|
| 826 |
+
chalice
|
| 827 |
+
chalk
|
| 828 |
+
chamber
|
| 829 |
+
chameleon
|
| 830 |
+
champagne
|
| 831 |
+
champagne flute
|
| 832 |
+
champion
|
| 833 |
+
championship
|
| 834 |
+
chandelier
|
| 835 |
+
changing table
|
| 836 |
+
channel
|
| 837 |
+
chap
|
| 838 |
+
chapel
|
| 839 |
+
character sculpture
|
| 840 |
+
charcoal
|
| 841 |
+
charge
|
| 842 |
+
charger
|
| 843 |
+
chariot
|
| 844 |
+
charity
|
| 845 |
+
charity event
|
| 846 |
+
charm
|
| 847 |
+
graph
|
| 848 |
+
chase
|
| 849 |
+
chassis
|
| 850 |
+
check
|
| 851 |
+
checkbook
|
| 852 |
+
chessboard
|
| 853 |
+
checklist
|
| 854 |
+
cheer
|
| 855 |
+
cheerlead
|
| 856 |
+
cheese
|
| 857 |
+
cheeseburger
|
| 858 |
+
cheesecake
|
| 859 |
+
cheetah
|
| 860 |
+
chef
|
| 861 |
+
chemical compound
|
| 862 |
+
chemist
|
| 863 |
+
chemistry
|
| 864 |
+
chemistry lab
|
| 865 |
+
cheongsam
|
| 866 |
+
cherry
|
| 867 |
+
cherry blossom
|
| 868 |
+
cherry tomato
|
| 869 |
+
cherry tree
|
| 870 |
+
chess
|
| 871 |
+
chestnut
|
| 872 |
+
chicken
|
| 873 |
+
chicken breast
|
| 874 |
+
chicken coop
|
| 875 |
+
chicken salad
|
| 876 |
+
chicken wing
|
| 877 |
+
garbanzo
|
| 878 |
+
chiffonier
|
| 879 |
+
chihuahua
|
| 880 |
+
child
|
| 881 |
+
child actor
|
| 882 |
+
childs room
|
| 883 |
+
chile
|
| 884 |
+
chili dog
|
| 885 |
+
chimney
|
| 886 |
+
chimpanzee
|
| 887 |
+
chinaware
|
| 888 |
+
chinese cabbage
|
| 889 |
+
chinese garden
|
| 890 |
+
chinese knot
|
| 891 |
+
chinese rose
|
| 892 |
+
chinese tower
|
| 893 |
+
chip
|
| 894 |
+
chipmunk
|
| 895 |
+
chisel
|
| 896 |
+
chocolate
|
| 897 |
+
chocolate bar
|
| 898 |
+
chocolate cake
|
| 899 |
+
chocolate chip
|
| 900 |
+
chocolate chip cookie
|
| 901 |
+
chocolate milk
|
| 902 |
+
chocolate mousse
|
| 903 |
+
truffle
|
| 904 |
+
choir
|
| 905 |
+
kitchen knife
|
| 906 |
+
cutting board
|
| 907 |
+
chopstick
|
| 908 |
+
christmas
|
| 909 |
+
christmas ball
|
| 910 |
+
christmas card
|
| 911 |
+
christmas decoration
|
| 912 |
+
christmas dinner
|
| 913 |
+
christmas eve
|
| 914 |
+
christmas hat
|
| 915 |
+
christmas light
|
| 916 |
+
christmas market
|
| 917 |
+
christmas ornament
|
| 918 |
+
christmas tree
|
| 919 |
+
chrysanthemum
|
| 920 |
+
church
|
| 921 |
+
church tower
|
| 922 |
+
cider
|
| 923 |
+
cigar
|
| 924 |
+
cigar box
|
| 925 |
+
cigarette
|
| 926 |
+
cigarette case
|
| 927 |
+
waistband
|
| 928 |
+
cinema
|
| 929 |
+
photographer
|
| 930 |
+
cinnamon
|
| 931 |
+
circle
|
| 932 |
+
circuit
|
| 933 |
+
circuit board
|
| 934 |
+
circus
|
| 935 |
+
water tank
|
| 936 |
+
citrus fruit
|
| 937 |
+
city
|
| 938 |
+
city bus
|
| 939 |
+
city hall
|
| 940 |
+
city nightview
|
| 941 |
+
city park
|
| 942 |
+
city skyline
|
| 943 |
+
city square
|
| 944 |
+
city street
|
| 945 |
+
city wall
|
| 946 |
+
city view
|
| 947 |
+
clam
|
| 948 |
+
clarinet
|
| 949 |
+
clasp
|
| 950 |
+
class
|
| 951 |
+
classic
|
| 952 |
+
classroom
|
| 953 |
+
clavicle
|
| 954 |
+
claw
|
| 955 |
+
clay
|
| 956 |
+
pottery
|
| 957 |
+
clean
|
| 958 |
+
clean room
|
| 959 |
+
cleaner
|
| 960 |
+
cleaning product
|
| 961 |
+
clear
|
| 962 |
+
cleat
|
| 963 |
+
clementine
|
| 964 |
+
client
|
| 965 |
+
cliff
|
| 966 |
+
climb
|
| 967 |
+
climb mountain
|
| 968 |
+
climber
|
| 969 |
+
clinic
|
| 970 |
+
clip
|
| 971 |
+
clip art
|
| 972 |
+
clipboard
|
| 973 |
+
clipper
|
| 974 |
+
clivia
|
| 975 |
+
cloak
|
| 976 |
+
clogs
|
| 977 |
+
close-up
|
| 978 |
+
closet
|
| 979 |
+
cloth
|
| 980 |
+
clothe
|
| 981 |
+
clothing
|
| 982 |
+
clothespin
|
| 983 |
+
clothesline
|
| 984 |
+
clothing store
|
| 985 |
+
cloud
|
| 986 |
+
cloud forest
|
| 987 |
+
cloudy
|
| 988 |
+
clover
|
| 989 |
+
joker
|
| 990 |
+
clown fish
|
| 991 |
+
club
|
| 992 |
+
clutch
|
| 993 |
+
clutch bag
|
| 994 |
+
coal
|
| 995 |
+
coast
|
| 996 |
+
coat
|
| 997 |
+
coatrack
|
| 998 |
+
cob
|
| 999 |
+
cock
|
| 1000 |
+
cockatoo
|
| 1001 |
+
cocker
|
| 1002 |
+
cockpit
|
| 1003 |
+
roach
|
| 1004 |
+
cocktail
|
| 1005 |
+
cocktail dress
|
| 1006 |
+
cocktail shaker
|
| 1007 |
+
cocktail table
|
| 1008 |
+
cocoa
|
| 1009 |
+
coconut
|
| 1010 |
+
coconut tree
|
| 1011 |
+
coffee
|
| 1012 |
+
coffee bean
|
| 1013 |
+
coffee cup
|
| 1014 |
+
coffee machine
|
| 1015 |
+
coffee shop
|
| 1016 |
+
coffeepot
|
| 1017 |
+
coffin
|
| 1018 |
+
cognac
|
| 1019 |
+
spiral
|
| 1020 |
+
coin
|
| 1021 |
+
coke
|
| 1022 |
+
colander
|
| 1023 |
+
cold
|
| 1024 |
+
slaw
|
| 1025 |
+
collaboration
|
| 1026 |
+
collage
|
| 1027 |
+
collection
|
| 1028 |
+
college student
|
| 1029 |
+
sheepdog
|
| 1030 |
+
crash
|
| 1031 |
+
color
|
| 1032 |
+
coloring book
|
| 1033 |
+
coloring material
|
| 1034 |
+
pony
|
| 1035 |
+
pillar
|
| 1036 |
+
comb
|
| 1037 |
+
combination lock
|
| 1038 |
+
comic
|
| 1039 |
+
comedy
|
| 1040 |
+
comedy film
|
| 1041 |
+
comet
|
| 1042 |
+
comfort
|
| 1043 |
+
comfort food
|
| 1044 |
+
comic book
|
| 1045 |
+
comic book character
|
| 1046 |
+
comic strip
|
| 1047 |
+
commander
|
| 1048 |
+
commentator
|
| 1049 |
+
community
|
| 1050 |
+
commuter
|
| 1051 |
+
company
|
| 1052 |
+
compass
|
| 1053 |
+
compete
|
| 1054 |
+
contest
|
| 1055 |
+
competitor
|
| 1056 |
+
composer
|
| 1057 |
+
composition
|
| 1058 |
+
compost
|
| 1059 |
+
computer
|
| 1060 |
+
computer box
|
| 1061 |
+
computer chair
|
| 1062 |
+
computer desk
|
| 1063 |
+
keyboard
|
| 1064 |
+
computer monitor
|
| 1065 |
+
computer room
|
| 1066 |
+
computer screen
|
| 1067 |
+
computer tower
|
| 1068 |
+
concept car
|
| 1069 |
+
concert
|
| 1070 |
+
concert hall
|
| 1071 |
+
conch
|
| 1072 |
+
concrete
|
| 1073 |
+
condiment
|
| 1074 |
+
condom
|
| 1075 |
+
condominium
|
| 1076 |
+
conductor
|
| 1077 |
+
cone
|
| 1078 |
+
meeting
|
| 1079 |
+
conference center
|
| 1080 |
+
conference hall
|
| 1081 |
+
meeting room
|
| 1082 |
+
confetti
|
| 1083 |
+
conflict
|
| 1084 |
+
confluence
|
| 1085 |
+
connect
|
| 1086 |
+
connector
|
| 1087 |
+
conservatory
|
| 1088 |
+
constellation
|
| 1089 |
+
construction site
|
| 1090 |
+
construction worker
|
| 1091 |
+
contain
|
| 1092 |
+
container
|
| 1093 |
+
container ship
|
| 1094 |
+
continent
|
| 1095 |
+
profile
|
| 1096 |
+
contract
|
| 1097 |
+
control
|
| 1098 |
+
control tower
|
| 1099 |
+
convenience store
|
| 1100 |
+
convention
|
| 1101 |
+
conversation
|
| 1102 |
+
converter
|
| 1103 |
+
convertible
|
| 1104 |
+
transporter
|
| 1105 |
+
cook
|
| 1106 |
+
cooking
|
| 1107 |
+
cooking spray
|
| 1108 |
+
cooker
|
| 1109 |
+
cool
|
| 1110 |
+
cooler
|
| 1111 |
+
copper
|
| 1112 |
+
copy
|
| 1113 |
+
coral
|
| 1114 |
+
coral reef
|
| 1115 |
+
rope
|
| 1116 |
+
corded phone
|
| 1117 |
+
liquor
|
| 1118 |
+
corgi
|
| 1119 |
+
cork
|
| 1120 |
+
corkboard
|
| 1121 |
+
cormorant
|
| 1122 |
+
corn
|
| 1123 |
+
corn field
|
| 1124 |
+
cornbread
|
| 1125 |
+
corner
|
| 1126 |
+
trumpet
|
| 1127 |
+
cornice
|
| 1128 |
+
cornmeal
|
| 1129 |
+
corral
|
| 1130 |
+
corridor
|
| 1131 |
+
corset
|
| 1132 |
+
cosmetic
|
| 1133 |
+
cosmetics brush
|
| 1134 |
+
cosmetics mirror
|
| 1135 |
+
cosplay
|
| 1136 |
+
costume
|
| 1137 |
+
costumer film designer
|
| 1138 |
+
infant bed
|
| 1139 |
+
cottage
|
| 1140 |
+
cotton
|
| 1141 |
+
cotton candy
|
| 1142 |
+
couch
|
| 1143 |
+
countdown
|
| 1144 |
+
counter
|
| 1145 |
+
counter top
|
| 1146 |
+
country artist
|
| 1147 |
+
country house
|
| 1148 |
+
country lane
|
| 1149 |
+
country pop artist
|
| 1150 |
+
countryside
|
| 1151 |
+
coupe
|
| 1152 |
+
couple
|
| 1153 |
+
couple photo
|
| 1154 |
+
courgette
|
| 1155 |
+
course
|
| 1156 |
+
court
|
| 1157 |
+
courthouse
|
| 1158 |
+
courtyard
|
| 1159 |
+
cousin
|
| 1160 |
+
coverall
|
| 1161 |
+
cow
|
| 1162 |
+
cowbell
|
| 1163 |
+
cowboy
|
| 1164 |
+
cowboy boot
|
| 1165 |
+
cowboy hat
|
| 1166 |
+
crab
|
| 1167 |
+
crabmeat
|
| 1168 |
+
crack
|
| 1169 |
+
cradle
|
| 1170 |
+
craft
|
| 1171 |
+
craftsman
|
| 1172 |
+
cranberry
|
| 1173 |
+
crane
|
| 1174 |
+
crape
|
| 1175 |
+
crapper
|
| 1176 |
+
crate
|
| 1177 |
+
crater lake
|
| 1178 |
+
lobster
|
| 1179 |
+
crayon
|
| 1180 |
+
cream cheese
|
| 1181 |
+
cream pitcher
|
| 1182 |
+
create
|
| 1183 |
+
creature
|
| 1184 |
+
credit card
|
| 1185 |
+
crescent
|
| 1186 |
+
croissant
|
| 1187 |
+
crest
|
| 1188 |
+
crew
|
| 1189 |
+
cricket
|
| 1190 |
+
cricket ball
|
| 1191 |
+
cricket team
|
| 1192 |
+
cricketer
|
| 1193 |
+
crochet
|
| 1194 |
+
crock pot
|
| 1195 |
+
crocodile
|
| 1196 |
+
crop
|
| 1197 |
+
crop top
|
| 1198 |
+
cross
|
| 1199 |
+
crossbar
|
| 1200 |
+
crossroad
|
| 1201 |
+
crosstalk
|
| 1202 |
+
crosswalk
|
| 1203 |
+
crouton
|
| 1204 |
+
crow
|
| 1205 |
+
crowbar
|
| 1206 |
+
crowd
|
| 1207 |
+
crowded
|
| 1208 |
+
crown
|
| 1209 |
+
crt screen
|
| 1210 |
+
crucifix
|
| 1211 |
+
cruise
|
| 1212 |
+
cruise ship
|
| 1213 |
+
cruiser
|
| 1214 |
+
crumb
|
| 1215 |
+
crush
|
| 1216 |
+
crutch
|
| 1217 |
+
crystal
|
| 1218 |
+
cub
|
| 1219 |
+
cube
|
| 1220 |
+
cucumber
|
| 1221 |
+
cue
|
| 1222 |
+
cuff
|
| 1223 |
+
cufflink
|
| 1224 |
+
cuisine
|
| 1225 |
+
farmland
|
| 1226 |
+
cup
|
| 1227 |
+
cupcake
|
| 1228 |
+
cupid
|
| 1229 |
+
curb
|
| 1230 |
+
curl
|
| 1231 |
+
hair roller
|
| 1232 |
+
currant
|
| 1233 |
+
currency
|
| 1234 |
+
curry
|
| 1235 |
+
curtain
|
| 1236 |
+
curve
|
| 1237 |
+
pad
|
| 1238 |
+
customer
|
| 1239 |
+
cut
|
| 1240 |
+
cutlery
|
| 1241 |
+
cycle
|
| 1242 |
+
cycling
|
| 1243 |
+
cyclone
|
| 1244 |
+
cylinder
|
| 1245 |
+
cymbal
|
| 1246 |
+
cypress
|
| 1247 |
+
cypress tree
|
| 1248 |
+
dachshund
|
| 1249 |
+
daffodil
|
| 1250 |
+
dagger
|
| 1251 |
+
dahlia
|
| 1252 |
+
daikon
|
| 1253 |
+
dairy
|
| 1254 |
+
daisy
|
| 1255 |
+
dam
|
| 1256 |
+
damage
|
| 1257 |
+
damp
|
| 1258 |
+
dance
|
| 1259 |
+
dance floor
|
| 1260 |
+
dance room
|
| 1261 |
+
dancer
|
| 1262 |
+
dandelion
|
| 1263 |
+
dark
|
| 1264 |
+
darkness
|
| 1265 |
+
dart
|
| 1266 |
+
dartboard
|
| 1267 |
+
dashboard
|
| 1268 |
+
date
|
| 1269 |
+
daughter
|
| 1270 |
+
dawn
|
| 1271 |
+
day bed
|
| 1272 |
+
daylight
|
| 1273 |
+
deadbolt
|
| 1274 |
+
death
|
| 1275 |
+
debate
|
| 1276 |
+
debris
|
| 1277 |
+
decanter
|
| 1278 |
+
deck
|
| 1279 |
+
decker bus
|
| 1280 |
+
decor
|
| 1281 |
+
decorate
|
| 1282 |
+
decorative picture
|
| 1283 |
+
deer
|
| 1284 |
+
defender
|
| 1285 |
+
deity
|
| 1286 |
+
delicatessen
|
| 1287 |
+
deliver
|
| 1288 |
+
demolition
|
| 1289 |
+
monster
|
| 1290 |
+
demonstration
|
| 1291 |
+
den
|
| 1292 |
+
denim jacket
|
| 1293 |
+
dentist
|
| 1294 |
+
department store
|
| 1295 |
+
depression
|
| 1296 |
+
derby
|
| 1297 |
+
dermopathy
|
| 1298 |
+
desert
|
| 1299 |
+
desert road
|
| 1300 |
+
design
|
| 1301 |
+
designer
|
| 1302 |
+
table
|
| 1303 |
+
table lamp
|
| 1304 |
+
desktop
|
| 1305 |
+
desktop computer
|
| 1306 |
+
dessert
|
| 1307 |
+
destruction
|
| 1308 |
+
detective
|
| 1309 |
+
detergent
|
| 1310 |
+
dew
|
| 1311 |
+
dial
|
| 1312 |
+
diamond
|
| 1313 |
+
diaper
|
| 1314 |
+
diaper bag
|
| 1315 |
+
journal
|
| 1316 |
+
die
|
| 1317 |
+
diet
|
| 1318 |
+
excavator
|
| 1319 |
+
number
|
| 1320 |
+
digital clock
|
| 1321 |
+
dill
|
| 1322 |
+
dinner
|
| 1323 |
+
rowboat
|
| 1324 |
+
dining room
|
| 1325 |
+
dinner party
|
| 1326 |
+
dinning table
|
| 1327 |
+
dinosaur
|
| 1328 |
+
dip
|
| 1329 |
+
diploma
|
| 1330 |
+
direct
|
| 1331 |
+
director
|
| 1332 |
+
dirt
|
| 1333 |
+
dirt bike
|
| 1334 |
+
dirt field
|
| 1335 |
+
dirt road
|
| 1336 |
+
dirt track
|
| 1337 |
+
disaster
|
| 1338 |
+
disciple
|
| 1339 |
+
disco
|
| 1340 |
+
disco ball
|
| 1341 |
+
discotheque
|
| 1342 |
+
disease
|
| 1343 |
+
plate
|
| 1344 |
+
dish antenna
|
| 1345 |
+
dish washer
|
| 1346 |
+
dishrag
|
| 1347 |
+
dishes
|
| 1348 |
+
dishsoap
|
| 1349 |
+
Disneyland
|
| 1350 |
+
dispenser
|
| 1351 |
+
display
|
| 1352 |
+
display window
|
| 1353 |
+
trench
|
| 1354 |
+
dive
|
| 1355 |
+
diver
|
| 1356 |
+
diving board
|
| 1357 |
+
paper cup
|
| 1358 |
+
dj
|
| 1359 |
+
doberman
|
| 1360 |
+
dock
|
| 1361 |
+
doctor
|
| 1362 |
+
document
|
| 1363 |
+
documentary
|
| 1364 |
+
dog
|
| 1365 |
+
dog bed
|
| 1366 |
+
dog breed
|
| 1367 |
+
dog collar
|
| 1368 |
+
dog food
|
| 1369 |
+
dog house
|
| 1370 |
+
doll
|
| 1371 |
+
dollar
|
| 1372 |
+
dollhouse
|
| 1373 |
+
dolly
|
| 1374 |
+
dolphin
|
| 1375 |
+
dome
|
| 1376 |
+
domicile
|
| 1377 |
+
domino
|
| 1378 |
+
donkey
|
| 1379 |
+
donut
|
| 1380 |
+
doodle
|
| 1381 |
+
door
|
| 1382 |
+
door handle
|
| 1383 |
+
doormat
|
| 1384 |
+
doorplate
|
| 1385 |
+
doorway
|
| 1386 |
+
dormitory
|
| 1387 |
+
dough
|
| 1388 |
+
downtown
|
| 1389 |
+
dozer
|
| 1390 |
+
drag
|
| 1391 |
+
dragon
|
| 1392 |
+
dragonfly
|
| 1393 |
+
drain
|
| 1394 |
+
drama
|
| 1395 |
+
drama film
|
| 1396 |
+
draw
|
| 1397 |
+
drawer
|
| 1398 |
+
drawing
|
| 1399 |
+
drawing pin
|
| 1400 |
+
pigtail
|
| 1401 |
+
dress
|
| 1402 |
+
dress hat
|
| 1403 |
+
dress shirt
|
| 1404 |
+
dress shoe
|
| 1405 |
+
dress suit
|
| 1406 |
+
dresser
|
| 1407 |
+
dressing room
|
| 1408 |
+
dribble
|
| 1409 |
+
drift
|
| 1410 |
+
driftwood
|
| 1411 |
+
drill
|
| 1412 |
+
drink
|
| 1413 |
+
drinking water
|
| 1414 |
+
drive
|
| 1415 |
+
driver
|
| 1416 |
+
driveway
|
| 1417 |
+
drone
|
| 1418 |
+
drop
|
| 1419 |
+
droplight
|
| 1420 |
+
dropper
|
| 1421 |
+
drought
|
| 1422 |
+
medicine
|
| 1423 |
+
pharmacy
|
| 1424 |
+
drum
|
| 1425 |
+
drummer
|
| 1426 |
+
drumstick
|
| 1427 |
+
dry
|
| 1428 |
+
duchess
|
| 1429 |
+
duck
|
| 1430 |
+
duckbill
|
| 1431 |
+
duckling
|
| 1432 |
+
duct tape
|
| 1433 |
+
dude
|
| 1434 |
+
duet
|
| 1435 |
+
duffel
|
| 1436 |
+
canoe
|
| 1437 |
+
dumbbell
|
| 1438 |
+
dumpling
|
| 1439 |
+
dune
|
| 1440 |
+
dunk
|
| 1441 |
+
durian
|
| 1442 |
+
dusk
|
| 1443 |
+
dust
|
| 1444 |
+
garbage truck
|
| 1445 |
+
dustpan
|
| 1446 |
+
duvet
|
| 1447 |
+
DVD
|
| 1448 |
+
dye
|
| 1449 |
+
eagle
|
| 1450 |
+
ear
|
| 1451 |
+
earmuff
|
| 1452 |
+
earphone
|
| 1453 |
+
earplug
|
| 1454 |
+
earring
|
| 1455 |
+
earthquake
|
| 1456 |
+
easel
|
| 1457 |
+
easter
|
| 1458 |
+
easter bunny
|
| 1459 |
+
easter egg
|
| 1460 |
+
eat
|
| 1461 |
+
restaurant
|
| 1462 |
+
eclair
|
| 1463 |
+
eclipse
|
| 1464 |
+
ecosystem
|
| 1465 |
+
edit
|
| 1466 |
+
education
|
| 1467 |
+
educator
|
| 1468 |
+
eel
|
| 1469 |
+
egg
|
| 1470 |
+
egg roll
|
| 1471 |
+
egg tart
|
| 1472 |
+
eggbeater
|
| 1473 |
+
egret
|
| 1474 |
+
Eiffel tower
|
| 1475 |
+
elastic band
|
| 1476 |
+
senior
|
| 1477 |
+
electric chair
|
| 1478 |
+
electric drill
|
| 1479 |
+
electrician
|
| 1480 |
+
electricity
|
| 1481 |
+
electron
|
| 1482 |
+
electronic
|
| 1483 |
+
elephant
|
| 1484 |
+
elevation map
|
| 1485 |
+
elevator
|
| 1486 |
+
elevator car
|
| 1487 |
+
elevator door
|
| 1488 |
+
elevator lobby
|
| 1489 |
+
elevator shaft
|
| 1490 |
+
embankment
|
| 1491 |
+
embassy
|
| 1492 |
+
embellishment
|
| 1493 |
+
ember
|
| 1494 |
+
emblem
|
| 1495 |
+
embroidery
|
| 1496 |
+
emerald
|
| 1497 |
+
emergency
|
| 1498 |
+
emergency service
|
| 1499 |
+
emergency vehicle
|
| 1500 |
+
emotion
|
| 1501 |
+
Empire State Building
|
| 1502 |
+
enamel
|
| 1503 |
+
enclosure
|
| 1504 |
+
side table
|
| 1505 |
+
energy
|
| 1506 |
+
engagement
|
| 1507 |
+
engagement ring
|
| 1508 |
+
engine
|
| 1509 |
+
engine room
|
| 1510 |
+
engineer
|
| 1511 |
+
engineering
|
| 1512 |
+
english shorthair
|
| 1513 |
+
ensemble
|
| 1514 |
+
enter
|
| 1515 |
+
entertainer
|
| 1516 |
+
entertainment
|
| 1517 |
+
entertainment center
|
| 1518 |
+
entrance
|
| 1519 |
+
entrance hall
|
| 1520 |
+
envelope
|
| 1521 |
+
equestrian
|
| 1522 |
+
equipment
|
| 1523 |
+
eraser
|
| 1524 |
+
erhu
|
| 1525 |
+
erosion
|
| 1526 |
+
escalator
|
| 1527 |
+
escargot
|
| 1528 |
+
espresso
|
| 1529 |
+
estate
|
| 1530 |
+
estuary
|
| 1531 |
+
eucalyptus tree
|
| 1532 |
+
evening
|
| 1533 |
+
evening dress
|
| 1534 |
+
evening light
|
| 1535 |
+
evening sky
|
| 1536 |
+
evening sun
|
| 1537 |
+
event
|
| 1538 |
+
evergreen
|
| 1539 |
+
ewe
|
| 1540 |
+
excavation
|
| 1541 |
+
exercise
|
| 1542 |
+
exhaust hood
|
| 1543 |
+
exhibition
|
| 1544 |
+
exit
|
| 1545 |
+
explorer
|
| 1546 |
+
explosion
|
| 1547 |
+
extension cord
|
| 1548 |
+
extinguisher
|
| 1549 |
+
extractor
|
| 1550 |
+
extrude
|
| 1551 |
+
eye
|
| 1552 |
+
eye shadow
|
| 1553 |
+
eyebrow
|
| 1554 |
+
eyeliner
|
| 1555 |
+
fabric
|
| 1556 |
+
fabric store
|
| 1557 |
+
facade
|
| 1558 |
+
face
|
| 1559 |
+
face close-up
|
| 1560 |
+
face powder
|
| 1561 |
+
face towel
|
| 1562 |
+
facial tissue holder
|
| 1563 |
+
facility
|
| 1564 |
+
factory
|
| 1565 |
+
factory workshop
|
| 1566 |
+
fair
|
| 1567 |
+
fairground
|
| 1568 |
+
fairy
|
| 1569 |
+
falcon
|
| 1570 |
+
fall
|
| 1571 |
+
family
|
| 1572 |
+
family car
|
| 1573 |
+
family photo
|
| 1574 |
+
family room
|
| 1575 |
+
fan
|
| 1576 |
+
fang
|
| 1577 |
+
farm
|
| 1578 |
+
farmer
|
| 1579 |
+
farmer market
|
| 1580 |
+
farmhouse
|
| 1581 |
+
fashion
|
| 1582 |
+
fashion accessory
|
| 1583 |
+
fashion designer
|
| 1584 |
+
fashion girl
|
| 1585 |
+
fashion illustration
|
| 1586 |
+
fashion look
|
| 1587 |
+
fashion model
|
| 1588 |
+
fashion show
|
| 1589 |
+
fast food
|
| 1590 |
+
fastfood restaurant
|
| 1591 |
+
father
|
| 1592 |
+
faucet
|
| 1593 |
+
fault
|
| 1594 |
+
fauna
|
| 1595 |
+
fawn
|
| 1596 |
+
fax
|
| 1597 |
+
feast
|
| 1598 |
+
feather
|
| 1599 |
+
fedora
|
| 1600 |
+
feed
|
| 1601 |
+
feedbag
|
| 1602 |
+
feeding
|
| 1603 |
+
feeding chair
|
| 1604 |
+
feline
|
| 1605 |
+
mountain lion
|
| 1606 |
+
fence
|
| 1607 |
+
fender
|
| 1608 |
+
fern
|
| 1609 |
+
ferret
|
| 1610 |
+
ferris wheel
|
| 1611 |
+
ferry
|
| 1612 |
+
fertilizer
|
| 1613 |
+
festival
|
| 1614 |
+
fiber
|
| 1615 |
+
fiction
|
| 1616 |
+
fiction book
|
| 1617 |
+
field
|
| 1618 |
+
field road
|
| 1619 |
+
fig
|
| 1620 |
+
fight
|
| 1621 |
+
figure skater
|
| 1622 |
+
figurine
|
| 1623 |
+
file
|
| 1624 |
+
file photo
|
| 1625 |
+
file cabinet
|
| 1626 |
+
fill
|
| 1627 |
+
film camera
|
| 1628 |
+
film director
|
| 1629 |
+
film format
|
| 1630 |
+
film premiere
|
| 1631 |
+
film producer
|
| 1632 |
+
filming
|
| 1633 |
+
filter
|
| 1634 |
+
fin
|
| 1635 |
+
hand
|
| 1636 |
+
finish line
|
| 1637 |
+
fir
|
| 1638 |
+
fir tree
|
| 1639 |
+
fire
|
| 1640 |
+
fire alarm
|
| 1641 |
+
fire department
|
| 1642 |
+
fire truck
|
| 1643 |
+
fire escape
|
| 1644 |
+
fire hose
|
| 1645 |
+
fire pit
|
| 1646 |
+
fire station
|
| 1647 |
+
firecracker
|
| 1648 |
+
fireman
|
| 1649 |
+
fireplace
|
| 1650 |
+
firework
|
| 1651 |
+
firework display
|
| 1652 |
+
first-aid kit
|
| 1653 |
+
fish
|
| 1654 |
+
fish boat
|
| 1655 |
+
fish market
|
| 1656 |
+
fish pond
|
| 1657 |
+
fishbowl
|
| 1658 |
+
fisherman
|
| 1659 |
+
fishing
|
| 1660 |
+
fishing boat
|
| 1661 |
+
fishing net
|
| 1662 |
+
fishing pole
|
| 1663 |
+
fishing village
|
| 1664 |
+
fitness
|
| 1665 |
+
fitness course
|
| 1666 |
+
five
|
| 1667 |
+
fixture
|
| 1668 |
+
fjord
|
| 1669 |
+
flag
|
| 1670 |
+
flag pole
|
| 1671 |
+
flake
|
| 1672 |
+
flame
|
| 1673 |
+
flamingo
|
| 1674 |
+
flannel
|
| 1675 |
+
flap
|
| 1676 |
+
flare
|
| 1677 |
+
flash
|
| 1678 |
+
flask
|
| 1679 |
+
flat
|
| 1680 |
+
flatfish
|
| 1681 |
+
flavor
|
| 1682 |
+
flea
|
| 1683 |
+
flea market
|
| 1684 |
+
fleet
|
| 1685 |
+
flight
|
| 1686 |
+
flight attendant
|
| 1687 |
+
flip
|
| 1688 |
+
flip-flop
|
| 1689 |
+
flipchart
|
| 1690 |
+
float
|
| 1691 |
+
flock
|
| 1692 |
+
flood
|
| 1693 |
+
floor
|
| 1694 |
+
floor fan
|
| 1695 |
+
floor mat
|
| 1696 |
+
floor plan
|
| 1697 |
+
floor window
|
| 1698 |
+
floral arrangement
|
| 1699 |
+
florist
|
| 1700 |
+
floss
|
| 1701 |
+
flour
|
| 1702 |
+
flow
|
| 1703 |
+
flower
|
| 1704 |
+
flower basket
|
| 1705 |
+
flower bed
|
| 1706 |
+
flower box
|
| 1707 |
+
flower field
|
| 1708 |
+
flower girl
|
| 1709 |
+
flower market
|
| 1710 |
+
fluid
|
| 1711 |
+
flush
|
| 1712 |
+
flute
|
| 1713 |
+
fly
|
| 1714 |
+
fly fishing
|
| 1715 |
+
flyer
|
| 1716 |
+
horse
|
| 1717 |
+
foam
|
| 1718 |
+
fog
|
| 1719 |
+
foggy
|
| 1720 |
+
foie gra
|
| 1721 |
+
foil
|
| 1722 |
+
folding chair
|
| 1723 |
+
leaf
|
| 1724 |
+
folk artist
|
| 1725 |
+
folk dance
|
| 1726 |
+
folk rock artist
|
| 1727 |
+
fondant
|
| 1728 |
+
hotpot
|
| 1729 |
+
font
|
| 1730 |
+
food
|
| 1731 |
+
food coloring
|
| 1732 |
+
food court
|
| 1733 |
+
food processor
|
| 1734 |
+
food stand
|
| 1735 |
+
food truck
|
| 1736 |
+
foosball
|
| 1737 |
+
foot
|
| 1738 |
+
foot bridge
|
| 1739 |
+
football
|
| 1740 |
+
football coach
|
| 1741 |
+
football college game
|
| 1742 |
+
football match
|
| 1743 |
+
football field
|
| 1744 |
+
football game
|
| 1745 |
+
football helmet
|
| 1746 |
+
football player
|
| 1747 |
+
football stadium
|
| 1748 |
+
football team
|
| 1749 |
+
path
|
| 1750 |
+
footprint
|
| 1751 |
+
footrest
|
| 1752 |
+
footstall
|
| 1753 |
+
footwear
|
| 1754 |
+
forbidden city
|
| 1755 |
+
ford
|
| 1756 |
+
forehead
|
| 1757 |
+
forest
|
| 1758 |
+
forest fire
|
| 1759 |
+
forest floor
|
| 1760 |
+
forest path
|
| 1761 |
+
forest road
|
| 1762 |
+
forge
|
| 1763 |
+
fork
|
| 1764 |
+
forklift
|
| 1765 |
+
form
|
| 1766 |
+
formal garden
|
| 1767 |
+
formation
|
| 1768 |
+
formula 1
|
| 1769 |
+
fort
|
| 1770 |
+
fortification
|
| 1771 |
+
forward
|
| 1772 |
+
fossil
|
| 1773 |
+
foundation
|
| 1774 |
+
fountain
|
| 1775 |
+
fountain pen
|
| 1776 |
+
fox
|
| 1777 |
+
frame
|
| 1778 |
+
freckle
|
| 1779 |
+
highway
|
| 1780 |
+
lorry
|
| 1781 |
+
French
|
| 1782 |
+
French bulldog
|
| 1783 |
+
French fries
|
| 1784 |
+
French toast
|
| 1785 |
+
freshener
|
| 1786 |
+
fridge
|
| 1787 |
+
fried chicken
|
| 1788 |
+
fried egg
|
| 1789 |
+
fried rice
|
| 1790 |
+
friendship
|
| 1791 |
+
frisbee
|
| 1792 |
+
frog
|
| 1793 |
+
frost
|
| 1794 |
+
frosting
|
| 1795 |
+
frosty
|
| 1796 |
+
frozen
|
| 1797 |
+
fruit
|
| 1798 |
+
fruit cake
|
| 1799 |
+
fruit dish
|
| 1800 |
+
fruit market
|
| 1801 |
+
fruit salad
|
| 1802 |
+
fruit stand
|
| 1803 |
+
fruit tree
|
| 1804 |
+
fruits shop
|
| 1805 |
+
fry
|
| 1806 |
+
frying pan
|
| 1807 |
+
fudge
|
| 1808 |
+
fuel
|
| 1809 |
+
fume hood
|
| 1810 |
+
fun
|
| 1811 |
+
funeral
|
| 1812 |
+
fungi
|
| 1813 |
+
funnel
|
| 1814 |
+
fur
|
| 1815 |
+
fur coat
|
| 1816 |
+
furniture
|
| 1817 |
+
futon
|
| 1818 |
+
gadget
|
| 1819 |
+
muzzle
|
| 1820 |
+
galaxy
|
| 1821 |
+
gallery
|
| 1822 |
+
game
|
| 1823 |
+
game board
|
| 1824 |
+
game controller
|
| 1825 |
+
ham
|
| 1826 |
+
gang
|
| 1827 |
+
garage
|
| 1828 |
+
garage door
|
| 1829 |
+
garage kit
|
| 1830 |
+
garbage
|
| 1831 |
+
garden
|
| 1832 |
+
garden asparagus
|
| 1833 |
+
garden hose
|
| 1834 |
+
garden spider
|
| 1835 |
+
gardener
|
| 1836 |
+
gardening
|
| 1837 |
+
garfield
|
| 1838 |
+
gargoyle
|
| 1839 |
+
wreath
|
| 1840 |
+
garlic
|
| 1841 |
+
garment
|
| 1842 |
+
gas
|
| 1843 |
+
gas station
|
| 1844 |
+
gas stove
|
| 1845 |
+
gasmask
|
| 1846 |
+
collect
|
| 1847 |
+
gathering
|
| 1848 |
+
gauge
|
| 1849 |
+
gazebo
|
| 1850 |
+
gear
|
| 1851 |
+
gecko
|
| 1852 |
+
geisha
|
| 1853 |
+
gel
|
| 1854 |
+
general store
|
| 1855 |
+
generator
|
| 1856 |
+
geranium
|
| 1857 |
+
ghost
|
| 1858 |
+
gift
|
| 1859 |
+
gift bag
|
| 1860 |
+
gift basket
|
| 1861 |
+
gift box
|
| 1862 |
+
gift card
|
| 1863 |
+
gift shop
|
| 1864 |
+
gift wrap
|
| 1865 |
+
gig
|
| 1866 |
+
gin
|
| 1867 |
+
ginger
|
| 1868 |
+
gingerbread
|
| 1869 |
+
gingerbread house
|
| 1870 |
+
ginkgo tree
|
| 1871 |
+
giraffe
|
| 1872 |
+
girl
|
| 1873 |
+
give
|
| 1874 |
+
glacier
|
| 1875 |
+
gladiator
|
| 1876 |
+
glass bead
|
| 1877 |
+
glass bottle
|
| 1878 |
+
glass bowl
|
| 1879 |
+
glass box
|
| 1880 |
+
glass building
|
| 1881 |
+
glass door
|
| 1882 |
+
glass floor
|
| 1883 |
+
glass house
|
| 1884 |
+
glass jar
|
| 1885 |
+
glass plate
|
| 1886 |
+
glass table
|
| 1887 |
+
glass vase
|
| 1888 |
+
glass wall
|
| 1889 |
+
glass window
|
| 1890 |
+
glasses
|
| 1891 |
+
glaze
|
| 1892 |
+
glider
|
| 1893 |
+
earth
|
| 1894 |
+
glove
|
| 1895 |
+
glow
|
| 1896 |
+
glue pudding
|
| 1897 |
+
go
|
| 1898 |
+
go for
|
| 1899 |
+
goal
|
| 1900 |
+
goalkeeper
|
| 1901 |
+
goat
|
| 1902 |
+
goat cheese
|
| 1903 |
+
gobi
|
| 1904 |
+
goggles
|
| 1905 |
+
gold
|
| 1906 |
+
gold medal
|
| 1907 |
+
Golden Gate Bridge
|
| 1908 |
+
golden retriever
|
| 1909 |
+
goldfish
|
| 1910 |
+
golf
|
| 1911 |
+
golf cap
|
| 1912 |
+
golf cart
|
| 1913 |
+
golf club
|
| 1914 |
+
golf course
|
| 1915 |
+
golfer
|
| 1916 |
+
goose
|
| 1917 |
+
gorilla
|
| 1918 |
+
gothic
|
| 1919 |
+
gourd
|
| 1920 |
+
government
|
| 1921 |
+
government agency
|
| 1922 |
+
gown
|
| 1923 |
+
graduate
|
| 1924 |
+
graduation
|
| 1925 |
+
grain
|
| 1926 |
+
grampus
|
| 1927 |
+
grand prix
|
| 1928 |
+
grandfather
|
| 1929 |
+
grandmother
|
| 1930 |
+
grandparent
|
| 1931 |
+
granite
|
| 1932 |
+
granola
|
| 1933 |
+
grape
|
| 1934 |
+
grapefruit
|
| 1935 |
+
wine
|
| 1936 |
+
grass
|
| 1937 |
+
grasshopper
|
| 1938 |
+
grassland
|
| 1939 |
+
grassy
|
| 1940 |
+
grater
|
| 1941 |
+
grave
|
| 1942 |
+
gravel
|
| 1943 |
+
gravestone
|
| 1944 |
+
gravy
|
| 1945 |
+
gravy boat
|
| 1946 |
+
gray
|
| 1947 |
+
graze
|
| 1948 |
+
grazing
|
| 1949 |
+
green
|
| 1950 |
+
greenery
|
| 1951 |
+
greet
|
| 1952 |
+
greeting
|
| 1953 |
+
greeting card
|
| 1954 |
+
greyhound
|
| 1955 |
+
grid
|
| 1956 |
+
griddle
|
| 1957 |
+
grill
|
| 1958 |
+
grille
|
| 1959 |
+
grilled eel
|
| 1960 |
+
grind
|
| 1961 |
+
grinder
|
| 1962 |
+
grits
|
| 1963 |
+
grocery bag
|
| 1964 |
+
grotto
|
| 1965 |
+
ground squirrel
|
| 1966 |
+
group
|
| 1967 |
+
group photo
|
| 1968 |
+
grove
|
| 1969 |
+
grow
|
| 1970 |
+
guacamole
|
| 1971 |
+
guard
|
| 1972 |
+
guard dog
|
| 1973 |
+
guest house
|
| 1974 |
+
guest room
|
| 1975 |
+
guide
|
| 1976 |
+
guinea pig
|
| 1977 |
+
guitar
|
| 1978 |
+
guitarist
|
| 1979 |
+
gulf
|
| 1980 |
+
gull
|
| 1981 |
+
gun
|
| 1982 |
+
gundam
|
| 1983 |
+
gurdwara
|
| 1984 |
+
guzheng
|
| 1985 |
+
gym
|
| 1986 |
+
gymnast
|
| 1987 |
+
habitat
|
| 1988 |
+
hacker
|
| 1989 |
+
hail
|
| 1990 |
+
hair
|
| 1991 |
+
hair color
|
| 1992 |
+
hair spray
|
| 1993 |
+
hairbrush
|
| 1994 |
+
haircut
|
| 1995 |
+
hairgrip
|
| 1996 |
+
hairnet
|
| 1997 |
+
hairpin
|
| 1998 |
+
hairstyle
|
| 1999 |
+
half
|
| 2000 |
+
hall
|
| 2001 |
+
halloween
|
| 2002 |
+
halloween costume
|
| 2003 |
+
halloween pumpkin
|
| 2004 |
+
halter top
|
| 2005 |
+
hamburg
|
| 2006 |
+
hamburger
|
| 2007 |
+
hami melon
|
| 2008 |
+
hammer
|
| 2009 |
+
hammock
|
| 2010 |
+
hamper
|
| 2011 |
+
hamster
|
| 2012 |
+
hand dryer
|
| 2013 |
+
hand glass
|
| 2014 |
+
hand towel
|
| 2015 |
+
handbag
|
| 2016 |
+
handball
|
| 2017 |
+
handcuff
|
| 2018 |
+
handgun
|
| 2019 |
+
handkerchief
|
| 2020 |
+
handle
|
| 2021 |
+
handsaw
|
| 2022 |
+
handshake
|
| 2023 |
+
handstand
|
| 2024 |
+
handwriting
|
| 2025 |
+
hanfu
|
| 2026 |
+
hang
|
| 2027 |
+
hangar
|
| 2028 |
+
hanger
|
| 2029 |
+
happiness
|
| 2030 |
+
harbor
|
| 2031 |
+
harbor seal
|
| 2032 |
+
hard rock artist
|
| 2033 |
+
hardback book
|
| 2034 |
+
safety helmet
|
| 2035 |
+
hardware
|
| 2036 |
+
hardware store
|
| 2037 |
+
hardwood
|
| 2038 |
+
hardwood floor
|
| 2039 |
+
mouth organ
|
| 2040 |
+
pipe organ
|
| 2041 |
+
harpsichord
|
| 2042 |
+
harvest
|
| 2043 |
+
harvester
|
| 2044 |
+
hassock
|
| 2045 |
+
hat
|
| 2046 |
+
hatbox
|
| 2047 |
+
hautboy
|
| 2048 |
+
hawthorn
|
| 2049 |
+
hay
|
| 2050 |
+
hayfield
|
| 2051 |
+
hazelnut
|
| 2052 |
+
head
|
| 2053 |
+
head coach
|
| 2054 |
+
headlight
|
| 2055 |
+
headboard
|
| 2056 |
+
headdress
|
| 2057 |
+
headland
|
| 2058 |
+
headquarter
|
| 2059 |
+
hearing
|
| 2060 |
+
heart
|
| 2061 |
+
heart shape
|
| 2062 |
+
heat
|
| 2063 |
+
heater
|
| 2064 |
+
heather
|
| 2065 |
+
hedge
|
| 2066 |
+
hedgehog
|
| 2067 |
+
heel
|
| 2068 |
+
helicopter
|
| 2069 |
+
heliport
|
| 2070 |
+
helmet
|
| 2071 |
+
help
|
| 2072 |
+
hen
|
| 2073 |
+
henna
|
| 2074 |
+
herb
|
| 2075 |
+
herd
|
| 2076 |
+
hermit crab
|
| 2077 |
+
hero
|
| 2078 |
+
heron
|
| 2079 |
+
hibiscus
|
| 2080 |
+
hibiscus flower
|
| 2081 |
+
hide
|
| 2082 |
+
high bar
|
| 2083 |
+
high heel
|
| 2084 |
+
highland
|
| 2085 |
+
highlight
|
| 2086 |
+
hike
|
| 2087 |
+
hiker
|
| 2088 |
+
hiking boot
|
| 2089 |
+
hiking equipment
|
| 2090 |
+
hill
|
| 2091 |
+
hill country
|
| 2092 |
+
hill station
|
| 2093 |
+
hillside
|
| 2094 |
+
hindu temple
|
| 2095 |
+
hinge
|
| 2096 |
+
hip
|
| 2097 |
+
hip hop artist
|
| 2098 |
+
hippo
|
| 2099 |
+
historian
|
| 2100 |
+
historic
|
| 2101 |
+
history
|
| 2102 |
+
hockey
|
| 2103 |
+
hockey arena
|
| 2104 |
+
hockey game
|
| 2105 |
+
hockey player
|
| 2106 |
+
hockey stick
|
| 2107 |
+
hoe
|
| 2108 |
+
hole
|
| 2109 |
+
vacation
|
| 2110 |
+
holly
|
| 2111 |
+
holothurian
|
| 2112 |
+
home
|
| 2113 |
+
home appliance
|
| 2114 |
+
home base
|
| 2115 |
+
home decor
|
| 2116 |
+
home interior
|
| 2117 |
+
home office
|
| 2118 |
+
home theater
|
| 2119 |
+
homework
|
| 2120 |
+
hummus
|
| 2121 |
+
honey
|
| 2122 |
+
beehive
|
| 2123 |
+
honeymoon
|
| 2124 |
+
hood
|
| 2125 |
+
hoodie
|
| 2126 |
+
hook
|
| 2127 |
+
jump
|
| 2128 |
+
horizon
|
| 2129 |
+
hornbill
|
| 2130 |
+
horned cow
|
| 2131 |
+
hornet
|
| 2132 |
+
horror
|
| 2133 |
+
horror film
|
| 2134 |
+
horse blanket
|
| 2135 |
+
horse cart
|
| 2136 |
+
horse farm
|
| 2137 |
+
horse ride
|
| 2138 |
+
horseback
|
| 2139 |
+
horseshoe
|
| 2140 |
+
hose
|
| 2141 |
+
hospital
|
| 2142 |
+
hospital bed
|
| 2143 |
+
hospital room
|
| 2144 |
+
host
|
| 2145 |
+
inn
|
| 2146 |
+
hot
|
| 2147 |
+
hot air balloon
|
| 2148 |
+
hot dog
|
| 2149 |
+
hot sauce
|
| 2150 |
+
hot spring
|
| 2151 |
+
hotel
|
| 2152 |
+
hotel lobby
|
| 2153 |
+
hotel room
|
| 2154 |
+
hotplate
|
| 2155 |
+
hourglass
|
| 2156 |
+
house
|
| 2157 |
+
house exterior
|
| 2158 |
+
houseplant
|
| 2159 |
+
hoverboard
|
| 2160 |
+
howler
|
| 2161 |
+
huddle
|
| 2162 |
+
hug
|
| 2163 |
+
hula hoop
|
| 2164 |
+
person
|
| 2165 |
+
humidifier
|
| 2166 |
+
hummingbird
|
| 2167 |
+
humpback whale
|
| 2168 |
+
hunt
|
| 2169 |
+
hunting lodge
|
| 2170 |
+
hurdle
|
| 2171 |
+
hurricane
|
| 2172 |
+
husky
|
| 2173 |
+
hut
|
| 2174 |
+
hyaena
|
| 2175 |
+
hybrid
|
| 2176 |
+
hydrangea
|
| 2177 |
+
hydrant
|
| 2178 |
+
seaplane
|
| 2179 |
+
ice
|
| 2180 |
+
ice bag
|
| 2181 |
+
polar bear
|
| 2182 |
+
ice cave
|
| 2183 |
+
icecream
|
| 2184 |
+
ice cream cone
|
| 2185 |
+
ice cream parlor
|
| 2186 |
+
ice cube
|
| 2187 |
+
ice floe
|
| 2188 |
+
ice hockey player
|
| 2189 |
+
ice hockey team
|
| 2190 |
+
lollipop
|
| 2191 |
+
ice maker
|
| 2192 |
+
rink
|
| 2193 |
+
ice sculpture
|
| 2194 |
+
ice shelf
|
| 2195 |
+
skate
|
| 2196 |
+
ice skating
|
| 2197 |
+
iceberg
|
| 2198 |
+
icicle
|
| 2199 |
+
icing
|
| 2200 |
+
icon
|
| 2201 |
+
id photo
|
| 2202 |
+
identity card
|
| 2203 |
+
igloo
|
| 2204 |
+
light
|
| 2205 |
+
iguana
|
| 2206 |
+
illuminate
|
| 2207 |
+
illustration
|
| 2208 |
+
image
|
| 2209 |
+
impala
|
| 2210 |
+
incense
|
| 2211 |
+
independence day
|
| 2212 |
+
individual
|
| 2213 |
+
indoor
|
| 2214 |
+
indoor rower
|
| 2215 |
+
induction cooker
|
| 2216 |
+
industrial area
|
| 2217 |
+
industry
|
| 2218 |
+
infantry
|
| 2219 |
+
inflatable boat
|
| 2220 |
+
information desk
|
| 2221 |
+
infrastructure
|
| 2222 |
+
ingredient
|
| 2223 |
+
inhalator
|
| 2224 |
+
injection
|
| 2225 |
+
injury
|
| 2226 |
+
ink
|
| 2227 |
+
inking pad
|
| 2228 |
+
inlet
|
| 2229 |
+
inscription
|
| 2230 |
+
insect
|
| 2231 |
+
install
|
| 2232 |
+
instrument
|
| 2233 |
+
insulated cup
|
| 2234 |
+
interaction
|
| 2235 |
+
interior design
|
| 2236 |
+
website
|
| 2237 |
+
intersection
|
| 2238 |
+
interview
|
| 2239 |
+
invertebrate
|
| 2240 |
+
invitation
|
| 2241 |
+
ipad
|
| 2242 |
+
iphone
|
| 2243 |
+
ipod
|
| 2244 |
+
iris
|
| 2245 |
+
iron
|
| 2246 |
+
ironing board
|
| 2247 |
+
irrigation system
|
| 2248 |
+
island
|
| 2249 |
+
islet
|
| 2250 |
+
isopod
|
| 2251 |
+
ivory
|
| 2252 |
+
ivy
|
| 2253 |
+
izakaya
|
| 2254 |
+
jack
|
| 2255 |
+
jackcrab
|
| 2256 |
+
jacket
|
| 2257 |
+
jacuzzi
|
| 2258 |
+
jade
|
| 2259 |
+
jaguar
|
| 2260 |
+
jail cell
|
| 2261 |
+
jam
|
| 2262 |
+
japanese garden
|
| 2263 |
+
jasmine
|
| 2264 |
+
jaw
|
| 2265 |
+
jay
|
| 2266 |
+
jazz
|
| 2267 |
+
jazz artist
|
| 2268 |
+
jazz fusion artist
|
| 2269 |
+
jeans
|
| 2270 |
+
jeep
|
| 2271 |
+
jelly
|
| 2272 |
+
jelly bean
|
| 2273 |
+
jellyfish
|
| 2274 |
+
jet
|
| 2275 |
+
motorboat
|
| 2276 |
+
jewel
|
| 2277 |
+
jewellery
|
| 2278 |
+
jewelry shop
|
| 2279 |
+
jigsaw puzzle
|
| 2280 |
+
rickshaw
|
| 2281 |
+
jockey
|
| 2282 |
+
jockey cap
|
| 2283 |
+
jog
|
| 2284 |
+
joint
|
| 2285 |
+
journalist
|
| 2286 |
+
joystick
|
| 2287 |
+
judge
|
| 2288 |
+
jug
|
| 2289 |
+
juggle
|
| 2290 |
+
juice
|
| 2291 |
+
juicer
|
| 2292 |
+
jujube
|
| 2293 |
+
jump rope
|
| 2294 |
+
jumpsuit
|
| 2295 |
+
jungle
|
| 2296 |
+
junkyard
|
| 2297 |
+
kale
|
| 2298 |
+
kaleidoscope
|
| 2299 |
+
kangaroo
|
| 2300 |
+
karaoke
|
| 2301 |
+
karate
|
| 2302 |
+
karting
|
| 2303 |
+
kasbah
|
| 2304 |
+
kayak
|
| 2305 |
+
kebab
|
| 2306 |
+
key
|
| 2307 |
+
keycard
|
| 2308 |
+
khaki
|
| 2309 |
+
kick
|
| 2310 |
+
kilt
|
| 2311 |
+
kimono
|
| 2312 |
+
kindergarden classroom
|
| 2313 |
+
kindergarten
|
| 2314 |
+
king
|
| 2315 |
+
king crab
|
| 2316 |
+
kiss
|
| 2317 |
+
kit
|
| 2318 |
+
kitchen
|
| 2319 |
+
kitchen cabinet
|
| 2320 |
+
kitchen counter
|
| 2321 |
+
kitchen floor
|
| 2322 |
+
kitchen hood
|
| 2323 |
+
kitchen island
|
| 2324 |
+
kitchen sink
|
| 2325 |
+
kitchen table
|
| 2326 |
+
kitchen utensil
|
| 2327 |
+
kitchen window
|
| 2328 |
+
kitchenware
|
| 2329 |
+
kite
|
| 2330 |
+
kiwi
|
| 2331 |
+
knee pad
|
| 2332 |
+
kneel
|
| 2333 |
+
knife
|
| 2334 |
+
rider
|
| 2335 |
+
knit
|
| 2336 |
+
knitting needle
|
| 2337 |
+
knob
|
| 2338 |
+
knocker
|
| 2339 |
+
knot
|
| 2340 |
+
koala
|
| 2341 |
+
koi
|
| 2342 |
+
ktv
|
| 2343 |
+
laboratory
|
| 2344 |
+
lab coat
|
| 2345 |
+
label
|
| 2346 |
+
labrador
|
| 2347 |
+
maze
|
| 2348 |
+
lace
|
| 2349 |
+
lace dress
|
| 2350 |
+
ladder
|
| 2351 |
+
ladle
|
| 2352 |
+
ladybird
|
| 2353 |
+
lagoon
|
| 2354 |
+
lake
|
| 2355 |
+
lake district
|
| 2356 |
+
lake house
|
| 2357 |
+
lakeshore
|
| 2358 |
+
lamb
|
| 2359 |
+
lamb chop
|
| 2360 |
+
lamp post
|
| 2361 |
+
lamp shade
|
| 2362 |
+
spear
|
| 2363 |
+
land
|
| 2364 |
+
land vehicle
|
| 2365 |
+
landfill
|
| 2366 |
+
landing
|
| 2367 |
+
landing deck
|
| 2368 |
+
landmark
|
| 2369 |
+
landscape
|
| 2370 |
+
landslide
|
| 2371 |
+
lanyard
|
| 2372 |
+
lantern
|
| 2373 |
+
lap
|
| 2374 |
+
laptop
|
| 2375 |
+
laptop keyboard
|
| 2376 |
+
larva
|
| 2377 |
+
lasagne
|
| 2378 |
+
laser
|
| 2379 |
+
lash
|
| 2380 |
+
lasso
|
| 2381 |
+
latch
|
| 2382 |
+
latex
|
| 2383 |
+
latte
|
| 2384 |
+
laugh
|
| 2385 |
+
launch
|
| 2386 |
+
launch event
|
| 2387 |
+
launch party
|
| 2388 |
+
laundromat
|
| 2389 |
+
laundry
|
| 2390 |
+
laundry basket
|
| 2391 |
+
laundry room
|
| 2392 |
+
lava
|
| 2393 |
+
lavender
|
| 2394 |
+
lawn
|
| 2395 |
+
lawn wedding
|
| 2396 |
+
lawyer
|
| 2397 |
+
lay
|
| 2398 |
+
lead
|
| 2399 |
+
lead singer
|
| 2400 |
+
lead to
|
| 2401 |
+
leader
|
| 2402 |
+
leak
|
| 2403 |
+
lean
|
| 2404 |
+
learn
|
| 2405 |
+
leash
|
| 2406 |
+
leather
|
| 2407 |
+
leather jacket
|
| 2408 |
+
leather shoe
|
| 2409 |
+
speech
|
| 2410 |
+
lecture hall
|
| 2411 |
+
lecture room
|
| 2412 |
+
ledge
|
| 2413 |
+
leftover
|
| 2414 |
+
leg
|
| 2415 |
+
legend
|
| 2416 |
+
legging
|
| 2417 |
+
legislative chamber
|
| 2418 |
+
lego
|
| 2419 |
+
legume
|
| 2420 |
+
lemon
|
| 2421 |
+
lemon juice
|
| 2422 |
+
lemonade
|
| 2423 |
+
lemur
|
| 2424 |
+
lens
|
| 2425 |
+
lens flare
|
| 2426 |
+
lentil
|
| 2427 |
+
leopard
|
| 2428 |
+
leotard
|
| 2429 |
+
tights
|
| 2430 |
+
leprechaun
|
| 2431 |
+
lesson
|
| 2432 |
+
letter
|
| 2433 |
+
mailbox
|
| 2434 |
+
letter logo
|
| 2435 |
+
lettering
|
| 2436 |
+
lettuce
|
| 2437 |
+
level
|
| 2438 |
+
library
|
| 2439 |
+
license
|
| 2440 |
+
license plate
|
| 2441 |
+
lichen
|
| 2442 |
+
lick
|
| 2443 |
+
lid
|
| 2444 |
+
lie
|
| 2445 |
+
life belt
|
| 2446 |
+
life jacket
|
| 2447 |
+
lifeboat
|
| 2448 |
+
lifeguard
|
| 2449 |
+
lift
|
| 2450 |
+
light fixture
|
| 2451 |
+
light show
|
| 2452 |
+
light switch
|
| 2453 |
+
lighting
|
| 2454 |
+
lightning
|
| 2455 |
+
lightning rod
|
| 2456 |
+
lilac
|
| 2457 |
+
lily
|
| 2458 |
+
limb
|
| 2459 |
+
lime
|
| 2460 |
+
limestone
|
| 2461 |
+
limo
|
| 2462 |
+
line
|
| 2463 |
+
line art
|
| 2464 |
+
line up
|
| 2465 |
+
linen
|
| 2466 |
+
liner
|
| 2467 |
+
lion
|
| 2468 |
+
lip balm
|
| 2469 |
+
lipstick
|
| 2470 |
+
liquid
|
| 2471 |
+
liquor store
|
| 2472 |
+
list
|
| 2473 |
+
litchi
|
| 2474 |
+
live
|
| 2475 |
+
livestock
|
| 2476 |
+
living room
|
| 2477 |
+
living space
|
| 2478 |
+
lizard
|
| 2479 |
+
load
|
| 2480 |
+
loading dock
|
| 2481 |
+
loafer
|
| 2482 |
+
hallway
|
| 2483 |
+
locate
|
| 2484 |
+
lock
|
| 2485 |
+
lock chamber
|
| 2486 |
+
locker
|
| 2487 |
+
loft
|
| 2488 |
+
log
|
| 2489 |
+
log cabin
|
| 2490 |
+
logo
|
| 2491 |
+
loki
|
| 2492 |
+
long hair
|
| 2493 |
+
longboard
|
| 2494 |
+
loom
|
| 2495 |
+
loop
|
| 2496 |
+
lose
|
| 2497 |
+
lottery
|
| 2498 |
+
lotus
|
| 2499 |
+
love
|
| 2500 |
+
loveseat
|
| 2501 |
+
luggage
|
| 2502 |
+
lumber
|
| 2503 |
+
lumberjack
|
| 2504 |
+
lunch
|
| 2505 |
+
lunch box
|
| 2506 |
+
lush
|
| 2507 |
+
luxury
|
| 2508 |
+
luxury yacht
|
| 2509 |
+
mac
|
| 2510 |
+
macadamia
|
| 2511 |
+
macaque
|
| 2512 |
+
macaroni
|
| 2513 |
+
macaw
|
| 2514 |
+
machete
|
| 2515 |
+
machine
|
| 2516 |
+
machine gun
|
| 2517 |
+
magazine
|
| 2518 |
+
magic
|
| 2519 |
+
magician
|
| 2520 |
+
magnet
|
| 2521 |
+
magnifying glass
|
| 2522 |
+
magnolia
|
| 2523 |
+
magpie
|
| 2524 |
+
mahjong
|
| 2525 |
+
mahout
|
| 2526 |
+
maid
|
| 2527 |
+
chain mail
|
| 2528 |
+
mail slot
|
| 2529 |
+
make
|
| 2530 |
+
makeover
|
| 2531 |
+
makeup artist
|
| 2532 |
+
makeup tool
|
| 2533 |
+
mallard
|
| 2534 |
+
mallard duck
|
| 2535 |
+
mallet
|
| 2536 |
+
mammal
|
| 2537 |
+
mammoth
|
| 2538 |
+
man
|
| 2539 |
+
management
|
| 2540 |
+
manager
|
| 2541 |
+
manatee
|
| 2542 |
+
mandala
|
| 2543 |
+
mandarin orange
|
| 2544 |
+
mandarine
|
| 2545 |
+
mane
|
| 2546 |
+
manga
|
| 2547 |
+
manger
|
| 2548 |
+
mango
|
| 2549 |
+
mangosteen
|
| 2550 |
+
mangrove
|
| 2551 |
+
manhattan
|
| 2552 |
+
manhole
|
| 2553 |
+
manhole cover
|
| 2554 |
+
manicure
|
| 2555 |
+
mannequin
|
| 2556 |
+
manor house
|
| 2557 |
+
mansion
|
| 2558 |
+
mantid
|
| 2559 |
+
mantle
|
| 2560 |
+
manufactured home
|
| 2561 |
+
manufacturing
|
| 2562 |
+
manuscript
|
| 2563 |
+
map
|
| 2564 |
+
maple
|
| 2565 |
+
maple leaf
|
| 2566 |
+
maple syrup
|
| 2567 |
+
maraca
|
| 2568 |
+
marathon
|
| 2569 |
+
marble
|
| 2570 |
+
march
|
| 2571 |
+
marching band
|
| 2572 |
+
mare
|
| 2573 |
+
marigold
|
| 2574 |
+
marine
|
| 2575 |
+
marine invertebrate
|
| 2576 |
+
marine mammal
|
| 2577 |
+
puppet
|
| 2578 |
+
mark
|
| 2579 |
+
market
|
| 2580 |
+
market square
|
| 2581 |
+
market stall
|
| 2582 |
+
marriage
|
| 2583 |
+
martial
|
| 2584 |
+
martial artist
|
| 2585 |
+
martial arts gym
|
| 2586 |
+
martini
|
| 2587 |
+
martini glass
|
| 2588 |
+
mascara
|
| 2589 |
+
mascot
|
| 2590 |
+
mashed potato
|
| 2591 |
+
masher
|
| 2592 |
+
mask
|
| 2593 |
+
massage
|
| 2594 |
+
mast
|
| 2595 |
+
mat
|
| 2596 |
+
matador
|
| 2597 |
+
match
|
| 2598 |
+
matchbox
|
| 2599 |
+
material
|
| 2600 |
+
mattress
|
| 2601 |
+
mausoleum
|
| 2602 |
+
maxi dress
|
| 2603 |
+
meal
|
| 2604 |
+
measuring cup
|
| 2605 |
+
measuring tape
|
| 2606 |
+
meat
|
| 2607 |
+
meatball
|
| 2608 |
+
mechanic
|
| 2609 |
+
mechanical fan
|
| 2610 |
+
medal
|
| 2611 |
+
media
|
| 2612 |
+
medical equipment
|
| 2613 |
+
medical image
|
| 2614 |
+
medical staff
|
| 2615 |
+
medicine cabinet
|
| 2616 |
+
medieval
|
| 2617 |
+
medina
|
| 2618 |
+
meditation
|
| 2619 |
+
meerkat
|
| 2620 |
+
meet
|
| 2621 |
+
melon
|
| 2622 |
+
monument
|
| 2623 |
+
menu
|
| 2624 |
+
mermaid
|
| 2625 |
+
net
|
| 2626 |
+
mess
|
| 2627 |
+
messenger bag
|
| 2628 |
+
metal
|
| 2629 |
+
metal artist
|
| 2630 |
+
metal detector
|
| 2631 |
+
meter
|
| 2632 |
+
mezzanine
|
| 2633 |
+
microphone
|
| 2634 |
+
microscope
|
| 2635 |
+
microwave
|
| 2636 |
+
midnight
|
| 2637 |
+
milestone
|
| 2638 |
+
military uniform
|
| 2639 |
+
milk
|
| 2640 |
+
milk can
|
| 2641 |
+
milk tea
|
| 2642 |
+
milkshake
|
| 2643 |
+
mill
|
| 2644 |
+
mine
|
| 2645 |
+
miner
|
| 2646 |
+
mineral
|
| 2647 |
+
mineral water
|
| 2648 |
+
miniskirt
|
| 2649 |
+
miniature
|
| 2650 |
+
minibus
|
| 2651 |
+
minister
|
| 2652 |
+
minivan
|
| 2653 |
+
mint
|
| 2654 |
+
mint candy
|
| 2655 |
+
mirror
|
| 2656 |
+
miss
|
| 2657 |
+
missile
|
| 2658 |
+
mission
|
| 2659 |
+
mistletoe
|
| 2660 |
+
mix
|
| 2661 |
+
mixer
|
| 2662 |
+
mixing bowl
|
| 2663 |
+
mixture
|
| 2664 |
+
moat
|
| 2665 |
+
mobility scooter
|
| 2666 |
+
model
|
| 2667 |
+
model car
|
| 2668 |
+
modern
|
| 2669 |
+
modern tower
|
| 2670 |
+
moisture
|
| 2671 |
+
mold
|
| 2672 |
+
molding
|
| 2673 |
+
mole
|
| 2674 |
+
monarch
|
| 2675 |
+
money
|
| 2676 |
+
monitor
|
| 2677 |
+
monk
|
| 2678 |
+
monkey
|
| 2679 |
+
monkey wrench
|
| 2680 |
+
monochrome
|
| 2681 |
+
monocycle
|
| 2682 |
+
monster truck
|
| 2683 |
+
moon
|
| 2684 |
+
moon cake
|
| 2685 |
+
moonlight
|
| 2686 |
+
moor
|
| 2687 |
+
moose
|
| 2688 |
+
swab
|
| 2689 |
+
moped
|
| 2690 |
+
morning
|
| 2691 |
+
morning fog
|
| 2692 |
+
morning light
|
| 2693 |
+
morning sun
|
| 2694 |
+
mortar
|
| 2695 |
+
mosaic
|
| 2696 |
+
mosque
|
| 2697 |
+
mosquito
|
| 2698 |
+
moss
|
| 2699 |
+
motel
|
| 2700 |
+
moth
|
| 2701 |
+
mother
|
| 2702 |
+
motherboard
|
| 2703 |
+
motif
|
| 2704 |
+
sport
|
| 2705 |
+
motor
|
| 2706 |
+
motorbike
|
| 2707 |
+
motorcycle
|
| 2708 |
+
motorcycle helmet
|
| 2709 |
+
motorcycle racer
|
| 2710 |
+
motorcyclist
|
| 2711 |
+
motorsport
|
| 2712 |
+
mound
|
| 2713 |
+
mountain
|
| 2714 |
+
mountain bike
|
| 2715 |
+
mountain biker
|
| 2716 |
+
mountain biking
|
| 2717 |
+
mountain gorilla
|
| 2718 |
+
mountain lake
|
| 2719 |
+
mountain landscape
|
| 2720 |
+
mountain pass
|
| 2721 |
+
mountain path
|
| 2722 |
+
mountain range
|
| 2723 |
+
mountain river
|
| 2724 |
+
mountain snowy
|
| 2725 |
+
mountain stream
|
| 2726 |
+
mountain view
|
| 2727 |
+
mountain village
|
| 2728 |
+
mountaineer
|
| 2729 |
+
mountaineering bag
|
| 2730 |
+
mouse
|
| 2731 |
+
mousepad
|
| 2732 |
+
mousetrap
|
| 2733 |
+
mouth
|
| 2734 |
+
mouthwash
|
| 2735 |
+
move
|
| 2736 |
+
movie poster
|
| 2737 |
+
movie ticket
|
| 2738 |
+
mower
|
| 2739 |
+
mp3 player
|
| 2740 |
+
mr
|
| 2741 |
+
mud
|
| 2742 |
+
muffin
|
| 2743 |
+
mug
|
| 2744 |
+
mulberry
|
| 2745 |
+
mulch
|
| 2746 |
+
mule
|
| 2747 |
+
municipality
|
| 2748 |
+
mural
|
| 2749 |
+
muscle
|
| 2750 |
+
muscle car
|
| 2751 |
+
museum
|
| 2752 |
+
mushroom
|
| 2753 |
+
music
|
| 2754 |
+
music festival
|
| 2755 |
+
music stool
|
| 2756 |
+
music studio
|
| 2757 |
+
music video performer
|
| 2758 |
+
musical keyboard
|
| 2759 |
+
musician
|
| 2760 |
+
mussel
|
| 2761 |
+
mustard
|
| 2762 |
+
mythology
|
| 2763 |
+
nacho
|
| 2764 |
+
nail polish
|
| 2765 |
+
nailfile
|
| 2766 |
+
nanny
|
| 2767 |
+
napkin
|
| 2768 |
+
narrow
|
| 2769 |
+
national flag
|
| 2770 |
+
nativity scene
|
| 2771 |
+
natural history museum
|
| 2772 |
+
nature
|
| 2773 |
+
nature reserve
|
| 2774 |
+
navigation
|
| 2775 |
+
navratri
|
| 2776 |
+
navy
|
| 2777 |
+
nebula
|
| 2778 |
+
neck
|
| 2779 |
+
neckband
|
| 2780 |
+
necklace
|
| 2781 |
+
neckline
|
| 2782 |
+
nectar
|
| 2783 |
+
nectarine
|
| 2784 |
+
needle
|
| 2785 |
+
neighbor
|
| 2786 |
+
neighbourhood
|
| 2787 |
+
neon
|
| 2788 |
+
neon light
|
| 2789 |
+
nerve
|
| 2790 |
+
nest
|
| 2791 |
+
new year
|
| 2792 |
+
newborn
|
| 2793 |
+
newfoundland
|
| 2794 |
+
newlywed
|
| 2795 |
+
news
|
| 2796 |
+
news conference
|
| 2797 |
+
newsstand
|
| 2798 |
+
night
|
| 2799 |
+
night market
|
| 2800 |
+
night sky
|
| 2801 |
+
night view
|
| 2802 |
+
nightclub
|
| 2803 |
+
nightstand
|
| 2804 |
+
noodle
|
| 2805 |
+
nose
|
| 2806 |
+
noseband
|
| 2807 |
+
note
|
| 2808 |
+
notebook
|
| 2809 |
+
notepad
|
| 2810 |
+
notepaper
|
| 2811 |
+
notice
|
| 2812 |
+
number icon
|
| 2813 |
+
nun
|
| 2814 |
+
nurse
|
| 2815 |
+
nursery
|
| 2816 |
+
nursing home
|
| 2817 |
+
nut
|
| 2818 |
+
nutcracker
|
| 2819 |
+
oak
|
| 2820 |
+
oak tree
|
| 2821 |
+
oar
|
| 2822 |
+
oasis
|
| 2823 |
+
oast house
|
| 2824 |
+
oatmeal
|
| 2825 |
+
oats
|
| 2826 |
+
obelisk
|
| 2827 |
+
observation tower
|
| 2828 |
+
observatory
|
| 2829 |
+
obstacle course
|
| 2830 |
+
sea
|
| 2831 |
+
octopus
|
| 2832 |
+
offer
|
| 2833 |
+
office
|
| 2834 |
+
office building
|
| 2835 |
+
office chair
|
| 2836 |
+
office cubicle
|
| 2837 |
+
office desk
|
| 2838 |
+
office supply
|
| 2839 |
+
office window
|
| 2840 |
+
officer
|
| 2841 |
+
official
|
| 2842 |
+
oil
|
| 2843 |
+
oil lamp
|
| 2844 |
+
oil painting
|
| 2845 |
+
oilrig
|
| 2846 |
+
okra
|
| 2847 |
+
old photo
|
| 2848 |
+
olive
|
| 2849 |
+
olive oil
|
| 2850 |
+
olive tree
|
| 2851 |
+
omelet
|
| 2852 |
+
onion
|
| 2853 |
+
onion ring
|
| 2854 |
+
opal
|
| 2855 |
+
open
|
| 2856 |
+
opening
|
| 2857 |
+
opening ceremony
|
| 2858 |
+
opera
|
| 2859 |
+
opera house
|
| 2860 |
+
operate
|
| 2861 |
+
operating room
|
| 2862 |
+
operation
|
| 2863 |
+
optical shop
|
| 2864 |
+
orangutan
|
| 2865 |
+
orange
|
| 2866 |
+
orange juice
|
| 2867 |
+
orange tree
|
| 2868 |
+
orangery
|
| 2869 |
+
orbit
|
| 2870 |
+
orchard
|
| 2871 |
+
orchestra pit
|
| 2872 |
+
orchid
|
| 2873 |
+
order
|
| 2874 |
+
organization
|
| 2875 |
+
origami
|
| 2876 |
+
ornament
|
| 2877 |
+
osprey
|
| 2878 |
+
ostrich
|
| 2879 |
+
otter
|
| 2880 |
+
out
|
| 2881 |
+
outcrop
|
| 2882 |
+
outdoor
|
| 2883 |
+
outhouse
|
| 2884 |
+
electric outlet
|
| 2885 |
+
outline
|
| 2886 |
+
oval
|
| 2887 |
+
oven
|
| 2888 |
+
overall
|
| 2889 |
+
overcoat
|
| 2890 |
+
overpass
|
| 2891 |
+
owl
|
| 2892 |
+
oyster
|
| 2893 |
+
teething ring
|
| 2894 |
+
pack
|
| 2895 |
+
package
|
| 2896 |
+
paddock
|
| 2897 |
+
police van
|
| 2898 |
+
padlock
|
| 2899 |
+
paella
|
| 2900 |
+
pagoda
|
| 2901 |
+
pain
|
| 2902 |
+
paint brush
|
| 2903 |
+
painter
|
| 2904 |
+
paisley bandanna
|
| 2905 |
+
palace
|
| 2906 |
+
palette
|
| 2907 |
+
paling
|
| 2908 |
+
pall
|
| 2909 |
+
palm tree
|
| 2910 |
+
pan
|
| 2911 |
+
pancake
|
| 2912 |
+
panda
|
| 2913 |
+
panel
|
| 2914 |
+
panorama
|
| 2915 |
+
pansy
|
| 2916 |
+
pant
|
| 2917 |
+
pantry
|
| 2918 |
+
pants
|
| 2919 |
+
pantyhose
|
| 2920 |
+
papaya
|
| 2921 |
+
paper
|
| 2922 |
+
paper bag
|
| 2923 |
+
paper cutter
|
| 2924 |
+
paper lantern
|
| 2925 |
+
paper plate
|
| 2926 |
+
paper towel
|
| 2927 |
+
paperback book
|
| 2928 |
+
paperweight
|
| 2929 |
+
parachute
|
| 2930 |
+
parade
|
| 2931 |
+
paradise
|
| 2932 |
+
parrot
|
| 2933 |
+
paramedic
|
| 2934 |
+
paraquet
|
| 2935 |
+
parasail
|
| 2936 |
+
paratrooper
|
| 2937 |
+
parchment
|
| 2938 |
+
parish
|
| 2939 |
+
park
|
| 2940 |
+
park bench
|
| 2941 |
+
parking
|
| 2942 |
+
parking garage
|
| 2943 |
+
parking meter
|
| 2944 |
+
parking sign
|
| 2945 |
+
parliament
|
| 2946 |
+
parsley
|
| 2947 |
+
participant
|
| 2948 |
+
partner
|
| 2949 |
+
partridge
|
| 2950 |
+
party
|
| 2951 |
+
party hat
|
| 2952 |
+
pass
|
| 2953 |
+
passage
|
| 2954 |
+
passbook
|
| 2955 |
+
passenger
|
| 2956 |
+
passenger ship
|
| 2957 |
+
passenger train
|
| 2958 |
+
passion fruit
|
| 2959 |
+
passport
|
| 2960 |
+
pasta
|
| 2961 |
+
paste
|
| 2962 |
+
pastry
|
| 2963 |
+
pasture
|
| 2964 |
+
patch
|
| 2965 |
+
patient
|
| 2966 |
+
pattern
|
| 2967 |
+
pavement
|
| 2968 |
+
pavilion
|
| 2969 |
+
paw
|
| 2970 |
+
pay
|
| 2971 |
+
payphone
|
| 2972 |
+
pea
|
| 2973 |
+
peace
|
| 2974 |
+
peach
|
| 2975 |
+
peacock
|
| 2976 |
+
peak
|
| 2977 |
+
peanut
|
| 2978 |
+
peanut butter
|
| 2979 |
+
pear
|
| 2980 |
+
pearl
|
| 2981 |
+
pebble
|
| 2982 |
+
pecan
|
| 2983 |
+
pedestrian
|
| 2984 |
+
pedestrian bridge
|
| 2985 |
+
pedestrian street
|
| 2986 |
+
peel
|
| 2987 |
+
peeler
|
| 2988 |
+
pegboard
|
| 2989 |
+
pegleg
|
| 2990 |
+
pelican
|
| 2991 |
+
pen
|
| 2992 |
+
penalty kick
|
| 2993 |
+
pencil
|
| 2994 |
+
pencil case
|
| 2995 |
+
pencil sharpener
|
| 2996 |
+
pencil skirt
|
| 2997 |
+
pendant
|
| 2998 |
+
pendulum
|
| 2999 |
+
penguin
|
| 3000 |
+
peninsula
|
| 3001 |
+
pennant
|
| 3002 |
+
penny
|
| 3003 |
+
piggy bank
|
| 3004 |
+
peony
|
| 3005 |
+
pepper
|
| 3006 |
+
pepper grinder
|
| 3007 |
+
peppercorn
|
| 3008 |
+
pepperoni
|
| 3009 |
+
perch
|
| 3010 |
+
perform
|
| 3011 |
+
performance
|
| 3012 |
+
performance arena
|
| 3013 |
+
perfume
|
| 3014 |
+
pergola
|
| 3015 |
+
persian cat
|
| 3016 |
+
persimmon
|
| 3017 |
+
personal care
|
| 3018 |
+
personal flotation device
|
| 3019 |
+
pest
|
| 3020 |
+
pet
|
| 3021 |
+
pet shop
|
| 3022 |
+
pet store
|
| 3023 |
+
petal
|
| 3024 |
+
petunia
|
| 3025 |
+
church bench
|
| 3026 |
+
pheasant
|
| 3027 |
+
phenomenon
|
| 3028 |
+
philosopher
|
| 3029 |
+
phone
|
| 3030 |
+
phonebook
|
| 3031 |
+
record player
|
| 3032 |
+
photo
|
| 3033 |
+
photo booth
|
| 3034 |
+
photo frame
|
| 3035 |
+
photography
|
| 3036 |
+
physicist
|
| 3037 |
+
physics laboratory
|
| 3038 |
+
pianist
|
| 3039 |
+
piano
|
| 3040 |
+
plectrum
|
| 3041 |
+
pick up
|
| 3042 |
+
pickle
|
| 3043 |
+
picnic
|
| 3044 |
+
picnic area
|
| 3045 |
+
picnic basket
|
| 3046 |
+
picnic table
|
| 3047 |
+
picture
|
| 3048 |
+
picture frame
|
| 3049 |
+
pie
|
| 3050 |
+
pigeon
|
| 3051 |
+
pilgrim
|
| 3052 |
+
tablet
|
| 3053 |
+
pillow
|
| 3054 |
+
pilot
|
| 3055 |
+
pilot boat
|
| 3056 |
+
pin
|
| 3057 |
+
pine
|
| 3058 |
+
pine cone
|
| 3059 |
+
pine forest
|
| 3060 |
+
pine nut
|
| 3061 |
+
pineapple
|
| 3062 |
+
table tennis table
|
| 3063 |
+
table tennis
|
| 3064 |
+
pink
|
| 3065 |
+
pint
|
| 3066 |
+
pipa
|
| 3067 |
+
pipe
|
| 3068 |
+
pipe bowl
|
| 3069 |
+
pirate
|
| 3070 |
+
pirate flag
|
| 3071 |
+
pirate ship
|
| 3072 |
+
pistachio
|
| 3073 |
+
ski slope
|
| 3074 |
+
pocket bread
|
| 3075 |
+
pitaya
|
| 3076 |
+
pitbull
|
| 3077 |
+
pitch
|
| 3078 |
+
pitcher
|
| 3079 |
+
pitcher plant
|
| 3080 |
+
pitchfork
|
| 3081 |
+
pizza
|
| 3082 |
+
pizza cutter
|
| 3083 |
+
pizza pan
|
| 3084 |
+
pizzeria
|
| 3085 |
+
placard
|
| 3086 |
+
place
|
| 3087 |
+
place mat
|
| 3088 |
+
plaid
|
| 3089 |
+
plain
|
| 3090 |
+
plan
|
| 3091 |
+
planet
|
| 3092 |
+
planet earth
|
| 3093 |
+
plank
|
| 3094 |
+
plant
|
| 3095 |
+
plantation
|
| 3096 |
+
planting
|
| 3097 |
+
plaque
|
| 3098 |
+
plaster
|
| 3099 |
+
plastic
|
| 3100 |
+
plasticine
|
| 3101 |
+
plateau
|
| 3102 |
+
platform
|
| 3103 |
+
platinum
|
| 3104 |
+
platter
|
| 3105 |
+
play
|
| 3106 |
+
play badminton
|
| 3107 |
+
play baseball
|
| 3108 |
+
play basketball
|
| 3109 |
+
play billiard
|
| 3110 |
+
play football
|
| 3111 |
+
play pong
|
| 3112 |
+
play tennis
|
| 3113 |
+
play volleyball
|
| 3114 |
+
player
|
| 3115 |
+
playground
|
| 3116 |
+
playhouse
|
| 3117 |
+
playing card
|
| 3118 |
+
playing chess
|
| 3119 |
+
playing golf
|
| 3120 |
+
playing mahjong
|
| 3121 |
+
playingfield
|
| 3122 |
+
playpen
|
| 3123 |
+
playroom
|
| 3124 |
+
plaza
|
| 3125 |
+
plier
|
| 3126 |
+
plot
|
| 3127 |
+
plow
|
| 3128 |
+
plug
|
| 3129 |
+
plug hat
|
| 3130 |
+
plum
|
| 3131 |
+
plumber
|
| 3132 |
+
plumbing fixture
|
| 3133 |
+
plume
|
| 3134 |
+
plywood
|
| 3135 |
+
pocket
|
| 3136 |
+
pocket watch
|
| 3137 |
+
pocketknife
|
| 3138 |
+
pod
|
| 3139 |
+
podium
|
| 3140 |
+
poetry
|
| 3141 |
+
poinsettia
|
| 3142 |
+
point
|
| 3143 |
+
pointer
|
| 3144 |
+
poker card
|
| 3145 |
+
poker chip
|
| 3146 |
+
poker table
|
| 3147 |
+
pole
|
| 3148 |
+
polecat
|
| 3149 |
+
police
|
| 3150 |
+
police car
|
| 3151 |
+
police dog
|
| 3152 |
+
police station
|
| 3153 |
+
politician
|
| 3154 |
+
polka dot
|
| 3155 |
+
pollen
|
| 3156 |
+
pollution
|
| 3157 |
+
polo
|
| 3158 |
+
polo neck
|
| 3159 |
+
polo shirt
|
| 3160 |
+
pomegranate
|
| 3161 |
+
pomeranian
|
| 3162 |
+
poncho
|
| 3163 |
+
pond
|
| 3164 |
+
ponytail
|
| 3165 |
+
poodle
|
| 3166 |
+
pool
|
| 3167 |
+
pop
|
| 3168 |
+
pop artist
|
| 3169 |
+
popcorn
|
| 3170 |
+
pope
|
| 3171 |
+
poppy
|
| 3172 |
+
porcelain
|
| 3173 |
+
porch
|
| 3174 |
+
pork
|
| 3175 |
+
porridge
|
| 3176 |
+
portable battery
|
| 3177 |
+
portal
|
| 3178 |
+
portfolio
|
| 3179 |
+
porthole
|
| 3180 |
+
portrait
|
| 3181 |
+
portrait session
|
| 3182 |
+
pose
|
| 3183 |
+
possum
|
| 3184 |
+
post
|
| 3185 |
+
post office
|
| 3186 |
+
stamp
|
| 3187 |
+
postcard
|
| 3188 |
+
poster
|
| 3189 |
+
poster page
|
| 3190 |
+
pot
|
| 3191 |
+
potato
|
| 3192 |
+
potato chip
|
| 3193 |
+
potato salad
|
| 3194 |
+
potholder
|
| 3195 |
+
potty
|
| 3196 |
+
pouch
|
| 3197 |
+
poultry
|
| 3198 |
+
pound
|
| 3199 |
+
pour
|
| 3200 |
+
powder
|
| 3201 |
+
power line
|
| 3202 |
+
power plugs and sockets
|
| 3203 |
+
power see
|
| 3204 |
+
power station
|
| 3205 |
+
practice
|
| 3206 |
+
Prague Castle
|
| 3207 |
+
prayer
|
| 3208 |
+
preacher
|
| 3209 |
+
premiere
|
| 3210 |
+
prescription
|
| 3211 |
+
show
|
| 3212 |
+
presentation
|
| 3213 |
+
president
|
| 3214 |
+
press room
|
| 3215 |
+
pressure cooker
|
| 3216 |
+
pretzel
|
| 3217 |
+
prince
|
| 3218 |
+
princess
|
| 3219 |
+
print
|
| 3220 |
+
printed page
|
| 3221 |
+
printer
|
| 3222 |
+
printing
|
| 3223 |
+
prison
|
| 3224 |
+
produce
|
| 3225 |
+
product
|
| 3226 |
+
profession
|
| 3227 |
+
professional
|
| 3228 |
+
professor
|
| 3229 |
+
project picture
|
| 3230 |
+
projection screen
|
| 3231 |
+
projector
|
| 3232 |
+
prom
|
| 3233 |
+
promenade
|
| 3234 |
+
propeller
|
| 3235 |
+
prophet
|
| 3236 |
+
proposal
|
| 3237 |
+
protective suit
|
| 3238 |
+
protest
|
| 3239 |
+
protester
|
| 3240 |
+
publication
|
| 3241 |
+
publicity portrait
|
| 3242 |
+
ice hockey
|
| 3243 |
+
pudding
|
| 3244 |
+
puddle
|
| 3245 |
+
puff
|
| 3246 |
+
puffin
|
| 3247 |
+
pug
|
| 3248 |
+
pull
|
| 3249 |
+
pulpit
|
| 3250 |
+
pulse
|
| 3251 |
+
pump
|
| 3252 |
+
pumpkin
|
| 3253 |
+
pumpkin pie
|
| 3254 |
+
pumpkin seed
|
| 3255 |
+
punch bag
|
| 3256 |
+
punch
|
| 3257 |
+
student
|
| 3258 |
+
purple
|
| 3259 |
+
push
|
| 3260 |
+
putt
|
| 3261 |
+
puzzle
|
| 3262 |
+
tower
|
| 3263 |
+
pyramid
|
| 3264 |
+
python
|
| 3265 |
+
qr code
|
| 3266 |
+
quail
|
| 3267 |
+
quarry
|
| 3268 |
+
quarter
|
| 3269 |
+
quartz
|
| 3270 |
+
queen
|
| 3271 |
+
quesadilla
|
| 3272 |
+
queue
|
| 3273 |
+
quiche
|
| 3274 |
+
quilt
|
| 3275 |
+
quilting
|
| 3276 |
+
quote
|
| 3277 |
+
rabbit
|
| 3278 |
+
raccoon
|
| 3279 |
+
race
|
| 3280 |
+
race track
|
| 3281 |
+
raceway
|
| 3282 |
+
race car
|
| 3283 |
+
racket
|
| 3284 |
+
radar
|
| 3285 |
+
radiator
|
| 3286 |
+
radio
|
| 3287 |
+
raft
|
| 3288 |
+
rag doll
|
| 3289 |
+
rail
|
| 3290 |
+
railcar
|
| 3291 |
+
railroad
|
| 3292 |
+
railroad bridge
|
| 3293 |
+
railway line
|
| 3294 |
+
railway station
|
| 3295 |
+
rain
|
| 3296 |
+
rain boot
|
| 3297 |
+
rainbow
|
| 3298 |
+
rainbow trout
|
| 3299 |
+
raincoat
|
| 3300 |
+
rainforest
|
| 3301 |
+
rainy
|
| 3302 |
+
raisin
|
| 3303 |
+
rake
|
| 3304 |
+
ram
|
| 3305 |
+
ramp
|
| 3306 |
+
rapeseed
|
| 3307 |
+
rapid
|
| 3308 |
+
rapper
|
| 3309 |
+
raspberry
|
| 3310 |
+
rat
|
| 3311 |
+
ratchet
|
| 3312 |
+
raven
|
| 3313 |
+
ravine
|
| 3314 |
+
ray
|
| 3315 |
+
razor
|
| 3316 |
+
razor blade
|
| 3317 |
+
read
|
| 3318 |
+
reading
|
| 3319 |
+
reamer
|
| 3320 |
+
rear
|
| 3321 |
+
rear light
|
| 3322 |
+
rear view
|
| 3323 |
+
rearview mirror
|
| 3324 |
+
receipt
|
| 3325 |
+
receive
|
| 3326 |
+
reception
|
| 3327 |
+
recipe
|
| 3328 |
+
record
|
| 3329 |
+
record producer
|
| 3330 |
+
recorder
|
| 3331 |
+
recording studio
|
| 3332 |
+
recreation room
|
| 3333 |
+
recreational vehicle
|
| 3334 |
+
rectangle
|
| 3335 |
+
recycling
|
| 3336 |
+
recycling bin
|
| 3337 |
+
red
|
| 3338 |
+
red carpet
|
| 3339 |
+
red flag
|
| 3340 |
+
red panda
|
| 3341 |
+
red wine
|
| 3342 |
+
redwood
|
| 3343 |
+
reed
|
| 3344 |
+
reef
|
| 3345 |
+
reel
|
| 3346 |
+
referee
|
| 3347 |
+
reflect
|
| 3348 |
+
reflection
|
| 3349 |
+
reflector
|
| 3350 |
+
register
|
| 3351 |
+
rein
|
| 3352 |
+
reindeer
|
| 3353 |
+
relax
|
| 3354 |
+
release
|
| 3355 |
+
relief
|
| 3356 |
+
religion
|
| 3357 |
+
religious
|
| 3358 |
+
relish
|
| 3359 |
+
remain
|
| 3360 |
+
remodel
|
| 3361 |
+
remote
|
| 3362 |
+
remove
|
| 3363 |
+
repair
|
| 3364 |
+
repair shop
|
| 3365 |
+
reptile
|
| 3366 |
+
rescue
|
| 3367 |
+
rescuer
|
| 3368 |
+
research
|
| 3369 |
+
researcher
|
| 3370 |
+
reservoir
|
| 3371 |
+
residence
|
| 3372 |
+
residential neighborhood
|
| 3373 |
+
resin
|
| 3374 |
+
resort
|
| 3375 |
+
resort town
|
| 3376 |
+
restaurant kitchen
|
| 3377 |
+
restaurant patio
|
| 3378 |
+
restroom
|
| 3379 |
+
retail
|
| 3380 |
+
retriever
|
| 3381 |
+
retro
|
| 3382 |
+
reveal
|
| 3383 |
+
rhinoceros
|
| 3384 |
+
rhododendron
|
| 3385 |
+
rib
|
| 3386 |
+
ribbon
|
| 3387 |
+
rice
|
| 3388 |
+
rice cooker
|
| 3389 |
+
rice field
|
| 3390 |
+
ride
|
| 3391 |
+
ridge
|
| 3392 |
+
riding
|
| 3393 |
+
rifle
|
| 3394 |
+
rim
|
| 3395 |
+
ring
|
| 3396 |
+
riot
|
| 3397 |
+
ripple
|
| 3398 |
+
rise
|
| 3399 |
+
rise building
|
| 3400 |
+
river
|
| 3401 |
+
river bank
|
| 3402 |
+
river boat
|
| 3403 |
+
river valley
|
| 3404 |
+
riverbed
|
| 3405 |
+
road
|
| 3406 |
+
road sign
|
| 3407 |
+
road trip
|
| 3408 |
+
roadside
|
| 3409 |
+
roast chicken
|
| 3410 |
+
robe
|
| 3411 |
+
robin
|
| 3412 |
+
robot
|
| 3413 |
+
stone
|
| 3414 |
+
rock arch
|
| 3415 |
+
rock artist
|
| 3416 |
+
rock band
|
| 3417 |
+
rock climber
|
| 3418 |
+
rock climbing
|
| 3419 |
+
rock concert
|
| 3420 |
+
rock face
|
| 3421 |
+
rock formation
|
| 3422 |
+
rocker
|
| 3423 |
+
rocket
|
| 3424 |
+
rocking chair
|
| 3425 |
+
rocky
|
| 3426 |
+
rodent
|
| 3427 |
+
rodeo
|
| 3428 |
+
rodeo arena
|
| 3429 |
+
roe
|
| 3430 |
+
roe deer
|
| 3431 |
+
roller
|
| 3432 |
+
coaster
|
| 3433 |
+
roller skate
|
| 3434 |
+
roller skates
|
| 3435 |
+
rolling pin
|
| 3436 |
+
romance
|
| 3437 |
+
romantic
|
| 3438 |
+
roof
|
| 3439 |
+
roof garden
|
| 3440 |
+
room
|
| 3441 |
+
room divider
|
| 3442 |
+
root
|
| 3443 |
+
root beer
|
| 3444 |
+
rope bridge
|
| 3445 |
+
rosary
|
| 3446 |
+
rose
|
| 3447 |
+
rosemary
|
| 3448 |
+
rosy cloud
|
| 3449 |
+
rottweiler
|
| 3450 |
+
round table
|
| 3451 |
+
router
|
| 3452 |
+
row
|
| 3453 |
+
rowan
|
| 3454 |
+
royal
|
| 3455 |
+
rubber stamp
|
| 3456 |
+
rubble
|
| 3457 |
+
rubik's cube
|
| 3458 |
+
ruby
|
| 3459 |
+
ruffle
|
| 3460 |
+
rugby
|
| 3461 |
+
rugby ball
|
| 3462 |
+
rugby player
|
| 3463 |
+
ruins
|
| 3464 |
+
ruler
|
| 3465 |
+
rum
|
| 3466 |
+
run
|
| 3467 |
+
runner
|
| 3468 |
+
running shoe
|
| 3469 |
+
rural
|
| 3470 |
+
rust
|
| 3471 |
+
rustic
|
| 3472 |
+
rye
|
| 3473 |
+
sack
|
| 3474 |
+
saddle
|
| 3475 |
+
saddlebag
|
| 3476 |
+
safari
|
| 3477 |
+
safe
|
| 3478 |
+
safety vest
|
| 3479 |
+
sage
|
| 3480 |
+
sail
|
| 3481 |
+
sailboat
|
| 3482 |
+
sailing
|
| 3483 |
+
sailor
|
| 3484 |
+
squirrel monkey
|
| 3485 |
+
sake
|
| 3486 |
+
salad
|
| 3487 |
+
salad bowl
|
| 3488 |
+
salamander
|
| 3489 |
+
salami
|
| 3490 |
+
sale
|
| 3491 |
+
salmon
|
| 3492 |
+
salon
|
| 3493 |
+
salsa
|
| 3494 |
+
salt
|
| 3495 |
+
salt and pepper shakers
|
| 3496 |
+
salt lake
|
| 3497 |
+
salt marsh
|
| 3498 |
+
salt shaker
|
| 3499 |
+
salute
|
| 3500 |
+
samoyed
|
| 3501 |
+
samurai
|
| 3502 |
+
sand
|
| 3503 |
+
sand bar
|
| 3504 |
+
sand box
|
| 3505 |
+
sand castle
|
| 3506 |
+
sand sculpture
|
| 3507 |
+
sandal
|
| 3508 |
+
sandwich
|
| 3509 |
+
sanitary napkin
|
| 3510 |
+
santa claus
|
| 3511 |
+
sapphire
|
| 3512 |
+
sardine
|
| 3513 |
+
sari
|
| 3514 |
+
sashimi
|
| 3515 |
+
satay
|
| 3516 |
+
satchel
|
| 3517 |
+
satellite
|
| 3518 |
+
satin
|
| 3519 |
+
sauce
|
| 3520 |
+
saucer
|
| 3521 |
+
sauna
|
| 3522 |
+
sausage
|
| 3523 |
+
savanna
|
| 3524 |
+
saw
|
| 3525 |
+
sawbuck
|
| 3526 |
+
sax
|
| 3527 |
+
saxophonist
|
| 3528 |
+
scaffold
|
| 3529 |
+
scale
|
| 3530 |
+
scale model
|
| 3531 |
+
scallop
|
| 3532 |
+
scar
|
| 3533 |
+
strawman
|
| 3534 |
+
scarf
|
| 3535 |
+
scene
|
| 3536 |
+
scenery
|
| 3537 |
+
schnauzer
|
| 3538 |
+
school
|
| 3539 |
+
school bus
|
| 3540 |
+
school uniform
|
| 3541 |
+
schoolhouse
|
| 3542 |
+
schooner
|
| 3543 |
+
science
|
| 3544 |
+
science fiction film
|
| 3545 |
+
science museum
|
| 3546 |
+
scientist
|
| 3547 |
+
scissors
|
| 3548 |
+
wall lamp
|
| 3549 |
+
scone
|
| 3550 |
+
scoop
|
| 3551 |
+
scooter
|
| 3552 |
+
score
|
| 3553 |
+
scoreboard
|
| 3554 |
+
scorpion
|
| 3555 |
+
scout
|
| 3556 |
+
scrambled egg
|
| 3557 |
+
scrap
|
| 3558 |
+
scraper
|
| 3559 |
+
scratch
|
| 3560 |
+
screen
|
| 3561 |
+
screen door
|
| 3562 |
+
screenshot
|
| 3563 |
+
screw
|
| 3564 |
+
screwdriver
|
| 3565 |
+
scroll
|
| 3566 |
+
scrub
|
| 3567 |
+
scrubbing brush
|
| 3568 |
+
sculptor
|
| 3569 |
+
sculpture
|
| 3570 |
+
sea cave
|
| 3571 |
+
sea ice
|
| 3572 |
+
sea lion
|
| 3573 |
+
sea turtle
|
| 3574 |
+
sea urchin
|
| 3575 |
+
seabass
|
| 3576 |
+
seabed
|
| 3577 |
+
seabird
|
| 3578 |
+
seafood
|
| 3579 |
+
seahorse
|
| 3580 |
+
seal
|
| 3581 |
+
sea view
|
| 3582 |
+
seashell
|
| 3583 |
+
seaside resort
|
| 3584 |
+
season
|
| 3585 |
+
seat
|
| 3586 |
+
seat belt
|
| 3587 |
+
seaweed
|
| 3588 |
+
secretary
|
| 3589 |
+
security
|
| 3590 |
+
sedan
|
| 3591 |
+
see
|
| 3592 |
+
seed
|
| 3593 |
+
seesaw
|
| 3594 |
+
segway
|
| 3595 |
+
selfie
|
| 3596 |
+
sell
|
| 3597 |
+
seminar
|
| 3598 |
+
sense
|
| 3599 |
+
sensor
|
| 3600 |
+
server
|
| 3601 |
+
server room
|
| 3602 |
+
service
|
| 3603 |
+
set
|
| 3604 |
+
sewing machine
|
| 3605 |
+
shadow
|
| 3606 |
+
shake
|
| 3607 |
+
shaker
|
| 3608 |
+
shampoo
|
| 3609 |
+
shape
|
| 3610 |
+
share
|
| 3611 |
+
shark
|
| 3612 |
+
sharpener
|
| 3613 |
+
sharpie
|
| 3614 |
+
shaver
|
| 3615 |
+
shaving cream
|
| 3616 |
+
shawl
|
| 3617 |
+
shear
|
| 3618 |
+
shears
|
| 3619 |
+
sheep
|
| 3620 |
+
sheet
|
| 3621 |
+
sheet music
|
| 3622 |
+
shelf
|
| 3623 |
+
shell
|
| 3624 |
+
shellfish
|
| 3625 |
+
shelter
|
| 3626 |
+
shelve
|
| 3627 |
+
shepherd
|
| 3628 |
+
sherbert
|
| 3629 |
+
shiba inu
|
| 3630 |
+
shine
|
| 3631 |
+
shipping
|
| 3632 |
+
shipping container
|
| 3633 |
+
shipwreck
|
| 3634 |
+
shipyard
|
| 3635 |
+
shirt
|
| 3636 |
+
shirtless
|
| 3637 |
+
shoal
|
| 3638 |
+
shoe
|
| 3639 |
+
shoe box
|
| 3640 |
+
shoe shop
|
| 3641 |
+
shoe tree
|
| 3642 |
+
shoot
|
| 3643 |
+
shooting basketball guard
|
| 3644 |
+
shop window
|
| 3645 |
+
shopfront
|
| 3646 |
+
shopper
|
| 3647 |
+
shopping
|
| 3648 |
+
shopping bag
|
| 3649 |
+
shopping basket
|
| 3650 |
+
shopping cart
|
| 3651 |
+
mall
|
| 3652 |
+
shopping street
|
| 3653 |
+
shore
|
| 3654 |
+
shoreline
|
| 3655 |
+
short
|
| 3656 |
+
short hair
|
| 3657 |
+
shorts
|
| 3658 |
+
shot glass
|
| 3659 |
+
shotgun
|
| 3660 |
+
shoulder
|
| 3661 |
+
shoulder bag
|
| 3662 |
+
shovel
|
| 3663 |
+
showcase
|
| 3664 |
+
shower
|
| 3665 |
+
shower cap
|
| 3666 |
+
shower curtain
|
| 3667 |
+
shower door
|
| 3668 |
+
shower head
|
| 3669 |
+
shredder
|
| 3670 |
+
shrew
|
| 3671 |
+
shrimp
|
| 3672 |
+
shrine
|
| 3673 |
+
shrub
|
| 3674 |
+
shutter
|
| 3675 |
+
siamese
|
| 3676 |
+
siberia
|
| 3677 |
+
sibling
|
| 3678 |
+
side
|
| 3679 |
+
side cabinet
|
| 3680 |
+
side dish
|
| 3681 |
+
sidecar
|
| 3682 |
+
sideline
|
| 3683 |
+
siding
|
| 3684 |
+
sign
|
| 3685 |
+
signage
|
| 3686 |
+
signal
|
| 3687 |
+
signature
|
| 3688 |
+
silk
|
| 3689 |
+
silk stocking
|
| 3690 |
+
silo
|
| 3691 |
+
silver
|
| 3692 |
+
silver medal
|
| 3693 |
+
silverware
|
| 3694 |
+
sing
|
| 3695 |
+
singe
|
| 3696 |
+
singer
|
| 3697 |
+
sink
|
| 3698 |
+
sip
|
| 3699 |
+
sit
|
| 3700 |
+
sitting
|
| 3701 |
+
skate park
|
| 3702 |
+
skateboard
|
| 3703 |
+
skateboarder
|
| 3704 |
+
skater
|
| 3705 |
+
skating rink
|
| 3706 |
+
skeleton
|
| 3707 |
+
sketch
|
| 3708 |
+
skewer
|
| 3709 |
+
ski
|
| 3710 |
+
ski boot
|
| 3711 |
+
ski equipment
|
| 3712 |
+
ski jacket
|
| 3713 |
+
ski lift
|
| 3714 |
+
ski pole
|
| 3715 |
+
ski resort
|
| 3716 |
+
snowboard
|
| 3717 |
+
skier
|
| 3718 |
+
skiing shoes
|
| 3719 |
+
skin
|
| 3720 |
+
skull
|
| 3721 |
+
skullcap
|
| 3722 |
+
sky
|
| 3723 |
+
sky tower
|
| 3724 |
+
skylight
|
| 3725 |
+
skyline
|
| 3726 |
+
skyscraper
|
| 3727 |
+
slalom
|
| 3728 |
+
slate
|
| 3729 |
+
sleigh
|
| 3730 |
+
sleep
|
| 3731 |
+
sleeping bag
|
| 3732 |
+
sleepwear
|
| 3733 |
+
sleeve
|
| 3734 |
+
slice
|
| 3735 |
+
slide
|
| 3736 |
+
slider
|
| 3737 |
+
sling
|
| 3738 |
+
slope
|
| 3739 |
+
slot
|
| 3740 |
+
slot machine
|
| 3741 |
+
sloth
|
| 3742 |
+
slow cooker
|
| 3743 |
+
slug
|
| 3744 |
+
slum
|
| 3745 |
+
smell
|
| 3746 |
+
smile
|
| 3747 |
+
smoke
|
| 3748 |
+
snack
|
| 3749 |
+
snail
|
| 3750 |
+
snake
|
| 3751 |
+
snapper
|
| 3752 |
+
snapshot
|
| 3753 |
+
snorkel
|
| 3754 |
+
snout
|
| 3755 |
+
snow
|
| 3756 |
+
snow leopard
|
| 3757 |
+
snow mountain
|
| 3758 |
+
snowball
|
| 3759 |
+
snowboarder
|
| 3760 |
+
snowfield
|
| 3761 |
+
snowflake
|
| 3762 |
+
snowman
|
| 3763 |
+
snowmobile
|
| 3764 |
+
snowplow
|
| 3765 |
+
snowshoe
|
| 3766 |
+
snowy
|
| 3767 |
+
soap
|
| 3768 |
+
soap bubble
|
| 3769 |
+
soap dispenser
|
| 3770 |
+
soccer goalkeeper
|
| 3771 |
+
socialite
|
| 3772 |
+
sock
|
| 3773 |
+
socket
|
| 3774 |
+
soda
|
| 3775 |
+
softball
|
| 3776 |
+
software
|
| 3777 |
+
solar battery
|
| 3778 |
+
soldier
|
| 3779 |
+
solo
|
| 3780 |
+
solution
|
| 3781 |
+
sombrero
|
| 3782 |
+
song
|
| 3783 |
+
sound
|
| 3784 |
+
soup
|
| 3785 |
+
soup bowl
|
| 3786 |
+
soupspoon
|
| 3787 |
+
sour cream
|
| 3788 |
+
souvenir
|
| 3789 |
+
soybean milk
|
| 3790 |
+
spa
|
| 3791 |
+
space
|
| 3792 |
+
space shuttle
|
| 3793 |
+
space station
|
| 3794 |
+
spacecraft
|
| 3795 |
+
spaghetti
|
| 3796 |
+
span
|
| 3797 |
+
wrench
|
| 3798 |
+
spark
|
| 3799 |
+
sparkle
|
| 3800 |
+
sparkler
|
| 3801 |
+
sparkling wine
|
| 3802 |
+
sparrow
|
| 3803 |
+
spatula
|
| 3804 |
+
speaker
|
| 3805 |
+
spectator
|
| 3806 |
+
speech bubble
|
| 3807 |
+
speed limit
|
| 3808 |
+
speed limit sign
|
| 3809 |
+
speedboat
|
| 3810 |
+
speedometer
|
| 3811 |
+
sphere
|
| 3812 |
+
spice
|
| 3813 |
+
spice rack
|
| 3814 |
+
spider
|
| 3815 |
+
spider web
|
| 3816 |
+
spike
|
| 3817 |
+
spin
|
| 3818 |
+
spinach
|
| 3819 |
+
spire
|
| 3820 |
+
splash
|
| 3821 |
+
sponge
|
| 3822 |
+
spoon
|
| 3823 |
+
sport association
|
| 3824 |
+
sport equipment
|
| 3825 |
+
sport team
|
| 3826 |
+
sports ball
|
| 3827 |
+
sports equipment
|
| 3828 |
+
sports meet
|
| 3829 |
+
sportswear
|
| 3830 |
+
dot
|
| 3831 |
+
spray
|
| 3832 |
+
spread
|
| 3833 |
+
spring
|
| 3834 |
+
spring roll
|
| 3835 |
+
sprinkle
|
| 3836 |
+
sprinkler
|
| 3837 |
+
sprout
|
| 3838 |
+
spruce
|
| 3839 |
+
spruce forest
|
| 3840 |
+
squad
|
| 3841 |
+
square
|
| 3842 |
+
squash
|
| 3843 |
+
squat
|
| 3844 |
+
squeeze
|
| 3845 |
+
squid
|
| 3846 |
+
squirrel
|
| 3847 |
+
water gun
|
| 3848 |
+
stab
|
| 3849 |
+
stable
|
| 3850 |
+
stack
|
| 3851 |
+
stadium
|
| 3852 |
+
staff
|
| 3853 |
+
stage
|
| 3854 |
+
stage light
|
| 3855 |
+
stagecoach
|
| 3856 |
+
stain
|
| 3857 |
+
stainless steel
|
| 3858 |
+
stair
|
| 3859 |
+
stairs
|
| 3860 |
+
stairwell
|
| 3861 |
+
stall
|
| 3862 |
+
stallion
|
| 3863 |
+
stand
|
| 3864 |
+
standing
|
| 3865 |
+
staple
|
| 3866 |
+
stapler
|
| 3867 |
+
star
|
| 3868 |
+
stare
|
| 3869 |
+
starfish
|
| 3870 |
+
starfruit
|
| 3871 |
+
starling
|
| 3872 |
+
state park
|
| 3873 |
+
state school
|
| 3874 |
+
station
|
| 3875 |
+
stationary bicycle
|
| 3876 |
+
stationery
|
| 3877 |
+
statue
|
| 3878 |
+
steak
|
| 3879 |
+
steak knife
|
| 3880 |
+
steam
|
| 3881 |
+
steam engine
|
| 3882 |
+
steam locomotive
|
| 3883 |
+
steam train
|
| 3884 |
+
steamed bread
|
| 3885 |
+
steel
|
| 3886 |
+
steering wheel
|
| 3887 |
+
stem
|
| 3888 |
+
stencil
|
| 3889 |
+
step stool
|
| 3890 |
+
stereo
|
| 3891 |
+
stethoscope
|
| 3892 |
+
stew
|
| 3893 |
+
stick
|
| 3894 |
+
stick insect
|
| 3895 |
+
sticker
|
| 3896 |
+
still life
|
| 3897 |
+
stilt
|
| 3898 |
+
stingray
|
| 3899 |
+
stir
|
| 3900 |
+
stirrer
|
| 3901 |
+
stirrup
|
| 3902 |
+
sew
|
| 3903 |
+
stock
|
| 3904 |
+
stocking
|
| 3905 |
+
stomach
|
| 3906 |
+
stone building
|
| 3907 |
+
stone carving
|
| 3908 |
+
stone house
|
| 3909 |
+
stone mill
|
| 3910 |
+
stool
|
| 3911 |
+
stop
|
| 3912 |
+
stop at
|
| 3913 |
+
stop light
|
| 3914 |
+
stop sign
|
| 3915 |
+
stop watch
|
| 3916 |
+
traffic light
|
| 3917 |
+
storage box
|
| 3918 |
+
storage room
|
| 3919 |
+
tank
|
| 3920 |
+
store
|
| 3921 |
+
storefront
|
| 3922 |
+
stork
|
| 3923 |
+
storm
|
| 3924 |
+
storm cloud
|
| 3925 |
+
stormy
|
| 3926 |
+
stove
|
| 3927 |
+
poker
|
| 3928 |
+
straddle
|
| 3929 |
+
strainer
|
| 3930 |
+
strait
|
| 3931 |
+
strap
|
| 3932 |
+
straw
|
| 3933 |
+
straw hat
|
| 3934 |
+
strawberry
|
| 3935 |
+
stream
|
| 3936 |
+
street art
|
| 3937 |
+
street artist
|
| 3938 |
+
street corner
|
| 3939 |
+
street dog
|
| 3940 |
+
street food
|
| 3941 |
+
street light
|
| 3942 |
+
street market
|
| 3943 |
+
street photography
|
| 3944 |
+
street scene
|
| 3945 |
+
street sign
|
| 3946 |
+
street vendor
|
| 3947 |
+
stretch
|
| 3948 |
+
stretcher
|
| 3949 |
+
strike
|
| 3950 |
+
striker
|
| 3951 |
+
string
|
| 3952 |
+
string cheese
|
| 3953 |
+
strip
|
| 3954 |
+
stripe
|
| 3955 |
+
stroll
|
| 3956 |
+
structure
|
| 3957 |
+
studio
|
| 3958 |
+
studio shot
|
| 3959 |
+
stuff
|
| 3960 |
+
stuffed animal
|
| 3961 |
+
stuffed toy
|
| 3962 |
+
stuffing
|
| 3963 |
+
stump
|
| 3964 |
+
stunning
|
| 3965 |
+
stunt
|
| 3966 |
+
stupa
|
| 3967 |
+
style
|
| 3968 |
+
stylus
|
| 3969 |
+
submarine
|
| 3970 |
+
submarine sandwich
|
| 3971 |
+
submarine water
|
| 3972 |
+
suburb
|
| 3973 |
+
subway
|
| 3974 |
+
subway station
|
| 3975 |
+
subwoofer
|
| 3976 |
+
succulent
|
| 3977 |
+
suede
|
| 3978 |
+
sugar
|
| 3979 |
+
sugar bowl
|
| 3980 |
+
sugar cane
|
| 3981 |
+
sugar cube
|
| 3982 |
+
suit
|
| 3983 |
+
suite
|
| 3984 |
+
summer
|
| 3985 |
+
summer evening
|
| 3986 |
+
summit
|
| 3987 |
+
sun
|
| 3988 |
+
sun hat
|
| 3989 |
+
sunbathe
|
| 3990 |
+
sunday
|
| 3991 |
+
sundial
|
| 3992 |
+
sunflower
|
| 3993 |
+
sunflower field
|
| 3994 |
+
sunflower seed
|
| 3995 |
+
sunglasses
|
| 3996 |
+
sunny
|
| 3997 |
+
sunrise
|
| 3998 |
+
sunset
|
| 3999 |
+
sunshade
|
| 4000 |
+
sunshine
|
| 4001 |
+
super bowl
|
| 4002 |
+
sports car
|
| 4003 |
+
superhero
|
| 4004 |
+
supermarket
|
| 4005 |
+
supermarket shelf
|
| 4006 |
+
supermodel
|
| 4007 |
+
supporter
|
| 4008 |
+
surf
|
| 4009 |
+
surface
|
| 4010 |
+
surfboard
|
| 4011 |
+
surfer
|
| 4012 |
+
surgeon
|
| 4013 |
+
surgery
|
| 4014 |
+
surround
|
| 4015 |
+
sushi
|
| 4016 |
+
sushi bar
|
| 4017 |
+
suspenders
|
| 4018 |
+
suspension
|
| 4019 |
+
suspension bridge
|
| 4020 |
+
suv
|
| 4021 |
+
swallow
|
| 4022 |
+
swallowtail butterfly
|
| 4023 |
+
swamp
|
| 4024 |
+
swan
|
| 4025 |
+
swan boat
|
| 4026 |
+
sweat pant
|
| 4027 |
+
sweatband
|
| 4028 |
+
sweater
|
| 4029 |
+
sweatshirt
|
| 4030 |
+
sweet
|
| 4031 |
+
sweet potato
|
| 4032 |
+
swim
|
| 4033 |
+
swim cap
|
| 4034 |
+
swimmer
|
| 4035 |
+
swimming hole
|
| 4036 |
+
swimming pool
|
| 4037 |
+
swing
|
| 4038 |
+
swing bridge
|
| 4039 |
+
swinge
|
| 4040 |
+
swirl
|
| 4041 |
+
switch
|
| 4042 |
+
swivel chair
|
| 4043 |
+
sword
|
| 4044 |
+
swordfish
|
| 4045 |
+
symbol
|
| 4046 |
+
symmetry
|
| 4047 |
+
synagogue
|
| 4048 |
+
syringe
|
| 4049 |
+
syrup
|
| 4050 |
+
system
|
| 4051 |
+
t shirt
|
| 4052 |
+
t-shirt
|
| 4053 |
+
tabasco sauce
|
| 4054 |
+
tabby
|
| 4055 |
+
table tennis racket
|
| 4056 |
+
table top
|
| 4057 |
+
tablecloth
|
| 4058 |
+
tablet computer
|
| 4059 |
+
tableware
|
| 4060 |
+
tachometer
|
| 4061 |
+
tackle
|
| 4062 |
+
taco
|
| 4063 |
+
tae kwon do
|
| 4064 |
+
tai chi
|
| 4065 |
+
tail
|
| 4066 |
+
tailor
|
| 4067 |
+
take
|
| 4068 |
+
takeoff
|
| 4069 |
+
talk
|
| 4070 |
+
tambourine
|
| 4071 |
+
tan
|
| 4072 |
+
tangerine
|
| 4073 |
+
tape
|
| 4074 |
+
tapestry
|
| 4075 |
+
tarmac
|
| 4076 |
+
taro
|
| 4077 |
+
tarp
|
| 4078 |
+
tart
|
| 4079 |
+
tassel
|
| 4080 |
+
taste
|
| 4081 |
+
tatami
|
| 4082 |
+
tattoo
|
| 4083 |
+
tattoo artist
|
| 4084 |
+
tavern
|
| 4085 |
+
tea
|
| 4086 |
+
tea bag
|
| 4087 |
+
tea party
|
| 4088 |
+
tea plantation
|
| 4089 |
+
tea pot
|
| 4090 |
+
tea set
|
| 4091 |
+
teach
|
| 4092 |
+
teacher
|
| 4093 |
+
teacup
|
| 4094 |
+
teal
|
| 4095 |
+
team photo
|
| 4096 |
+
team presentation
|
| 4097 |
+
tear
|
| 4098 |
+
technician
|
| 4099 |
+
technology
|
| 4100 |
+
teddy
|
| 4101 |
+
tee
|
| 4102 |
+
teenager
|
| 4103 |
+
telegraph pole
|
| 4104 |
+
zoom lens
|
| 4105 |
+
telescope
|
| 4106 |
+
television
|
| 4107 |
+
television camera
|
| 4108 |
+
television room
|
| 4109 |
+
television studio
|
| 4110 |
+
temperature
|
| 4111 |
+
temple
|
| 4112 |
+
tempura
|
| 4113 |
+
tennis
|
| 4114 |
+
tennis court
|
| 4115 |
+
tennis match
|
| 4116 |
+
tennis net
|
| 4117 |
+
tennis player
|
| 4118 |
+
tennis racket
|
| 4119 |
+
tent
|
| 4120 |
+
tequila
|
| 4121 |
+
terminal
|
| 4122 |
+
terrace
|
| 4123 |
+
terrain
|
| 4124 |
+
terrarium
|
| 4125 |
+
territory
|
| 4126 |
+
test
|
| 4127 |
+
test match
|
| 4128 |
+
test tube
|
| 4129 |
+
text
|
| 4130 |
+
text message
|
| 4131 |
+
textile
|
| 4132 |
+
texture
|
| 4133 |
+
thanksgiving
|
| 4134 |
+
thanksgiving dinner
|
| 4135 |
+
theater
|
| 4136 |
+
theatre actor
|
| 4137 |
+
therapy
|
| 4138 |
+
thermometer
|
| 4139 |
+
thermos
|
| 4140 |
+
thermos bottle
|
| 4141 |
+
thermostat
|
| 4142 |
+
thicket
|
| 4143 |
+
thimble
|
| 4144 |
+
thing
|
| 4145 |
+
thinking
|
| 4146 |
+
thistle
|
| 4147 |
+
throne
|
| 4148 |
+
throne room
|
| 4149 |
+
throw
|
| 4150 |
+
throw pillow
|
| 4151 |
+
thunder
|
| 4152 |
+
thunderstorm
|
| 4153 |
+
thyme
|
| 4154 |
+
tiara
|
| 4155 |
+
tick
|
| 4156 |
+
ticket
|
| 4157 |
+
ticket booth
|
| 4158 |
+
tide pool
|
| 4159 |
+
tie
|
| 4160 |
+
tiger
|
| 4161 |
+
tight
|
| 4162 |
+
tile
|
| 4163 |
+
tile flooring
|
| 4164 |
+
tile roof
|
| 4165 |
+
tile wall
|
| 4166 |
+
tin
|
| 4167 |
+
tinfoil
|
| 4168 |
+
tinsel
|
| 4169 |
+
tiramisu
|
| 4170 |
+
tire
|
| 4171 |
+
tissue
|
| 4172 |
+
toast
|
| 4173 |
+
toaster
|
| 4174 |
+
tobacco
|
| 4175 |
+
tobacco pipe
|
| 4176 |
+
toddler
|
| 4177 |
+
toe
|
| 4178 |
+
tofu
|
| 4179 |
+
toilet bowl
|
| 4180 |
+
toilet seat
|
| 4181 |
+
toiletry
|
| 4182 |
+
tokyo tower
|
| 4183 |
+
tomato
|
| 4184 |
+
tomato sauce
|
| 4185 |
+
tomato soup
|
| 4186 |
+
tomb
|
| 4187 |
+
tong
|
| 4188 |
+
tongs
|
| 4189 |
+
tool
|
| 4190 |
+
toolbox
|
| 4191 |
+
toothbrush
|
| 4192 |
+
toothpaste
|
| 4193 |
+
toothpick
|
| 4194 |
+
topiary garden
|
| 4195 |
+
topping
|
| 4196 |
+
torch
|
| 4197 |
+
tornado
|
| 4198 |
+
tortilla
|
| 4199 |
+
tortoise
|
| 4200 |
+
tote bag
|
| 4201 |
+
totem pole
|
| 4202 |
+
totoro
|
| 4203 |
+
toucan
|
| 4204 |
+
touch
|
| 4205 |
+
touchdown
|
| 4206 |
+
tour
|
| 4207 |
+
tour bus
|
| 4208 |
+
tour guide
|
| 4209 |
+
tourist
|
| 4210 |
+
tourist attraction
|
| 4211 |
+
tournament
|
| 4212 |
+
tow truck
|
| 4213 |
+
towel
|
| 4214 |
+
towel bar
|
| 4215 |
+
tower block
|
| 4216 |
+
tower bridge
|
| 4217 |
+
town
|
| 4218 |
+
town square
|
| 4219 |
+
toy
|
| 4220 |
+
toy car
|
| 4221 |
+
toy gun
|
| 4222 |
+
toyshop
|
| 4223 |
+
track
|
| 4224 |
+
tractor
|
| 4225 |
+
trade
|
| 4226 |
+
tradition
|
| 4227 |
+
traditional
|
| 4228 |
+
traffic
|
| 4229 |
+
traffic cone
|
| 4230 |
+
traffic congestion
|
| 4231 |
+
traffic jam
|
| 4232 |
+
traffic sign
|
| 4233 |
+
trail
|
| 4234 |
+
trailer
|
| 4235 |
+
trailer truck
|
| 4236 |
+
train
|
| 4237 |
+
train bridge
|
| 4238 |
+
train car
|
| 4239 |
+
train interior
|
| 4240 |
+
train track
|
| 4241 |
+
train window
|
| 4242 |
+
trainer
|
| 4243 |
+
training
|
| 4244 |
+
training bench
|
| 4245 |
+
training ground
|
| 4246 |
+
trolley
|
| 4247 |
+
trampoline
|
| 4248 |
+
transformer
|
| 4249 |
+
transparency
|
| 4250 |
+
travel
|
| 4251 |
+
tray
|
| 4252 |
+
treadmill
|
| 4253 |
+
treat
|
| 4254 |
+
tree
|
| 4255 |
+
tree branch
|
| 4256 |
+
tree farm
|
| 4257 |
+
tree frog
|
| 4258 |
+
tree house
|
| 4259 |
+
tree root
|
| 4260 |
+
tree trunk
|
| 4261 |
+
trial
|
| 4262 |
+
triangle
|
| 4263 |
+
triathlon
|
| 4264 |
+
tribe
|
| 4265 |
+
tributary
|
| 4266 |
+
trick
|
| 4267 |
+
tricycle
|
| 4268 |
+
trim
|
| 4269 |
+
trio
|
| 4270 |
+
tripod
|
| 4271 |
+
trombone
|
| 4272 |
+
troop
|
| 4273 |
+
trophy
|
| 4274 |
+
trophy cup
|
| 4275 |
+
tropic
|
| 4276 |
+
trout
|
| 4277 |
+
truck
|
| 4278 |
+
truck driver
|
| 4279 |
+
tub
|
| 4280 |
+
tube
|
| 4281 |
+
tugboat
|
| 4282 |
+
tulip
|
| 4283 |
+
tuna
|
| 4284 |
+
tundra
|
| 4285 |
+
tunnel
|
| 4286 |
+
turbine
|
| 4287 |
+
turkey
|
| 4288 |
+
turn
|
| 4289 |
+
turnip
|
| 4290 |
+
turquoise
|
| 4291 |
+
turret
|
| 4292 |
+
turtle
|
| 4293 |
+
tusk
|
| 4294 |
+
tv actor
|
| 4295 |
+
tv cabinet
|
| 4296 |
+
tv drama
|
| 4297 |
+
tv genre
|
| 4298 |
+
tv personality
|
| 4299 |
+
tv show
|
| 4300 |
+
tv sitcom
|
| 4301 |
+
tv tower
|
| 4302 |
+
twig
|
| 4303 |
+
twilight
|
| 4304 |
+
twin
|
| 4305 |
+
twine
|
| 4306 |
+
twist
|
| 4307 |
+
type
|
| 4308 |
+
type on
|
| 4309 |
+
typewriter
|
| 4310 |
+
ukulele
|
| 4311 |
+
ultraman
|
| 4312 |
+
umbrella
|
| 4313 |
+
underclothes
|
| 4314 |
+
underwater
|
| 4315 |
+
unicorn
|
| 4316 |
+
uniform
|
| 4317 |
+
universe
|
| 4318 |
+
university
|
| 4319 |
+
up
|
| 4320 |
+
urban
|
| 4321 |
+
urinal
|
| 4322 |
+
urn
|
| 4323 |
+
use
|
| 4324 |
+
utensil
|
| 4325 |
+
utility room
|
| 4326 |
+
vacuum
|
| 4327 |
+
valley
|
| 4328 |
+
valve
|
| 4329 |
+
vampire
|
| 4330 |
+
van
|
| 4331 |
+
vanilla
|
| 4332 |
+
vanity
|
| 4333 |
+
variety
|
| 4334 |
+
vase
|
| 4335 |
+
vault
|
| 4336 |
+
vector cartoon illustration
|
| 4337 |
+
vector icon
|
| 4338 |
+
vegetable
|
| 4339 |
+
vegetable garden
|
| 4340 |
+
vegetable market
|
| 4341 |
+
vegetation
|
| 4342 |
+
vehicle
|
| 4343 |
+
veil
|
| 4344 |
+
vein
|
| 4345 |
+
velvet
|
| 4346 |
+
vending machine
|
| 4347 |
+
vendor
|
| 4348 |
+
vent
|
| 4349 |
+
vespa
|
| 4350 |
+
vessel
|
| 4351 |
+
vest
|
| 4352 |
+
vet
|
| 4353 |
+
veteran
|
| 4354 |
+
veterinarians office
|
| 4355 |
+
viaduct
|
| 4356 |
+
video
|
| 4357 |
+
video camera
|
| 4358 |
+
video game
|
| 4359 |
+
videotape
|
| 4360 |
+
view mirror
|
| 4361 |
+
vigil
|
| 4362 |
+
villa
|
| 4363 |
+
village
|
| 4364 |
+
vine
|
| 4365 |
+
vinegar
|
| 4366 |
+
vineyard
|
| 4367 |
+
violence
|
| 4368 |
+
violet
|
| 4369 |
+
violin
|
| 4370 |
+
violinist
|
| 4371 |
+
violist
|
| 4372 |
+
vision
|
| 4373 |
+
visor
|
| 4374 |
+
vodka
|
| 4375 |
+
volcano
|
| 4376 |
+
volleyball
|
| 4377 |
+
volleyball court
|
| 4378 |
+
volleyball player
|
| 4379 |
+
volunteer
|
| 4380 |
+
voyage
|
| 4381 |
+
vulture
|
| 4382 |
+
waffle
|
| 4383 |
+
waffle iron
|
| 4384 |
+
wagon
|
| 4385 |
+
wagon wheel
|
| 4386 |
+
waist
|
| 4387 |
+
waiter
|
| 4388 |
+
waiting hall
|
| 4389 |
+
waiting room
|
| 4390 |
+
walk
|
| 4391 |
+
walking
|
| 4392 |
+
walking cane
|
| 4393 |
+
wall clock
|
| 4394 |
+
wallpaper
|
| 4395 |
+
walnut
|
| 4396 |
+
walrus
|
| 4397 |
+
war
|
| 4398 |
+
warehouse
|
| 4399 |
+
warm
|
| 4400 |
+
warning sign
|
| 4401 |
+
warrior
|
| 4402 |
+
warship
|
| 4403 |
+
warthog
|
| 4404 |
+
wash
|
| 4405 |
+
washer
|
| 4406 |
+
washing
|
| 4407 |
+
washing machine
|
| 4408 |
+
wasp
|
| 4409 |
+
waste
|
| 4410 |
+
waste container
|
| 4411 |
+
watch
|
| 4412 |
+
water
|
| 4413 |
+
water bird
|
| 4414 |
+
water buffalo
|
| 4415 |
+
water cooler
|
| 4416 |
+
water drop
|
| 4417 |
+
water feature
|
| 4418 |
+
water heater
|
| 4419 |
+
water level
|
| 4420 |
+
water lily
|
| 4421 |
+
water park
|
| 4422 |
+
water pipe
|
| 4423 |
+
water purifier
|
| 4424 |
+
water ski
|
| 4425 |
+
water sport
|
| 4426 |
+
water surface
|
| 4427 |
+
water tower
|
| 4428 |
+
watercolor
|
| 4429 |
+
watercolor illustration
|
| 4430 |
+
watercolor painting
|
| 4431 |
+
waterfall
|
| 4432 |
+
watering can
|
| 4433 |
+
watermark overlay stamp
|
| 4434 |
+
watermelon
|
| 4435 |
+
waterproof jacket
|
| 4436 |
+
waterway
|
| 4437 |
+
wave
|
| 4438 |
+
wax
|
| 4439 |
+
weapon
|
| 4440 |
+
wear
|
| 4441 |
+
weather
|
| 4442 |
+
vane
|
| 4443 |
+
web
|
| 4444 |
+
webcam
|
| 4445 |
+
wedding
|
| 4446 |
+
wedding ring
|
| 4447 |
+
wedding bouquet
|
| 4448 |
+
wedding cake
|
| 4449 |
+
wedding couple
|
| 4450 |
+
wedding invitation
|
| 4451 |
+
wedding party
|
| 4452 |
+
wedding photo
|
| 4453 |
+
wedding photographer
|
| 4454 |
+
wedding photography
|
| 4455 |
+
wedding reception
|
| 4456 |
+
wedge
|
| 4457 |
+
weed
|
| 4458 |
+
weight
|
| 4459 |
+
weight scale
|
| 4460 |
+
welder
|
| 4461 |
+
well
|
| 4462 |
+
western food
|
| 4463 |
+
western restaurant
|
| 4464 |
+
wet
|
| 4465 |
+
wet bar
|
| 4466 |
+
wet suit
|
| 4467 |
+
wetland
|
| 4468 |
+
wetsuit
|
| 4469 |
+
whale
|
| 4470 |
+
whale shark
|
| 4471 |
+
wheat
|
| 4472 |
+
wheat field
|
| 4473 |
+
wheel
|
| 4474 |
+
wheelchair
|
| 4475 |
+
wheelie
|
| 4476 |
+
whipped cream
|
| 4477 |
+
whisk
|
| 4478 |
+
whisker
|
| 4479 |
+
whiskey
|
| 4480 |
+
whistle
|
| 4481 |
+
white
|
| 4482 |
+
white house
|
| 4483 |
+
white wine
|
| 4484 |
+
whiteboard
|
| 4485 |
+
wicket
|
| 4486 |
+
wide
|
| 4487 |
+
wield
|
| 4488 |
+
wig
|
| 4489 |
+
Wii
|
| 4490 |
+
Wii controller
|
| 4491 |
+
wild
|
| 4492 |
+
wildebeest
|
| 4493 |
+
wildfire
|
| 4494 |
+
wildflower
|
| 4495 |
+
wildlife
|
| 4496 |
+
willow
|
| 4497 |
+
wind
|
| 4498 |
+
wind chime
|
| 4499 |
+
wind farm
|
| 4500 |
+
wind turbine
|
| 4501 |
+
windmill
|
| 4502 |
+
window
|
| 4503 |
+
window box
|
| 4504 |
+
window display
|
| 4505 |
+
window frame
|
| 4506 |
+
window screen
|
| 4507 |
+
window seat
|
| 4508 |
+
window sill
|
| 4509 |
+
wiper
|
| 4510 |
+
windshield
|
| 4511 |
+
windy
|
| 4512 |
+
wine bottle
|
| 4513 |
+
wine cooler
|
| 4514 |
+
wine cabinet
|
| 4515 |
+
wine cellar
|
| 4516 |
+
wine glass
|
| 4517 |
+
wine rack
|
| 4518 |
+
wine tasting
|
| 4519 |
+
winery
|
| 4520 |
+
wing
|
| 4521 |
+
winter
|
| 4522 |
+
winter melon
|
| 4523 |
+
winter morning
|
| 4524 |
+
winter scene
|
| 4525 |
+
winter sport
|
| 4526 |
+
winter storm
|
| 4527 |
+
wire
|
| 4528 |
+
wisteria
|
| 4529 |
+
witch
|
| 4530 |
+
witch hat
|
| 4531 |
+
wok
|
| 4532 |
+
wolf
|
| 4533 |
+
woman
|
| 4534 |
+
wood
|
| 4535 |
+
wood duck
|
| 4536 |
+
wood floor
|
| 4537 |
+
wood wall
|
| 4538 |
+
wood-burning stove
|
| 4539 |
+
wooden spoon
|
| 4540 |
+
woodland
|
| 4541 |
+
woodpecker
|
| 4542 |
+
woodworking plane
|
| 4543 |
+
wool
|
| 4544 |
+
job
|
| 4545 |
+
work card
|
| 4546 |
+
workbench
|
| 4547 |
+
worker
|
| 4548 |
+
workplace
|
| 4549 |
+
workshop
|
| 4550 |
+
world
|
| 4551 |
+
worm
|
| 4552 |
+
worship
|
| 4553 |
+
wound
|
| 4554 |
+
wrap
|
| 4555 |
+
wrap dress
|
| 4556 |
+
wrapping paper
|
| 4557 |
+
wrestle
|
| 4558 |
+
wrestler
|
| 4559 |
+
wrinkle
|
| 4560 |
+
wristband
|
| 4561 |
+
write
|
| 4562 |
+
writer
|
| 4563 |
+
writing
|
| 4564 |
+
writing brush
|
| 4565 |
+
writing desk
|
| 4566 |
+
yacht
|
| 4567 |
+
yak
|
| 4568 |
+
yard
|
| 4569 |
+
yellow
|
| 4570 |
+
yoga
|
| 4571 |
+
yoga mat
|
| 4572 |
+
yoghurt
|
| 4573 |
+
yoke
|
| 4574 |
+
yolk
|
| 4575 |
+
youth
|
| 4576 |
+
youth hostel
|
| 4577 |
+
yurt
|
| 4578 |
+
zebra
|
| 4579 |
+
zebra crossing
|
| 4580 |
+
zen garden
|
| 4581 |
+
zip
|
| 4582 |
+
zipper
|
| 4583 |
+
zombie
|
| 4584 |
+
zongzi
|
| 4585 |
+
zoo
|
ram/data/ram_tag_list_chinese.txt
ADDED
|
@@ -0,0 +1,4585 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
三维CG渲染
|
| 2 |
+
3d眼镜
|
| 3 |
+
算盘
|
| 4 |
+
鲍鱼
|
| 5 |
+
修道院
|
| 6 |
+
肚子
|
| 7 |
+
学院
|
| 8 |
+
附件
|
| 9 |
+
事故
|
| 10 |
+
手风琴
|
| 11 |
+
橡子
|
| 12 |
+
丙烯颜料
|
| 13 |
+
表演
|
| 14 |
+
行动
|
| 15 |
+
动作电影
|
| 16 |
+
活动
|
| 17 |
+
演员
|
| 18 |
+
改编本
|
| 19 |
+
添加
|
| 20 |
+
胶带
|
| 21 |
+
调整
|
| 22 |
+
成人
|
| 23 |
+
冒险
|
| 24 |
+
广告
|
| 25 |
+
天线
|
| 26 |
+
有氧运动
|
| 27 |
+
喷雾罐
|
| 28 |
+
爆炸头
|
| 29 |
+
农业
|
| 30 |
+
帮助
|
| 31 |
+
空调
|
| 32 |
+
空调系统
|
| 33 |
+
风向标
|
| 34 |
+
飞机客舱
|
| 35 |
+
飞机模型
|
| 36 |
+
机场
|
| 37 |
+
航线
|
| 38 |
+
客机
|
| 39 |
+
飞行员
|
| 40 |
+
飞机
|
| 41 |
+
飞机窗口
|
| 42 |
+
机场
|
| 43 |
+
机场跑道
|
| 44 |
+
航站楼
|
| 45 |
+
飞艇
|
| 46 |
+
航展
|
| 47 |
+
过道
|
| 48 |
+
警报
|
| 49 |
+
闹钟
|
| 50 |
+
信天翁
|
| 51 |
+
唱片
|
| 52 |
+
唱片封面
|
| 53 |
+
酒精
|
| 54 |
+
壁龛
|
| 55 |
+
水藻
|
| 56 |
+
胡同/球道
|
| 57 |
+
杏仁
|
| 58 |
+
芦荟
|
| 59 |
+
高山
|
| 60 |
+
羊驼
|
| 61 |
+
字母表
|
| 62 |
+
德国牧羊犬
|
| 63 |
+
圣坛
|
| 64 |
+
琥珀
|
| 65 |
+
救护车
|
| 66 |
+
秃鹰
|
| 67 |
+
美国短毛猫
|
| 68 |
+
紫水晶
|
| 69 |
+
圆形剧场
|
| 70 |
+
扩音器
|
| 71 |
+
游乐园
|
| 72 |
+
游乐设施
|
| 73 |
+
锚
|
| 74 |
+
古老的
|
| 75 |
+
海葵
|
| 76 |
+
天使
|
| 77 |
+
角
|
| 78 |
+
动物
|
| 79 |
+
动物雕塑
|
| 80 |
+
动物收容所
|
| 81 |
+
动画片
|
| 82 |
+
动画电影
|
| 83 |
+
动画师
|
| 84 |
+
动漫
|
| 85 |
+
脚踝
|
| 86 |
+
短袜
|
| 87 |
+
周年庆
|
| 88 |
+
风衣
|
| 89 |
+
蚂蚁
|
| 90 |
+
羚羊
|
| 91 |
+
古董
|
| 92 |
+
鹿角
|
| 93 |
+
铁砧
|
| 94 |
+
公寓
|
| 95 |
+
猿
|
| 96 |
+
应用程序
|
| 97 |
+
应用图标
|
| 98 |
+
出现
|
| 99 |
+
外观
|
| 100 |
+
开胃菜
|
| 101 |
+
掌声
|
| 102 |
+
苹果
|
| 103 |
+
苹果汁
|
| 104 |
+
苹果派
|
| 105 |
+
苹果树
|
| 106 |
+
苹果酱
|
| 107 |
+
设备
|
| 108 |
+
约定
|
| 109 |
+
通道
|
| 110 |
+
杏子
|
| 111 |
+
围裙
|
| 112 |
+
浅绿色
|
| 113 |
+
水族馆
|
| 114 |
+
观赏鱼
|
| 115 |
+
渡槽
|
| 116 |
+
游乐中心
|
| 117 |
+
商场游戏机
|
| 118 |
+
拱门
|
| 119 |
+
拱桥
|
| 120 |
+
考古现场
|
| 121 |
+
射箭
|
| 122 |
+
群岛
|
| 123 |
+
建筑师
|
| 124 |
+
建筑设计
|
| 125 |
+
档案
|
| 126 |
+
拱门
|
| 127 |
+
地区
|
| 128 |
+
竞技场
|
| 129 |
+
争论
|
| 130 |
+
手臂
|
| 131 |
+
穿山甲
|
| 132 |
+
臂章
|
| 133 |
+
扶手椅
|
| 134 |
+
衣柜
|
| 135 |
+
盔甲
|
| 136 |
+
军队
|
| 137 |
+
军事基地
|
| 138 |
+
坦克
|
| 139 |
+
阵列
|
| 140 |
+
逮捕
|
| 141 |
+
箭头
|
| 142 |
+
艺术
|
| 143 |
+
艺术展
|
| 144 |
+
美术馆
|
| 145 |
+
艺术印刷品
|
| 146 |
+
艺术学校
|
| 147 |
+
艺术工作室
|
| 148 |
+
艺术矢量插图
|
| 149 |
+
洋蓟
|
| 150 |
+
文章
|
| 151 |
+
手工艺品
|
| 152 |
+
艺术家
|
| 153 |
+
艺术阁楼
|
| 154 |
+
灰
|
| 155 |
+
烟灰缸
|
| 156 |
+
亚洲寺庙
|
| 157 |
+
芦笋
|
| 158 |
+
沥青道路
|
| 159 |
+
组装
|
| 160 |
+
集会
|
| 161 |
+
生产流水线
|
| 162 |
+
协会
|
| 163 |
+
宇航员
|
| 164 |
+
天文学家
|
| 165 |
+
运动员
|
| 166 |
+
运动
|
| 167 |
+
地图集
|
| 168 |
+
自助取款机
|
| 169 |
+
大气层
|
| 170 |
+
中庭
|
| 171 |
+
连接
|
| 172 |
+
战斗机
|
| 173 |
+
参加
|
| 174 |
+
吸引力
|
| 175 |
+
全地形车
|
| 176 |
+
茄子
|
| 177 |
+
拍卖
|
| 178 |
+
奥迪汽车
|
| 179 |
+
音频
|
| 180 |
+
礼堂
|
| 181 |
+
极光
|
| 182 |
+
作者
|
| 183 |
+
汽车厂
|
| 184 |
+
汽车修理工
|
| 185 |
+
汽车零件
|
| 186 |
+
车展
|
| 187 |
+
汽车展厅
|
| 188 |
+
汽车电池
|
| 189 |
+
汽车制造
|
| 190 |
+
汽车模型
|
| 191 |
+
汽车
|
| 192 |
+
秋天
|
| 193 |
+
秋天的森林
|
| 194 |
+
秋天的叶子
|
| 195 |
+
秋天的公园
|
| 196 |
+
秋天的树
|
| 197 |
+
阿凡达
|
| 198 |
+
林荫大道
|
| 199 |
+
飞行员太阳镜
|
| 200 |
+
牛油果
|
| 201 |
+
奖品
|
| 202 |
+
颁奖典礼
|
| 203 |
+
获奖者
|
| 204 |
+
棚
|
| 205 |
+
斧头
|
| 206 |
+
杜鹃花
|
| 207 |
+
狒狒
|
| 208 |
+
婴儿
|
| 209 |
+
奶瓶
|
| 210 |
+
婴儿车
|
| 211 |
+
婴儿衣服
|
| 212 |
+
小象
|
| 213 |
+
婴儿食品
|
| 214 |
+
婴儿座椅
|
| 215 |
+
迎婴派对
|
| 216 |
+
背后/后面
|
| 217 |
+
背景
|
| 218 |
+
背光
|
| 219 |
+
背包
|
| 220 |
+
后院
|
| 221 |
+
培根
|
| 222 |
+
徽章
|
| 223 |
+
獾
|
| 224 |
+
荒地
|
| 225 |
+
羽毛球运动
|
| 226 |
+
羽毛球拍
|
| 227 |
+
袋子
|
| 228 |
+
面包圈
|
| 229 |
+
风笛
|
| 230 |
+
法棍
|
| 231 |
+
诱饵
|
| 232 |
+
焙烤食品
|
| 233 |
+
面包师
|
| 234 |
+
面包店
|
| 235 |
+
烘焙
|
| 236 |
+
烤盘
|
| 237 |
+
平衡
|
| 238 |
+
平衡车
|
| 239 |
+
阳台
|
| 240 |
+
球
|
| 241 |
+
球池
|
| 242 |
+
芭蕾舞女演员
|
| 243 |
+
芭蕾舞
|
| 244 |
+
芭蕾舞演员
|
| 245 |
+
芭蕾舞裙
|
| 246 |
+
气球
|
| 247 |
+
气球拱门
|
| 248 |
+
棒球手
|
| 249 |
+
舞厅
|
| 250 |
+
竹子
|
| 251 |
+
竹林
|
| 252 |
+
香蕉
|
| 253 |
+
香蕉面包
|
| 254 |
+
香蕉叶子
|
| 255 |
+
香蕉树
|
| 256 |
+
乐队
|
| 257 |
+
创可贴
|
| 258 |
+
绷带
|
| 259 |
+
头巾
|
| 260 |
+
束发带
|
| 261 |
+
刘海
|
| 262 |
+
手镯
|
| 263 |
+
栏杆
|
| 264 |
+
五弦琴
|
| 265 |
+
银行
|
| 266 |
+
银行卡
|
| 267 |
+
银行金库
|
| 268 |
+
纸币
|
| 269 |
+
横幅/旗帜
|
| 270 |
+
宴会
|
| 271 |
+
宴会厅
|
| 272 |
+
榕树
|
| 273 |
+
包子
|
| 274 |
+
洗礼
|
| 275 |
+
酒吧
|
| 276 |
+
条形码
|
| 277 |
+
高脚凳
|
| 278 |
+
烧烤
|
| 279 |
+
烧烤架
|
| 280 |
+
杠铃
|
| 281 |
+
理发师
|
| 282 |
+
理发店
|
| 283 |
+
芭比娃娃
|
| 284 |
+
驳船
|
| 285 |
+
咖啡师
|
| 286 |
+
树皮
|
| 287 |
+
大麦
|
| 288 |
+
谷仓
|
| 289 |
+
仓鸮
|
| 290 |
+
挡光板
|
| 291 |
+
桶
|
| 292 |
+
路障
|
| 293 |
+
屏障
|
| 294 |
+
手推车
|
| 295 |
+
酒保
|
| 296 |
+
棒球
|
| 297 |
+
棒球基地
|
| 298 |
+
棒球棒
|
| 299 |
+
棒球帽
|
| 300 |
+
棒球场
|
| 301 |
+
棒球比赛
|
| 302 |
+
棒球手套
|
| 303 |
+
棒球投手
|
| 304 |
+
棒球队
|
| 305 |
+
棒球制服
|
| 306 |
+
地下室
|
| 307 |
+
罗勒
|
| 308 |
+
水盆
|
| 309 |
+
篮子
|
| 310 |
+
篮子
|
| 311 |
+
篮球
|
| 312 |
+
篮球篮板
|
| 313 |
+
篮球教练
|
| 314 |
+
篮球场
|
| 315 |
+
篮球比赛
|
| 316 |
+
篮球框
|
| 317 |
+
篮球运动员
|
| 318 |
+
篮球馆
|
| 319 |
+
篮球队
|
| 320 |
+
贝斯
|
| 321 |
+
低音吉他
|
| 322 |
+
低音喇叭
|
| 323 |
+
贝斯手
|
| 324 |
+
球棒/球拍
|
| 325 |
+
浴室
|
| 326 |
+
水浴加热器
|
| 327 |
+
浴垫
|
| 328 |
+
浴巾
|
| 329 |
+
泳装
|
| 330 |
+
浴袍
|
| 331 |
+
浴室
|
| 332 |
+
浴室配件
|
| 333 |
+
浴室柜
|
| 334 |
+
浴室门
|
| 335 |
+
浴室镜子
|
| 336 |
+
浴室水槽
|
| 337 |
+
卫生纸
|
| 338 |
+
浴室窗户
|
| 339 |
+
蝙蝠侠
|
| 340 |
+
棒子
|
| 341 |
+
接连猛打/击球员
|
| 342 |
+
电池
|
| 343 |
+
战斗
|
| 344 |
+
战绳
|
| 345 |
+
战舰
|
| 346 |
+
海湾
|
| 347 |
+
海湾大桥
|
| 348 |
+
凸窗
|
| 349 |
+
杨梅
|
| 350 |
+
集市
|
| 351 |
+
海滩
|
| 352 |
+
沙滩球
|
| 353 |
+
沙滩椅
|
| 354 |
+
海滨别墅
|
| 355 |
+
海滩小屋
|
| 356 |
+
沙滩毛巾
|
| 357 |
+
沙滩排球
|
| 358 |
+
灯塔
|
| 359 |
+
珠子
|
| 360 |
+
比格犬
|
| 361 |
+
鸟嘴
|
| 362 |
+
烧杯
|
| 363 |
+
横梁
|
| 364 |
+
豆子
|
| 365 |
+
豆袋椅
|
| 366 |
+
豆袋
|
| 367 |
+
熊
|
| 368 |
+
幼熊
|
| 369 |
+
胡子
|
| 370 |
+
野兽
|
| 371 |
+
击打/击败
|
| 372 |
+
美丽的
|
| 373 |
+
美丽
|
| 374 |
+
美容院
|
| 375 |
+
海狸
|
| 376 |
+
床
|
| 377 |
+
床单
|
| 378 |
+
床架
|
| 379 |
+
卧室
|
| 380 |
+
床上用品
|
| 381 |
+
便盆
|
| 382 |
+
卧室窗户
|
| 383 |
+
床头灯
|
| 384 |
+
蜜蜂
|
| 385 |
+
山毛榉
|
| 386 |
+
牛肉
|
| 387 |
+
养蜂人
|
| 388 |
+
蜂鸣器
|
| 389 |
+
啤酒
|
| 390 |
+
啤酒瓶
|
| 391 |
+
啤酒罐
|
| 392 |
+
啤酒花园
|
| 393 |
+
啤酒杯
|
| 394 |
+
啤酒馆
|
| 395 |
+
甜菜
|
| 396 |
+
甲虫
|
| 397 |
+
米色
|
| 398 |
+
时钟
|
| 399 |
+
甜椒
|
| 400 |
+
钟楼
|
| 401 |
+
皮带
|
| 402 |
+
皮带扣
|
| 403 |
+
长凳
|
| 404 |
+
弯曲
|
| 405 |
+
孟加拉虎
|
| 406 |
+
盒饭
|
| 407 |
+
贝雷帽
|
| 408 |
+
浆果
|
| 409 |
+
停泊位
|
| 410 |
+
饮料
|
| 411 |
+
围嘴
|
| 412 |
+
拌饭
|
| 413 |
+
圣经
|
| 414 |
+
比熊
|
| 415 |
+
自行车
|
| 416 |
+
自行车头盔
|
| 417 |
+
自行车车轮
|
| 418 |
+
自行车骑士
|
| 419 |
+
坐浴盆
|
| 420 |
+
大本钟
|
| 421 |
+
自行车道
|
| 422 |
+
自行车道
|
| 423 |
+
自行车赛
|
| 424 |
+
骑车
|
| 425 |
+
比基尼
|
| 426 |
+
比基尼上衣
|
| 427 |
+
账单
|
| 428 |
+
台球
|
| 429 |
+
广告牌
|
| 430 |
+
台球台
|
| 431 |
+
垃圾箱
|
| 432 |
+
活页夹
|
| 433 |
+
双筒望远镜
|
| 434 |
+
生物学实验室
|
| 435 |
+
双翼飞机
|
| 436 |
+
桦木
|
| 437 |
+
桦树
|
| 438 |
+
鸟
|
| 439 |
+
鸟池
|
| 440 |
+
喂鸟器
|
| 441 |
+
鸟舍
|
| 442 |
+
鸟巢
|
| 443 |
+
鸟池
|
| 444 |
+
鸟笼
|
| 445 |
+
出生
|
| 446 |
+
生日
|
| 447 |
+
生日蛋糕
|
| 448 |
+
生日蜡烛
|
| 449 |
+
生日贺卡
|
| 450 |
+
生日聚会
|
| 451 |
+
饼干
|
| 452 |
+
主教
|
| 453 |
+
野牛
|
| 454 |
+
钻头
|
| 455 |
+
咬
|
| 456 |
+
黑色
|
| 457 |
+
黑山羊
|
| 458 |
+
黑莓
|
| 459 |
+
乌鸦
|
| 460 |
+
黑板
|
| 461 |
+
铁匠
|
| 462 |
+
叶片/刀片
|
| 463 |
+
毯子/覆盖层
|
| 464 |
+
��动外套
|
| 465 |
+
看台
|
| 466 |
+
搅拌机
|
| 467 |
+
祝福
|
| 468 |
+
窗帘
|
| 469 |
+
眼罩
|
| 470 |
+
闪光
|
| 471 |
+
暴风雪
|
| 472 |
+
块
|
| 473 |
+
博客
|
| 474 |
+
血
|
| 475 |
+
开花
|
| 476 |
+
花
|
| 477 |
+
女装衬衫
|
| 478 |
+
吹
|
| 479 |
+
吹风机
|
| 480 |
+
河豚
|
| 481 |
+
蓝色
|
| 482 |
+
蓝色艺术家
|
| 483 |
+
蓝松鸦
|
| 484 |
+
蓝天
|
| 485 |
+
蓝莓
|
| 486 |
+
蓝知更鸟
|
| 487 |
+
猪
|
| 488 |
+
板子
|
| 489 |
+
板擦
|
| 490 |
+
棋盘游戏
|
| 491 |
+
木板路
|
| 492 |
+
船
|
| 493 |
+
船甲板
|
| 494 |
+
船屋
|
| 495 |
+
桨
|
| 496 |
+
乘船
|
| 497 |
+
浮标
|
| 498 |
+
山猫
|
| 499 |
+
躯干
|
| 500 |
+
身体冲浪板
|
| 501 |
+
健美运动员
|
| 502 |
+
水煮鸡蛋
|
| 503 |
+
锅炉
|
| 504 |
+
饰扣式领带
|
| 505 |
+
门闩
|
| 506 |
+
炸弹
|
| 507 |
+
轰炸机
|
| 508 |
+
披肩榛鸡
|
| 509 |
+
骨骼
|
| 510 |
+
篝火
|
| 511 |
+
阀盖
|
| 512 |
+
盆景
|
| 513 |
+
书
|
| 514 |
+
书籍封面
|
| 515 |
+
书柜
|
| 516 |
+
文件夹
|
| 517 |
+
书签
|
| 518 |
+
书架
|
| 519 |
+
书店
|
| 520 |
+
远程拾音器
|
| 521 |
+
推动
|
| 522 |
+
靴子
|
| 523 |
+
边界
|
| 524 |
+
边境牧羊犬
|
| 525 |
+
植物园
|
| 526 |
+
瓶
|
| 527 |
+
瓶盖
|
| 528 |
+
开瓶器
|
| 529 |
+
螺旋开瓶器
|
| 530 |
+
三角梅
|
| 531 |
+
巨石
|
| 532 |
+
花束
|
| 533 |
+
时装店
|
| 534 |
+
精品酒店
|
| 535 |
+
鞠躬/蝴蝶结
|
| 536 |
+
领结
|
| 537 |
+
弓形窗
|
| 538 |
+
碗
|
| 539 |
+
保龄球运动
|
| 540 |
+
保龄球馆
|
| 541 |
+
保龄球
|
| 542 |
+
保龄球设备
|
| 543 |
+
盒子
|
| 544 |
+
箱形梁桥
|
| 545 |
+
箱龟
|
| 546 |
+
拳击手
|
| 547 |
+
内裤
|
| 548 |
+
拳击
|
| 549 |
+
拳击手套
|
| 550 |
+
拳击台
|
| 551 |
+
男孩
|
| 552 |
+
支撑物
|
| 553 |
+
支架
|
| 554 |
+
辫子
|
| 555 |
+
大脑
|
| 556 |
+
刹车
|
| 557 |
+
刹车灯
|
| 558 |
+
树枝
|
| 559 |
+
商标
|
| 560 |
+
白兰地
|
| 561 |
+
黄铜
|
| 562 |
+
黄铜牌匾
|
| 563 |
+
面包
|
| 564 |
+
面包箱
|
| 565 |
+
休息
|
| 566 |
+
早餐
|
| 567 |
+
防浪堤
|
| 568 |
+
胸部
|
| 569 |
+
啤酒厂
|
| 570 |
+
砖块
|
| 571 |
+
砖建筑物
|
| 572 |
+
墙
|
| 573 |
+
砖块
|
| 574 |
+
婚纱
|
| 575 |
+
新娘
|
| 576 |
+
新郎
|
| 577 |
+
伴娘
|
| 578 |
+
桥
|
| 579 |
+
缰绳
|
| 580 |
+
公文包
|
| 581 |
+
明亮的
|
| 582 |
+
边沿
|
| 583 |
+
钻头
|
| 584 |
+
广播
|
| 585 |
+
西兰花
|
| 586 |
+
青铜
|
| 587 |
+
铜牌
|
| 588 |
+
青铜雕塑
|
| 589 |
+
青铜雕像
|
| 590 |
+
胸针
|
| 591 |
+
小溪
|
| 592 |
+
扫帚
|
| 593 |
+
肉汤
|
| 594 |
+
棕色
|
| 595 |
+
棕熊
|
| 596 |
+
巧克力蛋糕
|
| 597 |
+
早午餐
|
| 598 |
+
浅黑肤色的女人
|
| 599 |
+
刷子
|
| 600 |
+
郊狼
|
| 601 |
+
包菜
|
| 602 |
+
气泡
|
| 603 |
+
泡泡糖
|
| 604 |
+
珍珠奶茶
|
| 605 |
+
斗柜
|
| 606 |
+
盾牌
|
| 607 |
+
芽
|
| 608 |
+
佛
|
| 609 |
+
水牛
|
| 610 |
+
自助餐
|
| 611 |
+
昆虫
|
| 612 |
+
建造
|
| 613 |
+
建造者
|
| 614 |
+
建筑
|
| 615 |
+
积木
|
| 616 |
+
建筑立面
|
| 617 |
+
建筑材料
|
| 618 |
+
灯
|
| 619 |
+
牛
|
| 620 |
+
斗牛犬
|
| 621 |
+
子弹
|
| 622 |
+
动车
|
| 623 |
+
公告栏
|
| 624 |
+
防弹背心
|
| 625 |
+
斗牛
|
| 626 |
+
扩音器
|
| 627 |
+
斗牛场
|
| 628 |
+
大黄蜂
|
| 629 |
+
保险杠
|
| 630 |
+
卷/地形起伏
|
| 631 |
+
捆
|
| 632 |
+
蹦极
|
| 633 |
+
双层床
|
| 634 |
+
地堡/击球
|
| 635 |
+
兔子
|
| 636 |
+
浮标
|
| 637 |
+
书桌
|
| 638 |
+
墓室
|
| 639 |
+
燃烧
|
| 640 |
+
玉米煎饼
|
| 641 |
+
公交车
|
| 642 |
+
公交车司机
|
| 643 |
+
公交车内部
|
| 644 |
+
公交车站
|
| 645 |
+
公交车站
|
| 646 |
+
公交车窗户
|
| 647 |
+
灌木
|
| 648 |
+
商业
|
| 649 |
+
名片
|
| 650 |
+
业务主管
|
| 651 |
+
商务西装
|
| 652 |
+
业务团队
|
| 653 |
+
女商人
|
| 654 |
+
商人
|
| 655 |
+
半身像
|
| 656 |
+
屠夫
|
| 657 |
+
肉铺
|
| 658 |
+
孤峰
|
| 659 |
+
黄油
|
| 660 |
+
奶油
|
| 661 |
+
蝴蝶
|
| 662 |
+
蝴蝶馆
|
| 663 |
+
按钮
|
| 664 |
+
梧桐树
|
| 665 |
+
购买
|
| 666 |
+
出租车
|
| 667 |
+
小屋
|
| 668 |
+
卷心菜
|
| 669 |
+
小屋/机舱
|
| 670 |
+
守车
|
| 671 |
+
储藏柜
|
| 672 |
+
橱柜
|
| 673 |
+
电缆
|
| 674 |
+
缆车
|
| 675 |
+
仙人掌
|
| 676 |
+
咖啡馆
|
| 677 |
+
食堂
|
| 678 |
+
笼子
|
| 679 |
+
蛋糕
|
| 680 |
+
蛋糕台
|
| 681 |
+
计算器
|
| 682 |
+
大锅
|
| 683 |
+
日历
|
| 684 |
+
小腿
|
| 685 |
+
通话
|
| 686 |
+
电话亭
|
| 687 |
+
书法
|
| 688 |
+
平静的
|
| 689 |
+
摄像机
|
| 690 |
+
骆驼
|
| 691 |
+
相机
|
| 692 |
+
相机镜头
|
| 693 |
+
迷彩
|
| 694 |
+
露营
|
| 695 |
+
露营者
|
| 696 |
+
篝火
|
| 697 |
+
露营
|
| 698 |
+
营地
|
| 699 |
+
校园
|
| 700 |
+
罐
|
| 701 |
+
开罐器
|
| 702 |
+
运河
|
| 703 |
+
金丝雀
|
| 704 |
+
癌症
|
| 705 |
+
蜡烛
|
| 706 |
+
烛台
|
| 707 |
+
糖果
|
| 708 |
+
块状糖
|
| 709 |
+
柺杖糖
|
| 710 |
+
糖果店
|
| 711 |
+
拐杖
|
| 712 |
+
罐子
|
| 713 |
+
大炮
|
| 714 |
+
树冠/顶棚
|
| 715 |
+
四柱床
|
| 716 |
+
香瓜
|
| 717 |
+
悬臂桥
|
| 718 |
+
帆布
|
| 719 |
+
峡谷
|
| 720 |
+
帽子
|
| 721 |
+
斗篷
|
| 722 |
+
科德角
|
| 723 |
+
卡布奇诺
|
| 724 |
+
胶囊
|
| 725 |
+
队长
|
| 726 |
+
捕获
|
| 727 |
+
车
|
| 728 |
+
汽车经销商
|
| 729 |
+
车门
|
| 730 |
+
汽车内饰
|
| 731 |
+
车标
|
| 732 |
+
后视镜
|
| 733 |
+
停车场
|
| 734 |
+
汽车座椅
|
| 735 |
+
车展
|
| 736 |
+
洗车
|
| 737 |
+
车窗
|
| 738 |
+
焦糖
|
| 739 |
+
卡片
|
| 740 |
+
纸牌游戏
|
| 741 |
+
纸板
|
| 742 |
+
纸板盒
|
| 743 |
+
羊毛衫
|
| 744 |
+
红衣凤头鸟
|
| 745 |
+
货物
|
| 746 |
+
货运飞机
|
| 747 |
+
货船
|
| 748 |
+
加勒比
|
| 749 |
+
康乃馨
|
| 750 |
+
狂欢节
|
| 751 |
+
食肉动物
|
| 752 |
+
旋转木马
|
| 753 |
+
鲤鱼
|
| 754 |
+
木匠
|
| 755 |
+
地毯
|
| 756 |
+
拖鞋
|
| 757 |
+
红雀
|
| 758 |
+
长途客车
|
| 759 |
+
斑点狗
|
| 760 |
+
航空母舰
|
| 761 |
+
胡萝卜
|
| 762 |
+
胡萝卜蛋糕
|
| 763 |
+
携带
|
| 764 |
+
手推车
|
| 765 |
+
纸箱/纸盒
|
| 766 |
+
卡通
|
| 767 |
+
卡通人物
|
| 768 |
+
卡通插图
|
| 769 |
+
卡通风格
|
| 770 |
+
雕刻
|
| 771 |
+
容器
|
| 772 |
+
现金
|
| 773 |
+
腰果
|
| 774 |
+
赌场
|
| 775 |
+
砂锅
|
| 776 |
+
磁带
|
| 777 |
+
盒式录音机
|
| 778 |
+
石膏绷带
|
| 779 |
+
铸造
|
| 780 |
+
城堡
|
| 781 |
+
猫
|
| 782 |
+
猫窝
|
| 783 |
+
猫粮
|
| 784 |
+
猫器具
|
| 785 |
+
猫架
|
| 786 |
+
地下墓穴
|
| 787 |
+
双体船
|
| 788 |
+
美洲狮
|
| 789 |
+
握着/抓着
|
| 790 |
+
捕手
|
| 791 |
+
毛毛虫
|
| 792 |
+
鲶鱼
|
| 793 |
+
教堂
|
| 794 |
+
牛
|
| 795 |
+
猫步
|
| 796 |
+
走秀
|
| 797 |
+
菜花
|
| 798 |
+
洞穴
|
| 799 |
+
鱼子酱
|
| 800 |
+
光盘
|
| 801 |
+
CD播放器
|
| 802 |
+
雪松
|
| 803 |
+
天花板
|
| 804 |
+
吊扇
|
| 805 |
+
庆祝
|
| 806 |
+
庆典
|
| 807 |
+
名人
|
| 808 |
+
芹菜
|
| 809 |
+
大提琴
|
| 810 |
+
手机
|
| 811 |
+
水泥
|
| 812 |
+
墓地
|
| 813 |
+
中心装饰品
|
| 814 |
+
蜈蚣
|
| 815 |
+
陶瓷
|
| 816 |
+
瓷砖
|
| 817 |
+
麦片
|
| 818 |
+
仪式
|
| 819 |
+
证书
|
| 820 |
+
链条
|
| 821 |
+
链锯
|
| 822 |
+
椅子
|
| 823 |
+
升降椅
|
| 824 |
+
躺椅
|
| 825 |
+
木屋
|
| 826 |
+
圣杯
|
| 827 |
+
粉笔
|
| 828 |
+
房间
|
| 829 |
+
变色龙
|
| 830 |
+
香槟酒
|
| 831 |
+
香槟杯
|
| 832 |
+
冠军
|
| 833 |
+
锦标赛
|
| 834 |
+
吊灯
|
| 835 |
+
婴儿换尿布台
|
| 836 |
+
通道
|
| 837 |
+
皴裂处
|
| 838 |
+
小教堂
|
| 839 |
+
人物雕塑
|
| 840 |
+
木炭
|
| 841 |
+
充电
|
| 842 |
+
充电器
|
| 843 |
+
战车
|
| 844 |
+
慈善机构
|
| 845 |
+
慈善活动
|
| 846 |
+
魅力
|
| 847 |
+
图表
|
| 848 |
+
追逐
|
| 849 |
+
底盘
|
| 850 |
+
检查/支票
|
| 851 |
+
支票簿
|
| 852 |
+
棋盘
|
| 853 |
+
检查表
|
| 854 |
+
欢呼声
|
| 855 |
+
鼓励/啦啦队
|
| 856 |
+
奶酪
|
| 857 |
+
奶酪汉堡
|
| 858 |
+
奶酪蛋糕
|
| 859 |
+
猎豹
|
| 860 |
+
厨师
|
| 861 |
+
化合物
|
| 862 |
+
化学家
|
| 863 |
+
化学
|
| 864 |
+
化学实验室
|
| 865 |
+
旗袍
|
| 866 |
+
樱桃
|
| 867 |
+
樱花
|
| 868 |
+
樱桃番茄
|
| 869 |
+
樱桃树
|
| 870 |
+
国际象棋
|
| 871 |
+
栗子
|
| 872 |
+
鸡
|
| 873 |
+
鸡胸肉
|
| 874 |
+
鸡笼
|
| 875 |
+
鸡肉沙拉
|
| 876 |
+
鸡翅
|
| 877 |
+
鹰嘴豆
|
| 878 |
+
小衣橱
|
| 879 |
+
吉娃娃
|
| 880 |
+
孩子
|
| 881 |
+
童星
|
| 882 |
+
孩子的房间
|
| 883 |
+
红番椒
|
| 884 |
+
辣热狗
|
| 885 |
+
烟囱
|
| 886 |
+
黑猩猩
|
| 887 |
+
瓷器
|
| 888 |
+
白菜
|
| 889 |
+
中国园林
|
| 890 |
+
中国结
|
| 891 |
+
月季
|
| 892 |
+
中国塔
|
| 893 |
+
炸薯条/炸薯条
|
| 894 |
+
花栗鼠
|
| 895 |
+
凿子
|
| 896 |
+
巧克力
|
| 897 |
+
巧克力棒
|
| 898 |
+
巧克力蛋糕
|
| 899 |
+
巧克力碎片
|
| 900 |
+
巧克力饼干
|
| 901 |
+
巧克力牛奶
|
| 902 |
+
巧克力慕斯
|
| 903 |
+
松露
|
| 904 |
+
唱诗班
|
| 905 |
+
厨房刀
|
| 906 |
+
砧板
|
| 907 |
+
筷子
|
| 908 |
+
圣诞节
|
| 909 |
+
圣诞球
|
| 910 |
+
圣诞贺卡
|
| 911 |
+
圣诞装饰
|
| 912 |
+
圣诞晚宴
|
| 913 |
+
平安夜
|
| 914 |
+
圣诞帽
|
| 915 |
+
圣诞灯
|
| 916 |
+
圣诞市场
|
| 917 |
+
圣诞装饰
|
| 918 |
+
圣诞树
|
| 919 |
+
菊花
|
| 920 |
+
教堂
|
| 921 |
+
教堂塔
|
| 922 |
+
苹果酒
|
| 923 |
+
雪茄
|
| 924 |
+
雪茄盒
|
| 925 |
+
香烟
|
| 926 |
+
烟盒
|
| 927 |
+
腰带
|
| 928 |
+
电影院
|
| 929 |
+
摄影师
|
| 930 |
+
肉桂
|
| 931 |
+
圆
|
| 932 |
+
电路
|
| 933 |
+
电路板
|
| 934 |
+
马戏团
|
| 935 |
+
水箱
|
| 936 |
+
柑橘类水果
|
| 937 |
+
城市
|
| 938 |
+
城市公交
|
| 939 |
+
市政厅
|
| 940 |
+
城市夜景
|
| 941 |
+
城市公园
|
| 942 |
+
城市天际线
|
| 943 |
+
城市广场
|
| 944 |
+
城市街道
|
| 945 |
+
城墙
|
| 946 |
+
城市景观
|
| 947 |
+
蛤蜊
|
| 948 |
+
单���管
|
| 949 |
+
扣子
|
| 950 |
+
班级
|
| 951 |
+
经典
|
| 952 |
+
教室
|
| 953 |
+
锁骨
|
| 954 |
+
爪子
|
| 955 |
+
黏土
|
| 956 |
+
陶器
|
| 957 |
+
清洁
|
| 958 |
+
洁净室
|
| 959 |
+
清洁工人
|
| 960 |
+
清洁用品
|
| 961 |
+
清晰的
|
| 962 |
+
栓
|
| 963 |
+
克莱门氏小柑橘
|
| 964 |
+
客户端
|
| 965 |
+
悬崖
|
| 966 |
+
爬
|
| 967 |
+
爬山
|
| 968 |
+
登山者
|
| 969 |
+
诊所
|
| 970 |
+
夹子
|
| 971 |
+
剪贴画
|
| 972 |
+
剪贴板
|
| 973 |
+
快速帆船
|
| 974 |
+
君子兰
|
| 975 |
+
斗篷
|
| 976 |
+
木底鞋
|
| 977 |
+
特写
|
| 978 |
+
壁橱
|
| 979 |
+
布
|
| 980 |
+
穿衣
|
| 981 |
+
衣服
|
| 982 |
+
晒衣夹
|
| 983 |
+
晒衣绳
|
| 984 |
+
服装店
|
| 985 |
+
云
|
| 986 |
+
云雾森林
|
| 987 |
+
多云
|
| 988 |
+
三叶草
|
| 989 |
+
小丑
|
| 990 |
+
小丑鱼
|
| 991 |
+
俱乐部
|
| 992 |
+
离合器
|
| 993 |
+
手拿包
|
| 994 |
+
煤炭
|
| 995 |
+
海岸
|
| 996 |
+
外套
|
| 997 |
+
衣帽架
|
| 998 |
+
玉米
|
| 999 |
+
公鸡
|
| 1000 |
+
凤头鹦鹉
|
| 1001 |
+
可卡犬
|
| 1002 |
+
驾驶
|
| 1003 |
+
蟑螂
|
| 1004 |
+
鸡尾酒
|
| 1005 |
+
小礼服
|
| 1006 |
+
鸡尾酒调制器
|
| 1007 |
+
鸡尾酒桌
|
| 1008 |
+
可可
|
| 1009 |
+
椰子
|
| 1010 |
+
椰子树
|
| 1011 |
+
咖啡
|
| 1012 |
+
咖啡豆
|
| 1013 |
+
咖啡杯
|
| 1014 |
+
咖啡机
|
| 1015 |
+
咖啡店
|
| 1016 |
+
咖啡壶
|
| 1017 |
+
棺材
|
| 1018 |
+
法国白兰地
|
| 1019 |
+
螺旋
|
| 1020 |
+
硬币
|
| 1021 |
+
可口可乐
|
| 1022 |
+
滤器
|
| 1023 |
+
冷的
|
| 1024 |
+
卷心菜沙拉
|
| 1025 |
+
合作
|
| 1026 |
+
拼贴画
|
| 1027 |
+
收藏品
|
| 1028 |
+
大学生
|
| 1029 |
+
牧羊犬
|
| 1030 |
+
碰撞
|
| 1031 |
+
颜色
|
| 1032 |
+
涂色书
|
| 1033 |
+
染色材料
|
| 1034 |
+
矮种马
|
| 1035 |
+
柱子
|
| 1036 |
+
梳子
|
| 1037 |
+
密码锁
|
| 1038 |
+
喜剧演员
|
| 1039 |
+
喜剧
|
| 1040 |
+
喜剧电影
|
| 1041 |
+
彗星
|
| 1042 |
+
舒服
|
| 1043 |
+
安慰食物
|
| 1044 |
+
漫画书
|
| 1045 |
+
漫画人物
|
| 1046 |
+
连环画
|
| 1047 |
+
指挥官
|
| 1048 |
+
评论员
|
| 1049 |
+
社区
|
| 1050 |
+
通勤
|
| 1051 |
+
公司
|
| 1052 |
+
指南针
|
| 1053 |
+
比赛
|
| 1054 |
+
比赛
|
| 1055 |
+
竞争者
|
| 1056 |
+
作曲家
|
| 1057 |
+
作文
|
| 1058 |
+
堆肥
|
| 1059 |
+
电脑
|
| 1060 |
+
电脑机箱
|
| 1061 |
+
电脑椅
|
| 1062 |
+
电脑桌
|
| 1063 |
+
键盘
|
| 1064 |
+
计算机显示器
|
| 1065 |
+
计算机房
|
| 1066 |
+
电脑屏幕
|
| 1067 |
+
机箱
|
| 1068 |
+
概念车
|
| 1069 |
+
音乐会
|
| 1070 |
+
音乐厅
|
| 1071 |
+
贝壳
|
| 1072 |
+
混凝土
|
| 1073 |
+
调味品
|
| 1074 |
+
避孕套
|
| 1075 |
+
独立产权的公寓
|
| 1076 |
+
指挥
|
| 1077 |
+
锥形物
|
| 1078 |
+
会议
|
| 1079 |
+
会议中心
|
| 1080 |
+
会议厅
|
| 1081 |
+
会议室
|
| 1082 |
+
五彩纸屑
|
| 1083 |
+
冲突
|
| 1084 |
+
合流
|
| 1085 |
+
连接
|
| 1086 |
+
连接器
|
| 1087 |
+
温室
|
| 1088 |
+
星座
|
| 1089 |
+
建筑工地
|
| 1090 |
+
建筑工人
|
| 1091 |
+
包含
|
| 1092 |
+
容器
|
| 1093 |
+
集装箱船
|
| 1094 |
+
大陆
|
| 1095 |
+
轮廓
|
| 1096 |
+
合同
|
| 1097 |
+
控制
|
| 1098 |
+
控制塔
|
| 1099 |
+
便利店
|
| 1100 |
+
集会
|
| 1101 |
+
交谈
|
| 1102 |
+
转换器
|
| 1103 |
+
可转换的
|
| 1104 |
+
输送机
|
| 1105 |
+
厨师/烹饪
|
| 1106 |
+
烹饪
|
| 1107 |
+
烹饪喷雾剂
|
| 1108 |
+
炊具
|
| 1109 |
+
凉的
|
| 1110 |
+
冷却器
|
| 1111 |
+
铜
|
| 1112 |
+
一本/一册
|
| 1113 |
+
珊瑚
|
| 1114 |
+
珊瑚礁
|
| 1115 |
+
粗绳
|
| 1116 |
+
有线电话
|
| 1117 |
+
酒
|
| 1118 |
+
威尔士矮脚狗
|
| 1119 |
+
瓶塞
|
| 1120 |
+
软木板
|
| 1121 |
+
鸬鹚
|
| 1122 |
+
玉米
|
| 1123 |
+
玉米田
|
| 1124 |
+
玉米面包
|
| 1125 |
+
角落
|
| 1126 |
+
小号
|
| 1127 |
+
飞檐
|
| 1128 |
+
燕麦片
|
| 1129 |
+
围栏
|
| 1130 |
+
走廊
|
| 1131 |
+
紧身衣
|
| 1132 |
+
化妆品
|
| 1133 |
+
化妆刷
|
| 1134 |
+
化妆镜
|
| 1135 |
+
角色扮演
|
| 1136 |
+
服装
|
| 1137 |
+
服装电影设计师
|
| 1138 |
+
婴儿床
|
| 1139 |
+
小屋
|
| 1140 |
+
棉花
|
| 1141 |
+
棉花糖
|
| 1142 |
+
沙发
|
| 1143 |
+
倒计时
|
| 1144 |
+
柜台
|
| 1145 |
+
台面
|
| 1146 |
+
最佳乡村歌手
|
| 1147 |
+
乡村别墅
|
| 1148 |
+
乡村公路
|
| 1149 |
+
乡村流行歌手
|
| 1150 |
+
农村
|
| 1151 |
+
双门小轿车
|
| 1152 |
+
夫妇/两人/几个
|
| 1153 |
+
情侣写真
|
| 1154 |
+
小胡瓜
|
| 1155 |
+
课程
|
| 1156 |
+
球场
|
| 1157 |
+
法院
|
| 1158 |
+
院子
|
| 1159 |
+
堂兄弟
|
| 1160 |
+
工作服
|
| 1161 |
+
奶牛
|
| 1162 |
+
母牛的颈铃
|
| 1163 |
+
牛仔
|
| 1164 |
+
牛仔靴
|
| 1165 |
+
牛仔帽
|
| 1166 |
+
螃蟹
|
| 1167 |
+
蟹肉
|
| 1168 |
+
裂纹
|
| 1169 |
+
摇篮
|
| 1170 |
+
工艺
|
| 1171 |
+
工匠
|
| 1172 |
+
蔓越莓
|
| 1173 |
+
起重机
|
| 1174 |
+
黑纱
|
| 1175 |
+
厕所
|
| 1176 |
+
板条箱
|
| 1177 |
+
火山口湖
|
| 1178 |
+
龙虾
|
| 1179 |
+
蜡笔
|
| 1180 |
+
奶油乳酪
|
| 1181 |
+
奶油罐
|
| 1182 |
+
创建
|
| 1183 |
+
生物
|
| 1184 |
+
信用卡
|
| 1185 |
+
新月形
|
| 1186 |
+
新月形面包
|
| 1187 |
+
山顶
|
| 1188 |
+
全体船员
|
| 1189 |
+
蟋蟀
|
| 1190 |
+
板球用球
|
| 1191 |
+
板球队
|
| 1192 |
+
板球队员
|
| 1193 |
+
钩边
|
| 1194 |
+
克罗克电锅
|
| 1195 |
+
鳄鱼
|
| 1196 |
+
庄稼
|
| 1197 |
+
露脐上衣
|
| 1198 |
+
交叉
|
| 1199 |
+
横木
|
| 1200 |
+
十字路口
|
| 1201 |
+
相声
|
| 1202 |
+
人行横道
|
| 1203 |
+
油煎面包块
|
| 1204 |
+
乌鸦
|
| 1205 |
+
撬棍
|
| 1206 |
+
人群
|
| 1207 |
+
拥挤的
|
| 1208 |
+
皇冠
|
| 1209 |
+
阴极射线管屏幕
|
| 1210 |
+
耶稣受难像
|
| 1211 |
+
巡游
|
| 1212 |
+
游轮
|
| 1213 |
+
巡洋艇
|
| 1214 |
+
面包屑
|
| 1215 |
+
压坏
|
| 1216 |
+
拐杖
|
| 1217 |
+
水晶
|
| 1218 |
+
幼兽
|
| 1219 |
+
立方体
|
| 1220 |
+
黄瓜
|
| 1221 |
+
球杆
|
| 1222 |
+
袖口
|
| 1223 |
+
袖扣
|
| 1224 |
+
烹饪
|
| 1225 |
+
农田
|
| 1226 |
+
杯子
|
| 1227 |
+
纸杯蛋糕
|
| 1228 |
+
丘比特
|
| 1229 |
+
马路牙子
|
| 1230 |
+
旋度
|
| 1231 |
+
卷发器
|
| 1232 |
+
无籽葡萄干
|
| 1233 |
+
货币
|
| 1234 |
+
咖喱
|
| 1235 |
+
窗帘
|
| 1236 |
+
曲线
|
| 1237 |
+
软垫
|
| 1238 |
+
顾客
|
| 1239 |
+
切
|
| 1240 |
+
餐具
|
| 1241 |
+
自行车
|
| 1242 |
+
骑自行车
|
| 1243 |
+
龙卷风
|
| 1244 |
+
汽缸
|
| 1245 |
+
铙钹
|
| 1246 |
+
柏树
|
| 1247 |
+
柏树
|
| 1248 |
+
达克斯猎狗
|
| 1249 |
+
水仙花
|
| 1250 |
+
匕首
|
| 1251 |
+
大丽花
|
| 1252 |
+
萝卜
|
| 1253 |
+
乳制品
|
| 1254 |
+
雏菊
|
| 1255 |
+
大坝
|
| 1256 |
+
损害
|
| 1257 |
+
潮湿的
|
| 1258 |
+
跳舞
|
| 1259 |
+
舞池
|
| 1260 |
+
舞蹈室
|
| 1261 |
+
舞者
|
| 1262 |
+
蒲公英
|
| 1263 |
+
黑暗
|
| 1264 |
+
黑暗
|
| 1265 |
+
飞镖
|
| 1266 |
+
圆靶
|
| 1267 |
+
指示板
|
| 1268 |
+
日期
|
| 1269 |
+
女儿
|
| 1270 |
+
黎明
|
| 1271 |
+
天床上
|
| 1272 |
+
日光
|
| 1273 |
+
门栓
|
| 1274 |
+
死亡
|
| 1275 |
+
辩论
|
| 1276 |
+
碎片
|
| 1277 |
+
玻璃水瓶
|
| 1278 |
+
甲板
|
| 1279 |
+
双层巴士
|
| 1280 |
+
装饰
|
| 1281 |
+
装修/装饰
|
| 1282 |
+
装饰画
|
| 1283 |
+
鹿
|
| 1284 |
+
后卫
|
| 1285 |
+
神
|
| 1286 |
+
熟食
|
| 1287 |
+
投递
|
| 1288 |
+
拆迁
|
| 1289 |
+
怪兽
|
| 1290 |
+
演示
|
| 1291 |
+
兽窝/休闲室
|
| 1292 |
+
牛仔夹克
|
| 1293 |
+
牙医
|
| 1294 |
+
百货商店
|
| 1295 |
+
抑郁症
|
| 1296 |
+
德比
|
| 1297 |
+
皮肤病
|
| 1298 |
+
沙漠
|
| 1299 |
+
沙漠公路
|
| 1300 |
+
设计
|
| 1301 |
+
设计师
|
| 1302 |
+
桌子/表格
|
| 1303 |
+
台灯
|
| 1304 |
+
桌面
|
| 1305 |
+
台式电脑
|
| 1306 |
+
甜点
|
| 1307 |
+
破坏
|
| 1308 |
+
侦探
|
| 1309 |
+
洗涤剂
|
| 1310 |
+
露水
|
| 1311 |
+
仪表盘
|
| 1312 |
+
钻石
|
| 1313 |
+
尿布
|
| 1314 |
+
尿布包
|
| 1315 |
+
杂志
|
| 1316 |
+
死
|
| 1317 |
+
饮食
|
| 1318 |
+
挖掘机
|
| 1319 |
+
数字
|
| 1320 |
+
数字时钟
|
| 1321 |
+
莳萝
|
| 1322 |
+
晚餐
|
| 1323 |
+
小船
|
| 1324 |
+
餐厅
|
| 1325 |
+
晚宴
|
| 1326 |
+
餐桌
|
| 1327 |
+
恐龙
|
| 1328 |
+
浸
|
| 1329 |
+
文凭
|
| 1330 |
+
指引
|
| 1331 |
+
导演
|
| 1332 |
+
尘埃
|
| 1333 |
+
越野摩托车
|
| 1334 |
+
泥土地
|
| 1335 |
+
泥土路
|
| 1336 |
+
泥路/土路
|
| 1337 |
+
灾难
|
| 1338 |
+
信徒
|
| 1339 |
+
迪斯科舞厅
|
| 1340 |
+
迪斯科灯秋
|
| 1341 |
+
迪斯科舞厅
|
| 1342 |
+
疾病
|
| 1343 |
+
盘子
|
| 1344 |
+
碟形天线
|
| 1345 |
+
洗碗机
|
| 1346 |
+
抹布
|
| 1347 |
+
菜肴
|
| 1348 |
+
洗碗液
|
| 1349 |
+
迪斯尼乐园
|
| 1350 |
+
自动售货机
|
| 1351 |
+
展示
|
| 1352 |
+
陈列窗
|
| 1353 |
+
壕沟
|
| 1354 |
+
潜水
|
| 1355 |
+
潜水员
|
| 1356 |
+
跳水板
|
| 1357 |
+
纸杯
|
| 1358 |
+
流行音乐播音员
|
| 1359 |
+
杜宾犬
|
| 1360 |
+
码头
|
| 1361 |
+
医生
|
| 1362 |
+
文件
|
| 1363 |
+
纪录片
|
| 1364 |
+
狗
|
| 1365 |
+
狗窝
|
| 1366 |
+
犬种
|
| 1367 |
+
狗项圈
|
| 1368 |
+
狗粮
|
| 1369 |
+
狗窝
|
| 1370 |
+
洋娃娃
|
| 1371 |
+
美元
|
| 1372 |
+
玩偶之家
|
| 1373 |
+
洋娃娃
|
| 1374 |
+
海豚
|
| 1375 |
+
穹顶
|
| 1376 |
+
住宅
|
| 1377 |
+
多米诺骨牌
|
| 1378 |
+
驴
|
| 1379 |
+
甜甜圈
|
| 1380 |
+
涂鸦
|
| 1381 |
+
门
|
| 1382 |
+
门把手
|
| 1383 |
+
受气包
|
| 1384 |
+
门牌
|
| 1385 |
+
门口
|
| 1386 |
+
宿舍
|
| 1387 |
+
面团
|
| 1388 |
+
市中心
|
| 1389 |
+
推土机
|
| 1390 |
+
拖
|
| 1391 |
+
龙
|
| 1392 |
+
蜻蜓
|
| 1393 |
+
排水沟
|
| 1394 |
+
剧本
|
| 1395 |
+
戏剧电影
|
| 1396 |
+
画
|
| 1397 |
+
抽屉里
|
| 1398 |
+
图画/画画
|
| 1399 |
+
图钉
|
| 1400 |
+
辫子
|
| 1401 |
+
连衣裙/特定场合的服装
|
| 1402 |
+
礼帽
|
| 1403 |
+
正装衬衫
|
| 1404 |
+
皮鞋
|
| 1405 |
+
大礼服
|
| 1406 |
+
梳妆台
|
| 1407 |
+
更衣室
|
| 1408 |
+
运球
|
| 1409 |
+
漂移
|
| 1410 |
+
浮木
|
| 1411 |
+
钻
|
| 1412 |
+
饮品/喝
|
| 1413 |
+
饮用水
|
| 1414 |
+
开车
|
| 1415 |
+
司机
|
| 1416 |
+
车道
|
| 1417 |
+
无人机
|
| 1418 |
+
水滴/下降
|
| 1419 |
+
吊灯
|
| 1420 |
+
滴管
|
| 1421 |
+
干旱
|
| 1422 |
+
药物
|
| 1423 |
+
药店
|
| 1424 |
+
鼓
|
| 1425 |
+
鼓手
|
| 1426 |
+
鸡腿
|
| 1427 |
+
干的
|
| 1428 |
+
公爵夫人
|
| 1429 |
+
鸭子
|
| 1430 |
+
鸭嘴兽
|
| 1431 |
+
小鸭子
|
| 1432 |
+
布基胶带
|
| 1433 |
+
伙计
|
| 1434 |
+
二重唱
|
| 1435 |
+
粗呢
|
| 1436 |
+
独木舟
|
| 1437 |
+
哑铃
|
| 1438 |
+
饺子
|
| 1439 |
+
沙丘
|
| 1440 |
+
扣篮
|
| 1441 |
+
榴莲
|
| 1442 |
+
黄昏
|
| 1443 |
+
灰尘
|
| 1444 |
+
垃圾车
|
| 1445 |
+
簸箕
|
| 1446 |
+
羽绒被
|
| 1447 |
+
DVD
|
| 1448 |
+
染料
|
| 1449 |
+
鹰
|
| 1450 |
+
耳朵
|
| 1451 |
+
御寒耳罩
|
| 1452 |
+
耳机
|
| 1453 |
+
耳塞
|
| 1454 |
+
耳环
|
| 1455 |
+
地震
|
| 1456 |
+
画架
|
| 1457 |
+
复活节
|
| 1458 |
+
复活节兔子
|
| 1459 |
+
复活节彩蛋
|
| 1460 |
+
吃
|
| 1461 |
+
餐厅
|
| 1462 |
+
泡芙
|
| 1463 |
+
日食
|
| 1464 |
+
生态系统
|
| 1465 |
+
编辑
|
| 1466 |
+
教育
|
| 1467 |
+
教育家
|
| 1468 |
+
鳗鱼
|
| 1469 |
+
蛋
|
| 1470 |
+
蛋卷
|
| 1471 |
+
蛋挞
|
| 1472 |
+
打蛋器
|
| 1473 |
+
白鹭
|
| 1474 |
+
埃菲尔铁塔
|
| 1475 |
+
橡皮筋
|
| 1476 |
+
上级
|
| 1477 |
+
电椅
|
| 1478 |
+
电钻
|
| 1479 |
+
电工
|
| 1480 |
+
电
|
| 1481 |
+
电子
|
| 1482 |
+
电子器件
|
| 1483 |
+
大象
|
| 1484 |
+
高度图
|
| 1485 |
+
电梯
|
| 1486 |
+
电梯轿厢
|
| 1487 |
+
电梯门
|
| 1488 |
+
电梯大堂
|
| 1489 |
+
电梯井
|
| 1490 |
+
路堤
|
| 1491 |
+
大使馆
|
| 1492 |
+
装饰
|
| 1493 |
+
灰烬
|
| 1494 |
+
会徽
|
| 1495 |
+
刺绣
|
| 1496 |
+
翡翠
|
| 1497 |
+
紧急
|
| 1498 |
+
紧急服务
|
| 1499 |
+
紧急车辆
|
| 1500 |
+
情感
|
| 1501 |
+
帝国大厦
|
| 1502 |
+
搪瓷
|
| 1503 |
+
外壳/围墙
|
| 1504 |
+
茶几
|
| 1505 |
+
能源
|
| 1506 |
+
订婚
|
| 1507 |
+
订婚戒指
|
| 1508 |
+
引擎
|
| 1509 |
+
机舱
|
| 1510 |
+
工程师
|
| 1511 |
+
工程
|
| 1512 |
+
英国短毛猫
|
| 1513 |
+
乐团
|
| 1514 |
+
回车键
|
| 1515 |
+
演艺人员
|
| 1516 |
+
娱乐
|
| 1517 |
+
娱乐中心
|
| 1518 |
+
入口
|
| 1519 |
+
入口大厅
|
| 1520 |
+
信封
|
| 1521 |
+
马术
|
| 1522 |
+
设备
|
| 1523 |
+
橡皮擦
|
| 1524 |
+
二胡
|
| 1525 |
+
侵蚀
|
| 1526 |
+
自动扶梯
|
| 1527 |
+
食用蜗牛
|
| 1528 |
+
浓缩咖啡
|
| 1529 |
+
房地产
|
| 1530 |
+
河口
|
| 1531 |
+
桉树
|
| 1532 |
+
晚上
|
| 1533 |
+
晚礼服
|
| 1534 |
+
夜光
|
| 1535 |
+
傍晚天空
|
| 1536 |
+
晚上的太阳
|
| 1537 |
+
事件
|
| 1538 |
+
常绿的
|
| 1539 |
+
母羊
|
| 1540 |
+
挖掘
|
| 1541 |
+
运动
|
| 1542 |
+
排气罩
|
| 1543 |
+
展览
|
| 1544 |
+
出口
|
| 1545 |
+
探险者
|
| 1546 |
+
爆炸
|
| 1547 |
+
延长线
|
| 1548 |
+
灭火器
|
| 1549 |
+
排气扇
|
| 1550 |
+
挤压
|
| 1551 |
+
眼睛
|
| 1552 |
+
眼影
|
| 1553 |
+
眉
|
| 1554 |
+
眼线笔
|
| 1555 |
+
布料
|
| 1556 |
+
纺织品商店
|
| 1557 |
+
外观
|
| 1558 |
+
脸
|
| 1559 |
+
脸部特写
|
| 1560 |
+
蜜粉
|
| 1561 |
+
毛巾
|
| 1562 |
+
面巾纸架
|
| 1563 |
+
设施
|
| 1564 |
+
工厂
|
| 1565 |
+
工厂车间
|
| 1566 |
+
集市
|
| 1567 |
+
露天市场
|
| 1568 |
+
仙女
|
| 1569 |
+
猎鹰
|
| 1570 |
+
秋天
|
| 1571 |
+
家庭
|
| 1572 |
+
家庭轿车
|
| 1573 |
+
全家福
|
| 1574 |
+
家庭房
|
| 1575 |
+
风扇/扇子
|
| 1576 |
+
尖牙
|
| 1577 |
+
农场
|
| 1578 |
+
农民
|
| 1579 |
+
农民市场
|
| 1580 |
+
农舍
|
| 1581 |
+
时尚
|
| 1582 |
+
时尚配饰
|
| 1583 |
+
时装设计师
|
| 1584 |
+
时尚的女孩
|
| 1585 |
+
时装插图
|
| 1586 |
+
时装大片
|
| 1587 |
+
时装模特
|
| 1588 |
+
时装表演
|
| 1589 |
+
快餐
|
| 1590 |
+
西式快餐
|
| 1591 |
+
父亲
|
| 1592 |
+
水龙头
|
| 1593 |
+
故障
|
| 1594 |
+
动物
|
| 1595 |
+
小鹿
|
| 1596 |
+
传真
|
| 1597 |
+
宴会
|
| 1598 |
+
羽毛
|
| 1599 |
+
软呢帽
|
| 1600 |
+
饲料
|
| 1601 |
+
一餐
|
| 1602 |
+
饲养
|
| 1603 |
+
喂养的椅子
|
| 1604 |
+
猫科
|
| 1605 |
+
美洲狮
|
| 1606 |
+
栅栏
|
| 1607 |
+
芬达
|
| 1608 |
+
蕨类植物
|
| 1609 |
+
雪貂
|
| 1610 |
+
摩天轮
|
| 1611 |
+
渡船
|
| 1612 |
+
肥料
|
| 1613 |
+
节日
|
| 1614 |
+
纤维
|
| 1615 |
+
小说
|
| 1616 |
+
小说书
|
| 1617 |
+
田野/场地/野外
|
| 1618 |
+
田间道路
|
| 1619 |
+
无花果
|
| 1620 |
+
打架
|
| 1621 |
+
花样滑冰运动员
|
| 1622 |
+
小雕像
|
| 1623 |
+
文件
|
| 1624 |
+
档案照片
|
| 1625 |
+
文件柜
|
| 1626 |
+
填满
|
| 1627 |
+
胶片相机
|
| 1628 |
+
电影导演
|
| 1629 |
+
电影格式
|
| 1630 |
+
电影首映礼
|
| 1631 |
+
电影制片人
|
| 1632 |
+
拍摄
|
| 1633 |
+
过滤器
|
| 1634 |
+
鳍
|
| 1635 |
+
手
|
| 1636 |
+
终点线
|
| 1637 |
+
冷杉
|
| 1638 |
+
冷杉树
|
| 1639 |
+
火
|
| 1640 |
+
火灾报警
|
| 1641 |
+
消防部门
|
| 1642 |
+
消防车
|
| 1643 |
+
消防通道
|
| 1644 |
+
消防水带
|
| 1645 |
+
火坑
|
| 1646 |
+
消防站
|
| 1647 |
+
爆竹
|
| 1648 |
+
消防队员
|
| 1649 |
+
壁炉
|
| 1650 |
+
烟花
|
| 1651 |
+
烟花表演
|
| 1652 |
+
急救箱
|
| 1653 |
+
鱼
|
| 1654 |
+
鱼船
|
| 1655 |
+
海鲜市场
|
| 1656 |
+
鱼塘
|
| 1657 |
+
鱼缸
|
| 1658 |
+
渔夫
|
| 1659 |
+
钓鱼
|
| 1660 |
+
渔船
|
| 1661 |
+
渔网
|
| 1662 |
+
钓鱼
|
| 1663 |
+
渔村
|
| 1664 |
+
健身
|
| 1665 |
+
健身课程
|
| 1666 |
+
五个
|
| 1667 |
+
固定装置
|
| 1668 |
+
峡湾
|
| 1669 |
+
国旗
|
| 1670 |
+
旗杆
|
| 1671 |
+
小薄片
|
| 1672 |
+
火焰
|
| 1673 |
+
火烈鸟
|
| 1674 |
+
法兰绒
|
| 1675 |
+
拍打
|
| 1676 |
+
耀斑
|
| 1677 |
+
闪光
|
| 1678 |
+
烧瓶
|
| 1679 |
+
平
|
| 1680 |
+
比目鱼
|
| 1681 |
+
风味
|
| 1682 |
+
跳蚤
|
| 1683 |
+
跳蚤市场
|
| 1684 |
+
舰队
|
| 1685 |
+
飞行
|
| 1686 |
+
空中乘务员
|
| 1687 |
+
翻转
|
| 1688 |
+
触发器
|
| 1689 |
+
翻转图
|
| 1690 |
+
浮动
|
| 1691 |
+
群
|
| 1692 |
+
洪水
|
| 1693 |
+
地板/地面
|
| 1694 |
+
落地扇
|
| 1695 |
+
脚垫
|
| 1696 |
+
楼层平面图
|
| 1697 |
+
落地窗
|
| 1698 |
+
插花艺术
|
| 1699 |
+
花店
|
| 1700 |
+
牙线
|
| 1701 |
+
面粉
|
| 1702 |
+
流动
|
| 1703 |
+
花
|
| 1704 |
+
花篮
|
| 1705 |
+
花坛
|
| 1706 |
+
花箱
|
| 1707 |
+
花田
|
| 1708 |
+
花童
|
| 1709 |
+
花卉市场
|
| 1710 |
+
流体
|
| 1711 |
+
冲洗
|
| 1712 |
+
长笛
|
| 1713 |
+
飞
|
| 1714 |
+
飞行钓鱼
|
| 1715 |
+
传单
|
| 1716 |
+
马
|
| 1717 |
+
泡沫
|
| 1718 |
+
雾
|
| 1719 |
+
多雾的
|
| 1720 |
+
鹅肝酱
|
| 1721 |
+
箔纸
|
| 1722 |
+
折椅
|
| 1723 |
+
树叶
|
| 1724 |
+
民间艺术家
|
| 1725 |
+
民间舞蹈
|
| 1726 |
+
民间摇滚艺术家
|
| 1727 |
+
方旦糖
|
| 1728 |
+
火锅
|
| 1729 |
+
圣洗池
|
| 1730 |
+
食物
|
| 1731 |
+
食用色素
|
| 1732 |
+
美食广场
|
| 1733 |
+
食品加工机
|
| 1734 |
+
小吃摊
|
| 1735 |
+
快餐车
|
| 1736 |
+
桌上足球
|
| 1737 |
+
脚
|
| 1738 |
+
人行桥
|
| 1739 |
+
足球
|
| 1740 |
+
足球教练
|
| 1741 |
+
大学橄榄球赛
|
| 1742 |
+
足球比赛
|
| 1743 |
+
足球场
|
| 1744 |
+
足球比赛
|
| 1745 |
+
橄榄球头盔
|
| 1746 |
+
足球运动员
|
| 1747 |
+
足球场
|
| 1748 |
+
足球队
|
| 1749 |
+
小路
|
| 1750 |
+
脚印
|
| 1751 |
+
脚踏板
|
| 1752 |
+
台座
|
| 1753 |
+
鞋子
|
| 1754 |
+
故宫
|
| 1755 |
+
浅滩
|
| 1756 |
+
额头
|
| 1757 |
+
森林
|
| 1758 |
+
森林大火
|
| 1759 |
+
森林地面
|
| 1760 |
+
森林小路
|
| 1761 |
+
森林公路
|
| 1762 |
+
锻造
|
| 1763 |
+
餐叉
|
| 1764 |
+
叉车
|
| 1765 |
+
表格
|
| 1766 |
+
园林
|
| 1767 |
+
队列/形成物
|
| 1768 |
+
F1方程式赛车
|
| 1769 |
+
堡垒
|
| 1770 |
+
碉堡
|
| 1771 |
+
追逐
|
| 1772 |
+
化石
|
| 1773 |
+
粉底
|
| 1774 |
+
喷泉
|
| 1775 |
+
钢笔
|
| 1776 |
+
狐狸
|
| 1777 |
+
框架
|
| 1778 |
+
雀斑
|
| 1779 |
+
高速公路
|
| 1780 |
+
卡车
|
| 1781 |
+
法国
|
| 1782 |
+
法国斗牛犬
|
| 1783 |
+
薯条
|
| 1784 |
+
法式吐司
|
| 1785 |
+
化妆水
|
| 1786 |
+
冰箱
|
| 1787 |
+
炸鸡
|
| 1788 |
+
煎蛋
|
| 1789 |
+
炒饭
|
| 1790 |
+
友谊
|
| 1791 |
+
飞盘
|
| 1792 |
+
青蛙
|
| 1793 |
+
霜
|
| 1794 |
+
结霜
|
| 1795 |
+
严寒
|
| 1796 |
+
结冰
|
| 1797 |
+
水果
|
| 1798 |
+
水果蛋糕
|
| 1799 |
+
水果盘
|
| 1800 |
+
水果市场
|
| 1801 |
+
水果沙拉
|
| 1802 |
+
水果摊
|
| 1803 |
+
果树
|
| 1804 |
+
水果商店
|
| 1805 |
+
油炸食品
|
| 1806 |
+
煎锅
|
| 1807 |
+
软糖
|
| 1808 |
+
燃料
|
| 1809 |
+
吸烟罩
|
| 1810 |
+
有趣的
|
| 1811 |
+
葬礼
|
| 1812 |
+
真菌
|
| 1813 |
+
漏斗
|
| 1814 |
+
毛皮衣服
|
| 1815 |
+
毛皮大衣
|
| 1816 |
+
家具
|
| 1817 |
+
蒲团
|
| 1818 |
+
小工具
|
| 1819 |
+
枪口
|
| 1820 |
+
星云/星系
|
| 1821 |
+
美术馆
|
| 1822 |
+
游戏
|
| 1823 |
+
游戏棋盘
|
| 1824 |
+
游戏手柄
|
| 1825 |
+
火腿
|
| 1826 |
+
团伙
|
| 1827 |
+
车库
|
| 1828 |
+
车库门
|
| 1829 |
+
手工模型
|
| 1830 |
+
垃圾
|
| 1831 |
+
花园
|
| 1832 |
+
花园芦笋
|
| 1833 |
+
橡胶软管
|
| 1834 |
+
花园蜘蛛
|
| 1835 |
+
园丁
|
| 1836 |
+
园艺
|
| 1837 |
+
加菲猫
|
| 1838 |
+
滴水嘴
|
| 1839 |
+
花环
|
| 1840 |
+
大蒜
|
| 1841 |
+
衣服
|
| 1842 |
+
气体
|
| 1843 |
+
加油站
|
| 1844 |
+
煤气炉
|
| 1845 |
+
防毒面具
|
| 1846 |
+
收集
|
| 1847 |
+
聚集
|
| 1848 |
+
测量仪器
|
| 1849 |
+
露台
|
| 1850 |
+
齿轮
|
| 1851 |
+
壁虎
|
| 1852 |
+
艺妓
|
| 1853 |
+
凝胶
|
| 1854 |
+
百货商店
|
| 1855 |
+
发电机
|
| 1856 |
+
天竺葵
|
| 1857 |
+
幽灵
|
| 1858 |
+
礼物
|
| 1859 |
+
礼品袋
|
| 1860 |
+
礼品篮
|
| 1861 |
+
礼物盒
|
| 1862 |
+
礼品卡
|
| 1863 |
+
礼品商店
|
| 1864 |
+
礼物包装
|
| 1865 |
+
演唱会
|
| 1866 |
+
杜松子酒
|
| 1867 |
+
姜
|
| 1868 |
+
姜饼
|
| 1869 |
+
姜饼屋
|
| 1870 |
+
银杏树
|
| 1871 |
+
长颈鹿
|
| 1872 |
+
女孩
|
| 1873 |
+
给
|
| 1874 |
+
冰川
|
| 1875 |
+
角斗士
|
| 1876 |
+
玻璃珠
|
| 1877 |
+
玻璃瓶
|
| 1878 |
+
玻璃碗
|
| 1879 |
+
玻璃箱
|
| 1880 |
+
玻璃建筑
|
| 1881 |
+
玻璃门
|
| 1882 |
+
玻璃地板
|
| 1883 |
+
玻璃屋
|
| 1884 |
+
玻璃罐
|
| 1885 |
+
玻璃板
|
| 1886 |
+
玻璃桌子
|
| 1887 |
+
玻璃花瓶
|
| 1888 |
+
玻璃墙
|
| 1889 |
+
玻璃窗
|
| 1890 |
+
眼镜
|
| 1891 |
+
光滑面
|
| 1892 |
+
滑翔机
|
| 1893 |
+
地球
|
| 1894 |
+
手套
|
| 1895 |
+
发光
|
| 1896 |
+
汤圆
|
| 1897 |
+
去
|
| 1898 |
+
袭击
|
| 1899 |
+
球门
|
| 1900 |
+
守门员
|
| 1901 |
+
山羊
|
| 1902 |
+
羊奶酪
|
| 1903 |
+
戈壁
|
| 1904 |
+
护目镜/墨镜
|
| 1905 |
+
黄金
|
| 1906 |
+
金牌
|
| 1907 |
+
金门大桥
|
| 1908 |
+
金毛猎犬
|
| 1909 |
+
金鱼
|
| 1910 |
+
高尔夫运动
|
| 1911 |
+
高尔夫球帽
|
| 1912 |
+
高尔夫球车
|
| 1913 |
+
高尔夫球杆
|
| 1914 |
+
高尔夫球场
|
| 1915 |
+
高尔夫球手
|
| 1916 |
+
鹅
|
| 1917 |
+
大猩猩
|
| 1918 |
+
哥特式
|
| 1919 |
+
葫芦
|
| 1920 |
+
政府
|
| 1921 |
+
政府机构
|
| 1922 |
+
礼服
|
| 1923 |
+
毕业生
|
| 1924 |
+
毕业典礼
|
| 1925 |
+
谷物
|
| 1926 |
+
逆戟鲸
|
| 1927 |
+
大奖赛
|
| 1928 |
+
祖父
|
| 1929 |
+
祖母
|
| 1930 |
+
祖父母
|
| 1931 |
+
花岗岩
|
| 1932 |
+
格兰诺拉麦片
|
| 1933 |
+
葡萄
|
| 1934 |
+
西柚
|
| 1935 |
+
葡萄酒
|
| 1936 |
+
草
|
| 1937 |
+
蚱蜢
|
| 1938 |
+
草原
|
| 1939 |
+
长满草的
|
| 1940 |
+
擦菜器
|
| 1941 |
+
坟墓
|
| 1942 |
+
碎石
|
| 1943 |
+
墓���
|
| 1944 |
+
肉汁
|
| 1945 |
+
调味汁瓶
|
| 1946 |
+
灰色
|
| 1947 |
+
吃草
|
| 1948 |
+
放牧
|
| 1949 |
+
绿色
|
| 1950 |
+
绿色植物
|
| 1951 |
+
欢迎
|
| 1952 |
+
问候
|
| 1953 |
+
贺卡
|
| 1954 |
+
灰狗
|
| 1955 |
+
网格
|
| 1956 |
+
筛子
|
| 1957 |
+
烧烤架
|
| 1958 |
+
格栅
|
| 1959 |
+
烤鳗鱼
|
| 1960 |
+
磨
|
| 1961 |
+
研磨机
|
| 1962 |
+
粗燕麦粉
|
| 1963 |
+
杂货袋
|
| 1964 |
+
洞穴
|
| 1965 |
+
地松鼠
|
| 1966 |
+
群体
|
| 1967 |
+
合影
|
| 1968 |
+
小树林
|
| 1969 |
+
生长
|
| 1970 |
+
牛油果酱
|
| 1971 |
+
警卫
|
| 1972 |
+
看门狗
|
| 1973 |
+
宾馆
|
| 1974 |
+
客房
|
| 1975 |
+
指南
|
| 1976 |
+
豚鼠
|
| 1977 |
+
吉他
|
| 1978 |
+
吉他手
|
| 1979 |
+
海湾
|
| 1980 |
+
海鸥
|
| 1981 |
+
枪
|
| 1982 |
+
高达
|
| 1983 |
+
谒师所
|
| 1984 |
+
古筝
|
| 1985 |
+
健身房
|
| 1986 |
+
体操运动员
|
| 1987 |
+
栖息地
|
| 1988 |
+
黑客
|
| 1989 |
+
冰雹
|
| 1990 |
+
头发
|
| 1991 |
+
头发颜色
|
| 1992 |
+
发胶
|
| 1993 |
+
毛刷
|
| 1994 |
+
发型
|
| 1995 |
+
发夹
|
| 1996 |
+
发网
|
| 1997 |
+
发夹
|
| 1998 |
+
发型
|
| 1999 |
+
一半
|
| 2000 |
+
礼堂
|
| 2001 |
+
万圣节
|
| 2002 |
+
万圣节服装
|
| 2003 |
+
万圣节南瓜
|
| 2004 |
+
露背装
|
| 2005 |
+
汉堡
|
| 2006 |
+
汉堡包
|
| 2007 |
+
哈密瓜
|
| 2008 |
+
锤子
|
| 2009 |
+
吊床
|
| 2010 |
+
阻碍
|
| 2011 |
+
仓鼠
|
| 2012 |
+
烘手机
|
| 2013 |
+
放大镜
|
| 2014 |
+
擦手巾
|
| 2015 |
+
手提包
|
| 2016 |
+
手球
|
| 2017 |
+
手铐
|
| 2018 |
+
手枪
|
| 2019 |
+
手帕
|
| 2020 |
+
把手
|
| 2021 |
+
手锯
|
| 2022 |
+
握手
|
| 2023 |
+
倒立
|
| 2024 |
+
手写
|
| 2025 |
+
汉服
|
| 2026 |
+
悬挂
|
| 2027 |
+
飞机库
|
| 2028 |
+
衣架
|
| 2029 |
+
幸福
|
| 2030 |
+
海港
|
| 2031 |
+
斑海豹
|
| 2032 |
+
硬摇滚艺术家
|
| 2033 |
+
精装书
|
| 2034 |
+
建筑工人
|
| 2035 |
+
硬件
|
| 2036 |
+
五金店
|
| 2037 |
+
硬木
|
| 2038 |
+
硬木地板
|
| 2039 |
+
口琴
|
| 2040 |
+
管风琴
|
| 2041 |
+
羽管键琴
|
| 2042 |
+
收获
|
| 2043 |
+
收割机
|
| 2044 |
+
坐垫/搁脚凳/草丛
|
| 2045 |
+
帽子
|
| 2046 |
+
帽盒
|
| 2047 |
+
双簧管
|
| 2048 |
+
山楂
|
| 2049 |
+
干草
|
| 2050 |
+
干草地
|
| 2051 |
+
榛子
|
| 2052 |
+
头
|
| 2053 |
+
主教练
|
| 2054 |
+
大灯
|
| 2055 |
+
床头板
|
| 2056 |
+
头饰
|
| 2057 |
+
海岬
|
| 2058 |
+
总部
|
| 2059 |
+
听力
|
| 2060 |
+
心脏
|
| 2061 |
+
心形
|
| 2062 |
+
热能
|
| 2063 |
+
加热器
|
| 2064 |
+
帚石楠
|
| 2065 |
+
树篱
|
| 2066 |
+
刺猬
|
| 2067 |
+
脚后跟
|
| 2068 |
+
直升机
|
| 2069 |
+
直升机机场
|
| 2070 |
+
头盔
|
| 2071 |
+
帮助
|
| 2072 |
+
母鸡
|
| 2073 |
+
指甲花
|
| 2074 |
+
药草
|
| 2075 |
+
兽群
|
| 2076 |
+
寄居蟹
|
| 2077 |
+
英雄
|
| 2078 |
+
苍鹭
|
| 2079 |
+
芙蓉花
|
| 2080 |
+
芙蓉花
|
| 2081 |
+
隐藏/隐蔽处
|
| 2082 |
+
高杠
|
| 2083 |
+
高跟鞋
|
| 2084 |
+
高地
|
| 2085 |
+
突出
|
| 2086 |
+
徒步旅行
|
| 2087 |
+
徒步旅行者
|
| 2088 |
+
徒步靴
|
| 2089 |
+
登山设备
|
| 2090 |
+
山丘
|
| 2091 |
+
丘陵地
|
| 2092 |
+
别墅
|
| 2093 |
+
山坡
|
| 2094 |
+
印度教寺庙
|
| 2095 |
+
铰链
|
| 2096 |
+
臀部
|
| 2097 |
+
嘻哈艺人
|
| 2098 |
+
河马
|
| 2099 |
+
历史学家
|
| 2100 |
+
历史遗迹
|
| 2101 |
+
历史
|
| 2102 |
+
曲棍球
|
| 2103 |
+
冰球馆
|
| 2104 |
+
曲棍球比赛
|
| 2105 |
+
曲棍球运动员
|
| 2106 |
+
曲棍球棒
|
| 2107 |
+
锄头
|
| 2108 |
+
洞
|
| 2109 |
+
假日
|
| 2110 |
+
冬青树
|
| 2111 |
+
海参
|
| 2112 |
+
家/住宅
|
| 2113 |
+
家用电器
|
| 2114 |
+
基地
|
| 2115 |
+
家居装饰
|
| 2116 |
+
室内设计
|
| 2117 |
+
内政部
|
| 2118 |
+
家庭影院
|
| 2119 |
+
家庭作业
|
| 2120 |
+
鹰嘴豆泥
|
| 2121 |
+
蜂蜜
|
| 2122 |
+
蜂窝
|
| 2123 |
+
蜜月
|
| 2124 |
+
风帽
|
| 2125 |
+
连帽衫
|
| 2126 |
+
挂钩/勾住
|
| 2127 |
+
跳
|
| 2128 |
+
地平线
|
| 2129 |
+
犀鸟
|
| 2130 |
+
长角牛
|
| 2131 |
+
大黄蜂
|
| 2132 |
+
震惊
|
| 2133 |
+
恐怖电影
|
| 2134 |
+
马鞍褥
|
| 2135 |
+
马车
|
| 2136 |
+
马场
|
| 2137 |
+
骑马
|
| 2138 |
+
马背
|
| 2139 |
+
马蹄铁
|
| 2140 |
+
软管
|
| 2141 |
+
医院
|
| 2142 |
+
医院病床
|
| 2143 |
+
病房
|
| 2144 |
+
主持人
|
| 2145 |
+
小旅馆
|
| 2146 |
+
热
|
| 2147 |
+
热气球
|
| 2148 |
+
热狗
|
| 2149 |
+
辣椒酱
|
| 2150 |
+
温泉
|
| 2151 |
+
旅馆
|
| 2152 |
+
酒店大堂
|
| 2153 |
+
酒店房间
|
| 2154 |
+
电炉
|
| 2155 |
+
沙漏
|
| 2156 |
+
房子
|
| 2157 |
+
房子外部
|
| 2158 |
+
室内植物
|
| 2159 |
+
悬滑板
|
| 2160 |
+
吼
|
| 2161 |
+
蜷缩
|
| 2162 |
+
拥抱
|
| 2163 |
+
呼啦圈
|
| 2164 |
+
人
|
| 2165 |
+
增湿器
|
| 2166 |
+
蜂鸟
|
| 2167 |
+
座头鲸
|
| 2168 |
+
打猎
|
| 2169 |
+
狩猎小屋
|
| 2170 |
+
障碍
|
| 2171 |
+
飓风
|
| 2172 |
+
哈士奇
|
| 2173 |
+
小屋
|
| 2174 |
+
鬣狗
|
| 2175 |
+
混合物
|
| 2176 |
+
绣球花
|
| 2177 |
+
消火栓
|
| 2178 |
+
水上飞机
|
| 2179 |
+
冰
|
| 2180 |
+
冰袋
|
| 2181 |
+
北极熊
|
| 2182 |
+
冰洞
|
| 2183 |
+
冰淇淋
|
| 2184 |
+
冰淇淋蛋卷
|
| 2185 |
+
冰淇淋商店
|
| 2186 |
+
冰块
|
| 2187 |
+
浮冰
|
| 2188 |
+
冰球运动员
|
| 2189 |
+
冰球队
|
| 2190 |
+
棒棒糖
|
| 2191 |
+
制冰机
|
| 2192 |
+
溜冰场
|
| 2193 |
+
冰雕
|
| 2194 |
+
冰架
|
| 2195 |
+
溜冰鞋
|
| 2196 |
+
滑冰
|
| 2197 |
+
冰山
|
| 2198 |
+
冰柱
|
| 2199 |
+
糖衣/酥皮
|
| 2200 |
+
图标
|
| 2201 |
+
身份证照片
|
| 2202 |
+
身份证
|
| 2203 |
+
冰屋
|
| 2204 |
+
光/灯光/光线
|
| 2205 |
+
鬣蜥蜴
|
| 2206 |
+
照亮
|
| 2207 |
+
插图
|
| 2208 |
+
形象
|
| 2209 |
+
黑斑羚
|
| 2210 |
+
熏香
|
| 2211 |
+
独立日
|
| 2212 |
+
个人
|
| 2213 |
+
室内
|
| 2214 |
+
划船器
|
| 2215 |
+
电磁炉
|
| 2216 |
+
工业区
|
| 2217 |
+
工业
|
| 2218 |
+
步兵
|
| 2219 |
+
充气艇
|
| 2220 |
+
服务台
|
| 2221 |
+
基础设施
|
| 2222 |
+
成分
|
| 2223 |
+
吸入器
|
| 2224 |
+
注射
|
| 2225 |
+
受伤
|
| 2226 |
+
墨水
|
| 2227 |
+
印泥
|
| 2228 |
+
小湖湾
|
| 2229 |
+
题词
|
| 2230 |
+
昆虫
|
| 2231 |
+
安装
|
| 2232 |
+
乐器/器械
|
| 2233 |
+
绝缘杯
|
| 2234 |
+
互动
|
| 2235 |
+
室内设计
|
| 2236 |
+
网站
|
| 2237 |
+
十字路口
|
| 2238 |
+
面试
|
| 2239 |
+
无脊椎动物
|
| 2240 |
+
邀请
|
| 2241 |
+
平板电脑
|
| 2242 |
+
苹果手机
|
| 2243 |
+
苹果音乐播放器
|
| 2244 |
+
虹膜
|
| 2245 |
+
铁
|
| 2246 |
+
熨衣板
|
| 2247 |
+
灌溉系统
|
| 2248 |
+
岛
|
| 2249 |
+
小岛
|
| 2250 |
+
等足类动物
|
| 2251 |
+
象牙
|
| 2252 |
+
常青藤
|
| 2253 |
+
居酒屋
|
| 2254 |
+
千斤顶
|
| 2255 |
+
帝王蟹/蟹
|
| 2256 |
+
夹克衫
|
| 2257 |
+
按摩浴缸
|
| 2258 |
+
玉
|
| 2259 |
+
美洲虎
|
| 2260 |
+
监狱牢房
|
| 2261 |
+
果酱
|
| 2262 |
+
日式花园
|
| 2263 |
+
茉莉花
|
| 2264 |
+
下巴
|
| 2265 |
+
松鸦
|
| 2266 |
+
爵士乐
|
| 2267 |
+
爵士乐艺术家
|
| 2268 |
+
爵士融合艺术家
|
| 2269 |
+
牛仔裤
|
| 2270 |
+
吉普车
|
| 2271 |
+
果冻
|
| 2272 |
+
果冻豆
|
| 2273 |
+
水母
|
| 2274 |
+
喷气式飞机
|
| 2275 |
+
摩托艇
|
| 2276 |
+
珠宝
|
| 2277 |
+
珠宝
|
| 2278 |
+
珠宝店
|
| 2279 |
+
拼图游戏
|
| 2280 |
+
人力车
|
| 2281 |
+
赛马骑师
|
| 2282 |
+
赛马帽
|
| 2283 |
+
慢跑
|
| 2284 |
+
联合的
|
| 2285 |
+
记者
|
| 2286 |
+
操纵杆
|
| 2287 |
+
法官
|
| 2288 |
+
水壶
|
| 2289 |
+
玩杂耍
|
| 2290 |
+
果汁
|
| 2291 |
+
榨汁器
|
| 2292 |
+
枣子
|
| 2293 |
+
跳绳
|
| 2294 |
+
连身裤
|
| 2295 |
+
丛林
|
| 2296 |
+
废品堆放场
|
| 2297 |
+
羽衣甘蓝
|
| 2298 |
+
万花筒
|
| 2299 |
+
袋鼠
|
| 2300 |
+
卡拉ok
|
| 2301 |
+
空手道
|
| 2302 |
+
卡丁车运动
|
| 2303 |
+
旧城区
|
| 2304 |
+
皮船
|
| 2305 |
+
烤肉串
|
| 2306 |
+
按键/钥匙
|
| 2307 |
+
门卡
|
| 2308 |
+
卡其色
|
| 2309 |
+
踢
|
| 2310 |
+
苏格兰裙
|
| 2311 |
+
和服
|
| 2312 |
+
幼儿园教室
|
| 2313 |
+
幼儿园
|
| 2314 |
+
国王
|
| 2315 |
+
帝王蟹
|
| 2316 |
+
亲吻
|
| 2317 |
+
工具包
|
| 2318 |
+
厨房
|
| 2319 |
+
厨房橱柜
|
| 2320 |
+
厨房台面
|
| 2321 |
+
厨房地板
|
| 2322 |
+
厨房抽油烟机
|
| 2323 |
+
厨房岛
|
| 2324 |
+
厨房水槽
|
| 2325 |
+
厨房桌子
|
| 2326 |
+
厨房用具
|
| 2327 |
+
厨房窗户
|
| 2328 |
+
厨房用具
|
| 2329 |
+
风筝
|
| 2330 |
+
猕猴桃
|
| 2331 |
+
护膝
|
| 2332 |
+
跪下
|
| 2333 |
+
餐刀
|
| 2334 |
+
骑手
|
| 2335 |
+
编织
|
| 2336 |
+
编织针
|
| 2337 |
+
球形把手
|
| 2338 |
+
门环
|
| 2339 |
+
结
|
| 2340 |
+
考拉
|
| 2341 |
+
锦鲤
|
| 2342 |
+
ktv
|
| 2343 |
+
实验室
|
| 2344 |
+
实验室外套
|
| 2345 |
+
标签
|
| 2346 |
+
拉布拉多
|
| 2347 |
+
迷宫
|
| 2348 |
+
网眼织物
|
| 2349 |
+
蕾丝连衣裙
|
| 2350 |
+
梯子
|
| 2351 |
+
长柄杓
|
| 2352 |
+
瓢虫
|
| 2353 |
+
环礁湖
|
| 2354 |
+
湖泊
|
| 2355 |
+
湖区
|
| 2356 |
+
湖边小屋
|
| 2357 |
+
湖岸
|
| 2358 |
+
羊肉
|
| 2359 |
+
羊排
|
| 2360 |
+
灯柱
|
| 2361 |
+
灯罩
|
| 2362 |
+
矛
|
| 2363 |
+
土地
|
| 2364 |
+
陆地车辆
|
| 2365 |
+
废物填埋
|
| 2366 |
+
着陆
|
| 2367 |
+
降落甲板
|
| 2368 |
+
地标
|
| 2369 |
+
风景
|
| 2370 |
+
山崩
|
| 2371 |
+
挂带
|
| 2372 |
+
灯笼
|
| 2373 |
+
腿/大腿
|
| 2374 |
+
笔记本电脑
|
| 2375 |
+
笔记本键盘
|
| 2376 |
+
幼体
|
| 2377 |
+
烤宽面条
|
| 2378 |
+
激光
|
| 2379 |
+
睫毛
|
| 2380 |
+
套索
|
| 2381 |
+
门闩
|
| 2382 |
+
乳胶
|
| 2383 |
+
拿铁咖啡
|
| 2384 |
+
笑
|
| 2385 |
+
发射
|
| 2386 |
+
发布会
|
| 2387 |
+
举办会议
|
| 2388 |
+
自助洗衣店
|
| 2389 |
+
洗衣房
|
| 2390 |
+
洗衣篮
|
| 2391 |
+
洗衣房
|
| 2392 |
+
熔岩
|
| 2393 |
+
薰衣草
|
| 2394 |
+
草坪
|
| 2395 |
+
草坪婚礼
|
| 2396 |
+
律师
|
| 2397 |
+
躺
|
| 2398 |
+
引领
|
| 2399 |
+
主唱
|
| 2400 |
+
通向
|
| 2401 |
+
领袖
|
| 2402 |
+
泄漏
|
| 2403 |
+
倾斜/倚靠
|
| 2404 |
+
学习
|
| 2405 |
+
皮带
|
| 2406 |
+
皮革
|
| 2407 |
+
皮夹克
|
| 2408 |
+
皮鞋
|
| 2409 |
+
演讲
|
| 2410 |
+
演讲厅
|
| 2411 |
+
教学室
|
| 2412 |
+
窗台
|
| 2413 |
+
剩饭
|
| 2414 |
+
腿
|
| 2415 |
+
传说
|
| 2416 |
+
紧身裤/秋裤
|
| 2417 |
+
立法院
|
| 2418 |
+
乐高
|
| 2419 |
+
豆类
|
| 2420 |
+
柠檬
|
| 2421 |
+
柠檬汁
|
| 2422 |
+
柠檬水
|
| 2423 |
+
狐猴
|
| 2424 |
+
镜头
|
| 2425 |
+
眩光
|
| 2426 |
+
扁豆
|
| 2427 |
+
豹
|
| 2428 |
+
紧身连衣裤
|
| 2429 |
+
紧身裤袜
|
| 2430 |
+
小妖精
|
| 2431 |
+
课程
|
| 2432 |
+
信函
|
| 2433 |
+
信箱
|
| 2434 |
+
信的标志
|
| 2435 |
+
刻字
|
| 2436 |
+
生菜
|
| 2437 |
+
水平
|
| 2438 |
+
图书馆
|
| 2439 |
+
许可证
|
| 2440 |
+
车牌
|
| 2441 |
+
地衣
|
| 2442 |
+
舔
|
| 2443 |
+
盖子
|
| 2444 |
+
躺着
|
| 2445 |
+
安全带
|
| 2446 |
+
救生衣
|
| 2447 |
+
救生艇
|
| 2448 |
+
救生员
|
| 2449 |
+
提起
|
| 2450 |
+
灯具
|
| 2451 |
+
灯光秀
|
| 2452 |
+
电灯开关
|
| 2453 |
+
照明/照明设备
|
| 2454 |
+
闪电
|
| 2455 |
+
避雷针
|
| 2456 |
+
淡紫色
|
| 2457 |
+
百合
|
| 2458 |
+
肢体
|
| 2459 |
+
石灰
|
| 2460 |
+
石灰石
|
| 2461 |
+
豪华轿车
|
| 2462 |
+
线条
|
| 2463 |
+
艺术线条
|
| 2464 |
+
排队
|
| 2465 |
+
亚麻
|
| 2466 |
+
邮轮
|
| 2467 |
+
狮子
|
| 2468 |
+
润唇膏
|
| 2469 |
+
口红
|
| 2470 |
+
液体
|
| 2471 |
+
酒类商店
|
| 2472 |
+
列表
|
| 2473 |
+
荔枝
|
| 2474 |
+
生活
|
| 2475 |
+
家畜
|
| 2476 |
+
客厅
|
| 2477 |
+
生活空间
|
| 2478 |
+
蜥蜴
|
| 2479 |
+
负载
|
| 2480 |
+
装卸码头
|
| 2481 |
+
游手好闲的人
|
| 2482 |
+
走廊
|
| 2483 |
+
定位
|
| 2484 |
+
锁
|
| 2485 |
+
闸室
|
| 2486 |
+
储物柜
|
| 2487 |
+
阁楼
|
| 2488 |
+
原木
|
| 2489 |
+
小木屋
|
| 2490 |
+
标志
|
| 2491 |
+
洛基
|
| 2492 |
+
长头发
|
| 2493 |
+
冲浪板
|
| 2494 |
+
隐约显现/织布机
|
| 2495 |
+
环状
|
| 2496 |
+
遗失
|
| 2497 |
+
彩票
|
| 2498 |
+
莲花
|
| 2499 |
+
爱
|
| 2500 |
+
双人沙发
|
| 2501 |
+
行李
|
| 2502 |
+
木材
|
| 2503 |
+
伐木工人
|
| 2504 |
+
午餐
|
| 2505 |
+
午餐盒
|
| 2506 |
+
郁郁葱葱的
|
| 2507 |
+
奢侈品
|
| 2508 |
+
豪华游艇
|
| 2509 |
+
雨衣
|
| 2510 |
+
澳洲胡桃
|
| 2511 |
+
短尾猿
|
| 2512 |
+
通心粉
|
| 2513 |
+
金刚鹦鹉
|
| 2514 |
+
弯刀
|
| 2515 |
+
机器
|
| 2516 |
+
机枪
|
| 2517 |
+
杂志
|
| 2518 |
+
魔法
|
| 2519 |
+
魔术师
|
| 2520 |
+
磁铁
|
| 2521 |
+
放大镜
|
| 2522 |
+
木兰花
|
| 2523 |
+
喜鹊
|
| 2524 |
+
麻将
|
| 2525 |
+
象夫
|
| 2526 |
+
女仆
|
| 2527 |
+
邮件
|
| 2528 |
+
邮件槽
|
| 2529 |
+
制作
|
| 2530 |
+
改造
|
| 2531 |
+
化妆师
|
| 2532 |
+
化妆工具
|
| 2533 |
+
野鸭
|
| 2534 |
+
野鸭
|
| 2535 |
+
槌棒
|
| 2536 |
+
哺乳动物
|
| 2537 |
+
猛犸象
|
| 2538 |
+
男人
|
| 2539 |
+
管理
|
| 2540 |
+
经理
|
| 2541 |
+
海牛
|
| 2542 |
+
曼荼罗
|
| 2543 |
+
橘子
|
| 2544 |
+
普通话
|
| 2545 |
+
鬃毛
|
| 2546 |
+
漫画
|
| 2547 |
+
食槽
|
| 2548 |
+
芒果
|
| 2549 |
+
山竹果
|
| 2550 |
+
红树林
|
| 2551 |
+
曼哈顿
|
| 2552 |
+
检修孔
|
| 2553 |
+
井盖
|
| 2554 |
+
修指甲
|
| 2555 |
+
人体模型
|
| 2556 |
+
庄园主宅
|
| 2557 |
+
大厦
|
| 2558 |
+
螳螂
|
| 2559 |
+
地幔
|
| 2560 |
+
活动房层
|
| 2561 |
+
制造业
|
| 2562 |
+
手稿
|
| 2563 |
+
地图
|
| 2564 |
+
枫木
|
| 2565 |
+
枫叶
|
| 2566 |
+
枫糖浆
|
| 2567 |
+
沙球
|
| 2568 |
+
马拉松
|
| 2569 |
+
大理石
|
| 2570 |
+
行进
|
| 2571 |
+
行进乐队
|
| 2572 |
+
母马
|
| 2573 |
+
金盏花
|
| 2574 |
+
水兵
|
| 2575 |
+
海洋无脊椎动物
|
| 2576 |
+
海洋哺乳动物
|
| 2577 |
+
木偶
|
| 2578 |
+
标志
|
| 2579 |
+
集市
|
| 2580 |
+
市场广场
|
| 2581 |
+
市场摊位
|
| 2582 |
+
结婚
|
| 2583 |
+
武术
|
| 2584 |
+
武术家
|
| 2585 |
+
武术馆
|
| 2586 |
+
马提尼
|
| 2587 |
+
马丁尼酒杯
|
| 2588 |
+
睫毛膏
|
| 2589 |
+
吉祥物
|
| 2590 |
+
土豆泥
|
| 2591 |
+
搅碎机
|
| 2592 |
+
面具/口罩
|
| 2593 |
+
按摩
|
| 2594 |
+
桅杆
|
| 2595 |
+
地垫
|
| 2596 |
+
斗牛士
|
| 2597 |
+
比赛
|
| 2598 |
+
火柴盒
|
| 2599 |
+
衣料
|
| 2600 |
+
床垫
|
| 2601 |
+
陵墓
|
| 2602 |
+
长裙
|
| 2603 |
+
一餐
|
| 2604 |
+
量杯
|
| 2605 |
+
卷尺
|
| 2606 |
+
肉类
|
| 2607 |
+
肉丸
|
| 2608 |
+
机械师
|
| 2609 |
+
机械风扇
|
| 2610 |
+
奖牌
|
| 2611 |
+
媒体
|
| 2612 |
+
医疗设备
|
| 2613 |
+
医学图像
|
| 2614 |
+
医务人员
|
| 2615 |
+
医药箱
|
| 2616 |
+
中世纪的
|
| 2617 |
+
麦地那市
|
| 2618 |
+
冥想
|
| 2619 |
+
猫鼬
|
| 2620 |
+
赛事
|
| 2621 |
+
香瓜
|
| 2622 |
+
纪念碑
|
| 2623 |
+
菜单
|
| 2624 |
+
美人鱼
|
| 2625 |
+
网
|
| 2626 |
+
肮脏
|
| 2627 |
+
信使袋
|
| 2628 |
+
金属
|
| 2629 |
+
金属艺术家
|
| 2630 |
+
金属探测器
|
| 2631 |
+
计量器
|
| 2632 |
+
中层楼
|
| 2633 |
+
麦克风
|
| 2634 |
+
显微镜
|
| 2635 |
+
微波炉
|
| 2636 |
+
午夜
|
| 2637 |
+
里程碑
|
| 2638 |
+
军装
|
| 2639 |
+
牛奶
|
| 2640 |
+
牛奶罐
|
| 2641 |
+
奶茶
|
| 2642 |
+
奶昔
|
| 2643 |
+
磨坊
|
| 2644 |
+
矿井
|
| 2645 |
+
矿工
|
| 2646 |
+
矿物质
|
| 2647 |
+
矿泉水
|
| 2648 |
+
迷你
|
| 2649 |
+
微缩模型
|
| 2650 |
+
面包车
|
| 2651 |
+
部长
|
| 2652 |
+
小型货车
|
| 2653 |
+
薄荷
|
| 2654 |
+
薄荷糖
|
| 2655 |
+
镜子
|
| 2656 |
+
小姐
|
| 2657 |
+
投掷物
|
| 2658 |
+
任务
|
| 2659 |
+
槲寄生
|
| 2660 |
+
混合
|
| 2661 |
+
搅拌机
|
| 2662 |
+
搅拌碗
|
| 2663 |
+
混合物
|
| 2664 |
+
护城河
|
| 2665 |
+
电动踏板车
|
| 2666 |
+
模型/模特
|
| 2667 |
+
汽车模型
|
| 2668 |
+
现代
|
| 2669 |
+
现代大厦
|
| 2670 |
+
潮湿
|
| 2671 |
+
模具
|
| 2672 |
+
模具
|
| 2673 |
+
鼹鼠
|
| 2674 |
+
君主
|
| 2675 |
+
钱
|
| 2676 |
+
监控器
|
| 2677 |
+
和尚
|
| 2678 |
+
猴子
|
| 2679 |
+
活动扳手
|
| 2680 |
+
黑白照片
|
| 2681 |
+
独轮脚踏车
|
| 2682 |
+
怪物卡车
|
| 2683 |
+
月亮
|
| 2684 |
+
月饼
|
| 2685 |
+
月光
|
| 2686 |
+
沼泽
|
| 2687 |
+
驼鹿
|
| 2688 |
+
拖把
|
| 2689 |
+
助力车
|
| 2690 |
+
早晨
|
| 2691 |
+
晨雾
|
| 2692 |
+
晨光
|
| 2693 |
+
朝阳
|
| 2694 |
+
砂浆
|
| 2695 |
+
马赛克
|
| 2696 |
+
清真寺
|
| 2697 |
+
蚊子
|
| 2698 |
+
藓类植物
|
| 2699 |
+
汽车旅馆
|
| 2700 |
+
蛾
|
| 2701 |
+
母亲
|
| 2702 |
+
主板
|
| 2703 |
+
主题
|
| 2704 |
+
动作
|
| 2705 |
+
电动机
|
| 2706 |
+
摩托车
|
| 2707 |
+
摩托车
|
| 2708 |
+
摩托车头盔
|
| 2709 |
+
摩托车赛车手
|
| 2710 |
+
骑摩托车的人
|
| 2711 |
+
赛车运动
|
| 2712 |
+
土堆
|
| 2713 |
+
山
|
| 2714 |
+
山地自行车
|
| 2715 |
+
山地自行车员
|
| 2716 |
+
山地自行车运动
|
| 2717 |
+
山地大猩猩
|
| 2718 |
+
山湖
|
| 2719 |
+
山景观
|
| 2720 |
+
山口
|
| 2721 |
+
山路
|
| 2722 |
+
山脉
|
| 2723 |
+
山区河流
|
| 2724 |
+
山雪
|
| 2725 |
+
山间溪流
|
| 2726 |
+
山景城
|
| 2727 |
+
山村
|
| 2728 |
+
登山者
|
| 2729 |
+
登山包
|
| 2730 |
+
鼠标/鼠
|
| 2731 |
+
鼠标垫
|
| 2732 |
+
捕鼠器
|
| 2733 |
+
嘴
|
| 2734 |
+
漱口水
|
| 2735 |
+
移动
|
| 2736 |
+
电影海报
|
| 2737 |
+
电影票
|
| 2738 |
+
割草机
|
| 2739 |
+
mp3播放器
|
| 2740 |
+
先生
|
| 2741 |
+
泥
|
| 2742 |
+
松饼
|
| 2743 |
+
马克杯
|
| 2744 |
+
桑树
|
| 2745 |
+
覆盖物
|
| 2746 |
+
骡子
|
| 2747 |
+
直辖市
|
| 2748 |
+
壁画
|
| 2749 |
+
肌肉
|
| 2750 |
+
肌肉车
|
| 2751 |
+
博物馆
|
| 2752 |
+
蘑菇
|
| 2753 |
+
音乐
|
| 2754 |
+
音乐节
|
| 2755 |
+
音乐凳子
|
| 2756 |
+
音乐工作室
|
| 2757 |
+
音乐录影带表演者
|
| 2758 |
+
音乐键盘
|
| 2759 |
+
音乐家
|
| 2760 |
+
贻贝
|
| 2761 |
+
芥末
|
| 2762 |
+
神话
|
| 2763 |
+
烤干酪辣味玉米片
|
| 2764 |
+
指甲油
|
| 2765 |
+
指甲锉
|
| 2766 |
+
保姆
|
| 2767 |
+
餐巾
|
| 2768 |
+
狭窄的
|
| 2769 |
+
国旗
|
| 2770 |
+
基督诞生的场景
|
| 2771 |
+
自然历史博物馆
|
| 2772 |
+
自然
|
| 2773 |
+
自然保护区
|
| 2774 |
+
导航
|
| 2775 |
+
九夜节
|
| 2776 |
+
海军
|
| 2777 |
+
星云
|
| 2778 |
+
脖子
|
| 2779 |
+
围颈带/领口
|
| 2780 |
+
项链
|
| 2781 |
+
领口
|
| 2782 |
+
花蜜
|
| 2783 |
+
油桃
|
| 2784 |
+
针状物
|
| 2785 |
+
邻居
|
| 2786 |
+
与某处邻近的地区
|
| 2787 |
+
霓虹灯
|
| 2788 |
+
霓虹灯
|
| 2789 |
+
神经
|
| 2790 |
+
巢
|
| 2791 |
+
新年
|
| 2792 |
+
新生的
|
| 2793 |
+
纽芬兰
|
| 2794 |
+
新婚
|
| 2795 |
+
新闻
|
| 2796 |
+
记者招待会
|
| 2797 |
+
报摊
|
| 2798 |
+
晚上
|
| 2799 |
+
夜市
|
| 2800 |
+
夜空
|
| 2801 |
+
夜景
|
| 2802 |
+
夜总会
|
| 2803 |
+
床头柜
|
| 2804 |
+
面条
|
| 2805 |
+
鼻子
|
| 2806 |
+
鼻羁
|
| 2807 |
+
注解
|
| 2808 |
+
笔记本
|
| 2809 |
+
记事本
|
| 2810 |
+
信纸
|
| 2811 |
+
公告
|
| 2812 |
+
数字图标
|
| 2813 |
+
修女
|
| 2814 |
+
护士
|
| 2815 |
+
托儿所
|
| 2816 |
+
养老院
|
| 2817 |
+
螺母
|
| 2818 |
+
胡桃夹子
|
| 2819 |
+
橡木
|
| 2820 |
+
橡树
|
| 2821 |
+
桨
|
| 2822 |
+
绿洲
|
| 2823 |
+
烘干室
|
| 2824 |
+
燕麦片
|
| 2825 |
+
燕麦
|
| 2826 |
+
方尖塔
|
| 2827 |
+
观察塔
|
| 2828 |
+
天文台
|
| 2829 |
+
超越障碍训练场
|
| 2830 |
+
海洋
|
| 2831 |
+
章鱼
|
| 2832 |
+
提供
|
| 2833 |
+
办公室
|
| 2834 |
+
办公大楼
|
| 2835 |
+
办公椅
|
| 2836 |
+
办公室隔间
|
| 2837 |
+
办公桌
|
| 2838 |
+
办公用品
|
| 2839 |
+
办公室的窗户
|
| 2840 |
+
军官
|
| 2841 |
+
行政官员
|
| 2842 |
+
石油
|
| 2843 |
+
油灯
|
| 2844 |
+
油画
|
| 2845 |
+
石油钻台
|
| 2846 |
+
秋葵
|
| 2847 |
+
老照片
|
| 2848 |
+
橄榄
|
| 2849 |
+
橄榄油
|
| 2850 |
+
橄榄树
|
| 2851 |
+
煎蛋卷
|
| 2852 |
+
洋葱
|
| 2853 |
+
洋葱圈
|
| 2854 |
+
蛋白石
|
| 2855 |
+
开阔的/张开
|
| 2856 |
+
开始
|
| 2857 |
+
开幕式
|
| 2858 |
+
歌剧
|
| 2859 |
+
歌剧院
|
| 2860 |
+
操作
|
| 2861 |
+
手术室
|
| 2862 |
+
操作
|
| 2863 |
+
眼镜店
|
| 2864 |
+
猩猩
|
| 2865 |
+
橙子/橙色
|
| 2866 |
+
橙汁
|
| 2867 |
+
橙树
|
| 2868 |
+
橘园
|
| 2869 |
+
轨道
|
| 2870 |
+
果园
|
| 2871 |
+
乐池
|
| 2872 |
+
兰花
|
| 2873 |
+
订单
|
| 2874 |
+
组织
|
| 2875 |
+
折纸
|
| 2876 |
+
点缀
|
| 2877 |
+
鱼鹰
|
| 2878 |
+
鸵鸟
|
| 2879 |
+
水獭
|
| 2880 |
+
外面的
|
| 2881 |
+
露头
|
| 2882 |
+
户外
|
| 2883 |
+
厕所
|
| 2884 |
+
电源插头
|
| 2885 |
+
大纲
|
| 2886 |
+
��圆形
|
| 2887 |
+
烤箱
|
| 2888 |
+
整体
|
| 2889 |
+
大衣
|
| 2890 |
+
天桥
|
| 2891 |
+
猫头鹰
|
| 2892 |
+
牡蛎
|
| 2893 |
+
橡皮环
|
| 2894 |
+
包裹
|
| 2895 |
+
包/包装/包裹
|
| 2896 |
+
围场
|
| 2897 |
+
警车
|
| 2898 |
+
挂锁
|
| 2899 |
+
肉菜饭
|
| 2900 |
+
宝塔
|
| 2901 |
+
疼痛
|
| 2902 |
+
油漆刷
|
| 2903 |
+
画家
|
| 2904 |
+
佩斯利印花大手帕
|
| 2905 |
+
宫殿
|
| 2906 |
+
调色板
|
| 2907 |
+
栅栏
|
| 2908 |
+
棺罩
|
| 2909 |
+
棕榈树
|
| 2910 |
+
平底锅
|
| 2911 |
+
煎饼
|
| 2912 |
+
熊猫
|
| 2913 |
+
面板
|
| 2914 |
+
全景
|
| 2915 |
+
三色堇
|
| 2916 |
+
喘息
|
| 2917 |
+
储藏室
|
| 2918 |
+
裤子
|
| 2919 |
+
连裤袜
|
| 2920 |
+
木瓜
|
| 2921 |
+
纸
|
| 2922 |
+
纸袋
|
| 2923 |
+
切纸机
|
| 2924 |
+
纸灯笼
|
| 2925 |
+
纸盘子
|
| 2926 |
+
纸巾
|
| 2927 |
+
平装书
|
| 2928 |
+
压纸器
|
| 2929 |
+
降落伞
|
| 2930 |
+
游行
|
| 2931 |
+
天堂
|
| 2932 |
+
鹦鹉
|
| 2933 |
+
护理人员
|
| 2934 |
+
长尾小鹦鹉
|
| 2935 |
+
滑翔伞
|
| 2936 |
+
伞兵
|
| 2937 |
+
羊皮纸
|
| 2938 |
+
教区
|
| 2939 |
+
公园
|
| 2940 |
+
公园长椅
|
| 2941 |
+
停车
|
| 2942 |
+
停车场
|
| 2943 |
+
停车费
|
| 2944 |
+
停车标志
|
| 2945 |
+
议会
|
| 2946 |
+
欧芹/香菜
|
| 2947 |
+
参与者
|
| 2948 |
+
合作伙伴
|
| 2949 |
+
帕特里奇
|
| 2950 |
+
聚会
|
| 2951 |
+
派对帽
|
| 2952 |
+
通过
|
| 2953 |
+
通道
|
| 2954 |
+
存折
|
| 2955 |
+
乘客
|
| 2956 |
+
客船
|
| 2957 |
+
旅客列车
|
| 2958 |
+
百香果
|
| 2959 |
+
护照
|
| 2960 |
+
面食
|
| 2961 |
+
粘贴
|
| 2962 |
+
糕点
|
| 2963 |
+
牧场
|
| 2964 |
+
补丁
|
| 2965 |
+
病人
|
| 2966 |
+
图案/款式
|
| 2967 |
+
人行道/硬路面
|
| 2968 |
+
大帐篷
|
| 2969 |
+
爪子
|
| 2970 |
+
支付
|
| 2971 |
+
付费电话
|
| 2972 |
+
豌豆
|
| 2973 |
+
和平
|
| 2974 |
+
桃子
|
| 2975 |
+
孔雀
|
| 2976 |
+
山峰/尖顶
|
| 2977 |
+
花生
|
| 2978 |
+
花生酱
|
| 2979 |
+
梨
|
| 2980 |
+
珍珠
|
| 2981 |
+
卵石
|
| 2982 |
+
山核桃
|
| 2983 |
+
行人
|
| 2984 |
+
人行天桥
|
| 2985 |
+
步行街
|
| 2986 |
+
果皮
|
| 2987 |
+
削皮器
|
| 2988 |
+
小钉板
|
| 2989 |
+
木质腿
|
| 2990 |
+
鹈鹕
|
| 2991 |
+
笔/围栏
|
| 2992 |
+
点球
|
| 2993 |
+
铅笔
|
| 2994 |
+
铅笔盒
|
| 2995 |
+
卷笔刀
|
| 2996 |
+
铅笔裙
|
| 2997 |
+
吊坠
|
| 2998 |
+
钟摆
|
| 2999 |
+
企鹅
|
| 3000 |
+
半岛
|
| 3001 |
+
锦标旗
|
| 3002 |
+
便士
|
| 3003 |
+
储蓄罐
|
| 3004 |
+
牡丹
|
| 3005 |
+
胡椒/辣椒
|
| 3006 |
+
胡椒研磨机
|
| 3007 |
+
胡椒子
|
| 3008 |
+
意大利辣香肠
|
| 3009 |
+
栖息/鲈鱼
|
| 3010 |
+
表演
|
| 3011 |
+
表演
|
| 3012 |
+
表演舞台
|
| 3013 |
+
香水
|
| 3014 |
+
绿廊
|
| 3015 |
+
波斯猫
|
| 3016 |
+
柿子
|
| 3017 |
+
个人护理
|
| 3018 |
+
个人漂浮装置
|
| 3019 |
+
害虫
|
| 3020 |
+
宠物
|
| 3021 |
+
宠物店
|
| 3022 |
+
宠物店
|
| 3023 |
+
花瓣
|
| 3024 |
+
佩妮
|
| 3025 |
+
教堂的长椅
|
| 3026 |
+
野鸡
|
| 3027 |
+
现象
|
| 3028 |
+
哲学家
|
| 3029 |
+
电话
|
| 3030 |
+
电话簿
|
| 3031 |
+
留声机
|
| 3032 |
+
照片
|
| 3033 |
+
照相亭
|
| 3034 |
+
相框
|
| 3035 |
+
摄影
|
| 3036 |
+
物理学家
|
| 3037 |
+
物理实验室
|
| 3038 |
+
钢琴家
|
| 3039 |
+
钢琴
|
| 3040 |
+
选择
|
| 3041 |
+
捡起
|
| 3042 |
+
泡菜
|
| 3043 |
+
野餐
|
| 3044 |
+
野餐区
|
| 3045 |
+
野餐篮
|
| 3046 |
+
野餐桌
|
| 3047 |
+
图片
|
| 3048 |
+
相框
|
| 3049 |
+
馅饼
|
| 3050 |
+
鸽子
|
| 3051 |
+
朝圣者
|
| 3052 |
+
药片
|
| 3053 |
+
枕头
|
| 3054 |
+
飞行员
|
| 3055 |
+
领航艇
|
| 3056 |
+
别针
|
| 3057 |
+
松树
|
| 3058 |
+
松果
|
| 3059 |
+
松林
|
| 3060 |
+
松子
|
| 3061 |
+
菠萝
|
| 3062 |
+
乒乓球桌
|
| 3063 |
+
乒乓球
|
| 3064 |
+
粉色
|
| 3065 |
+
一品脱的量
|
| 3066 |
+
琵琶
|
| 3067 |
+
管子
|
| 3068 |
+
管碗
|
| 3069 |
+
海盗
|
| 3070 |
+
海盗旗
|
| 3071 |
+
海盗船
|
| 3072 |
+
阿月浑子
|
| 3073 |
+
滑雪场
|
| 3074 |
+
口袋里的面包
|
| 3075 |
+
火龙果
|
| 3076 |
+
斗牛犬
|
| 3077 |
+
球场
|
| 3078 |
+
大水罐
|
| 3079 |
+
猪笼草
|
| 3080 |
+
干草叉
|
| 3081 |
+
披萨
|
| 3082 |
+
披萨刀
|
| 3083 |
+
比萨锅
|
| 3084 |
+
披萨店
|
| 3085 |
+
招牌
|
| 3086 |
+
地方
|
| 3087 |
+
餐具垫
|
| 3088 |
+
格子
|
| 3089 |
+
平原
|
| 3090 |
+
示意图
|
| 3091 |
+
行星
|
| 3092 |
+
行星地球
|
| 3093 |
+
厚木板
|
| 3094 |
+
植物
|
| 3095 |
+
种植园
|
| 3096 |
+
种植
|
| 3097 |
+
匾额
|
| 3098 |
+
石膏
|
| 3099 |
+
塑料
|
| 3100 |
+
橡皮泥
|
| 3101 |
+
高原
|
| 3102 |
+
平台
|
| 3103 |
+
白金
|
| 3104 |
+
大浅盘
|
| 3105 |
+
玩/演奏/运动
|
| 3106 |
+
打羽毛球
|
| 3107 |
+
打棒球
|
| 3108 |
+
打篮球
|
| 3109 |
+
玩台球
|
| 3110 |
+
踢足球
|
| 3111 |
+
玩乒乓球
|
| 3112 |
+
打网球
|
| 3113 |
+
打排球
|
| 3114 |
+
选手/运动员
|
| 3115 |
+
操场
|
| 3116 |
+
剧场
|
| 3117 |
+
扑克牌
|
| 3118 |
+
下棋
|
| 3119 |
+
打高尔夫球
|
| 3120 |
+
打麻将
|
| 3121 |
+
运动场
|
| 3122 |
+
护栏
|
| 3123 |
+
游戏室
|
| 3124 |
+
广场
|
| 3125 |
+
钳子
|
| 3126 |
+
故事情节
|
| 3127 |
+
犁
|
| 3128 |
+
插头
|
| 3129 |
+
插头帽
|
| 3130 |
+
李子
|
| 3131 |
+
水管工
|
| 3132 |
+
卫生洁具
|
| 3133 |
+
羽毛
|
| 3134 |
+
夹板
|
| 3135 |
+
口袋
|
| 3136 |
+
怀表
|
| 3137 |
+
随身小折刀
|
| 3138 |
+
圆荚体
|
| 3139 |
+
乐队指挥台
|
| 3140 |
+
诗歌
|
| 3141 |
+
一品红
|
| 3142 |
+
指/朝向
|
| 3143 |
+
指针
|
| 3144 |
+
扑克卡
|
| 3145 |
+
筹码
|
| 3146 |
+
扑克表
|
| 3147 |
+
杆/柱
|
| 3148 |
+
臭猫
|
| 3149 |
+
警察
|
| 3150 |
+
警车
|
| 3151 |
+
警犬
|
| 3152 |
+
警察局
|
| 3153 |
+
政治家
|
| 3154 |
+
圆点
|
| 3155 |
+
花粉
|
| 3156 |
+
污染
|
| 3157 |
+
马球
|
| 3158 |
+
马球领
|
| 3159 |
+
马球衬衫
|
| 3160 |
+
石榴
|
| 3161 |
+
波美拉尼亚的
|
| 3162 |
+
雨披
|
| 3163 |
+
池塘
|
| 3164 |
+
马尾辫
|
| 3165 |
+
贵宾犬
|
| 3166 |
+
池
|
| 3167 |
+
流行
|
| 3168 |
+
流行艺术家
|
| 3169 |
+
爆米花
|
| 3170 |
+
教皇
|
| 3171 |
+
罂粟
|
| 3172 |
+
瓷
|
| 3173 |
+
玄关
|
| 3174 |
+
猪肉
|
| 3175 |
+
粥
|
| 3176 |
+
便携式电池
|
| 3177 |
+
门户网站
|
| 3178 |
+
投资组合
|
| 3179 |
+
汽门
|
| 3180 |
+
肖像
|
| 3181 |
+
肖像会话
|
| 3182 |
+
摆姿势拍照
|
| 3183 |
+
负鼠
|
| 3184 |
+
帖子
|
| 3185 |
+
邮局
|
| 3186 |
+
邮票
|
| 3187 |
+
明信片
|
| 3188 |
+
海报
|
| 3189 |
+
海报页
|
| 3190 |
+
锅/罐/陶盆
|
| 3191 |
+
土豆
|
| 3192 |
+
土豆片
|
| 3193 |
+
土豆沙拉
|
| 3194 |
+
布垫子
|
| 3195 |
+
便壶
|
| 3196 |
+
袋
|
| 3197 |
+
家禽
|
| 3198 |
+
英镑
|
| 3199 |
+
倾泻
|
| 3200 |
+
粉末
|
| 3201 |
+
电源线
|
| 3202 |
+
电源插头及插座
|
| 3203 |
+
权力看
|
| 3204 |
+
电站
|
| 3205 |
+
练习
|
| 3206 |
+
布拉格城堡
|
| 3207 |
+
祈祷
|
| 3208 |
+
牧师
|
| 3209 |
+
首映
|
| 3210 |
+
处方
|
| 3211 |
+
显示
|
| 3212 |
+
演讲
|
| 3213 |
+
总统
|
| 3214 |
+
新闻发布室
|
| 3215 |
+
高压锅
|
| 3216 |
+
椒盐卷饼
|
| 3217 |
+
王子
|
| 3218 |
+
公主
|
| 3219 |
+
打印
|
| 3220 |
+
打印页面
|
| 3221 |
+
打印机
|
| 3222 |
+
印刷
|
| 3223 |
+
监狱
|
| 3224 |
+
农产品/生产
|
| 3225 |
+
产品
|
| 3226 |
+
职业
|
| 3227 |
+
专业的
|
| 3228 |
+
教授
|
| 3229 |
+
项目图片
|
| 3230 |
+
投影屏幕
|
| 3231 |
+
投影仪
|
| 3232 |
+
毕业舞会
|
| 3233 |
+
散步
|
| 3234 |
+
螺旋桨
|
| 3235 |
+
先知
|
| 3236 |
+
建议
|
| 3237 |
+
防护服
|
| 3238 |
+
抗议
|
| 3239 |
+
抗议者
|
| 3240 |
+
出版
|
| 3241 |
+
宣传画像
|
| 3242 |
+
冰上曲棍球
|
| 3243 |
+
布丁
|
| 3244 |
+
水坑
|
| 3245 |
+
泡芙
|
| 3246 |
+
角嘴海雀
|
| 3247 |
+
哈巴狗
|
| 3248 |
+
拉
|
| 3249 |
+
讲坛
|
| 3250 |
+
脉冲
|
| 3251 |
+
泵
|
| 3252 |
+
南瓜
|
| 3253 |
+
南瓜饼
|
| 3254 |
+
南瓜种子
|
| 3255 |
+
拳击吊袋
|
| 3256 |
+
拳头猛击/穿孔
|
| 3257 |
+
学生
|
| 3258 |
+
紫色
|
| 3259 |
+
推
|
| 3260 |
+
轻轻一击
|
| 3261 |
+
谜题
|
| 3262 |
+
塔
|
| 3263 |
+
金字塔
|
| 3264 |
+
大蟒
|
| 3265 |
+
二维码
|
| 3266 |
+
鹌鹑
|
| 3267 |
+
采石场
|
| 3268 |
+
季度
|
| 3269 |
+
石英
|
| 3270 |
+
女王
|
| 3271 |
+
油炸玉米粉饼
|
| 3272 |
+
队列
|
| 3273 |
+
乳蛋饼
|
| 3274 |
+
被子
|
| 3275 |
+
绗缝
|
| 3276 |
+
引用
|
| 3277 |
+
兔子
|
| 3278 |
+
浣熊
|
| 3279 |
+
比赛
|
| 3280 |
+
赛道
|
| 3281 |
+
水沟/跑道
|
| 3282 |
+
赛车
|
| 3283 |
+
球拍
|
| 3284 |
+
雷达
|
| 3285 |
+
散热器
|
| 3286 |
+
广播
|
| 3287 |
+
木筏/橡皮艇
|
| 3288 |
+
布娃娃
|
| 3289 |
+
栏杆/铁轨
|
| 3290 |
+
轨道车
|
| 3291 |
+
铁道
|
| 3292 |
+
铁路桥梁
|
| 3293 |
+
轨道线
|
| 3294 |
+
火车站
|
| 3295 |
+
雨
|
| 3296 |
+
雨靴
|
| 3297 |
+
彩虹
|
| 3298 |
+
虹鳟鱼
|
| 3299 |
+
雨衣
|
| 3300 |
+
热带雨林
|
| 3301 |
+
多雨的
|
| 3302 |
+
葡萄干
|
| 3303 |
+
耙子
|
| 3304 |
+
公羊
|
| 3305 |
+
斜坡
|
| 3306 |
+
油菜籽
|
| 3307 |
+
快速
|
| 3308 |
+
说唱歌手
|
| 3309 |
+
树莓
|
| 3310 |
+
老鼠
|
| 3311 |
+
棘轮
|
| 3312 |
+
乌鸦
|
| 3313 |
+
峡谷
|
| 3314 |
+
雷
|
| 3315 |
+
剃须刀
|
| 3316 |
+
锋利的
|
| 3317 |
+
阅读
|
| 3318 |
+
阅读材料
|
| 3319 |
+
钻孔器
|
| 3320 |
+
后面
|
| 3321 |
+
尾灯
|
| 3322 |
+
后视图
|
| 3323 |
+
后视镜
|
| 3324 |
+
收据
|
| 3325 |
+
收到
|
| 3326 |
+
接待
|
| 3327 |
+
配方
|
| 3328 |
+
记录
|
| 3329 |
+
唱片制作人
|
| 3330 |
+
记录器/竖笛
|
| 3331 |
+
录音室
|
| 3332 |
+
娱乐室
|
| 3333 |
+
休闲车
|
| 3334 |
+
矩形
|
| 3335 |
+
回收
|
| 3336 |
+
回收站
|
| 3337 |
+
红色
|
| 3338 |
+
红地毯
|
| 3339 |
+
红旗
|
| 3340 |
+
红熊猫
|
| 3341 |
+
红酒
|
| 3342 |
+
红木
|
| 3343 |
+
芦苇
|
| 3344 |
+
礁石
|
| 3345 |
+
卷轴
|
| 3346 |
+
裁判
|
| 3347 |
+
倒影
|
| 3348 |
+
倒影
|
| 3349 |
+
反射器
|
| 3350 |
+
注册
|
| 3351 |
+
控制
|
| 3352 |
+
驯鹿
|
| 3353 |
+
放松
|
| 3354 |
+
释放
|
| 3355 |
+
救援
|
| 3356 |
+
宗教
|
| 3357 |
+
宗教的
|
| 3358 |
+
享受
|
| 3359 |
+
保持
|
| 3360 |
+
改造
|
| 3361 |
+
遥控器
|
| 3362 |
+
移除
|
| 3363 |
+
修复
|
| 3364 |
+
维修店
|
| 3365 |
+
爬行动物
|
| 3366 |
+
救援
|
| 3367 |
+
救助者
|
| 3368 |
+
研究
|
| 3369 |
+
研究员
|
| 3370 |
+
储层
|
| 3371 |
+
住宅
|
| 3372 |
+
居民区
|
| 3373 |
+
树脂
|
| 3374 |
+
度假胜地
|
| 3375 |
+
度假小镇
|
| 3376 |
+
餐厅的厨房
|
| 3377 |
+
餐厅的露台
|
| 3378 |
+
厕所
|
| 3379 |
+
零售
|
| 3380 |
+
寻回犬
|
| 3381 |
+
制动火箭
|
| 3382 |
+
揭示
|
| 3383 |
+
犀牛
|
| 3384 |
+
杜鹃
|
| 3385 |
+
肋骨
|
| 3386 |
+
丝带
|
| 3387 |
+
大米
|
| 3388 |
+
电饭煲
|
| 3389 |
+
稻田
|
| 3390 |
+
骑/搭乘
|
| 3391 |
+
脊
|
| 3392 |
+
骑马
|
| 3393 |
+
步枪
|
| 3394 |
+
边缘
|
| 3395 |
+
环/戒指
|
| 3396 |
+
暴乱
|
| 3397 |
+
涟漪
|
| 3398 |
+
上升
|
| 3399 |
+
高层建筑
|
| 3400 |
+
河
|
| 3401 |
+
河岸
|
| 3402 |
+
河船
|
| 3403 |
+
河谷
|
| 3404 |
+
河床
|
| 3405 |
+
路
|
| 3406 |
+
路标
|
| 3407 |
+
公路旅行
|
| 3408 |
+
路边
|
| 3409 |
+
烤鸡
|
| 3410 |
+
长袍
|
| 3411 |
+
罗宾
|
| 3412 |
+
机器人
|
| 3413 |
+
石头
|
| 3414 |
+
岩石拱
|
| 3415 |
+
摇滚艺术家
|
| 3416 |
+
摇滚乐队
|
| 3417 |
+
攀岩者
|
| 3418 |
+
攀岩
|
| 3419 |
+
摇滚音乐会
|
| 3420 |
+
岩石表面
|
| 3421 |
+
岩层
|
| 3422 |
+
摇滚歌手
|
| 3423 |
+
火箭
|
| 3424 |
+
摇椅
|
| 3425 |
+
岩石
|
| 3426 |
+
啮齿动物
|
| 3427 |
+
牛仔竞技表演
|
| 3428 |
+
竞技舞台
|
| 3429 |
+
罗伊
|
| 3430 |
+
狍子
|
| 3431 |
+
辊
|
| 3432 |
+
过山车
|
| 3433 |
+
轮式溜冰鞋
|
| 3434 |
+
溜冰鞋
|
| 3435 |
+
擀面杖
|
| 3436 |
+
浪漫
|
| 3437 |
+
浪漫的
|
| 3438 |
+
屋顶
|
| 3439 |
+
屋顶花园
|
| 3440 |
+
房间
|
| 3441 |
+
房间分频器
|
| 3442 |
+
根
|
| 3443 |
+
根啤酒
|
| 3444 |
+
绳索桥
|
| 3445 |
+
念珠
|
| 3446 |
+
玫瑰
|
| 3447 |
+
迷迭香
|
| 3448 |
+
玫瑰色的云
|
| 3449 |
+
罗特韦尔犬
|
| 3450 |
+
圆桌
|
| 3451 |
+
路由器
|
| 3452 |
+
行
|
| 3453 |
+
罗文
|
| 3454 |
+
皇家
|
| 3455 |
+
橡皮图章
|
| 3456 |
+
废墟
|
| 3457 |
+
魔方
|
| 3458 |
+
红宝石
|
| 3459 |
+
莱夫
|
| 3460 |
+
橄榄球
|
| 3461 |
+
橄榄球
|
| 3462 |
+
橄榄球运动员
|
| 3463 |
+
毁坏
|
| 3464 |
+
尺
|
| 3465 |
+
朗姆酒
|
| 3466 |
+
跑
|
| 3467 |
+
跑步者
|
| 3468 |
+
跑步鞋
|
| 3469 |
+
农村的
|
| 3470 |
+
锈
|
| 3471 |
+
乡村的
|
| 3472 |
+
黑麦
|
| 3473 |
+
袋
|
| 3474 |
+
鞍
|
| 3475 |
+
鞍囊
|
| 3476 |
+
旅行
|
| 3477 |
+
安全
|
| 3478 |
+
安全背心
|
| 3479 |
+
圣人
|
| 3480 |
+
帆
|
| 3481 |
+
帆船
|
| 3482 |
+
航行
|
| 3483 |
+
水手
|
| 3484 |
+
松鼠猴
|
| 3485 |
+
缘故
|
| 3486 |
+
沙拉
|
| 3487 |
+
沙拉碗
|
| 3488 |
+
火蜥蜴
|
| 3489 |
+
意大利蒜味腊肠
|
| 3490 |
+
出售
|
| 3491 |
+
三文鱼
|
| 3492 |
+
沙龙
|
| 3493 |
+
萨尔萨舞
|
| 3494 |
+
盐
|
| 3495 |
+
盐和胡椒瓶
|
| 3496 |
+
盐湖
|
| 3497 |
+
盐沼
|
| 3498 |
+
盐瓶
|
| 3499 |
+
敬礼
|
| 3500 |
+
萨莫耶德人
|
| 3501 |
+
武士
|
| 3502 |
+
沙子
|
| 3503 |
+
沙洲
|
| 3504 |
+
砂箱
|
| 3505 |
+
沙堡
|
| 3506 |
+
沙雕
|
| 3507 |
+
凉鞋
|
| 3508 |
+
三明治
|
| 3509 |
+
卫生巾
|
| 3510 |
+
圣诞老人
|
| 3511 |
+
蓝宝石
|
| 3512 |
+
沙丁鱼
|
| 3513 |
+
莎丽
|
| 3514 |
+
生鱼片
|
| 3515 |
+
沙爹
|
| 3516 |
+
书包
|
| 3517 |
+
卫星
|
| 3518 |
+
缎
|
| 3519 |
+
酱汁
|
| 3520 |
+
碟子
|
| 3521 |
+
桑拿
|
| 3522 |
+
香肠
|
| 3523 |
+
稀树大草原
|
| 3524 |
+
锯
|
| 3525 |
+
锯木架
|
| 3526 |
+
萨克斯管
|
| 3527 |
+
萨克斯手
|
| 3528 |
+
脚手架
|
| 3529 |
+
秤/标尺
|
| 3530 |
+
比例模型
|
| 3531 |
+
扇贝
|
| 3532 |
+
疤痕
|
| 3533 |
+
稻草人
|
| 3534 |
+
围巾
|
| 3535 |
+
场景
|
| 3536 |
+
风景
|
| 3537 |
+
雪纳瑞犬
|
| 3538 |
+
学校
|
| 3539 |
+
校车
|
| 3540 |
+
校服
|
| 3541 |
+
校舍
|
| 3542 |
+
纵帆船
|
| 3543 |
+
科学
|
| 3544 |
+
科幻电影
|
| 3545 |
+
科学博物馆
|
| 3546 |
+
科学家
|
| 3547 |
+
剪刀
|
| 3548 |
+
壁灯
|
| 3549 |
+
司康饼
|
| 3550 |
+
勺子
|
| 3551 |
+
踏板车/摩托车
|
| 3552 |
+
分数
|
| 3553 |
+
记分板
|
| 3554 |
+
蝎子
|
| 3555 |
+
童子军
|
| 3556 |
+
炒蛋
|
| 3557 |
+
废弃
|
| 3558 |
+
刮板
|
| 3559 |
+
刮伤
|
| 3560 |
+
屏幕
|
| 3561 |
+
纱门
|
| 3562 |
+
截图
|
| 3563 |
+
螺杆
|
| 3564 |
+
螺丝刀
|
| 3565 |
+
长卷纸/卷轴
|
| 3566 |
+
擦洗
|
| 3567 |
+
硬毛刷
|
| 3568 |
+
雕塑家
|
| 3569 |
+
雕塑
|
| 3570 |
+
海洞穴
|
| 3571 |
+
海冰
|
| 3572 |
+
海狮
|
| 3573 |
+
海龟
|
| 3574 |
+
海胆
|
| 3575 |
+
尖吻鲈
|
| 3576 |
+
海底
|
| 3577 |
+
海鸟
|
| 3578 |
+
海鲜
|
| 3579 |
+
海马
|
| 3580 |
+
海豹
|
| 3581 |
+
海景
|
| 3582 |
+
海贝
|
| 3583 |
+
海滨度假胜地
|
| 3584 |
+
季节
|
| 3585 |
+
座位
|
| 3586 |
+
安全带
|
| 3587 |
+
海藻
|
| 3588 |
+
秘书
|
| 3589 |
+
安全
|
| 3590 |
+
小轿车
|
| 3591 |
+
看到
|
| 3592 |
+
种子
|
| 3593 |
+
跷跷板
|
| 3594 |
+
赛格威
|
| 3595 |
+
自拍
|
| 3596 |
+
出售
|
| 3597 |
+
研讨会
|
| 3598 |
+
感觉
|
| 3599 |
+
传感器
|
| 3600 |
+
服务器
|
| 3601 |
+
服务器机房
|
| 3602 |
+
服务
|
| 3603 |
+
集
|
| 3604 |
+
缝纫机
|
| 3605 |
+
影子
|
| 3606 |
+
摇
|
| 3607 |
+
瓶
|
| 3608 |
+
洗发水
|
| 3609 |
+
形状
|
| 3610 |
+
分享
|
| 3611 |
+
鲨鱼
|
| 3612 |
+
卷笔刀
|
| 3613 |
+
记号笔
|
| 3614 |
+
剃须刀
|
| 3615 |
+
剃须膏
|
| 3616 |
+
披肩/围巾
|
| 3617 |
+
剪切
|
| 3618 |
+
剪刀
|
| 3619 |
+
羊
|
| 3620 |
+
床单
|
| 3621 |
+
乐谱
|
| 3622 |
+
架子
|
| 3623 |
+
贝壳
|
| 3624 |
+
贝类
|
| 3625 |
+
避难所
|
| 3626 |
+
搁置
|
| 3627 |
+
牧羊人
|
| 3628 |
+
果子露
|
| 3629 |
+
柴犬
|
| 3630 |
+
发光
|
| 3631 |
+
航运
|
| 3632 |
+
集装箱
|
| 3633 |
+
海难
|
| 3634 |
+
船厂
|
| 3635 |
+
衬衫
|
| 3636 |
+
赤膊的
|
| 3637 |
+
浅滩
|
| 3638 |
+
鞋
|
| 3639 |
+
鞋盒
|
| 3640 |
+
鞋店
|
| 3641 |
+
鞋楦
|
| 3642 |
+
射击
|
| 3643 |
+
得分篮球后卫
|
| 3644 |
+
商店橱窗
|
| 3645 |
+
门面
|
| 3646 |
+
购物者
|
| 3647 |
+
购物
|
| 3648 |
+
购物袋
|
| 3649 |
+
购物篮
|
| 3650 |
+
购物车
|
| 3651 |
+
购物中心
|
| 3652 |
+
购物街
|
| 3653 |
+
海岸
|
| 3654 |
+
海岸线
|
| 3655 |
+
短的
|
| 3656 |
+
短发
|
| 3657 |
+
短裤
|
| 3658 |
+
小酒杯
|
| 3659 |
+
散弹枪
|
| 3660 |
+
肩膀
|
| 3661 |
+
单肩包
|
| 3662 |
+
铲
|
| 3663 |
+
陈列柜
|
| 3664 |
+
淋浴
|
| 3665 |
+
浴帽
|
| 3666 |
+
浴帘
|
| 3667 |
+
淋浴门
|
| 3668 |
+
淋浴头
|
| 3669 |
+
碎纸机
|
| 3670 |
+
泼妇
|
| 3671 |
+
虾
|
| 3672 |
+
神社
|
| 3673 |
+
灌木
|
| 3674 |
+
快门
|
| 3675 |
+
暹罗猫
|
| 3676 |
+
西伯利亚
|
| 3677 |
+
兄弟姐妹
|
| 3678 |
+
侧面
|
| 3679 |
+
边柜
|
| 3680 |
+
配菜
|
| 3681 |
+
边车
|
| 3682 |
+
边线
|
| 3683 |
+
壁板
|
| 3684 |
+
标志
|
| 3685 |
+
指示牌
|
| 3686 |
+
信号
|
| 3687 |
+
签名
|
| 3688 |
+
丝绸
|
| 3689 |
+
丝袜
|
| 3690 |
+
筒仓
|
| 3691 |
+
银
|
| 3692 |
+
银牌
|
| 3693 |
+
银器
|
| 3694 |
+
唱歌
|
| 3695 |
+
烧焦
|
| 3696 |
+
歌手
|
| 3697 |
+
水槽
|
| 3698 |
+
啜
|
| 3699 |
+
坐/放置/坐落
|
| 3700 |
+
坐着
|
| 3701 |
+
滑板公园
|
| 3702 |
+
滑板
|
| 3703 |
+
滑板者
|
| 3704 |
+
溜冰者
|
| 3705 |
+
溜冰场
|
| 3706 |
+
骨架
|
| 3707 |
+
草图
|
| 3708 |
+
串串
|
| 3709 |
+
滑雪
|
| 3710 |
+
滑雪靴
|
| 3711 |
+
滑雪设备
|
| 3712 |
+
滑雪服
|
| 3713 |
+
滑雪缆车
|
| 3714 |
+
滑雪杖
|
| 3715 |
+
滑雪胜地
|
| 3716 |
+
滑雪板
|
| 3717 |
+
滑雪
|
| 3718 |
+
滑雪鞋
|
| 3719 |
+
皮肤
|
| 3720 |
+
头骨
|
| 3721 |
+
无边便帽
|
| 3722 |
+
天空
|
| 3723 |
+
天空塔
|
| 3724 |
+
天窗
|
| 3725 |
+
天际线
|
| 3726 |
+
摩天大楼
|
| 3727 |
+
激流回旋
|
| 3728 |
+
石板
|
| 3729 |
+
雪橇
|
| 3730 |
+
睡眠
|
| 3731 |
+
睡袋
|
| 3732 |
+
睡衣
|
| 3733 |
+
袖子
|
| 3734 |
+
片
|
| 3735 |
+
滑动
|
| 3736 |
+
滑块
|
| 3737 |
+
吊索
|
| 3738 |
+
坡
|
| 3739 |
+
投币口
|
| 3740 |
+
老虎机
|
| 3741 |
+
树懒
|
| 3742 |
+
慢炖锅
|
| 3743 |
+
鼻涕虫
|
| 3744 |
+
贫民窟
|
| 3745 |
+
气味
|
| 3746 |
+
微笑
|
| 3747 |
+
烟雾/抽烟
|
| 3748 |
+
零食
|
| 3749 |
+
蜗牛
|
| 3750 |
+
蛇
|
| 3751 |
+
鲷鱼
|
| 3752 |
+
快照
|
| 3753 |
+
通气管
|
| 3754 |
+
鼻子
|
| 3755 |
+
雪
|
| 3756 |
+
雪豹
|
| 3757 |
+
雪山
|
| 3758 |
+
雪球
|
| 3759 |
+
单板滑雪者
|
| 3760 |
+
雪原
|
| 3761 |
+
雪花
|
| 3762 |
+
雪人
|
| 3763 |
+
雪地摩托
|
| 3764 |
+
雪犁
|
| 3765 |
+
雪鞋
|
| 3766 |
+
雪
|
| 3767 |
+
肥皂
|
| 3768 |
+
肥皂泡
|
| 3769 |
+
给皂器
|
| 3770 |
+
足球守门员
|
| 3771 |
+
社会名流
|
| 3772 |
+
短袜
|
| 3773 |
+
插座
|
| 3774 |
+
苏打水
|
| 3775 |
+
垒球
|
| 3776 |
+
软件
|
| 3777 |
+
太阳能电池阵列
|
| 3778 |
+
士兵
|
| 3779 |
+
独奏
|
| 3780 |
+
解决方案
|
| 3781 |
+
宽边帽
|
| 3782 |
+
歌曲
|
| 3783 |
+
声音
|
| 3784 |
+
汤
|
| 3785 |
+
汤碗
|
| 3786 |
+
汤匙
|
| 3787 |
+
酸奶油
|
| 3788 |
+
纪念品
|
| 3789 |
+
豆浆
|
| 3790 |
+
水疗中心
|
| 3791 |
+
空间
|
| 3792 |
+
航天飞机
|
| 3793 |
+
空间站
|
| 3794 |
+
宇宙飞船
|
| 3795 |
+
意大利面
|
| 3796 |
+
横跨
|
| 3797 |
+
扳手
|
| 3798 |
+
火花
|
| 3799 |
+
闪耀
|
| 3800 |
+
烟火
|
| 3801 |
+
起泡葡萄酒
|
| 3802 |
+
麻雀
|
| 3803 |
+
抹刀
|
| 3804 |
+
扬声器
|
| 3805 |
+
观众
|
| 3806 |
+
会话框
|
| 3807 |
+
速度限制
|
| 3808 |
+
限速标志
|
| 3809 |
+
快艇
|
| 3810 |
+
车速表
|
| 3811 |
+
球
|
| 3812 |
+
香料
|
| 3813 |
+
调料架
|
| 3814 |
+
蜘蛛
|
| 3815 |
+
蜘蛛网
|
| 3816 |
+
扣球
|
| 3817 |
+
旋转
|
| 3818 |
+
菠菜
|
| 3819 |
+
尖塔
|
| 3820 |
+
飞溅
|
| 3821 |
+
海绵
|
| 3822 |
+
勺子
|
| 3823 |
+
体育协会
|
| 3824 |
+
运动器材
|
| 3825 |
+
运动团队
|
| 3826 |
+
体育球
|
| 3827 |
+
体育器材
|
| 3828 |
+
运动会
|
| 3829 |
+
运动服装
|
| 3830 |
+
点
|
| 3831 |
+
喷雾
|
| 3832 |
+
伸展
|
| 3833 |
+
春天
|
| 3834 |
+
春卷
|
| 3835 |
+
撒
|
| 3836 |
+
洒水器
|
| 3837 |
+
发芽
|
| 3838 |
+
云杉
|
| 3839 |
+
云杉森林
|
| 3840 |
+
队
|
| 3841 |
+
广场
|
| 3842 |
+
南瓜
|
| 3843 |
+
蹲
|
| 3844 |
+
挤
|
| 3845 |
+
鱿鱼
|
| 3846 |
+
松鼠
|
| 3847 |
+
水枪
|
| 3848 |
+
刺
|
| 3849 |
+
稳定的
|
| 3850 |
+
(码放整齐的)一叠
|
| 3851 |
+
体育场
|
| 3852 |
+
工作人员
|
| 3853 |
+
舞台
|
| 3854 |
+
舞台灯
|
| 3855 |
+
驿马车
|
| 3856 |
+
弄脏
|
| 3857 |
+
不锈钢
|
| 3858 |
+
楼梯
|
| 3859 |
+
楼梯
|
| 3860 |
+
楼梯间
|
| 3861 |
+
摊位/小隔间
|
| 3862 |
+
种马
|
| 3863 |
+
站/矗立/摊位
|
| 3864 |
+
站
|
| 3865 |
+
主食
|
| 3866 |
+
订书机
|
| 3867 |
+
星星
|
| 3868 |
+
盯着
|
| 3869 |
+
海星
|
| 3870 |
+
杨桃
|
| 3871 |
+
燕八哥
|
| 3872 |
+
州立公园
|
| 3873 |
+
公立学校
|
| 3874 |
+
车站
|
| 3875 |
+
固定自行车
|
| 3876 |
+
文具
|
| 3877 |
+
雕像
|
| 3878 |
+
牛排
|
| 3879 |
+
牛排刀
|
| 3880 |
+
蒸汽
|
| 3881 |
+
蒸汽机
|
| 3882 |
+
蒸汽机车
|
| 3883 |
+
蒸汽火车
|
| 3884 |
+
馒头
|
| 3885 |
+
钢
|
| 3886 |
+
方向盘
|
| 3887 |
+
(花草的)茎
|
| 3888 |
+
模版
|
| 3889 |
+
梯凳
|
| 3890 |
+
立体声
|
| 3891 |
+
听诊器
|
| 3892 |
+
炖
|
| 3893 |
+
戳/条状物
|
| 3894 |
+
竹节虫
|
| 3895 |
+
贴纸
|
| 3896 |
+
静物画
|
| 3897 |
+
高跷
|
| 3898 |
+
黄貂鱼
|
| 3899 |
+
搅拌
|
| 3900 |
+
搅拌器
|
| 3901 |
+
镫
|
| 3902 |
+
缝
|
| 3903 |
+
股票
|
| 3904 |
+
长筒袜
|
| 3905 |
+
腹部
|
| 3906 |
+
石头建筑
|
| 3907 |
+
石雕
|
| 3908 |
+
石屋
|
| 3909 |
+
石磨
|
| 3910 |
+
凳子
|
| 3911 |
+
停止
|
| 3912 |
+
停在
|
| 3913 |
+
红灯
|
| 3914 |
+
停车标志
|
| 3915 |
+
秒表
|
| 3916 |
+
红绿灯
|
| 3917 |
+
存储箱
|
| 3918 |
+
储藏室
|
| 3919 |
+
罐/蓄水池
|
| 3920 |
+
商店
|
| 3921 |
+
店面
|
| 3922 |
+
鹳
|
| 3923 |
+
风暴
|
| 3924 |
+
暴风云
|
| 3925 |
+
狂风暴雨的
|
| 3926 |
+
炉子
|
| 3927 |
+
扑克
|
| 3928 |
+
跨骑
|
| 3929 |
+
过滤器
|
| 3930 |
+
海峡
|
| 3931 |
+
带
|
| 3932 |
+
稻草/吸管
|
| 3933 |
+
草帽
|
| 3934 |
+
草莓
|
| 3935 |
+
溪流
|
| 3936 |
+
街头艺术
|
| 3937 |
+
街头艺术家
|
| 3938 |
+
街角
|
| 3939 |
+
流浪狗
|
| 3940 |
+
街头食品
|
| 3941 |
+
路灯
|
| 3942 |
+
街市场
|
| 3943 |
+
街头摄影
|
| 3944 |
+
街景
|
| 3945 |
+
路标
|
| 3946 |
+
街头小贩
|
| 3947 |
+
拉伸
|
| 3948 |
+
担架
|
| 3949 |
+
罢工
|
| 3950 |
+
前锋
|
| 3951 |
+
细绳
|
| 3952 |
+
芝士条
|
| 3953 |
+
带子
|
| 3954 |
+
条纹
|
| 3955 |
+
漫步
|
| 3956 |
+
结构
|
| 3957 |
+
工作室
|
| 3958 |
+
影棚拍摄
|
| 3959 |
+
材料
|
| 3960 |
+
填充玩具动物
|
| 3961 |
+
毛绒玩具
|
| 3962 |
+
馅
|
| 3963 |
+
树桩
|
| 3964 |
+
惊人的
|
| 3965 |
+
特技
|
| 3966 |
+
佛塔
|
| 3967 |
+
风格
|
| 3968 |
+
手写笔
|
| 3969 |
+
潜艇
|
| 3970 |
+
潜艇形大三明治
|
| 3971 |
+
海底水
|
| 3972 |
+
郊区
|
| 3973 |
+
地铁
|
| 3974 |
+
地铁站
|
| 3975 |
+
低音炮
|
| 3976 |
+
多肉
|
| 3977 |
+
绒面革
|
| 3978 |
+
糖
|
| 3979 |
+
糖碗
|
| 3980 |
+
甘蔗
|
| 3981 |
+
方糖
|
| 3982 |
+
西装
|
| 3983 |
+
套房
|
| 3984 |
+
夏天
|
| 3985 |
+
夏天傍晚
|
| 3986 |
+
峰顶
|
| 3987 |
+
太阳
|
| 3988 |
+
太阳帽
|
| 3989 |
+
日光浴
|
| 3990 |
+
周日
|
| 3991 |
+
日晷
|
| 3992 |
+
向日葵
|
| 3993 |
+
向日葵田
|
| 3994 |
+
葵花籽
|
| 3995 |
+
太阳镜
|
| 3996 |
+
晴天
|
| 3997 |
+
日出
|
| 3998 |
+
日落
|
| 3999 |
+
遮阳伞
|
| 4000 |
+
阳光
|
| 4001 |
+
超级碗
|
| 4002 |
+
跑车
|
| 4003 |
+
超级英雄
|
| 4004 |
+
超市
|
| 4005 |
+
超市货架
|
| 4006 |
+
超模
|
| 4007 |
+
支持者
|
| 4008 |
+
冲浪
|
| 4009 |
+
表面
|
| 4010 |
+
冲浪板
|
| 4011 |
+
冲浪者
|
| 4012 |
+
外科医生
|
| 4013 |
+
外科手术
|
| 4014 |
+
环绕
|
| 4015 |
+
寿司
|
| 4016 |
+
寿司吧
|
| 4017 |
+
背带裤
|
| 4018 |
+
悬架
|
| 4019 |
+
吊桥
|
| 4020 |
+
越野车
|
| 4021 |
+
燕子
|
| 4022 |
+
燕尾蝶
|
| 4023 |
+
沼泽
|
| 4024 |
+
天鹅
|
| 4025 |
+
天鹅游艇
|
| 4026 |
+
运动裤
|
| 4027 |
+
防汗带
|
| 4028 |
+
毛衣
|
| 4029 |
+
运动衫
|
| 4030 |
+
甜的
|
| 4031 |
+
红薯
|
| 4032 |
+
游泳
|
| 4033 |
+
泳帽
|
| 4034 |
+
游泳者
|
| 4035 |
+
游泳洞
|
| 4036 |
+
游泳池
|
| 4037 |
+
摆动
|
| 4038 |
+
平转桥
|
| 4039 |
+
秋千
|
| 4040 |
+
漩涡
|
| 4041 |
+
开关
|
| 4042 |
+
转椅
|
| 4043 |
+
剑
|
| 4044 |
+
旗鱼
|
| 4045 |
+
象征
|
| 4046 |
+
对称
|
| 4047 |
+
犹太教堂
|
| 4048 |
+
注射器
|
| 4049 |
+
糖浆
|
| 4050 |
+
系统
|
| 4051 |
+
t恤
|
| 4052 |
+
t恤
|
| 4053 |
+
塔巴斯科辣椒酱
|
| 4054 |
+
虎斑
|
| 4055 |
+
乒乓球拍
|
| 4056 |
+
桌面
|
| 4057 |
+
桌布
|
| 4058 |
+
平板电脑
|
| 4059 |
+
餐具
|
| 4060 |
+
转速表
|
| 4061 |
+
拦截
|
| 4062 |
+
墨西哥煎玉米卷
|
| 4063 |
+
跆拳道
|
| 4064 |
+
太极
|
| 4065 |
+
尾巴
|
| 4066 |
+
裁缝
|
| 4067 |
+
拍/拿
|
| 4068 |
+
起飞
|
| 4069 |
+
说话/交谈/演讲
|
| 4070 |
+
手鼓
|
| 4071 |
+
棕褐色
|
| 4072 |
+
橘子
|
| 4073 |
+
胶带/磁带/终点线
|
| 4074 |
+
挂毯
|
| 4075 |
+
沥青碎石路面
|
| 4076 |
+
芋头
|
| 4077 |
+
篷布
|
| 4078 |
+
果馅饼
|
| 4079 |
+
流苏
|
| 4080 |
+
味道
|
| 4081 |
+
榻榻米
|
| 4082 |
+
纹身
|
| 4083 |
+
纹身艺术家
|
| 4084 |
+
酒馆
|
| 4085 |
+
茶
|
| 4086 |
+
茶包
|
| 4087 |
+
茶话会
|
| 4088 |
+
茶园
|
| 4089 |
+
茶壶
|
| 4090 |
+
茶具
|
| 4091 |
+
教
|
| 4092 |
+
老师
|
| 4093 |
+
茶杯
|
| 4094 |
+
水鸭
|
| 4095 |
+
团队合影
|
| 4096 |
+
团队介绍
|
| 4097 |
+
眼泪/撕裂/划破
|
| 4098 |
+
技术员
|
| 4099 |
+
技术
|
| 4100 |
+
泰迪熊
|
| 4101 |
+
T字形物
|
| 4102 |
+
青少年
|
| 4103 |
+
电线杆
|
| 4104 |
+
变焦镜头
|
| 4105 |
+
望远镜
|
| 4106 |
+
电视
|
| 4107 |
+
电视摄像机
|
| 4108 |
+
电视室
|
| 4109 |
+
电视演播室
|
| 4110 |
+
温度
|
| 4111 |
+
寺庙
|
| 4112 |
+
天妇罗
|
| 4113 |
+
网球
|
| 4114 |
+
网球场
|
| 4115 |
+
网球比赛
|
| 4116 |
+
网球网
|
| 4117 |
+
网球运动员
|
| 4118 |
+
网球拍
|
| 4119 |
+
帐篷
|
| 4120 |
+
龙舌兰酒
|
| 4121 |
+
终端/航站楼
|
| 4122 |
+
阳台
|
| 4123 |
+
地形
|
| 4124 |
+
玻璃容器
|
| 4125 |
+
领土
|
| 4126 |
+
测试
|
| 4127 |
+
测试赛
|
| 4128 |
+
试管
|
| 4129 |
+
文本
|
| 4130 |
+
短信
|
| 4131 |
+
纺织
|
| 4132 |
+
纹理
|
| 4133 |
+
感恩节
|
| 4134 |
+
感恩节晚餐
|
| 4135 |
+
剧院
|
| 4136 |
+
戏剧演员
|
| 4137 |
+
治疗
|
| 4138 |
+
温度计
|
| 4139 |
+
热水瓶
|
| 4140 |
+
暖瓶
|
| 4141 |
+
恒温器
|
| 4142 |
+
灌木丛
|
| 4143 |
+
顶针
|
| 4144 |
+
东西
|
| 4145 |
+
思考
|
| 4146 |
+
蓟
|
| 4147 |
+
宝座
|
| 4148 |
+
金銮殿
|
| 4149 |
+
扔
|
| 4150 |
+
抱枕
|
| 4151 |
+
雷
|
| 4152 |
+
雷雨
|
| 4153 |
+
百里香
|
| 4154 |
+
皇冠
|
| 4155 |
+
记号
|
| 4156 |
+
票
|
| 4157 |
+
售票亭
|
| 4158 |
+
潮池
|
| 4159 |
+
领带
|
| 4160 |
+
老虎
|
| 4161 |
+
紧
|
| 4162 |
+
瓦
|
| 4163 |
+
瓷砖地板
|
| 4164 |
+
瓦屋顶
|
| 4165 |
+
瓷砖墙
|
| 4166 |
+
锡
|
| 4167 |
+
锡纸
|
| 4168 |
+
箔
|
| 4169 |
+
提拉米苏
|
| 4170 |
+
轮胎
|
| 4171 |
+
纸巾
|
| 4172 |
+
烤面包
|
| 4173 |
+
烤面包机
|
| 4174 |
+
烟草
|
| 4175 |
+
烟斗
|
| 4176 |
+
学步的小孩
|
| 4177 |
+
脚趾
|
| 4178 |
+
豆腐
|
| 4179 |
+
马桶
|
| 4180 |
+
马桶座圈
|
| 4181 |
+
化妆包
|
| 4182 |
+
东京铁塔
|
| 4183 |
+
番茄
|
| 4184 |
+
番茄酱
|
| 4185 |
+
番茄汤
|
| 4186 |
+
墓
|
| 4187 |
+
钳子
|
| 4188 |
+
钳子
|
| 4189 |
+
工具
|
| 4190 |
+
工具箱
|
| 4191 |
+
牙刷
|
| 4192 |
+
牙膏
|
| 4193 |
+
牙签
|
| 4194 |
+
修剪成形的花园
|
| 4195 |
+
配料
|
| 4196 |
+
火炬/光源
|
| 4197 |
+
龙卷风
|
| 4198 |
+
玉米粉圆饼
|
| 4199 |
+
乌龟
|
| 4200 |
+
大手提袋
|
| 4201 |
+
图腾柱
|
| 4202 |
+
龙猫
|
| 4203 |
+
巨嘴鸟
|
| 4204 |
+
触摸
|
| 4205 |
+
触地
|
| 4206 |
+
旅行
|
| 4207 |
+
旅游巴士
|
| 4208 |
+
导游
|
| 4209 |
+
游客
|
| 4210 |
+
旅游景点
|
| 4211 |
+
锦标赛
|
| 4212 |
+
拖车
|
| 4213 |
+
毛巾
|
| 4214 |
+
毛巾杆
|
| 4215 |
+
大厦
|
| 4216 |
+
塔桥
|
| 4217 |
+
小镇
|
| 4218 |
+
城镇广场
|
| 4219 |
+
玩具
|
| 4220 |
+
玩具车
|
| 4221 |
+
玩具枪
|
| 4222 |
+
玩具店
|
| 4223 |
+
跑道
|
| 4224 |
+
拖拉机
|
| 4225 |
+
贸易
|
| 4226 |
+
传统
|
| 4227 |
+
传统的
|
| 4228 |
+
交通
|
| 4229 |
+
锥形交通路标
|
| 4230 |
+
交通拥堵
|
| 4231 |
+
交通堵塞
|
| 4232 |
+
交通标志
|
| 4233 |
+
小道
|
| 4234 |
+
预告片
|
| 4235 |
+
拖车
|
| 4236 |
+
火车
|
| 4237 |
+
火车桥
|
| 4238 |
+
火车车厢
|
| 4239 |
+
火车内部
|
| 4240 |
+
火车轨道
|
| 4241 |
+
火车窗口
|
| 4242 |
+
教练
|
| 4243 |
+
训练
|
| 4244 |
+
训练长椅
|
| 4245 |
+
训练场
|
| 4246 |
+
电车/手推车
|
| 4247 |
+
蹦床
|
| 4248 |
+
变形金刚
|
| 4249 |
+
透明度
|
| 4250 |
+
旅行
|
| 4251 |
+
托盘/碟子
|
| 4252 |
+
跑步机
|
| 4253 |
+
美食
|
| 4254 |
+
树
|
| 4255 |
+
树枝
|
| 4256 |
+
林场
|
| 4257 |
+
树蛙
|
| 4258 |
+
树屋
|
| 4259 |
+
树根
|
| 4260 |
+
树干
|
| 4261 |
+
试验
|
| 4262 |
+
三角形
|
| 4263 |
+
铁人三项
|
| 4264 |
+
部落
|
| 4265 |
+
支流
|
| 4266 |
+
戏法/特技
|
| 4267 |
+
三轮车
|
| 4268 |
+
修剪
|
| 4269 |
+
三人组
|
| 4270 |
+
三脚架
|
| 4271 |
+
长号
|
| 4272 |
+
部队
|
| 4273 |
+
奖杯
|
| 4274 |
+
奖杯
|
| 4275 |
+
热带
|
| 4276 |
+
鳟鱼
|
| 4277 |
+
卡车
|
| 4278 |
+
卡车司机
|
| 4279 |
+
浴缸
|
| 4280 |
+
管子
|
| 4281 |
+
拖船
|
| 4282 |
+
郁金香
|
| 4283 |
+
金枪鱼
|
| 4284 |
+
苔原
|
| 4285 |
+
隧道
|
| 4286 |
+
涡轮
|
| 4287 |
+
火鸡
|
| 4288 |
+
转动
|
| 4289 |
+
芜菁
|
| 4290 |
+
绿松石
|
| 4291 |
+
炮塔
|
| 4292 |
+
乌龟
|
| 4293 |
+
獠牙
|
| 4294 |
+
电视演员
|
| 4295 |
+
电视柜
|
| 4296 |
+
电视剧
|
| 4297 |
+
电视节目类型
|
| 4298 |
+
电视名人
|
| 4299 |
+
电视节目
|
| 4300 |
+
情景喜剧
|
| 4301 |
+
电视塔
|
| 4302 |
+
枝条
|
| 4303 |
+
黄昏
|
| 4304 |
+
双胞胎
|
| 4305 |
+
麻线
|
| 4306 |
+
扭
|
| 4307 |
+
类型
|
| 4308 |
+
键入
|
| 4309 |
+
打字机
|
| 4310 |
+
尤克里里
|
| 4311 |
+
奥特曼
|
| 4312 |
+
伞
|
| 4313 |
+
内衣
|
| 4314 |
+
水下
|
| 4315 |
+
独角兽
|
| 4316 |
+
制服
|
| 4317 |
+
宇宙
|
| 4318 |
+
大学
|
| 4319 |
+
向上
|
| 4320 |
+
城市
|
| 4321 |
+
尿壶
|
| 4322 |
+
瓮
|
| 4323 |
+
使用
|
| 4324 |
+
用具
|
| 4325 |
+
杂物间
|
| 4326 |
+
吸尘器/真空
|
| 4327 |
+
谷
|
| 4328 |
+
阀门
|
| 4329 |
+
吸血鬼
|
| 4330 |
+
货车
|
| 4331 |
+
香草
|
| 4332 |
+
虚荣
|
| 4333 |
+
种类
|
| 4334 |
+
花瓶/瓶
|
| 4335 |
+
金库
|
| 4336 |
+
矢量卡通插图
|
| 4337 |
+
矢量图标
|
| 4338 |
+
蔬菜
|
| 4339 |
+
菜园
|
| 4340 |
+
蔬菜市场
|
| 4341 |
+
植被
|
| 4342 |
+
车辆
|
| 4343 |
+
面纱
|
| 4344 |
+
静脉
|
| 4345 |
+
天鹅绒
|
| 4346 |
+
自动售货机
|
| 4347 |
+
小贩
|
| 4348 |
+
通风孔
|
| 4349 |
+
胡蜂属
|
| 4350 |
+
船
|
| 4351 |
+
背心
|
| 4352 |
+
兽医
|
| 4353 |
+
经验丰富的
|
| 4354 |
+
兽医办公室
|
| 4355 |
+
高架桥
|
| 4356 |
+
视频
|
| 4357 |
+
摄像机
|
| 4358 |
+
电子游戏
|
| 4359 |
+
录像带
|
| 4360 |
+
视镜
|
| 4361 |
+
守夜
|
| 4362 |
+
别墅
|
| 4363 |
+
村庄
|
| 4364 |
+
藤蔓
|
| 4365 |
+
醋
|
| 4366 |
+
葡萄园
|
| 4367 |
+
暴力
|
| 4368 |
+
紫罗兰色
|
| 4369 |
+
小提琴
|
| 4370 |
+
小提琴家
|
| 4371 |
+
中提琴演奏者
|
| 4372 |
+
愿景
|
| 4373 |
+
遮阳板
|
| 4374 |
+
伏特加
|
| 4375 |
+
火山
|
| 4376 |
+
排球
|
| 4377 |
+
排球场
|
| 4378 |
+
排球运动员
|
| 4379 |
+
志愿者
|
| 4380 |
+
航行
|
| 4381 |
+
秃鹰
|
| 4382 |
+
华夫饼干
|
| 4383 |
+
华夫饼机
|
| 4384 |
+
货车
|
| 4385 |
+
马车车轮
|
| 4386 |
+
腰
|
| 4387 |
+
服务员
|
| 4388 |
+
候机室
|
| 4389 |
+
等候室
|
| 4390 |
+
走
|
| 4391 |
+
步行
|
| 4392 |
+
手杖
|
| 4393 |
+
挂钟
|
| 4394 |
+
壁纸
|
| 4395 |
+
核桃
|
| 4396 |
+
海象
|
| 4397 |
+
战争
|
| 4398 |
+
仓库
|
| 4399 |
+
温暖的
|
| 4400 |
+
警告标志
|
| 4401 |
+
战士
|
| 4402 |
+
军舰
|
| 4403 |
+
疣猪
|
| 4404 |
+
洗
|
| 4405 |
+
洗衣机/垫圈
|
| 4406 |
+
洗
|
| 4407 |
+
洗衣机
|
| 4408 |
+
黄蜂
|
| 4409 |
+
浪费
|
| 4410 |
+
废物容器
|
| 4411 |
+
手表
|
| 4412 |
+
水
|
| 4413 |
+
水鸟
|
| 4414 |
+
水牛
|
| 4415 |
+
水冷却器
|
| 4416 |
+
水滴
|
| 4417 |
+
水景
|
| 4418 |
+
热水器
|
| 4419 |
+
水位
|
| 4420 |
+
荷花
|
| 4421 |
+
水上乐园
|
| 4422 |
+
水管
|
| 4423 |
+
净水器
|
| 4424 |
+
滑水板
|
| 4425 |
+
水上运动
|
| 4426 |
+
水面
|
| 4427 |
+
水塔
|
| 4428 |
+
水彩
|
| 4429 |
+
水彩插图
|
| 4430 |
+
水彩画
|
| 4431 |
+
瀑布
|
| 4432 |
+
喷壶
|
| 4433 |
+
水印叠加图章
|
| 4434 |
+
西瓜
|
| 4435 |
+
防水外套
|
| 4436 |
+
水路
|
| 4437 |
+
波浪
|
| 4438 |
+
蜡
|
| 4439 |
+
武器
|
| 4440 |
+
穿着
|
| 4441 |
+
天气
|
| 4442 |
+
叶片
|
| 4443 |
+
网
|
| 4444 |
+
摄像头
|
| 4445 |
+
婚礼
|
| 4446 |
+
结婚戒指
|
| 4447 |
+
婚礼花束
|
| 4448 |
+
结婚蛋糕
|
| 4449 |
+
新婚夫妇
|
| 4450 |
+
婚礼请柬
|
| 4451 |
+
婚礼派对
|
| 4452 |
+
婚纱照
|
| 4453 |
+
婚礼摄影师
|
| 4454 |
+
婚纱摄影
|
| 4455 |
+
婚宴
|
| 4456 |
+
楔
|
| 4457 |
+
杂草
|
| 4458 |
+
重量
|
| 4459 |
+
体重秤
|
| 4460 |
+
焊接工
|
| 4461 |
+
井
|
| 4462 |
+
西餐
|
| 4463 |
+
西餐厅
|
| 4464 |
+
湿
|
| 4465 |
+
吧台
|
| 4466 |
+
潜水衣
|
| 4467 |
+
湿地
|
| 4468 |
+
潜水服
|
| 4469 |
+
鲸鱼
|
| 4470 |
+
鲸鲨
|
| 4471 |
+
小麦
|
| 4472 |
+
麦田
|
| 4473 |
+
车轮
|
| 4474 |
+
轮椅
|
| 4475 |
+
后轮支撑车技
|
| 4476 |
+
生奶油
|
| 4477 |
+
搅拌器
|
| 4478 |
+
胡须
|
| 4479 |
+
威士忌
|
| 4480 |
+
哨子
|
| 4481 |
+
白色
|
| 4482 |
+
白宫
|
| 4483 |
+
白葡萄酒
|
| 4484 |
+
白板
|
| 4485 |
+
便门
|
| 4486 |
+
宽的
|
| 4487 |
+
挥动
|
| 4488 |
+
假发
|
| 4489 |
+
Wii
|
| 4490 |
+
Wii手柄
|
| 4491 |
+
荒野
|
| 4492 |
+
角马
|
| 4493 |
+
野火
|
| 4494 |
+
野花
|
| 4495 |
+
野生动物
|
| 4496 |
+
柳树
|
| 4497 |
+
风
|
| 4498 |
+
风铃
|
| 4499 |
+
风电场
|
| 4500 |
+
风力涡轮机
|
| 4501 |
+
风车
|
| 4502 |
+
窗户
|
| 4503 |
+
窗台花盆箱
|
| 4504 |
+
橱窗展示
|
| 4505 |
+
窗框
|
| 4506 |
+
纱窗
|
| 4507 |
+
靠窗的座位
|
| 4508 |
+
窗台
|
| 4509 |
+
雨刮器
|
| 4510 |
+
挡风玻璃
|
| 4511 |
+
有风的
|
| 4512 |
+
酒瓶
|
| 4513 |
+
冷酒器
|
| 4514 |
+
酒柜
|
| 4515 |
+
酒窖
|
| 4516 |
+
酒杯
|
| 4517 |
+
酒架
|
| 4518 |
+
品酒
|
| 4519 |
+
酒庄
|
| 4520 |
+
翅膀
|
| 4521 |
+
冬天
|
| 4522 |
+
冬瓜
|
| 4523 |
+
冬天的早晨
|
| 4524 |
+
冬季场景
|
| 4525 |
+
冬季运动
|
| 4526 |
+
冬季风暴
|
| 4527 |
+
电线
|
| 4528 |
+
紫藤
|
| 4529 |
+
巫婆
|
| 4530 |
+
女巫帽子
|
| 4531 |
+
炒锅
|
| 4532 |
+
狼
|
| 4533 |
+
女人
|
| 4534 |
+
木头
|
| 4535 |
+
林鸳鸯
|
| 4536 |
+
木地板
|
| 4537 |
+
木墙
|
| 4538 |
+
烧木炉
|
| 4539 |
+
木匙
|
| 4540 |
+
林地
|
| 4541 |
+
啄木鸟
|
| 4542 |
+
木工刨
|
| 4543 |
+
羊毛
|
| 4544 |
+
工作
|
| 4545 |
+
练习卡
|
| 4546 |
+
工作台
|
| 4547 |
+
工人
|
| 4548 |
+
工作场所
|
| 4549 |
+
车间
|
| 4550 |
+
世界
|
| 4551 |
+
蠕虫
|
| 4552 |
+
敬拜
|
| 4553 |
+
伤口
|
| 4554 |
+
包
|
| 4555 |
+
裹身裙
|
| 4556 |
+
包装纸
|
| 4557 |
+
搏斗
|
| 4558 |
+
摔跤手
|
| 4559 |
+
皱纹
|
| 4560 |
+
腕带
|
| 4561 |
+
写
|
| 4562 |
+
作家
|
| 4563 |
+
手写/字迹
|
| 4564 |
+
毛笔
|
| 4565 |
+
写字桌
|
| 4566 |
+
游艇
|
| 4567 |
+
牦牛
|
| 4568 |
+
院子
|
| 4569 |
+
黄色
|
| 4570 |
+
瑜伽
|
| 4571 |
+
瑜伽垫
|
| 4572 |
+
酸奶
|
| 4573 |
+
轭
|
| 4574 |
+
蛋黄
|
| 4575 |
+
青年
|
| 4576 |
+
青年旅馆
|
| 4577 |
+
蒙古包
|
| 4578 |
+
斑马
|
| 4579 |
+
斑马线
|
| 4580 |
+
禅意花园
|
| 4581 |
+
拉链
|
| 4582 |
+
拉链
|
| 4583 |
+
僵尸
|
| 4584 |
+
粽子
|
| 4585 |
+
动物园
|
ram/data/ram_tag_list_threshold.txt
ADDED
|
@@ -0,0 +1,4585 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
0.65
|
| 2 |
+
0.65
|
| 3 |
+
0.65
|
| 4 |
+
0.65
|
| 5 |
+
0.65
|
| 6 |
+
0.65
|
| 7 |
+
0.65
|
| 8 |
+
0.8
|
| 9 |
+
0.71
|
| 10 |
+
0.75
|
| 11 |
+
0.65
|
| 12 |
+
0.65
|
| 13 |
+
0.65
|
| 14 |
+
0.8
|
| 15 |
+
0.65
|
| 16 |
+
0.8
|
| 17 |
+
0.8
|
| 18 |
+
0.65
|
| 19 |
+
0.65
|
| 20 |
+
0.65
|
| 21 |
+
0.65
|
| 22 |
+
0.8
|
| 23 |
+
0.65
|
| 24 |
+
0.8
|
| 25 |
+
0.8
|
| 26 |
+
0.65
|
| 27 |
+
0.65
|
| 28 |
+
0.65
|
| 29 |
+
0.65
|
| 30 |
+
0.65
|
| 31 |
+
0.65
|
| 32 |
+
0.65
|
| 33 |
+
0.65
|
| 34 |
+
0.65
|
| 35 |
+
0.65
|
| 36 |
+
0.65
|
| 37 |
+
0.8
|
| 38 |
+
0.65
|
| 39 |
+
0.65
|
| 40 |
+
0.9
|
| 41 |
+
0.65
|
| 42 |
+
0.9
|
| 43 |
+
0.65
|
| 44 |
+
0.65
|
| 45 |
+
0.65
|
| 46 |
+
0.65
|
| 47 |
+
0.8
|
| 48 |
+
0.65
|
| 49 |
+
0.65
|
| 50 |
+
0.65
|
| 51 |
+
0.65
|
| 52 |
+
0.65
|
| 53 |
+
0.61
|
| 54 |
+
0.65
|
| 55 |
+
0.65
|
| 56 |
+
0.65
|
| 57 |
+
0.65
|
| 58 |
+
0.65
|
| 59 |
+
0.8
|
| 60 |
+
0.65
|
| 61 |
+
0.65
|
| 62 |
+
0.65
|
| 63 |
+
0.65
|
| 64 |
+
0.65
|
| 65 |
+
0.65
|
| 66 |
+
0.65
|
| 67 |
+
0.65
|
| 68 |
+
0.65
|
| 69 |
+
0.65
|
| 70 |
+
0.65
|
| 71 |
+
0.65
|
| 72 |
+
0.65
|
| 73 |
+
0.65
|
| 74 |
+
0.8
|
| 75 |
+
0.65
|
| 76 |
+
0.8
|
| 77 |
+
0.8
|
| 78 |
+
0.7
|
| 79 |
+
0.65
|
| 80 |
+
0.65
|
| 81 |
+
0.8
|
| 82 |
+
0.65
|
| 83 |
+
0.65
|
| 84 |
+
0.8
|
| 85 |
+
0.65
|
| 86 |
+
0.65
|
| 87 |
+
0.65
|
| 88 |
+
0.65
|
| 89 |
+
0.65
|
| 90 |
+
0.82
|
| 91 |
+
0.8
|
| 92 |
+
0.65
|
| 93 |
+
0.65
|
| 94 |
+
0.8
|
| 95 |
+
0.65
|
| 96 |
+
0.8
|
| 97 |
+
0.65
|
| 98 |
+
0.65
|
| 99 |
+
0.65
|
| 100 |
+
0.65
|
| 101 |
+
0.65
|
| 102 |
+
0.9
|
| 103 |
+
0.65
|
| 104 |
+
0.65
|
| 105 |
+
0.65
|
| 106 |
+
0.65
|
| 107 |
+
0.65
|
| 108 |
+
0.65
|
| 109 |
+
0.65
|
| 110 |
+
0.65
|
| 111 |
+
0.65
|
| 112 |
+
0.65
|
| 113 |
+
0.8
|
| 114 |
+
0.65
|
| 115 |
+
0.65
|
| 116 |
+
0.65
|
| 117 |
+
0.65
|
| 118 |
+
0.8
|
| 119 |
+
0.65
|
| 120 |
+
0.65
|
| 121 |
+
0.65
|
| 122 |
+
0.65
|
| 123 |
+
0.65
|
| 124 |
+
0.8
|
| 125 |
+
0.65
|
| 126 |
+
0.65
|
| 127 |
+
0.8
|
| 128 |
+
0.65
|
| 129 |
+
0.65
|
| 130 |
+
0.8
|
| 131 |
+
0.65
|
| 132 |
+
0.65
|
| 133 |
+
0.65
|
| 134 |
+
0.65
|
| 135 |
+
0.65
|
| 136 |
+
0.8
|
| 137 |
+
0.65
|
| 138 |
+
0.65
|
| 139 |
+
0.65
|
| 140 |
+
0.65
|
| 141 |
+
0.8
|
| 142 |
+
0.8
|
| 143 |
+
0.65
|
| 144 |
+
0.85
|
| 145 |
+
0.8
|
| 146 |
+
0.65
|
| 147 |
+
0.65
|
| 148 |
+
0.65
|
| 149 |
+
0.65
|
| 150 |
+
0.8
|
| 151 |
+
0.65
|
| 152 |
+
0.8
|
| 153 |
+
0.65
|
| 154 |
+
0.65
|
| 155 |
+
0.65
|
| 156 |
+
0.65
|
| 157 |
+
0.65
|
| 158 |
+
0.65
|
| 159 |
+
0.65
|
| 160 |
+
0.65
|
| 161 |
+
0.65
|
| 162 |
+
0.65
|
| 163 |
+
0.65
|
| 164 |
+
0.65
|
| 165 |
+
0.8
|
| 166 |
+
0.65
|
| 167 |
+
0.65
|
| 168 |
+
0.65
|
| 169 |
+
0.65
|
| 170 |
+
0.65
|
| 171 |
+
0.65
|
| 172 |
+
0.65
|
| 173 |
+
0.8
|
| 174 |
+
0.65
|
| 175 |
+
0.65
|
| 176 |
+
0.77
|
| 177 |
+
0.65
|
| 178 |
+
0.65
|
| 179 |
+
0.65
|
| 180 |
+
0.9
|
| 181 |
+
0.65
|
| 182 |
+
0.65
|
| 183 |
+
0.65
|
| 184 |
+
0.65
|
| 185 |
+
0.65
|
| 186 |
+
0.65
|
| 187 |
+
0.65
|
| 188 |
+
0.65
|
| 189 |
+
0.65
|
| 190 |
+
0.8
|
| 191 |
+
0.65
|
| 192 |
+
0.89
|
| 193 |
+
0.65
|
| 194 |
+
0.8
|
| 195 |
+
0.65
|
| 196 |
+
0.65
|
| 197 |
+
0.65
|
| 198 |
+
0.65
|
| 199 |
+
0.65
|
| 200 |
+
0.65
|
| 201 |
+
0.8
|
| 202 |
+
0.65
|
| 203 |
+
0.65
|
| 204 |
+
0.65
|
| 205 |
+
0.65
|
| 206 |
+
0.65
|
| 207 |
+
0.78
|
| 208 |
+
0.8
|
| 209 |
+
0.65
|
| 210 |
+
0.65
|
| 211 |
+
0.65
|
| 212 |
+
0.65
|
| 213 |
+
0.65
|
| 214 |
+
0.65
|
| 215 |
+
0.65
|
| 216 |
+
0.8
|
| 217 |
+
0.65
|
| 218 |
+
0.65
|
| 219 |
+
0.9
|
| 220 |
+
0.8
|
| 221 |
+
0.65
|
| 222 |
+
0.65
|
| 223 |
+
0.65
|
| 224 |
+
0.65
|
| 225 |
+
0.65
|
| 226 |
+
0.65
|
| 227 |
+
0.8
|
| 228 |
+
0.65
|
| 229 |
+
0.65
|
| 230 |
+
0.65
|
| 231 |
+
0.65
|
| 232 |
+
0.65
|
| 233 |
+
0.65
|
| 234 |
+
0.65
|
| 235 |
+
0.65
|
| 236 |
+
0.65
|
| 237 |
+
0.65
|
| 238 |
+
0.65
|
| 239 |
+
0.8
|
| 240 |
+
0.8
|
| 241 |
+
0.65
|
| 242 |
+
0.65
|
| 243 |
+
0.65
|
| 244 |
+
0.65
|
| 245 |
+
0.65
|
| 246 |
+
0.8
|
| 247 |
+
0.65
|
| 248 |
+
0.8
|
| 249 |
+
0.65
|
| 250 |
+
0.9
|
| 251 |
+
0.65
|
| 252 |
+
0.83
|
| 253 |
+
0.65
|
| 254 |
+
0.65
|
| 255 |
+
0.65
|
| 256 |
+
0.8
|
| 257 |
+
0.65
|
| 258 |
+
0.65
|
| 259 |
+
0.8
|
| 260 |
+
0.65
|
| 261 |
+
0.65
|
| 262 |
+
0.79
|
| 263 |
+
0.65
|
| 264 |
+
0.65
|
| 265 |
+
0.8
|
| 266 |
+
0.65
|
| 267 |
+
0.65
|
| 268 |
+
0.65
|
| 269 |
+
0.89
|
| 270 |
+
0.65
|
| 271 |
+
0.65
|
| 272 |
+
0.65
|
| 273 |
+
0.65
|
| 274 |
+
0.65
|
| 275 |
+
0.9
|
| 276 |
+
0.65
|
| 277 |
+
0.65
|
| 278 |
+
0.86
|
| 279 |
+
0.65
|
| 280 |
+
0.65
|
| 281 |
+
0.65
|
| 282 |
+
0.65
|
| 283 |
+
0.65
|
| 284 |
+
0.65
|
| 285 |
+
0.65
|
| 286 |
+
0.65
|
| 287 |
+
0.65
|
| 288 |
+
0.8
|
| 289 |
+
0.65
|
| 290 |
+
0.65
|
| 291 |
+
0.8
|
| 292 |
+
0.65
|
| 293 |
+
0.65
|
| 294 |
+
0.65
|
| 295 |
+
0.65
|
| 296 |
+
0.79
|
| 297 |
+
0.65
|
| 298 |
+
0.63
|
| 299 |
+
0.65
|
| 300 |
+
0.87
|
| 301 |
+
0.8
|
| 302 |
+
0.46
|
| 303 |
+
0.65
|
| 304 |
+
0.65
|
| 305 |
+
0.65
|
| 306 |
+
0.65
|
| 307 |
+
0.65
|
| 308 |
+
0.65
|
| 309 |
+
0.8
|
| 310 |
+
0.65
|
| 311 |
+
0.9
|
| 312 |
+
0.65
|
| 313 |
+
0.65
|
| 314 |
+
0.9
|
| 315 |
+
0.65
|
| 316 |
+
0.65
|
| 317 |
+
0.8
|
| 318 |
+
0.65
|
| 319 |
+
0.65
|
| 320 |
+
0.65
|
| 321 |
+
0.65
|
| 322 |
+
0.65
|
| 323 |
+
0.65
|
| 324 |
+
0.65
|
| 325 |
+
0.8
|
| 326 |
+
0.65
|
| 327 |
+
0.65
|
| 328 |
+
0.65
|
| 329 |
+
0.65
|
| 330 |
+
0.65
|
| 331 |
+
0.8
|
| 332 |
+
0.65
|
| 333 |
+
0.65
|
| 334 |
+
0.65
|
| 335 |
+
0.65
|
| 336 |
+
0.65
|
| 337 |
+
0.65
|
| 338 |
+
0.65
|
| 339 |
+
0.8
|
| 340 |
+
0.65
|
| 341 |
+
0.65
|
| 342 |
+
0.65
|
| 343 |
+
0.8
|
| 344 |
+
0.65
|
| 345 |
+
0.65
|
| 346 |
+
0.8
|
| 347 |
+
0.65
|
| 348 |
+
0.65
|
| 349 |
+
0.65
|
| 350 |
+
0.65
|
| 351 |
+
0.9
|
| 352 |
+
0.65
|
| 353 |
+
0.65
|
| 354 |
+
0.65
|
| 355 |
+
0.65
|
| 356 |
+
0.65
|
| 357 |
+
0.65
|
| 358 |
+
0.65
|
| 359 |
+
0.8
|
| 360 |
+
0.65
|
| 361 |
+
0.65
|
| 362 |
+
0.65
|
| 363 |
+
0.65
|
| 364 |
+
0.65
|
| 365 |
+
0.65
|
| 366 |
+
0.65
|
| 367 |
+
0.9
|
| 368 |
+
0.65
|
| 369 |
+
0.8
|
| 370 |
+
0.65
|
| 371 |
+
0.8
|
| 372 |
+
0.8
|
| 373 |
+
0.8
|
| 374 |
+
0.65
|
| 375 |
+
0.65
|
| 376 |
+
0.84
|
| 377 |
+
0.65
|
| 378 |
+
0.65
|
| 379 |
+
0.79
|
| 380 |
+
0.65
|
| 381 |
+
0.65
|
| 382 |
+
0.65
|
| 383 |
+
0.65
|
| 384 |
+
0.8
|
| 385 |
+
0.65
|
| 386 |
+
0.65
|
| 387 |
+
0.65
|
| 388 |
+
0.65
|
| 389 |
+
0.8
|
| 390 |
+
0.65
|
| 391 |
+
0.65
|
| 392 |
+
0.65
|
| 393 |
+
0.65
|
| 394 |
+
0.65
|
| 395 |
+
0.65
|
| 396 |
+
0.65
|
| 397 |
+
0.8
|
| 398 |
+
0.81
|
| 399 |
+
0.65
|
| 400 |
+
0.8
|
| 401 |
+
0.65
|
| 402 |
+
0.65
|
| 403 |
+
0.9
|
| 404 |
+
0.65
|
| 405 |
+
0.65
|
| 406 |
+
0.65
|
| 407 |
+
0.65
|
| 408 |
+
0.8
|
| 409 |
+
0.65
|
| 410 |
+
0.65
|
| 411 |
+
0.65
|
| 412 |
+
0.65
|
| 413 |
+
0.65
|
| 414 |
+
0.65
|
| 415 |
+
0.9
|
| 416 |
+
0.65
|
| 417 |
+
0.65
|
| 418 |
+
0.8
|
| 419 |
+
0.65
|
| 420 |
+
0.65
|
| 421 |
+
0.65
|
| 422 |
+
0.65
|
| 423 |
+
0.65
|
| 424 |
+
0.65
|
| 425 |
+
0.9
|
| 426 |
+
0.65
|
| 427 |
+
0.65
|
| 428 |
+
0.65
|
| 429 |
+
0.87
|
| 430 |
+
0.65
|
| 431 |
+
0.65
|
| 432 |
+
0.65
|
| 433 |
+
0.65
|
| 434 |
+
0.65
|
| 435 |
+
0.65
|
| 436 |
+
0.65
|
| 437 |
+
0.65
|
| 438 |
+
0.83
|
| 439 |
+
0.65
|
| 440 |
+
0.65
|
| 441 |
+
0.65
|
| 442 |
+
0.65
|
| 443 |
+
0.65
|
| 444 |
+
0.65
|
| 445 |
+
0.65
|
| 446 |
+
0.77
|
| 447 |
+
0.87
|
| 448 |
+
0.65
|
| 449 |
+
0.65
|
| 450 |
+
0.8
|
| 451 |
+
0.8
|
| 452 |
+
0.65
|
| 453 |
+
0.65
|
| 454 |
+
0.65
|
| 455 |
+
0.65
|
| 456 |
+
0.85
|
| 457 |
+
0.65
|
| 458 |
+
0.68
|
| 459 |
+
0.65
|
| 460 |
+
0.8
|
| 461 |
+
0.65
|
| 462 |
+
0.65
|
| 463 |
+
0.75
|
| 464 |
+
0.8
|
| 465 |
+
0.65
|
| 466 |
+
0.65
|
| 467 |
+
0.65
|
| 468 |
+
0.65
|
| 469 |
+
0.65
|
| 470 |
+
0.65
|
| 471 |
+
0.65
|
| 472 |
+
0.8
|
| 473 |
+
0.65
|
| 474 |
+
0.65
|
| 475 |
+
0.8
|
| 476 |
+
0.8
|
| 477 |
+
0.8
|
| 478 |
+
0.8
|
| 479 |
+
0.79
|
| 480 |
+
0.65
|
| 481 |
+
0.85
|
| 482 |
+
0.65
|
| 483 |
+
0.65
|
| 484 |
+
0.65
|
| 485 |
+
0.9
|
| 486 |
+
0.65
|
| 487 |
+
0.89
|
| 488 |
+
0.8
|
| 489 |
+
0.65
|
| 490 |
+
0.65
|
| 491 |
+
0.65
|
| 492 |
+
0.76
|
| 493 |
+
0.65
|
| 494 |
+
0.65
|
| 495 |
+
0.65
|
| 496 |
+
0.65
|
| 497 |
+
0.65
|
| 498 |
+
0.65
|
| 499 |
+
1
|
| 500 |
+
0.65
|
| 501 |
+
0.65
|
| 502 |
+
0.65
|
| 503 |
+
0.65
|
| 504 |
+
0.65
|
| 505 |
+
0.65
|
| 506 |
+
0.65
|
| 507 |
+
0.65
|
| 508 |
+
0.65
|
| 509 |
+
0.8
|
| 510 |
+
0.65
|
| 511 |
+
0.65
|
| 512 |
+
0.65
|
| 513 |
+
0.9
|
| 514 |
+
0.65
|
| 515 |
+
0.89
|
| 516 |
+
0.7
|
| 517 |
+
0.65
|
| 518 |
+
0.65
|
| 519 |
+
0.65
|
| 520 |
+
0.65
|
| 521 |
+
0.65
|
| 522 |
+
0.8
|
| 523 |
+
0.8
|
| 524 |
+
0.65
|
| 525 |
+
0.65
|
| 526 |
+
0.71
|
| 527 |
+
0.65
|
| 528 |
+
0.65
|
| 529 |
+
0.65
|
| 530 |
+
0.65
|
| 531 |
+
0.65
|
| 532 |
+
0.8
|
| 533 |
+
0.65
|
| 534 |
+
0.65
|
| 535 |
+
0.8
|
| 536 |
+
0.65
|
| 537 |
+
0.65
|
| 538 |
+
0.9
|
| 539 |
+
0.65
|
| 540 |
+
0.65
|
| 541 |
+
0.65
|
| 542 |
+
0.65
|
| 543 |
+
0.8
|
| 544 |
+
0.65
|
| 545 |
+
0.65
|
| 546 |
+
0.65
|
| 547 |
+
0.65
|
| 548 |
+
0.65
|
| 549 |
+
0.65
|
| 550 |
+
0.65
|
| 551 |
+
0.8
|
| 552 |
+
0.65
|
| 553 |
+
0.65
|
| 554 |
+
0.8
|
| 555 |
+
0.8
|
| 556 |
+
0.65
|
| 557 |
+
0.65
|
| 558 |
+
0.8
|
| 559 |
+
0.8
|
| 560 |
+
0.65
|
| 561 |
+
0.65
|
| 562 |
+
0.65
|
| 563 |
+
0.8
|
| 564 |
+
0.65
|
| 565 |
+
0.8
|
| 566 |
+
0.8
|
| 567 |
+
0.65
|
| 568 |
+
0.8
|
| 569 |
+
0.65
|
| 570 |
+
0.8
|
| 571 |
+
0.8
|
| 572 |
+
0.9
|
| 573 |
+
0.65
|
| 574 |
+
0.85
|
| 575 |
+
0.8
|
| 576 |
+
0.8
|
| 577 |
+
0.8
|
| 578 |
+
0.9
|
| 579 |
+
0.65
|
| 580 |
+
0.65
|
| 581 |
+
0.8
|
| 582 |
+
0.65
|
| 583 |
+
0.65
|
| 584 |
+
0.65
|
| 585 |
+
0.75
|
| 586 |
+
0.65
|
| 587 |
+
0.65
|
| 588 |
+
0.65
|
| 589 |
+
0.65
|
| 590 |
+
0.65
|
| 591 |
+
0.65
|
| 592 |
+
0.65
|
| 593 |
+
0.65
|
| 594 |
+
0.8
|
| 595 |
+
0.65
|
| 596 |
+
0.65
|
| 597 |
+
0.65
|
| 598 |
+
0.65
|
| 599 |
+
0.65
|
| 600 |
+
0.65
|
| 601 |
+
0.65
|
| 602 |
+
0.65
|
| 603 |
+
0.65
|
| 604 |
+
0.65
|
| 605 |
+
0.65
|
| 606 |
+
0.65
|
| 607 |
+
0.65
|
| 608 |
+
0.65
|
| 609 |
+
0.65
|
| 610 |
+
0.65
|
| 611 |
+
0.65
|
| 612 |
+
0.8
|
| 613 |
+
0.65
|
| 614 |
+
0.8
|
| 615 |
+
0.65
|
| 616 |
+
0.65
|
| 617 |
+
0.65
|
| 618 |
+
0.63
|
| 619 |
+
0.65
|
| 620 |
+
0.65
|
| 621 |
+
0.65
|
| 622 |
+
0.65
|
| 623 |
+
0.65
|
| 624 |
+
0.65
|
| 625 |
+
0.65
|
| 626 |
+
0.65
|
| 627 |
+
0.65
|
| 628 |
+
0.65
|
| 629 |
+
0.65
|
| 630 |
+
0.8
|
| 631 |
+
0.65
|
| 632 |
+
0.65
|
| 633 |
+
0.65
|
| 634 |
+
0.65
|
| 635 |
+
0.8
|
| 636 |
+
0.65
|
| 637 |
+
0.65
|
| 638 |
+
0.65
|
| 639 |
+
0.8
|
| 640 |
+
0.65
|
| 641 |
+
0.88
|
| 642 |
+
0.65
|
| 643 |
+
0.65
|
| 644 |
+
0.65
|
| 645 |
+
0.65
|
| 646 |
+
0.65
|
| 647 |
+
0.8
|
| 648 |
+
0.8
|
| 649 |
+
0.71
|
| 650 |
+
0.65
|
| 651 |
+
0.65
|
| 652 |
+
0.65
|
| 653 |
+
0.8
|
| 654 |
+
0.8
|
| 655 |
+
0.65
|
| 656 |
+
0.65
|
| 657 |
+
0.65
|
| 658 |
+
0.65
|
| 659 |
+
0.65
|
| 660 |
+
0.8
|
| 661 |
+
0.9
|
| 662 |
+
0.65
|
| 663 |
+
0.8
|
| 664 |
+
0.65
|
| 665 |
+
0.65
|
| 666 |
+
0.65
|
| 667 |
+
0.65
|
| 668 |
+
0.65
|
| 669 |
+
0.8
|
| 670 |
+
0.65
|
| 671 |
+
0.71
|
| 672 |
+
0.65
|
| 673 |
+
0.8
|
| 674 |
+
0.76
|
| 675 |
+
0.85
|
| 676 |
+
0.8
|
| 677 |
+
0.65
|
| 678 |
+
0.65
|
| 679 |
+
0.8
|
| 680 |
+
0.65
|
| 681 |
+
0.79
|
| 682 |
+
0.65
|
| 683 |
+
0.75
|
| 684 |
+
0.65
|
| 685 |
+
0.8
|
| 686 |
+
0.65
|
| 687 |
+
0.86
|
| 688 |
+
0.65
|
| 689 |
+
0.65
|
| 690 |
+
0.9
|
| 691 |
+
0.9
|
| 692 |
+
0.65
|
| 693 |
+
0.65
|
| 694 |
+
0.65
|
| 695 |
+
0.65
|
| 696 |
+
0.65
|
| 697 |
+
0.73
|
| 698 |
+
0.65
|
| 699 |
+
0.65
|
| 700 |
+
0.65
|
| 701 |
+
0.65
|
| 702 |
+
0.8
|
| 703 |
+
0.65
|
| 704 |
+
0.65
|
| 705 |
+
0.9
|
| 706 |
+
0.65
|
| 707 |
+
0.85
|
| 708 |
+
0.65
|
| 709 |
+
0.65
|
| 710 |
+
0.65
|
| 711 |
+
0.65
|
| 712 |
+
0.8
|
| 713 |
+
0.75
|
| 714 |
+
0.65
|
| 715 |
+
0.65
|
| 716 |
+
0.65
|
| 717 |
+
0.65
|
| 718 |
+
0.8
|
| 719 |
+
0.85
|
| 720 |
+
0.8
|
| 721 |
+
0.65
|
| 722 |
+
0.65
|
| 723 |
+
0.65
|
| 724 |
+
0.65
|
| 725 |
+
0.65
|
| 726 |
+
0.65
|
| 727 |
+
0.77
|
| 728 |
+
0.65
|
| 729 |
+
0.65
|
| 730 |
+
0.65
|
| 731 |
+
0.65
|
| 732 |
+
0.65
|
| 733 |
+
0.86
|
| 734 |
+
0.65
|
| 735 |
+
0.65
|
| 736 |
+
0.65
|
| 737 |
+
0.65
|
| 738 |
+
0.65
|
| 739 |
+
0.8
|
| 740 |
+
0.65
|
| 741 |
+
0.6
|
| 742 |
+
0.65
|
| 743 |
+
0.65
|
| 744 |
+
0.65
|
| 745 |
+
0.65
|
| 746 |
+
0.65
|
| 747 |
+
0.65
|
| 748 |
+
0.65
|
| 749 |
+
0.65
|
| 750 |
+
0.65
|
| 751 |
+
0.65
|
| 752 |
+
0.74
|
| 753 |
+
0.65
|
| 754 |
+
0.65
|
| 755 |
+
0.67
|
| 756 |
+
0.65
|
| 757 |
+
0.65
|
| 758 |
+
0.8
|
| 759 |
+
0.65
|
| 760 |
+
0.65
|
| 761 |
+
0.85
|
| 762 |
+
0.65
|
| 763 |
+
0.8
|
| 764 |
+
0.65
|
| 765 |
+
0.65
|
| 766 |
+
0.84
|
| 767 |
+
0.8
|
| 768 |
+
0.8
|
| 769 |
+
0.8
|
| 770 |
+
0.8
|
| 771 |
+
0.8
|
| 772 |
+
0.65
|
| 773 |
+
0.65
|
| 774 |
+
0.65
|
| 775 |
+
0.65
|
| 776 |
+
0.65
|
| 777 |
+
0.65
|
| 778 |
+
0.8
|
| 779 |
+
0.65
|
| 780 |
+
0.9
|
| 781 |
+
0.9
|
| 782 |
+
0.65
|
| 783 |
+
0.65
|
| 784 |
+
0.65
|
| 785 |
+
0.65
|
| 786 |
+
0.65
|
| 787 |
+
0.65
|
| 788 |
+
0.65
|
| 789 |
+
0.8
|
| 790 |
+
0.65
|
| 791 |
+
0.65
|
| 792 |
+
0.65
|
| 793 |
+
0.8
|
| 794 |
+
0.89
|
| 795 |
+
0.65
|
| 796 |
+
0.65
|
| 797 |
+
0.65
|
| 798 |
+
0.83
|
| 799 |
+
0.65
|
| 800 |
+
0.65
|
| 801 |
+
0.65
|
| 802 |
+
0.65
|
| 803 |
+
0.6
|
| 804 |
+
0.65
|
| 805 |
+
0.8
|
| 806 |
+
0.8
|
| 807 |
+
0.8
|
| 808 |
+
0.65
|
| 809 |
+
0.65
|
| 810 |
+
0.89
|
| 811 |
+
0.65
|
| 812 |
+
0.65
|
| 813 |
+
0.65
|
| 814 |
+
0.65
|
| 815 |
+
0.8
|
| 816 |
+
0.65
|
| 817 |
+
0.65
|
| 818 |
+
0.8
|
| 819 |
+
0.65
|
| 820 |
+
0.8
|
| 821 |
+
0.65
|
| 822 |
+
0.77
|
| 823 |
+
0.65
|
| 824 |
+
0.65
|
| 825 |
+
0.65
|
| 826 |
+
0.65
|
| 827 |
+
0.65
|
| 828 |
+
0.65
|
| 829 |
+
0.65
|
| 830 |
+
0.65
|
| 831 |
+
0.65
|
| 832 |
+
0.65
|
| 833 |
+
0.65
|
| 834 |
+
0.65
|
| 835 |
+
0.65
|
| 836 |
+
0.65
|
| 837 |
+
0.65
|
| 838 |
+
0.65
|
| 839 |
+
0.65
|
| 840 |
+
0.65
|
| 841 |
+
0.65
|
| 842 |
+
0.65
|
| 843 |
+
0.65
|
| 844 |
+
0.65
|
| 845 |
+
0.65
|
| 846 |
+
0.65
|
| 847 |
+
0.87
|
| 848 |
+
0.65
|
| 849 |
+
0.65
|
| 850 |
+
0.65
|
| 851 |
+
0.65
|
| 852 |
+
0.65
|
| 853 |
+
0.65
|
| 854 |
+
0.65
|
| 855 |
+
0.65
|
| 856 |
+
0.74
|
| 857 |
+
0.65
|
| 858 |
+
0.65
|
| 859 |
+
0.66
|
| 860 |
+
0.89
|
| 861 |
+
0.65
|
| 862 |
+
0.65
|
| 863 |
+
0.65
|
| 864 |
+
0.65
|
| 865 |
+
0.65
|
| 866 |
+
0.65
|
| 867 |
+
0.65
|
| 868 |
+
0.65
|
| 869 |
+
0.65
|
| 870 |
+
0.65
|
| 871 |
+
0.65
|
| 872 |
+
0.9
|
| 873 |
+
0.65
|
| 874 |
+
0.65
|
| 875 |
+
0.65
|
| 876 |
+
0.65
|
| 877 |
+
0.65
|
| 878 |
+
0.65
|
| 879 |
+
0.65
|
| 880 |
+
0.84
|
| 881 |
+
0.65
|
| 882 |
+
0.65
|
| 883 |
+
0.65
|
| 884 |
+
0.65
|
| 885 |
+
0.65
|
| 886 |
+
0.65
|
| 887 |
+
0.65
|
| 888 |
+
0.65
|
| 889 |
+
0.65
|
| 890 |
+
0.65
|
| 891 |
+
0.65
|
| 892 |
+
0.65
|
| 893 |
+
0.65
|
| 894 |
+
0.65
|
| 895 |
+
0.65
|
| 896 |
+
0.8
|
| 897 |
+
0.65
|
| 898 |
+
0.65
|
| 899 |
+
0.65
|
| 900 |
+
0.65
|
| 901 |
+
0.65
|
| 902 |
+
0.65
|
| 903 |
+
0.65
|
| 904 |
+
0.65
|
| 905 |
+
0.65
|
| 906 |
+
0.8
|
| 907 |
+
0.65
|
| 908 |
+
0.88
|
| 909 |
+
0.65
|
| 910 |
+
0.65
|
| 911 |
+
0.8
|
| 912 |
+
0.65
|
| 913 |
+
0.65
|
| 914 |
+
0.7
|
| 915 |
+
0.65
|
| 916 |
+
0.65
|
| 917 |
+
0.65
|
| 918 |
+
0.9
|
| 919 |
+
0.65
|
| 920 |
+
0.9
|
| 921 |
+
0.65
|
| 922 |
+
0.65
|
| 923 |
+
0.65
|
| 924 |
+
0.65
|
| 925 |
+
0.65
|
| 926 |
+
0.65
|
| 927 |
+
0.65
|
| 928 |
+
0.8
|
| 929 |
+
0.8
|
| 930 |
+
0.65
|
| 931 |
+
0.8
|
| 932 |
+
0.65
|
| 933 |
+
0.65
|
| 934 |
+
0.65
|
| 935 |
+
0.65
|
| 936 |
+
0.65
|
| 937 |
+
0.8
|
| 938 |
+
0.65
|
| 939 |
+
0.65
|
| 940 |
+
0.65
|
| 941 |
+
0.65
|
| 942 |
+
0.65
|
| 943 |
+
0.65
|
| 944 |
+
0.82
|
| 945 |
+
0.65
|
| 946 |
+
0.65
|
| 947 |
+
0.65
|
| 948 |
+
0.65
|
| 949 |
+
0.65
|
| 950 |
+
0.8
|
| 951 |
+
0.8
|
| 952 |
+
0.9
|
| 953 |
+
0.65
|
| 954 |
+
0.65
|
| 955 |
+
0.65
|
| 956 |
+
0.65
|
| 957 |
+
0.8
|
| 958 |
+
0.65
|
| 959 |
+
0.65
|
| 960 |
+
0.65
|
| 961 |
+
0.8
|
| 962 |
+
0.65
|
| 963 |
+
0.65
|
| 964 |
+
0.65
|
| 965 |
+
0.8
|
| 966 |
+
0.8
|
| 967 |
+
0.65
|
| 968 |
+
0.65
|
| 969 |
+
0.65
|
| 970 |
+
0.65
|
| 971 |
+
0.65
|
| 972 |
+
0.65
|
| 973 |
+
0.65
|
| 974 |
+
0.65
|
| 975 |
+
0.65
|
| 976 |
+
0.65
|
| 977 |
+
0.8
|
| 978 |
+
0.8
|
| 979 |
+
0.65
|
| 980 |
+
0.8
|
| 981 |
+
0.8
|
| 982 |
+
0.65
|
| 983 |
+
0.65
|
| 984 |
+
0.65
|
| 985 |
+
0.75
|
| 986 |
+
0.65
|
| 987 |
+
0.7
|
| 988 |
+
0.9
|
| 989 |
+
0.8
|
| 990 |
+
0.65
|
| 991 |
+
0.65
|
| 992 |
+
0.65
|
| 993 |
+
0.65
|
| 994 |
+
0.65
|
| 995 |
+
0.8
|
| 996 |
+
0.8
|
| 997 |
+
0.65
|
| 998 |
+
0.65
|
| 999 |
+
0.65
|
| 1000 |
+
0.65
|
| 1001 |
+
0.65
|
| 1002 |
+
0.65
|
| 1003 |
+
0.65
|
| 1004 |
+
0.88
|
| 1005 |
+
0.65
|
| 1006 |
+
0.65
|
| 1007 |
+
1
|
| 1008 |
+
0.65
|
| 1009 |
+
0.65
|
| 1010 |
+
0.65
|
| 1011 |
+
0.8
|
| 1012 |
+
0.65
|
| 1013 |
+
0.8
|
| 1014 |
+
0.65
|
| 1015 |
+
0.65
|
| 1016 |
+
0.65
|
| 1017 |
+
0.65
|
| 1018 |
+
0.65
|
| 1019 |
+
0.8
|
| 1020 |
+
0.8
|
| 1021 |
+
0.65
|
| 1022 |
+
0.65
|
| 1023 |
+
0.8
|
| 1024 |
+
0.65
|
| 1025 |
+
0.65
|
| 1026 |
+
0.8
|
| 1027 |
+
0.8
|
| 1028 |
+
0.65
|
| 1029 |
+
0.65
|
| 1030 |
+
0.8
|
| 1031 |
+
0.8
|
| 1032 |
+
0.65
|
| 1033 |
+
0.65
|
| 1034 |
+
0.65
|
| 1035 |
+
0.65
|
| 1036 |
+
0.65
|
| 1037 |
+
0.65
|
| 1038 |
+
0.8
|
| 1039 |
+
0.65
|
| 1040 |
+
0.65
|
| 1041 |
+
0.65
|
| 1042 |
+
0.65
|
| 1043 |
+
0.65
|
| 1044 |
+
0.65
|
| 1045 |
+
0.65
|
| 1046 |
+
0.65
|
| 1047 |
+
0.65
|
| 1048 |
+
0.65
|
| 1049 |
+
0.8
|
| 1050 |
+
0.65
|
| 1051 |
+
0.8
|
| 1052 |
+
0.65
|
| 1053 |
+
0.8
|
| 1054 |
+
0.8
|
| 1055 |
+
0.65
|
| 1056 |
+
0.8
|
| 1057 |
+
0.65
|
| 1058 |
+
0.65
|
| 1059 |
+
0.71
|
| 1060 |
+
0.65
|
| 1061 |
+
0.65
|
| 1062 |
+
0.65
|
| 1063 |
+
0.79
|
| 1064 |
+
0.65
|
| 1065 |
+
0.65
|
| 1066 |
+
0.65
|
| 1067 |
+
0.65
|
| 1068 |
+
0.65
|
| 1069 |
+
0.89
|
| 1070 |
+
0.65
|
| 1071 |
+
0.65
|
| 1072 |
+
0.8
|
| 1073 |
+
0.65
|
| 1074 |
+
0.65
|
| 1075 |
+
0.65
|
| 1076 |
+
0.65
|
| 1077 |
+
0.65
|
| 1078 |
+
0.8
|
| 1079 |
+
0.65
|
| 1080 |
+
0.65
|
| 1081 |
+
0.9
|
| 1082 |
+
0.65
|
| 1083 |
+
0.65
|
| 1084 |
+
0.65
|
| 1085 |
+
0.65
|
| 1086 |
+
0.65
|
| 1087 |
+
0.65
|
| 1088 |
+
0.65
|
| 1089 |
+
0.65
|
| 1090 |
+
0.65
|
| 1091 |
+
0.8
|
| 1092 |
+
0.65
|
| 1093 |
+
0.65
|
| 1094 |
+
0.65
|
| 1095 |
+
0.8
|
| 1096 |
+
0.65
|
| 1097 |
+
0.8
|
| 1098 |
+
0.65
|
| 1099 |
+
0.65
|
| 1100 |
+
0.65
|
| 1101 |
+
0.65
|
| 1102 |
+
0.65
|
| 1103 |
+
0.65
|
| 1104 |
+
0.65
|
| 1105 |
+
0.8
|
| 1106 |
+
0.65
|
| 1107 |
+
0.65
|
| 1108 |
+
0.65
|
| 1109 |
+
0.8
|
| 1110 |
+
0.65
|
| 1111 |
+
0.65
|
| 1112 |
+
0.65
|
| 1113 |
+
0.9
|
| 1114 |
+
0.65
|
| 1115 |
+
0.8
|
| 1116 |
+
0.65
|
| 1117 |
+
0.65
|
| 1118 |
+
0.65
|
| 1119 |
+
0.65
|
| 1120 |
+
0.65
|
| 1121 |
+
0.65
|
| 1122 |
+
0.88
|
| 1123 |
+
0.65
|
| 1124 |
+
0.65
|
| 1125 |
+
0.8
|
| 1126 |
+
0.65
|
| 1127 |
+
0.65
|
| 1128 |
+
0.65
|
| 1129 |
+
0.65
|
| 1130 |
+
0.65
|
| 1131 |
+
0.65
|
| 1132 |
+
0.65
|
| 1133 |
+
0.65
|
| 1134 |
+
0.65
|
| 1135 |
+
0.65
|
| 1136 |
+
0.8
|
| 1137 |
+
0.65
|
| 1138 |
+
0.65
|
| 1139 |
+
0.8
|
| 1140 |
+
0.65
|
| 1141 |
+
0.65
|
| 1142 |
+
0.82
|
| 1143 |
+
0.65
|
| 1144 |
+
0.9
|
| 1145 |
+
0.65
|
| 1146 |
+
0.65
|
| 1147 |
+
0.65
|
| 1148 |
+
0.65
|
| 1149 |
+
0.65
|
| 1150 |
+
0.8
|
| 1151 |
+
0.65
|
| 1152 |
+
0.8
|
| 1153 |
+
0.65
|
| 1154 |
+
0.65
|
| 1155 |
+
0.65
|
| 1156 |
+
0.8
|
| 1157 |
+
0.65
|
| 1158 |
+
0.9
|
| 1159 |
+
0.65
|
| 1160 |
+
0.65
|
| 1161 |
+
0.88
|
| 1162 |
+
0.65
|
| 1163 |
+
0.65
|
| 1164 |
+
0.65
|
| 1165 |
+
0.65
|
| 1166 |
+
0.9
|
| 1167 |
+
0.65
|
| 1168 |
+
0.65
|
| 1169 |
+
0.65
|
| 1170 |
+
0.8
|
| 1171 |
+
0.65
|
| 1172 |
+
0.65
|
| 1173 |
+
0.65
|
| 1174 |
+
0.65
|
| 1175 |
+
0.65
|
| 1176 |
+
0.65
|
| 1177 |
+
0.65
|
| 1178 |
+
0.9
|
| 1179 |
+
0.65
|
| 1180 |
+
0.65
|
| 1181 |
+
0.65
|
| 1182 |
+
0.65
|
| 1183 |
+
0.8
|
| 1184 |
+
0.65
|
| 1185 |
+
0.65
|
| 1186 |
+
0.65
|
| 1187 |
+
0.65
|
| 1188 |
+
0.65
|
| 1189 |
+
0.65
|
| 1190 |
+
0.65
|
| 1191 |
+
0.65
|
| 1192 |
+
0.65
|
| 1193 |
+
0.65
|
| 1194 |
+
0.65
|
| 1195 |
+
0.89
|
| 1196 |
+
0.65
|
| 1197 |
+
0.65
|
| 1198 |
+
0.8
|
| 1199 |
+
0.65
|
| 1200 |
+
0.65
|
| 1201 |
+
0.65
|
| 1202 |
+
0.87
|
| 1203 |
+
0.65
|
| 1204 |
+
0.66
|
| 1205 |
+
0.65
|
| 1206 |
+
0.84
|
| 1207 |
+
0.65
|
| 1208 |
+
0.8
|
| 1209 |
+
0.65
|
| 1210 |
+
0.65
|
| 1211 |
+
0.65
|
| 1212 |
+
0.65
|
| 1213 |
+
0.65
|
| 1214 |
+
0.65
|
| 1215 |
+
0.65
|
| 1216 |
+
0.65
|
| 1217 |
+
0.65
|
| 1218 |
+
0.65
|
| 1219 |
+
0.65
|
| 1220 |
+
0.84
|
| 1221 |
+
0.65
|
| 1222 |
+
0.65
|
| 1223 |
+
0.65
|
| 1224 |
+
0.65
|
| 1225 |
+
0.65
|
| 1226 |
+
0.9
|
| 1227 |
+
0.8
|
| 1228 |
+
0.65
|
| 1229 |
+
0.65
|
| 1230 |
+
0.65
|
| 1231 |
+
0.65
|
| 1232 |
+
0.65
|
| 1233 |
+
0.5
|
| 1234 |
+
0.65
|
| 1235 |
+
0.64
|
| 1236 |
+
0.65
|
| 1237 |
+
0.65
|
| 1238 |
+
0.8
|
| 1239 |
+
0.8
|
| 1240 |
+
0.65
|
| 1241 |
+
0.65
|
| 1242 |
+
0.65
|
| 1243 |
+
0.65
|
| 1244 |
+
0.65
|
| 1245 |
+
0.65
|
| 1246 |
+
0.65
|
| 1247 |
+
0.65
|
| 1248 |
+
0.81
|
| 1249 |
+
0.65
|
| 1250 |
+
0.65
|
| 1251 |
+
0.65
|
| 1252 |
+
0.65
|
| 1253 |
+
0.65
|
| 1254 |
+
0.65
|
| 1255 |
+
0.65
|
| 1256 |
+
0.65
|
| 1257 |
+
0.65
|
| 1258 |
+
0.8
|
| 1259 |
+
0.65
|
| 1260 |
+
0.65
|
| 1261 |
+
0.8
|
| 1262 |
+
0.65
|
| 1263 |
+
0.8
|
| 1264 |
+
0.8
|
| 1265 |
+
0.65
|
| 1266 |
+
0.65
|
| 1267 |
+
0.65
|
| 1268 |
+
0.8
|
| 1269 |
+
0.8
|
| 1270 |
+
0.65
|
| 1271 |
+
0.65
|
| 1272 |
+
0.8
|
| 1273 |
+
0.65
|
| 1274 |
+
0.65
|
| 1275 |
+
0.65
|
| 1276 |
+
0.65
|
| 1277 |
+
0.65
|
| 1278 |
+
0.8
|
| 1279 |
+
0.65
|
| 1280 |
+
0.8
|
| 1281 |
+
0.8
|
| 1282 |
+
0.65
|
| 1283 |
+
0.84
|
| 1284 |
+
0.65
|
| 1285 |
+
0.65
|
| 1286 |
+
0.65
|
| 1287 |
+
0.65
|
| 1288 |
+
0.65
|
| 1289 |
+
0.8
|
| 1290 |
+
0.65
|
| 1291 |
+
0.65
|
| 1292 |
+
0.65
|
| 1293 |
+
0.65
|
| 1294 |
+
0.65
|
| 1295 |
+
0.65
|
| 1296 |
+
0.65
|
| 1297 |
+
0.65
|
| 1298 |
+
0.9
|
| 1299 |
+
0.65
|
| 1300 |
+
0.8
|
| 1301 |
+
0.65
|
| 1302 |
+
0.85
|
| 1303 |
+
0.65
|
| 1304 |
+
0.65
|
| 1305 |
+
0.65
|
| 1306 |
+
0.9
|
| 1307 |
+
0.65
|
| 1308 |
+
0.65
|
| 1309 |
+
0.65
|
| 1310 |
+
0.65
|
| 1311 |
+
0.65
|
| 1312 |
+
0.8
|
| 1313 |
+
0.65
|
| 1314 |
+
0.65
|
| 1315 |
+
0.65
|
| 1316 |
+
0.73
|
| 1317 |
+
0.65
|
| 1318 |
+
0.65
|
| 1319 |
+
0.8
|
| 1320 |
+
0.65
|
| 1321 |
+
0.65
|
| 1322 |
+
0.8
|
| 1323 |
+
0.65
|
| 1324 |
+
0.8
|
| 1325 |
+
0.65
|
| 1326 |
+
0.65
|
| 1327 |
+
0.86
|
| 1328 |
+
0.65
|
| 1329 |
+
0.65
|
| 1330 |
+
0.65
|
| 1331 |
+
0.65
|
| 1332 |
+
0.87
|
| 1333 |
+
0.65
|
| 1334 |
+
0.65
|
| 1335 |
+
0.8
|
| 1336 |
+
0.65
|
| 1337 |
+
0.65
|
| 1338 |
+
0.65
|
| 1339 |
+
0.65
|
| 1340 |
+
0.65
|
| 1341 |
+
0.65
|
| 1342 |
+
0.65
|
| 1343 |
+
0.8
|
| 1344 |
+
0.65
|
| 1345 |
+
0.65
|
| 1346 |
+
0.65
|
| 1347 |
+
0.65
|
| 1348 |
+
0.65
|
| 1349 |
+
0.65
|
| 1350 |
+
0.65
|
| 1351 |
+
0.8
|
| 1352 |
+
0.65
|
| 1353 |
+
0.65
|
| 1354 |
+
0.8
|
| 1355 |
+
0.65
|
| 1356 |
+
0.65
|
| 1357 |
+
0.65
|
| 1358 |
+
0.65
|
| 1359 |
+
0.65
|
| 1360 |
+
0.8
|
| 1361 |
+
0.8
|
| 1362 |
+
0.8
|
| 1363 |
+
0.65
|
| 1364 |
+
0.9
|
| 1365 |
+
0.65
|
| 1366 |
+
0.65
|
| 1367 |
+
0.65
|
| 1368 |
+
0.65
|
| 1369 |
+
0.65
|
| 1370 |
+
0.65
|
| 1371 |
+
0.8
|
| 1372 |
+
0.65
|
| 1373 |
+
0.65
|
| 1374 |
+
0.82
|
| 1375 |
+
0.8
|
| 1376 |
+
0.65
|
| 1377 |
+
0.65
|
| 1378 |
+
0.65
|
| 1379 |
+
0.84
|
| 1380 |
+
0.9
|
| 1381 |
+
0.9
|
| 1382 |
+
0.65
|
| 1383 |
+
0.65
|
| 1384 |
+
0.65
|
| 1385 |
+
0.65
|
| 1386 |
+
0.65
|
| 1387 |
+
0.65
|
| 1388 |
+
0.65
|
| 1389 |
+
0.65
|
| 1390 |
+
0.65
|
| 1391 |
+
0.8
|
| 1392 |
+
0.64
|
| 1393 |
+
0.65
|
| 1394 |
+
0.65
|
| 1395 |
+
0.65
|
| 1396 |
+
0.8
|
| 1397 |
+
0.8
|
| 1398 |
+
0.87
|
| 1399 |
+
0.65
|
| 1400 |
+
0.65
|
| 1401 |
+
0.78
|
| 1402 |
+
0.65
|
| 1403 |
+
0.65
|
| 1404 |
+
0.65
|
| 1405 |
+
0.65
|
| 1406 |
+
0.65
|
| 1407 |
+
0.65
|
| 1408 |
+
0.65
|
| 1409 |
+
0.65
|
| 1410 |
+
0.65
|
| 1411 |
+
0.65
|
| 1412 |
+
0.8
|
| 1413 |
+
0.65
|
| 1414 |
+
0.8
|
| 1415 |
+
0.8
|
| 1416 |
+
0.8
|
| 1417 |
+
0.65
|
| 1418 |
+
0.8
|
| 1419 |
+
0.65
|
| 1420 |
+
0.65
|
| 1421 |
+
0.65
|
| 1422 |
+
0.8
|
| 1423 |
+
0.65
|
| 1424 |
+
0.9
|
| 1425 |
+
0.65
|
| 1426 |
+
0.65
|
| 1427 |
+
0.8
|
| 1428 |
+
0.65
|
| 1429 |
+
0.85
|
| 1430 |
+
0.65
|
| 1431 |
+
0.65
|
| 1432 |
+
0.65
|
| 1433 |
+
0.65
|
| 1434 |
+
0.65
|
| 1435 |
+
0.65
|
| 1436 |
+
0.65
|
| 1437 |
+
0.74
|
| 1438 |
+
0.65
|
| 1439 |
+
0.8
|
| 1440 |
+
0.65
|
| 1441 |
+
0.65
|
| 1442 |
+
0.8
|
| 1443 |
+
0.65
|
| 1444 |
+
0.65
|
| 1445 |
+
0.65
|
| 1446 |
+
0.65
|
| 1447 |
+
0.65
|
| 1448 |
+
0.65
|
| 1449 |
+
0.65
|
| 1450 |
+
0.8
|
| 1451 |
+
0.65
|
| 1452 |
+
0.88
|
| 1453 |
+
0.65
|
| 1454 |
+
0.65
|
| 1455 |
+
0.65
|
| 1456 |
+
0.65
|
| 1457 |
+
0.65
|
| 1458 |
+
0.65
|
| 1459 |
+
0.83
|
| 1460 |
+
0.89
|
| 1461 |
+
0.89
|
| 1462 |
+
0.65
|
| 1463 |
+
0.65
|
| 1464 |
+
0.65
|
| 1465 |
+
0.65
|
| 1466 |
+
0.65
|
| 1467 |
+
0.65
|
| 1468 |
+
0.65
|
| 1469 |
+
0.9
|
| 1470 |
+
0.65
|
| 1471 |
+
0.65
|
| 1472 |
+
0.65
|
| 1473 |
+
0.65
|
| 1474 |
+
0.65
|
| 1475 |
+
0.65
|
| 1476 |
+
0.8
|
| 1477 |
+
0.65
|
| 1478 |
+
0.65
|
| 1479 |
+
0.65
|
| 1480 |
+
0.65
|
| 1481 |
+
0.65
|
| 1482 |
+
0.65
|
| 1483 |
+
0.86
|
| 1484 |
+
0.65
|
| 1485 |
+
0.65
|
| 1486 |
+
0.65
|
| 1487 |
+
0.65
|
| 1488 |
+
0.65
|
| 1489 |
+
0.65
|
| 1490 |
+
0.65
|
| 1491 |
+
0.65
|
| 1492 |
+
0.65
|
| 1493 |
+
0.65
|
| 1494 |
+
0.8
|
| 1495 |
+
0.65
|
| 1496 |
+
0.65
|
| 1497 |
+
0.65
|
| 1498 |
+
0.65
|
| 1499 |
+
0.65
|
| 1500 |
+
0.65
|
| 1501 |
+
0.65
|
| 1502 |
+
0.65
|
| 1503 |
+
0.65
|
| 1504 |
+
0.65
|
| 1505 |
+
0.65
|
| 1506 |
+
0.65
|
| 1507 |
+
0.8
|
| 1508 |
+
0.8
|
| 1509 |
+
0.65
|
| 1510 |
+
0.65
|
| 1511 |
+
0.65
|
| 1512 |
+
0.65
|
| 1513 |
+
0.65
|
| 1514 |
+
0.65
|
| 1515 |
+
0.65
|
| 1516 |
+
0.65
|
| 1517 |
+
0.65
|
| 1518 |
+
0.8
|
| 1519 |
+
0.65
|
| 1520 |
+
0.65
|
| 1521 |
+
0.65
|
| 1522 |
+
0.65
|
| 1523 |
+
0.65
|
| 1524 |
+
0.65
|
| 1525 |
+
0.65
|
| 1526 |
+
0.65
|
| 1527 |
+
0.65
|
| 1528 |
+
0.65
|
| 1529 |
+
0.8
|
| 1530 |
+
0.65
|
| 1531 |
+
0.65
|
| 1532 |
+
0.8
|
| 1533 |
+
0.65
|
| 1534 |
+
0.65
|
| 1535 |
+
0.65
|
| 1536 |
+
0.65
|
| 1537 |
+
0.8
|
| 1538 |
+
0.65
|
| 1539 |
+
0.65
|
| 1540 |
+
0.65
|
| 1541 |
+
0.8
|
| 1542 |
+
0.65
|
| 1543 |
+
0.8
|
| 1544 |
+
0.65
|
| 1545 |
+
0.65
|
| 1546 |
+
0.65
|
| 1547 |
+
0.65
|
| 1548 |
+
0.65
|
| 1549 |
+
0.65
|
| 1550 |
+
0.65
|
| 1551 |
+
0.8
|
| 1552 |
+
0.65
|
| 1553 |
+
0.65
|
| 1554 |
+
0.65
|
| 1555 |
+
0.8
|
| 1556 |
+
0.65
|
| 1557 |
+
0.8
|
| 1558 |
+
0.8
|
| 1559 |
+
0.65
|
| 1560 |
+
0.65
|
| 1561 |
+
0.65
|
| 1562 |
+
0.65
|
| 1563 |
+
0.65
|
| 1564 |
+
0.8
|
| 1565 |
+
0.65
|
| 1566 |
+
0.65
|
| 1567 |
+
0.65
|
| 1568 |
+
0.65
|
| 1569 |
+
0.65
|
| 1570 |
+
0.8
|
| 1571 |
+
0.8
|
| 1572 |
+
0.65
|
| 1573 |
+
0.65
|
| 1574 |
+
0.65
|
| 1575 |
+
0.8
|
| 1576 |
+
0.65
|
| 1577 |
+
0.8
|
| 1578 |
+
0.8
|
| 1579 |
+
0.65
|
| 1580 |
+
0.65
|
| 1581 |
+
0.8
|
| 1582 |
+
0.65
|
| 1583 |
+
0.65
|
| 1584 |
+
0.65
|
| 1585 |
+
0.65
|
| 1586 |
+
0.65
|
| 1587 |
+
0.65
|
| 1588 |
+
0.8
|
| 1589 |
+
0.65
|
| 1590 |
+
0.65
|
| 1591 |
+
0.8
|
| 1592 |
+
0.85
|
| 1593 |
+
0.65
|
| 1594 |
+
0.65
|
| 1595 |
+
0.65
|
| 1596 |
+
0.65
|
| 1597 |
+
0.65
|
| 1598 |
+
0.8
|
| 1599 |
+
0.65
|
| 1600 |
+
0.8
|
| 1601 |
+
0.65
|
| 1602 |
+
0.65
|
| 1603 |
+
0.65
|
| 1604 |
+
0.65
|
| 1605 |
+
0.65
|
| 1606 |
+
0.8
|
| 1607 |
+
0.65
|
| 1608 |
+
0.65
|
| 1609 |
+
0.65
|
| 1610 |
+
0.9
|
| 1611 |
+
0.65
|
| 1612 |
+
0.65
|
| 1613 |
+
0.9
|
| 1614 |
+
0.65
|
| 1615 |
+
0.65
|
| 1616 |
+
0.65
|
| 1617 |
+
0.9
|
| 1618 |
+
0.65
|
| 1619 |
+
0.65
|
| 1620 |
+
0.8
|
| 1621 |
+
0.65
|
| 1622 |
+
0.65
|
| 1623 |
+
0.65
|
| 1624 |
+
0.65
|
| 1625 |
+
0.65
|
| 1626 |
+
0.8
|
| 1627 |
+
0.65
|
| 1628 |
+
0.8
|
| 1629 |
+
0.65
|
| 1630 |
+
0.65
|
| 1631 |
+
0.65
|
| 1632 |
+
0.65
|
| 1633 |
+
0.65
|
| 1634 |
+
0.65
|
| 1635 |
+
0.8
|
| 1636 |
+
0.65
|
| 1637 |
+
0.65
|
| 1638 |
+
0.65
|
| 1639 |
+
0.86
|
| 1640 |
+
0.65
|
| 1641 |
+
0.65
|
| 1642 |
+
0.65
|
| 1643 |
+
0.65
|
| 1644 |
+
0.65
|
| 1645 |
+
0.65
|
| 1646 |
+
0.65
|
| 1647 |
+
0.65
|
| 1648 |
+
0.87
|
| 1649 |
+
0.8
|
| 1650 |
+
0.84
|
| 1651 |
+
0.65
|
| 1652 |
+
0.65
|
| 1653 |
+
0.8
|
| 1654 |
+
0.65
|
| 1655 |
+
0.65
|
| 1656 |
+
0.65
|
| 1657 |
+
0.65
|
| 1658 |
+
0.8
|
| 1659 |
+
0.65
|
| 1660 |
+
0.8
|
| 1661 |
+
0.65
|
| 1662 |
+
0.65
|
| 1663 |
+
0.65
|
| 1664 |
+
0.65
|
| 1665 |
+
0.65
|
| 1666 |
+
0.8
|
| 1667 |
+
0.65
|
| 1668 |
+
0.65
|
| 1669 |
+
0.8
|
| 1670 |
+
0.65
|
| 1671 |
+
0.65
|
| 1672 |
+
0.8
|
| 1673 |
+
0.81
|
| 1674 |
+
0.65
|
| 1675 |
+
0.65
|
| 1676 |
+
0.65
|
| 1677 |
+
0.8
|
| 1678 |
+
0.65
|
| 1679 |
+
0.8
|
| 1680 |
+
0.65
|
| 1681 |
+
0.65
|
| 1682 |
+
0.65
|
| 1683 |
+
0.65
|
| 1684 |
+
0.65
|
| 1685 |
+
0.8
|
| 1686 |
+
0.65
|
| 1687 |
+
0.65
|
| 1688 |
+
0.65
|
| 1689 |
+
0.65
|
| 1690 |
+
0.8
|
| 1691 |
+
0.8
|
| 1692 |
+
0.65
|
| 1693 |
+
0.7
|
| 1694 |
+
0.65
|
| 1695 |
+
0.65
|
| 1696 |
+
0.8
|
| 1697 |
+
0.65
|
| 1698 |
+
0.65
|
| 1699 |
+
0.65
|
| 1700 |
+
0.65
|
| 1701 |
+
0.65
|
| 1702 |
+
0.8
|
| 1703 |
+
0.82
|
| 1704 |
+
0.65
|
| 1705 |
+
0.65
|
| 1706 |
+
0.65
|
| 1707 |
+
0.65
|
| 1708 |
+
0.65
|
| 1709 |
+
0.65
|
| 1710 |
+
0.65
|
| 1711 |
+
0.65
|
| 1712 |
+
0.65
|
| 1713 |
+
0.8
|
| 1714 |
+
0.65
|
| 1715 |
+
0.65
|
| 1716 |
+
0.87
|
| 1717 |
+
0.65
|
| 1718 |
+
0.9
|
| 1719 |
+
0.8
|
| 1720 |
+
0.65
|
| 1721 |
+
0.65
|
| 1722 |
+
0.65
|
| 1723 |
+
0.9
|
| 1724 |
+
0.65
|
| 1725 |
+
0.65
|
| 1726 |
+
0.65
|
| 1727 |
+
0.65
|
| 1728 |
+
0.65
|
| 1729 |
+
0.8
|
| 1730 |
+
0.7
|
| 1731 |
+
0.65
|
| 1732 |
+
0.65
|
| 1733 |
+
0.65
|
| 1734 |
+
0.65
|
| 1735 |
+
0.65
|
| 1736 |
+
0.65
|
| 1737 |
+
0.8
|
| 1738 |
+
0.65
|
| 1739 |
+
0.9
|
| 1740 |
+
0.65
|
| 1741 |
+
0.65
|
| 1742 |
+
0.65
|
| 1743 |
+
0.65
|
| 1744 |
+
0.65
|
| 1745 |
+
0.65
|
| 1746 |
+
0.8
|
| 1747 |
+
0.65
|
| 1748 |
+
0.8
|
| 1749 |
+
0.8
|
| 1750 |
+
0.65
|
| 1751 |
+
0.65
|
| 1752 |
+
0.65
|
| 1753 |
+
0.65
|
| 1754 |
+
0.65
|
| 1755 |
+
0.65
|
| 1756 |
+
0.65
|
| 1757 |
+
0.85
|
| 1758 |
+
0.65
|
| 1759 |
+
0.65
|
| 1760 |
+
0.65
|
| 1761 |
+
0.65
|
| 1762 |
+
0.65
|
| 1763 |
+
0.73
|
| 1764 |
+
0.65
|
| 1765 |
+
0.8
|
| 1766 |
+
0.65
|
| 1767 |
+
0.65
|
| 1768 |
+
0.65
|
| 1769 |
+
0.65
|
| 1770 |
+
0.65
|
| 1771 |
+
0.65
|
| 1772 |
+
0.65
|
| 1773 |
+
0.65
|
| 1774 |
+
0.9
|
| 1775 |
+
0.65
|
| 1776 |
+
0.89
|
| 1777 |
+
0.8
|
| 1778 |
+
0.65
|
| 1779 |
+
0.9
|
| 1780 |
+
0.65
|
| 1781 |
+
1
|
| 1782 |
+
0.65
|
| 1783 |
+
0.65
|
| 1784 |
+
0.65
|
| 1785 |
+
0.65
|
| 1786 |
+
0.9
|
| 1787 |
+
0.65
|
| 1788 |
+
0.65
|
| 1789 |
+
0.65
|
| 1790 |
+
0.65
|
| 1791 |
+
0.89
|
| 1792 |
+
0.89
|
| 1793 |
+
0.65
|
| 1794 |
+
0.65
|
| 1795 |
+
0.65
|
| 1796 |
+
0.8
|
| 1797 |
+
0.75
|
| 1798 |
+
0.65
|
| 1799 |
+
0.65
|
| 1800 |
+
0.65
|
| 1801 |
+
0.65
|
| 1802 |
+
0.65
|
| 1803 |
+
0.65
|
| 1804 |
+
0.65
|
| 1805 |
+
0.8
|
| 1806 |
+
0.65
|
| 1807 |
+
0.65
|
| 1808 |
+
0.65
|
| 1809 |
+
0.65
|
| 1810 |
+
0.8
|
| 1811 |
+
0.65
|
| 1812 |
+
0.65
|
| 1813 |
+
0.65
|
| 1814 |
+
0.65
|
| 1815 |
+
0.65
|
| 1816 |
+
0.8
|
| 1817 |
+
0.65
|
| 1818 |
+
0.65
|
| 1819 |
+
0.65
|
| 1820 |
+
0.8
|
| 1821 |
+
0.8
|
| 1822 |
+
0.8
|
| 1823 |
+
0.65
|
| 1824 |
+
0.65
|
| 1825 |
+
0.88
|
| 1826 |
+
0.65
|
| 1827 |
+
0.8
|
| 1828 |
+
0.65
|
| 1829 |
+
0.65
|
| 1830 |
+
0.8
|
| 1831 |
+
0.85
|
| 1832 |
+
0.65
|
| 1833 |
+
0.65
|
| 1834 |
+
0.65
|
| 1835 |
+
0.65
|
| 1836 |
+
0.65
|
| 1837 |
+
0.65
|
| 1838 |
+
0.65
|
| 1839 |
+
0.8
|
| 1840 |
+
0.65
|
| 1841 |
+
0.65
|
| 1842 |
+
0.8
|
| 1843 |
+
0.9
|
| 1844 |
+
0.57
|
| 1845 |
+
0.65
|
| 1846 |
+
0.8
|
| 1847 |
+
0.65
|
| 1848 |
+
0.65
|
| 1849 |
+
0.65
|
| 1850 |
+
0.8
|
| 1851 |
+
0.65
|
| 1852 |
+
0.65
|
| 1853 |
+
0.65
|
| 1854 |
+
0.65
|
| 1855 |
+
0.65
|
| 1856 |
+
0.65
|
| 1857 |
+
0.65
|
| 1858 |
+
0.8
|
| 1859 |
+
0.65
|
| 1860 |
+
0.65
|
| 1861 |
+
0.8
|
| 1862 |
+
0.65
|
| 1863 |
+
0.65
|
| 1864 |
+
0.65
|
| 1865 |
+
0.65
|
| 1866 |
+
0.65
|
| 1867 |
+
0.65
|
| 1868 |
+
0.65
|
| 1869 |
+
0.65
|
| 1870 |
+
0.65
|
| 1871 |
+
0.9
|
| 1872 |
+
0.8
|
| 1873 |
+
0.8
|
| 1874 |
+
0.79
|
| 1875 |
+
0.65
|
| 1876 |
+
0.65
|
| 1877 |
+
0.8
|
| 1878 |
+
0.65
|
| 1879 |
+
0.65
|
| 1880 |
+
0.65
|
| 1881 |
+
0.65
|
| 1882 |
+
0.65
|
| 1883 |
+
0.65
|
| 1884 |
+
0.65
|
| 1885 |
+
0.65
|
| 1886 |
+
0.65
|
| 1887 |
+
0.65
|
| 1888 |
+
0.65
|
| 1889 |
+
0.8
|
| 1890 |
+
0.65
|
| 1891 |
+
0.65
|
| 1892 |
+
0.65
|
| 1893 |
+
0.8
|
| 1894 |
+
0.89
|
| 1895 |
+
0.8
|
| 1896 |
+
0.65
|
| 1897 |
+
0.8
|
| 1898 |
+
0.65
|
| 1899 |
+
0.8
|
| 1900 |
+
0.65
|
| 1901 |
+
0.81
|
| 1902 |
+
0.65
|
| 1903 |
+
0.65
|
| 1904 |
+
0.65
|
| 1905 |
+
0.8
|
| 1906 |
+
0.65
|
| 1907 |
+
0.65
|
| 1908 |
+
0.65
|
| 1909 |
+
0.65
|
| 1910 |
+
0.89
|
| 1911 |
+
0.65
|
| 1912 |
+
0.65
|
| 1913 |
+
0.65
|
| 1914 |
+
0.65
|
| 1915 |
+
0.65
|
| 1916 |
+
0.89
|
| 1917 |
+
0.84
|
| 1918 |
+
0.65
|
| 1919 |
+
0.65
|
| 1920 |
+
0.65
|
| 1921 |
+
0.65
|
| 1922 |
+
0.8
|
| 1923 |
+
0.65
|
| 1924 |
+
0.9
|
| 1925 |
+
0.65
|
| 1926 |
+
0.65
|
| 1927 |
+
0.65
|
| 1928 |
+
0.65
|
| 1929 |
+
0.65
|
| 1930 |
+
0.65
|
| 1931 |
+
0.65
|
| 1932 |
+
0.65
|
| 1933 |
+
0.89
|
| 1934 |
+
0.65
|
| 1935 |
+
0.8
|
| 1936 |
+
0.83
|
| 1937 |
+
0.65
|
| 1938 |
+
0.65
|
| 1939 |
+
0.8
|
| 1940 |
+
0.65
|
| 1941 |
+
0.65
|
| 1942 |
+
0.72
|
| 1943 |
+
0.65
|
| 1944 |
+
0.65
|
| 1945 |
+
0.65
|
| 1946 |
+
0.8
|
| 1947 |
+
0.8
|
| 1948 |
+
0.65
|
| 1949 |
+
0.8
|
| 1950 |
+
0.65
|
| 1951 |
+
0.65
|
| 1952 |
+
0.65
|
| 1953 |
+
0.8
|
| 1954 |
+
0.65
|
| 1955 |
+
0.65
|
| 1956 |
+
0.65
|
| 1957 |
+
0.8
|
| 1958 |
+
0.65
|
| 1959 |
+
0.65
|
| 1960 |
+
0.65
|
| 1961 |
+
0.65
|
| 1962 |
+
0.65
|
| 1963 |
+
0.65
|
| 1964 |
+
0.65
|
| 1965 |
+
0.65
|
| 1966 |
+
1
|
| 1967 |
+
0.65
|
| 1968 |
+
0.65
|
| 1969 |
+
0.8
|
| 1970 |
+
0.65
|
| 1971 |
+
0.65
|
| 1972 |
+
0.65
|
| 1973 |
+
0.65
|
| 1974 |
+
0.65
|
| 1975 |
+
0.8
|
| 1976 |
+
0.65
|
| 1977 |
+
0.9
|
| 1978 |
+
0.65
|
| 1979 |
+
0.65
|
| 1980 |
+
0.89
|
| 1981 |
+
0.65
|
| 1982 |
+
0.65
|
| 1983 |
+
0.65
|
| 1984 |
+
0.65
|
| 1985 |
+
0.9
|
| 1986 |
+
0.65
|
| 1987 |
+
0.65
|
| 1988 |
+
0.65
|
| 1989 |
+
0.65
|
| 1990 |
+
0.8
|
| 1991 |
+
0.65
|
| 1992 |
+
0.65
|
| 1993 |
+
0.65
|
| 1994 |
+
0.65
|
| 1995 |
+
0.65
|
| 1996 |
+
0.65
|
| 1997 |
+
0.65
|
| 1998 |
+
0.8
|
| 1999 |
+
0.8
|
| 2000 |
+
0.65
|
| 2001 |
+
0.69
|
| 2002 |
+
0.8
|
| 2003 |
+
0.65
|
| 2004 |
+
0.65
|
| 2005 |
+
0.65
|
| 2006 |
+
0.9
|
| 2007 |
+
0.65
|
| 2008 |
+
0.65
|
| 2009 |
+
0.65
|
| 2010 |
+
0.65
|
| 2011 |
+
0.71
|
| 2012 |
+
0.65
|
| 2013 |
+
0.65
|
| 2014 |
+
0.65
|
| 2015 |
+
0.88
|
| 2016 |
+
0.65
|
| 2017 |
+
0.65
|
| 2018 |
+
0.65
|
| 2019 |
+
0.65
|
| 2020 |
+
0.8
|
| 2021 |
+
0.65
|
| 2022 |
+
0.65
|
| 2023 |
+
0.65
|
| 2024 |
+
0.85
|
| 2025 |
+
0.65
|
| 2026 |
+
0.8
|
| 2027 |
+
0.65
|
| 2028 |
+
0.65
|
| 2029 |
+
0.65
|
| 2030 |
+
0.8
|
| 2031 |
+
0.65
|
| 2032 |
+
0.65
|
| 2033 |
+
0.65
|
| 2034 |
+
0.65
|
| 2035 |
+
0.65
|
| 2036 |
+
0.65
|
| 2037 |
+
0.65
|
| 2038 |
+
0.65
|
| 2039 |
+
0.65
|
| 2040 |
+
0.65
|
| 2041 |
+
0.65
|
| 2042 |
+
0.65
|
| 2043 |
+
0.65
|
| 2044 |
+
0.65
|
| 2045 |
+
0.87
|
| 2046 |
+
0.65
|
| 2047 |
+
0.65
|
| 2048 |
+
0.65
|
| 2049 |
+
0.65
|
| 2050 |
+
0.65
|
| 2051 |
+
0.65
|
| 2052 |
+
0.8
|
| 2053 |
+
0.65
|
| 2054 |
+
0.65
|
| 2055 |
+
0.65
|
| 2056 |
+
0.65
|
| 2057 |
+
0.65
|
| 2058 |
+
0.65
|
| 2059 |
+
0.65
|
| 2060 |
+
0.8
|
| 2061 |
+
0.65
|
| 2062 |
+
0.65
|
| 2063 |
+
0.65
|
| 2064 |
+
0.65
|
| 2065 |
+
0.65
|
| 2066 |
+
0.9
|
| 2067 |
+
0.8
|
| 2068 |
+
0.9
|
| 2069 |
+
0.65
|
| 2070 |
+
0.8
|
| 2071 |
+
0.8
|
| 2072 |
+
0.65
|
| 2073 |
+
0.65
|
| 2074 |
+
0.8
|
| 2075 |
+
0.8
|
| 2076 |
+
0.65
|
| 2077 |
+
0.8
|
| 2078 |
+
0.65
|
| 2079 |
+
0.65
|
| 2080 |
+
0.65
|
| 2081 |
+
0.65
|
| 2082 |
+
0.65
|
| 2083 |
+
0.65
|
| 2084 |
+
0.65
|
| 2085 |
+
0.65
|
| 2086 |
+
0.8
|
| 2087 |
+
0.8
|
| 2088 |
+
0.65
|
| 2089 |
+
0.65
|
| 2090 |
+
0.8
|
| 2091 |
+
0.65
|
| 2092 |
+
0.65
|
| 2093 |
+
0.65
|
| 2094 |
+
0.65
|
| 2095 |
+
0.65
|
| 2096 |
+
0.65
|
| 2097 |
+
0.65
|
| 2098 |
+
0.65
|
| 2099 |
+
0.65
|
| 2100 |
+
0.8
|
| 2101 |
+
0.8
|
| 2102 |
+
0.65
|
| 2103 |
+
0.65
|
| 2104 |
+
0.65
|
| 2105 |
+
0.65
|
| 2106 |
+
0.65
|
| 2107 |
+
0.65
|
| 2108 |
+
0.8
|
| 2109 |
+
0.8
|
| 2110 |
+
0.65
|
| 2111 |
+
0.65
|
| 2112 |
+
0.8
|
| 2113 |
+
0.65
|
| 2114 |
+
0.65
|
| 2115 |
+
0.8
|
| 2116 |
+
0.65
|
| 2117 |
+
0.8
|
| 2118 |
+
0.65
|
| 2119 |
+
0.65
|
| 2120 |
+
0.65
|
| 2121 |
+
0.65
|
| 2122 |
+
0.65
|
| 2123 |
+
0.65
|
| 2124 |
+
0.8
|
| 2125 |
+
0.8
|
| 2126 |
+
0.65
|
| 2127 |
+
0.8
|
| 2128 |
+
0.65
|
| 2129 |
+
0.65
|
| 2130 |
+
0.65
|
| 2131 |
+
0.65
|
| 2132 |
+
0.65
|
| 2133 |
+
0.65
|
| 2134 |
+
0.65
|
| 2135 |
+
0.65
|
| 2136 |
+
0.65
|
| 2137 |
+
0.65
|
| 2138 |
+
0.65
|
| 2139 |
+
0.65
|
| 2140 |
+
0.65
|
| 2141 |
+
0.8
|
| 2142 |
+
0.65
|
| 2143 |
+
0.65
|
| 2144 |
+
0.65
|
| 2145 |
+
0.8
|
| 2146 |
+
0.8
|
| 2147 |
+
0.65
|
| 2148 |
+
0.85
|
| 2149 |
+
0.65
|
| 2150 |
+
0.65
|
| 2151 |
+
0.8
|
| 2152 |
+
0.65
|
| 2153 |
+
0.89
|
| 2154 |
+
0.65
|
| 2155 |
+
0.65
|
| 2156 |
+
0.9
|
| 2157 |
+
0.8
|
| 2158 |
+
0.65
|
| 2159 |
+
0.65
|
| 2160 |
+
0.65
|
| 2161 |
+
0.65
|
| 2162 |
+
0.8
|
| 2163 |
+
0.65
|
| 2164 |
+
0.86
|
| 2165 |
+
0.65
|
| 2166 |
+
0.77
|
| 2167 |
+
0.65
|
| 2168 |
+
0.65
|
| 2169 |
+
0.65
|
| 2170 |
+
0.65
|
| 2171 |
+
0.65
|
| 2172 |
+
0.65
|
| 2173 |
+
0.65
|
| 2174 |
+
0.65
|
| 2175 |
+
0.65
|
| 2176 |
+
0.65
|
| 2177 |
+
0.9
|
| 2178 |
+
0.65
|
| 2179 |
+
0.8
|
| 2180 |
+
0.65
|
| 2181 |
+
0.65
|
| 2182 |
+
0.65
|
| 2183 |
+
0.9
|
| 2184 |
+
0.65
|
| 2185 |
+
0.65
|
| 2186 |
+
0.65
|
| 2187 |
+
0.65
|
| 2188 |
+
0.65
|
| 2189 |
+
0.65
|
| 2190 |
+
0.65
|
| 2191 |
+
0.65
|
| 2192 |
+
0.65
|
| 2193 |
+
0.65
|
| 2194 |
+
0.65
|
| 2195 |
+
0.65
|
| 2196 |
+
0.65
|
| 2197 |
+
0.65
|
| 2198 |
+
0.65
|
| 2199 |
+
0.65
|
| 2200 |
+
0.8
|
| 2201 |
+
0.65
|
| 2202 |
+
0.65
|
| 2203 |
+
0.65
|
| 2204 |
+
0.9
|
| 2205 |
+
0.65
|
| 2206 |
+
0.65
|
| 2207 |
+
0.8
|
| 2208 |
+
0.8
|
| 2209 |
+
0.65
|
| 2210 |
+
0.65
|
| 2211 |
+
0.65
|
| 2212 |
+
0.65
|
| 2213 |
+
0.8
|
| 2214 |
+
0.65
|
| 2215 |
+
0.65
|
| 2216 |
+
0.65
|
| 2217 |
+
0.65
|
| 2218 |
+
0.65
|
| 2219 |
+
0.65
|
| 2220 |
+
0.65
|
| 2221 |
+
0.65
|
| 2222 |
+
0.8
|
| 2223 |
+
0.65
|
| 2224 |
+
0.65
|
| 2225 |
+
0.65
|
| 2226 |
+
0.65
|
| 2227 |
+
0.65
|
| 2228 |
+
0.65
|
| 2229 |
+
0.8
|
| 2230 |
+
0.65
|
| 2231 |
+
0.65
|
| 2232 |
+
0.8
|
| 2233 |
+
0.65
|
| 2234 |
+
0.65
|
| 2235 |
+
0.8
|
| 2236 |
+
0.8
|
| 2237 |
+
0.8
|
| 2238 |
+
0.65
|
| 2239 |
+
0.65
|
| 2240 |
+
0.65
|
| 2241 |
+
0.65
|
| 2242 |
+
0.8
|
| 2243 |
+
0.65
|
| 2244 |
+
0.65
|
| 2245 |
+
0.65
|
| 2246 |
+
0.65
|
| 2247 |
+
0.65
|
| 2248 |
+
0.89
|
| 2249 |
+
0.65
|
| 2250 |
+
0.65
|
| 2251 |
+
0.65
|
| 2252 |
+
0.65
|
| 2253 |
+
0.65
|
| 2254 |
+
0.65
|
| 2255 |
+
0.65
|
| 2256 |
+
0.8
|
| 2257 |
+
0.65
|
| 2258 |
+
0.65
|
| 2259 |
+
0.65
|
| 2260 |
+
0.65
|
| 2261 |
+
0.65
|
| 2262 |
+
0.65
|
| 2263 |
+
0.65
|
| 2264 |
+
0.65
|
| 2265 |
+
0.65
|
| 2266 |
+
0.65
|
| 2267 |
+
0.65
|
| 2268 |
+
0.65
|
| 2269 |
+
0.8
|
| 2270 |
+
0.65
|
| 2271 |
+
0.65
|
| 2272 |
+
0.65
|
| 2273 |
+
0.75
|
| 2274 |
+
0.8
|
| 2275 |
+
0.65
|
| 2276 |
+
0.8
|
| 2277 |
+
0.88
|
| 2278 |
+
0.65
|
| 2279 |
+
0.65
|
| 2280 |
+
0.65
|
| 2281 |
+
0.65
|
| 2282 |
+
0.65
|
| 2283 |
+
0.65
|
| 2284 |
+
0.65
|
| 2285 |
+
0.65
|
| 2286 |
+
0.65
|
| 2287 |
+
0.65
|
| 2288 |
+
0.65
|
| 2289 |
+
0.65
|
| 2290 |
+
0.65
|
| 2291 |
+
0.65
|
| 2292 |
+
0.65
|
| 2293 |
+
0.65
|
| 2294 |
+
0.65
|
| 2295 |
+
0.8
|
| 2296 |
+
0.65
|
| 2297 |
+
0.65
|
| 2298 |
+
0.65
|
| 2299 |
+
0.88
|
| 2300 |
+
0.65
|
| 2301 |
+
0.65
|
| 2302 |
+
0.65
|
| 2303 |
+
0.65
|
| 2304 |
+
0.65
|
| 2305 |
+
0.65
|
| 2306 |
+
0.65
|
| 2307 |
+
0.65
|
| 2308 |
+
0.65
|
| 2309 |
+
0.8
|
| 2310 |
+
0.65
|
| 2311 |
+
0.82
|
| 2312 |
+
0.65
|
| 2313 |
+
0.65
|
| 2314 |
+
0.8
|
| 2315 |
+
0.65
|
| 2316 |
+
0.8
|
| 2317 |
+
0.65
|
| 2318 |
+
0.9
|
| 2319 |
+
0.65
|
| 2320 |
+
0.65
|
| 2321 |
+
0.65
|
| 2322 |
+
0.65
|
| 2323 |
+
0.65
|
| 2324 |
+
0.65
|
| 2325 |
+
0.65
|
| 2326 |
+
0.65
|
| 2327 |
+
0.65
|
| 2328 |
+
0.65
|
| 2329 |
+
0.83
|
| 2330 |
+
0.65
|
| 2331 |
+
0.65
|
| 2332 |
+
0.92
|
| 2333 |
+
0.89
|
| 2334 |
+
0.8
|
| 2335 |
+
0.8
|
| 2336 |
+
0.65
|
| 2337 |
+
0.65
|
| 2338 |
+
0.65
|
| 2339 |
+
0.65
|
| 2340 |
+
0.75
|
| 2341 |
+
0.65
|
| 2342 |
+
0.65
|
| 2343 |
+
0.65
|
| 2344 |
+
0.65
|
| 2345 |
+
0.8
|
| 2346 |
+
0.65
|
| 2347 |
+
0.65
|
| 2348 |
+
0.8
|
| 2349 |
+
0.65
|
| 2350 |
+
0.65
|
| 2351 |
+
0.65
|
| 2352 |
+
0.85
|
| 2353 |
+
0.65
|
| 2354 |
+
0.8
|
| 2355 |
+
0.65
|
| 2356 |
+
0.65
|
| 2357 |
+
0.65
|
| 2358 |
+
0.65
|
| 2359 |
+
0.65
|
| 2360 |
+
0.65
|
| 2361 |
+
0.65
|
| 2362 |
+
0.65
|
| 2363 |
+
0.8
|
| 2364 |
+
0.65
|
| 2365 |
+
0.65
|
| 2366 |
+
0.65
|
| 2367 |
+
0.65
|
| 2368 |
+
0.65
|
| 2369 |
+
0.8
|
| 2370 |
+
0.65
|
| 2371 |
+
0.65
|
| 2372 |
+
0.87
|
| 2373 |
+
0.65
|
| 2374 |
+
0.79
|
| 2375 |
+
0.65
|
| 2376 |
+
0.65
|
| 2377 |
+
0.65
|
| 2378 |
+
0.65
|
| 2379 |
+
0.65
|
| 2380 |
+
0.65
|
| 2381 |
+
0.65
|
| 2382 |
+
0.65
|
| 2383 |
+
0.65
|
| 2384 |
+
0.65
|
| 2385 |
+
0.8
|
| 2386 |
+
0.65
|
| 2387 |
+
0.65
|
| 2388 |
+
0.65
|
| 2389 |
+
0.65
|
| 2390 |
+
0.65
|
| 2391 |
+
0.65
|
| 2392 |
+
0.65
|
| 2393 |
+
0.83
|
| 2394 |
+
0.8
|
| 2395 |
+
0.65
|
| 2396 |
+
0.65
|
| 2397 |
+
0.8
|
| 2398 |
+
0.8
|
| 2399 |
+
0.65
|
| 2400 |
+
0.7
|
| 2401 |
+
0.65
|
| 2402 |
+
0.65
|
| 2403 |
+
0.8
|
| 2404 |
+
0.65
|
| 2405 |
+
0.65
|
| 2406 |
+
0.8
|
| 2407 |
+
0.8
|
| 2408 |
+
0.65
|
| 2409 |
+
0.8
|
| 2410 |
+
0.65
|
| 2411 |
+
0.65
|
| 2412 |
+
0.65
|
| 2413 |
+
0.65
|
| 2414 |
+
0.9
|
| 2415 |
+
0.8
|
| 2416 |
+
0.65
|
| 2417 |
+
0.65
|
| 2418 |
+
0.65
|
| 2419 |
+
0.65
|
| 2420 |
+
0.7
|
| 2421 |
+
0.65
|
| 2422 |
+
0.65
|
| 2423 |
+
0.65
|
| 2424 |
+
0.65
|
| 2425 |
+
0.65
|
| 2426 |
+
0.65
|
| 2427 |
+
0.87
|
| 2428 |
+
0.65
|
| 2429 |
+
0.65
|
| 2430 |
+
0.65
|
| 2431 |
+
0.65
|
| 2432 |
+
0.8
|
| 2433 |
+
0.82
|
| 2434 |
+
0.65
|
| 2435 |
+
0.8
|
| 2436 |
+
0.65
|
| 2437 |
+
0.65
|
| 2438 |
+
0.9
|
| 2439 |
+
0.65
|
| 2440 |
+
0.65
|
| 2441 |
+
0.65
|
| 2442 |
+
0.65
|
| 2443 |
+
0.65
|
| 2444 |
+
1
|
| 2445 |
+
0.65
|
| 2446 |
+
0.65
|
| 2447 |
+
0.65
|
| 2448 |
+
0.65
|
| 2449 |
+
0.65
|
| 2450 |
+
0.65
|
| 2451 |
+
0.65
|
| 2452 |
+
0.65
|
| 2453 |
+
0.8
|
| 2454 |
+
0.64
|
| 2455 |
+
0.65
|
| 2456 |
+
0.65
|
| 2457 |
+
0.63
|
| 2458 |
+
0.65
|
| 2459 |
+
0.65
|
| 2460 |
+
0.65
|
| 2461 |
+
0.65
|
| 2462 |
+
0.8
|
| 2463 |
+
0.65
|
| 2464 |
+
0.65
|
| 2465 |
+
0.65
|
| 2466 |
+
0.65
|
| 2467 |
+
0.76
|
| 2468 |
+
0.65
|
| 2469 |
+
0.65
|
| 2470 |
+
0.65
|
| 2471 |
+
0.65
|
| 2472 |
+
0.8
|
| 2473 |
+
0.65
|
| 2474 |
+
0.8
|
| 2475 |
+
0.65
|
| 2476 |
+
0.8
|
| 2477 |
+
0.65
|
| 2478 |
+
0.75
|
| 2479 |
+
0.65
|
| 2480 |
+
0.65
|
| 2481 |
+
0.65
|
| 2482 |
+
0.8
|
| 2483 |
+
0.65
|
| 2484 |
+
0.65
|
| 2485 |
+
0.65
|
| 2486 |
+
0.65
|
| 2487 |
+
0.8
|
| 2488 |
+
0.65
|
| 2489 |
+
0.65
|
| 2490 |
+
0.8
|
| 2491 |
+
0.65
|
| 2492 |
+
0.65
|
| 2493 |
+
0.65
|
| 2494 |
+
0.65
|
| 2495 |
+
0.65
|
| 2496 |
+
0.65
|
| 2497 |
+
0.65
|
| 2498 |
+
0.65
|
| 2499 |
+
0.8
|
| 2500 |
+
0.65
|
| 2501 |
+
0.87
|
| 2502 |
+
0.65
|
| 2503 |
+
0.65
|
| 2504 |
+
0.8
|
| 2505 |
+
0.65
|
| 2506 |
+
0.65
|
| 2507 |
+
0.65
|
| 2508 |
+
0.65
|
| 2509 |
+
0.65
|
| 2510 |
+
0.65
|
| 2511 |
+
0.65
|
| 2512 |
+
0.65
|
| 2513 |
+
0.65
|
| 2514 |
+
0.65
|
| 2515 |
+
0.8
|
| 2516 |
+
0.65
|
| 2517 |
+
0.8
|
| 2518 |
+
0.65
|
| 2519 |
+
0.65
|
| 2520 |
+
0.65
|
| 2521 |
+
0.65
|
| 2522 |
+
0.65
|
| 2523 |
+
0.65
|
| 2524 |
+
0.65
|
| 2525 |
+
0.65
|
| 2526 |
+
0.65
|
| 2527 |
+
0.65
|
| 2528 |
+
0.65
|
| 2529 |
+
0.8
|
| 2530 |
+
0.65
|
| 2531 |
+
0.65
|
| 2532 |
+
0.65
|
| 2533 |
+
0.65
|
| 2534 |
+
0.65
|
| 2535 |
+
0.65
|
| 2536 |
+
0.65
|
| 2537 |
+
0.65
|
| 2538 |
+
0.8
|
| 2539 |
+
0.65
|
| 2540 |
+
0.8
|
| 2541 |
+
0.65
|
| 2542 |
+
0.65
|
| 2543 |
+
0.65
|
| 2544 |
+
0.65
|
| 2545 |
+
0.65
|
| 2546 |
+
0.8
|
| 2547 |
+
0.65
|
| 2548 |
+
0.82
|
| 2549 |
+
0.65
|
| 2550 |
+
0.65
|
| 2551 |
+
0.65
|
| 2552 |
+
0.65
|
| 2553 |
+
0.65
|
| 2554 |
+
0.8
|
| 2555 |
+
0.89
|
| 2556 |
+
0.65
|
| 2557 |
+
0.8
|
| 2558 |
+
0.65
|
| 2559 |
+
0.65
|
| 2560 |
+
0.65
|
| 2561 |
+
0.65
|
| 2562 |
+
0.65
|
| 2563 |
+
0.9
|
| 2564 |
+
0.65
|
| 2565 |
+
0.65
|
| 2566 |
+
0.65
|
| 2567 |
+
0.65
|
| 2568 |
+
0.65
|
| 2569 |
+
0.65
|
| 2570 |
+
0.8
|
| 2571 |
+
0.65
|
| 2572 |
+
0.65
|
| 2573 |
+
0.65
|
| 2574 |
+
0.65
|
| 2575 |
+
0.65
|
| 2576 |
+
0.65
|
| 2577 |
+
0.65
|
| 2578 |
+
0.65
|
| 2579 |
+
0.8
|
| 2580 |
+
0.65
|
| 2581 |
+
0.65
|
| 2582 |
+
0.65
|
| 2583 |
+
0.65
|
| 2584 |
+
0.65
|
| 2585 |
+
0.65
|
| 2586 |
+
0.65
|
| 2587 |
+
0.65
|
| 2588 |
+
0.65
|
| 2589 |
+
0.8
|
| 2590 |
+
0.65
|
| 2591 |
+
0.65
|
| 2592 |
+
0.9
|
| 2593 |
+
0.65
|
| 2594 |
+
0.65
|
| 2595 |
+
0.65
|
| 2596 |
+
0.65
|
| 2597 |
+
0.8
|
| 2598 |
+
0.65
|
| 2599 |
+
0.65
|
| 2600 |
+
0.9
|
| 2601 |
+
0.65
|
| 2602 |
+
0.65
|
| 2603 |
+
0.8
|
| 2604 |
+
0.65
|
| 2605 |
+
0.65
|
| 2606 |
+
0.8
|
| 2607 |
+
0.65
|
| 2608 |
+
0.65
|
| 2609 |
+
0.65
|
| 2610 |
+
0.65
|
| 2611 |
+
0.8
|
| 2612 |
+
0.65
|
| 2613 |
+
0.65
|
| 2614 |
+
0.65
|
| 2615 |
+
0.65
|
| 2616 |
+
0.8
|
| 2617 |
+
0.65
|
| 2618 |
+
0.65
|
| 2619 |
+
0.65
|
| 2620 |
+
0.65
|
| 2621 |
+
0.65
|
| 2622 |
+
0.9
|
| 2623 |
+
0.8
|
| 2624 |
+
0.65
|
| 2625 |
+
0.73
|
| 2626 |
+
0.65
|
| 2627 |
+
0.65
|
| 2628 |
+
0.8
|
| 2629 |
+
0.65
|
| 2630 |
+
0.65
|
| 2631 |
+
0.65
|
| 2632 |
+
0.65
|
| 2633 |
+
0.86
|
| 2634 |
+
0.65
|
| 2635 |
+
0.9
|
| 2636 |
+
0.65
|
| 2637 |
+
0.65
|
| 2638 |
+
0.65
|
| 2639 |
+
0.65
|
| 2640 |
+
0.65
|
| 2641 |
+
0.65
|
| 2642 |
+
0.65
|
| 2643 |
+
0.65
|
| 2644 |
+
0.65
|
| 2645 |
+
0.65
|
| 2646 |
+
0.65
|
| 2647 |
+
0.65
|
| 2648 |
+
0.65
|
| 2649 |
+
0.65
|
| 2650 |
+
0.65
|
| 2651 |
+
0.65
|
| 2652 |
+
0.65
|
| 2653 |
+
0.65
|
| 2654 |
+
0.65
|
| 2655 |
+
0.9
|
| 2656 |
+
0.65
|
| 2657 |
+
0.65
|
| 2658 |
+
0.65
|
| 2659 |
+
0.65
|
| 2660 |
+
0.8
|
| 2661 |
+
0.65
|
| 2662 |
+
0.65
|
| 2663 |
+
0.65
|
| 2664 |
+
0.65
|
| 2665 |
+
0.65
|
| 2666 |
+
0.8
|
| 2667 |
+
0.65
|
| 2668 |
+
0.8
|
| 2669 |
+
0.65
|
| 2670 |
+
0.65
|
| 2671 |
+
0.65
|
| 2672 |
+
0.65
|
| 2673 |
+
0.65
|
| 2674 |
+
0.8
|
| 2675 |
+
0.8
|
| 2676 |
+
0.9
|
| 2677 |
+
0.65
|
| 2678 |
+
0.9
|
| 2679 |
+
0.65
|
| 2680 |
+
0.65
|
| 2681 |
+
0.65
|
| 2682 |
+
0.65
|
| 2683 |
+
0.86
|
| 2684 |
+
0.65
|
| 2685 |
+
0.65
|
| 2686 |
+
0.65
|
| 2687 |
+
0.65
|
| 2688 |
+
0.65
|
| 2689 |
+
0.65
|
| 2690 |
+
0.8
|
| 2691 |
+
0.65
|
| 2692 |
+
0.65
|
| 2693 |
+
0.65
|
| 2694 |
+
0.65
|
| 2695 |
+
0.65
|
| 2696 |
+
0.86
|
| 2697 |
+
0.65
|
| 2698 |
+
0.8
|
| 2699 |
+
0.8
|
| 2700 |
+
0.65
|
| 2701 |
+
0.8
|
| 2702 |
+
0.65
|
| 2703 |
+
0.65
|
| 2704 |
+
0.8
|
| 2705 |
+
0.65
|
| 2706 |
+
0.65
|
| 2707 |
+
0.69
|
| 2708 |
+
0.65
|
| 2709 |
+
0.65
|
| 2710 |
+
0.65
|
| 2711 |
+
0.65
|
| 2712 |
+
0.65
|
| 2713 |
+
0.88
|
| 2714 |
+
0.65
|
| 2715 |
+
0.65
|
| 2716 |
+
0.65
|
| 2717 |
+
0.65
|
| 2718 |
+
0.65
|
| 2719 |
+
0.65
|
| 2720 |
+
0.65
|
| 2721 |
+
0.65
|
| 2722 |
+
0.8
|
| 2723 |
+
0.65
|
| 2724 |
+
0.65
|
| 2725 |
+
0.65
|
| 2726 |
+
0.65
|
| 2727 |
+
0.65
|
| 2728 |
+
0.65
|
| 2729 |
+
0.65
|
| 2730 |
+
0.72
|
| 2731 |
+
0.65
|
| 2732 |
+
0.65
|
| 2733 |
+
0.8
|
| 2734 |
+
0.65
|
| 2735 |
+
0.8
|
| 2736 |
+
0.8
|
| 2737 |
+
0.65
|
| 2738 |
+
0.65
|
| 2739 |
+
0.65
|
| 2740 |
+
0.65
|
| 2741 |
+
0.65
|
| 2742 |
+
0.65
|
| 2743 |
+
0.9
|
| 2744 |
+
0.65
|
| 2745 |
+
0.65
|
| 2746 |
+
0.65
|
| 2747 |
+
0.65
|
| 2748 |
+
0.8
|
| 2749 |
+
0.65
|
| 2750 |
+
0.65
|
| 2751 |
+
0.9
|
| 2752 |
+
0.9
|
| 2753 |
+
0.8
|
| 2754 |
+
0.8
|
| 2755 |
+
0.65
|
| 2756 |
+
0.65
|
| 2757 |
+
0.65
|
| 2758 |
+
0.65
|
| 2759 |
+
0.8
|
| 2760 |
+
0.65
|
| 2761 |
+
0.65
|
| 2762 |
+
0.65
|
| 2763 |
+
0.65
|
| 2764 |
+
0.8
|
| 2765 |
+
0.65
|
| 2766 |
+
0.65
|
| 2767 |
+
0.65
|
| 2768 |
+
0.8
|
| 2769 |
+
0.65
|
| 2770 |
+
0.65
|
| 2771 |
+
0.65
|
| 2772 |
+
0.8
|
| 2773 |
+
0.65
|
| 2774 |
+
0.65
|
| 2775 |
+
0.65
|
| 2776 |
+
0.65
|
| 2777 |
+
0.45
|
| 2778 |
+
0.8
|
| 2779 |
+
0.65
|
| 2780 |
+
0.88
|
| 2781 |
+
0.65
|
| 2782 |
+
0.65
|
| 2783 |
+
0.65
|
| 2784 |
+
0.65
|
| 2785 |
+
0.65
|
| 2786 |
+
0.65
|
| 2787 |
+
0.65
|
| 2788 |
+
0.65
|
| 2789 |
+
0.65
|
| 2790 |
+
0.65
|
| 2791 |
+
0.8
|
| 2792 |
+
0.8
|
| 2793 |
+
0.65
|
| 2794 |
+
0.65
|
| 2795 |
+
0.65
|
| 2796 |
+
0.8
|
| 2797 |
+
0.65
|
| 2798 |
+
0.8
|
| 2799 |
+
0.65
|
| 2800 |
+
0.8
|
| 2801 |
+
0.51
|
| 2802 |
+
0.65
|
| 2803 |
+
0.65
|
| 2804 |
+
0.8
|
| 2805 |
+
0.65
|
| 2806 |
+
0.65
|
| 2807 |
+
0.8
|
| 2808 |
+
0.8
|
| 2809 |
+
0.65
|
| 2810 |
+
0.65
|
| 2811 |
+
0.65
|
| 2812 |
+
0.65
|
| 2813 |
+
0.65
|
| 2814 |
+
0.8
|
| 2815 |
+
0.65
|
| 2816 |
+
0.65
|
| 2817 |
+
0.65
|
| 2818 |
+
0.65
|
| 2819 |
+
0.65
|
| 2820 |
+
0.65
|
| 2821 |
+
0.65
|
| 2822 |
+
0.65
|
| 2823 |
+
0.65
|
| 2824 |
+
0.65
|
| 2825 |
+
0.65
|
| 2826 |
+
0.65
|
| 2827 |
+
0.65
|
| 2828 |
+
0.65
|
| 2829 |
+
0.65
|
| 2830 |
+
0.66
|
| 2831 |
+
0.65
|
| 2832 |
+
0.8
|
| 2833 |
+
0.9
|
| 2834 |
+
0.65
|
| 2835 |
+
0.65
|
| 2836 |
+
0.65
|
| 2837 |
+
0.65
|
| 2838 |
+
0.65
|
| 2839 |
+
0.65
|
| 2840 |
+
0.65
|
| 2841 |
+
0.8
|
| 2842 |
+
0.8
|
| 2843 |
+
0.65
|
| 2844 |
+
0.8
|
| 2845 |
+
0.65
|
| 2846 |
+
0.65
|
| 2847 |
+
0.65
|
| 2848 |
+
0.65
|
| 2849 |
+
0.65
|
| 2850 |
+
0.65
|
| 2851 |
+
0.65
|
| 2852 |
+
0.65
|
| 2853 |
+
0.65
|
| 2854 |
+
0.65
|
| 2855 |
+
0.8
|
| 2856 |
+
0.8
|
| 2857 |
+
0.65
|
| 2858 |
+
0.65
|
| 2859 |
+
0.65
|
| 2860 |
+
0.65
|
| 2861 |
+
0.65
|
| 2862 |
+
0.65
|
| 2863 |
+
0.65
|
| 2864 |
+
0.65
|
| 2865 |
+
0.81
|
| 2866 |
+
0.65
|
| 2867 |
+
0.65
|
| 2868 |
+
0.65
|
| 2869 |
+
0.65
|
| 2870 |
+
0.65
|
| 2871 |
+
0.65
|
| 2872 |
+
0.65
|
| 2873 |
+
0.8
|
| 2874 |
+
0.65
|
| 2875 |
+
0.65
|
| 2876 |
+
0.8
|
| 2877 |
+
0.65
|
| 2878 |
+
0.75
|
| 2879 |
+
0.65
|
| 2880 |
+
0.65
|
| 2881 |
+
0.65
|
| 2882 |
+
0.8
|
| 2883 |
+
0.65
|
| 2884 |
+
0.65
|
| 2885 |
+
0.8
|
| 2886 |
+
0.65
|
| 2887 |
+
0.66
|
| 2888 |
+
0.65
|
| 2889 |
+
0.65
|
| 2890 |
+
0.65
|
| 2891 |
+
0.65
|
| 2892 |
+
0.65
|
| 2893 |
+
0.65
|
| 2894 |
+
0.8
|
| 2895 |
+
0.65
|
| 2896 |
+
0.65
|
| 2897 |
+
0.65
|
| 2898 |
+
0.65
|
| 2899 |
+
0.65
|
| 2900 |
+
0.9
|
| 2901 |
+
0.65
|
| 2902 |
+
0.65
|
| 2903 |
+
0.8
|
| 2904 |
+
0.65
|
| 2905 |
+
0.65
|
| 2906 |
+
0.65
|
| 2907 |
+
0.65
|
| 2908 |
+
0.65
|
| 2909 |
+
0.9
|
| 2910 |
+
0.8
|
| 2911 |
+
0.65
|
| 2912 |
+
0.85
|
| 2913 |
+
0.8
|
| 2914 |
+
0.65
|
| 2915 |
+
0.65
|
| 2916 |
+
0.8
|
| 2917 |
+
0.65
|
| 2918 |
+
0.65
|
| 2919 |
+
0.65
|
| 2920 |
+
0.65
|
| 2921 |
+
0.9
|
| 2922 |
+
0.65
|
| 2923 |
+
0.65
|
| 2924 |
+
0.65
|
| 2925 |
+
0.65
|
| 2926 |
+
0.65
|
| 2927 |
+
0.65
|
| 2928 |
+
0.65
|
| 2929 |
+
0.65
|
| 2930 |
+
0.8
|
| 2931 |
+
0.65
|
| 2932 |
+
0.65
|
| 2933 |
+
0.65
|
| 2934 |
+
0.65
|
| 2935 |
+
0.65
|
| 2936 |
+
0.65
|
| 2937 |
+
0.65
|
| 2938 |
+
0.65
|
| 2939 |
+
0.8
|
| 2940 |
+
0.65
|
| 2941 |
+
0.65
|
| 2942 |
+
0.65
|
| 2943 |
+
0.81
|
| 2944 |
+
0.65
|
| 2945 |
+
0.65
|
| 2946 |
+
0.65
|
| 2947 |
+
0.65
|
| 2948 |
+
0.65
|
| 2949 |
+
0.65
|
| 2950 |
+
0.89
|
| 2951 |
+
0.65
|
| 2952 |
+
0.8
|
| 2953 |
+
0.65
|
| 2954 |
+
0.65
|
| 2955 |
+
0.8
|
| 2956 |
+
0.65
|
| 2957 |
+
0.65
|
| 2958 |
+
0.65
|
| 2959 |
+
0.79
|
| 2960 |
+
0.75
|
| 2961 |
+
0.65
|
| 2962 |
+
0.65
|
| 2963 |
+
0.8
|
| 2964 |
+
0.65
|
| 2965 |
+
0.67
|
| 2966 |
+
0.8
|
| 2967 |
+
0.8
|
| 2968 |
+
0.86
|
| 2969 |
+
0.65
|
| 2970 |
+
0.65
|
| 2971 |
+
0.65
|
| 2972 |
+
0.65
|
| 2973 |
+
0.65
|
| 2974 |
+
0.65
|
| 2975 |
+
0.81
|
| 2976 |
+
0.8
|
| 2977 |
+
0.65
|
| 2978 |
+
0.65
|
| 2979 |
+
0.9
|
| 2980 |
+
0.65
|
| 2981 |
+
0.79
|
| 2982 |
+
0.65
|
| 2983 |
+
0.8
|
| 2984 |
+
0.65
|
| 2985 |
+
0.65
|
| 2986 |
+
0.65
|
| 2987 |
+
0.65
|
| 2988 |
+
0.65
|
| 2989 |
+
0.65
|
| 2990 |
+
0.65
|
| 2991 |
+
0.65
|
| 2992 |
+
0.65
|
| 2993 |
+
0.65
|
| 2994 |
+
0.65
|
| 2995 |
+
0.65
|
| 2996 |
+
0.65
|
| 2997 |
+
0.8
|
| 2998 |
+
0.65
|
| 2999 |
+
0.77
|
| 3000 |
+
0.65
|
| 3001 |
+
0.65
|
| 3002 |
+
0.65
|
| 3003 |
+
0.65
|
| 3004 |
+
0.65
|
| 3005 |
+
0.65
|
| 3006 |
+
0.65
|
| 3007 |
+
0.65
|
| 3008 |
+
0.65
|
| 3009 |
+
0.8
|
| 3010 |
+
0.8
|
| 3011 |
+
0.8
|
| 3012 |
+
0.65
|
| 3013 |
+
0.74
|
| 3014 |
+
0.65
|
| 3015 |
+
0.65
|
| 3016 |
+
0.65
|
| 3017 |
+
0.65
|
| 3018 |
+
0.65
|
| 3019 |
+
0.65
|
| 3020 |
+
0.6
|
| 3021 |
+
0.65
|
| 3022 |
+
0.65
|
| 3023 |
+
0.65
|
| 3024 |
+
0.65
|
| 3025 |
+
0.65
|
| 3026 |
+
0.65
|
| 3027 |
+
0.65
|
| 3028 |
+
0.65
|
| 3029 |
+
0.8
|
| 3030 |
+
0.65
|
| 3031 |
+
0.65
|
| 3032 |
+
0.8
|
| 3033 |
+
0.65
|
| 3034 |
+
0.65
|
| 3035 |
+
0.8
|
| 3036 |
+
0.65
|
| 3037 |
+
0.65
|
| 3038 |
+
0.65
|
| 3039 |
+
0.89
|
| 3040 |
+
0.8
|
| 3041 |
+
0.65
|
| 3042 |
+
0.65
|
| 3043 |
+
0.88
|
| 3044 |
+
0.65
|
| 3045 |
+
0.65
|
| 3046 |
+
0.65
|
| 3047 |
+
0.9
|
| 3048 |
+
0.75
|
| 3049 |
+
0.65
|
| 3050 |
+
0.65
|
| 3051 |
+
0.65
|
| 3052 |
+
0.8
|
| 3053 |
+
0.6
|
| 3054 |
+
0.65
|
| 3055 |
+
0.65
|
| 3056 |
+
0.65
|
| 3057 |
+
0.9
|
| 3058 |
+
0.65
|
| 3059 |
+
0.65
|
| 3060 |
+
0.65
|
| 3061 |
+
0.84
|
| 3062 |
+
0.65
|
| 3063 |
+
0.65
|
| 3064 |
+
0.8
|
| 3065 |
+
0.65
|
| 3066 |
+
0.65
|
| 3067 |
+
0.8
|
| 3068 |
+
0.65
|
| 3069 |
+
0.65
|
| 3070 |
+
0.65
|
| 3071 |
+
0.65
|
| 3072 |
+
0.65
|
| 3073 |
+
0.65
|
| 3074 |
+
0.65
|
| 3075 |
+
0.65
|
| 3076 |
+
0.65
|
| 3077 |
+
0.8
|
| 3078 |
+
0.65
|
| 3079 |
+
0.65
|
| 3080 |
+
0.65
|
| 3081 |
+
0.9
|
| 3082 |
+
0.65
|
| 3083 |
+
0.65
|
| 3084 |
+
0.65
|
| 3085 |
+
0.65
|
| 3086 |
+
0.8
|
| 3087 |
+
0.65
|
| 3088 |
+
0.8
|
| 3089 |
+
0.65
|
| 3090 |
+
0.8
|
| 3091 |
+
0.8
|
| 3092 |
+
0.8
|
| 3093 |
+
0.65
|
| 3094 |
+
0.8
|
| 3095 |
+
0.65
|
| 3096 |
+
0.65
|
| 3097 |
+
0.65
|
| 3098 |
+
0.65
|
| 3099 |
+
0.8
|
| 3100 |
+
0.65
|
| 3101 |
+
0.65
|
| 3102 |
+
0.85
|
| 3103 |
+
0.65
|
| 3104 |
+
0.65
|
| 3105 |
+
0.8
|
| 3106 |
+
0.65
|
| 3107 |
+
0.65
|
| 3108 |
+
0.65
|
| 3109 |
+
0.65
|
| 3110 |
+
0.65
|
| 3111 |
+
0.65
|
| 3112 |
+
0.65
|
| 3113 |
+
0.65
|
| 3114 |
+
0.8
|
| 3115 |
+
0.8
|
| 3116 |
+
0.65
|
| 3117 |
+
0.65
|
| 3118 |
+
0.65
|
| 3119 |
+
0.65
|
| 3120 |
+
0.65
|
| 3121 |
+
0.63
|
| 3122 |
+
0.65
|
| 3123 |
+
0.65
|
| 3124 |
+
0.65
|
| 3125 |
+
0.7
|
| 3126 |
+
0.65
|
| 3127 |
+
0.65
|
| 3128 |
+
0.65
|
| 3129 |
+
0.65
|
| 3130 |
+
0.65
|
| 3131 |
+
0.65
|
| 3132 |
+
0.65
|
| 3133 |
+
0.65
|
| 3134 |
+
0.65
|
| 3135 |
+
0.65
|
| 3136 |
+
0.65
|
| 3137 |
+
0.65
|
| 3138 |
+
0.65
|
| 3139 |
+
0.8
|
| 3140 |
+
0.65
|
| 3141 |
+
0.65
|
| 3142 |
+
0.8
|
| 3143 |
+
0.65
|
| 3144 |
+
0.65
|
| 3145 |
+
0.65
|
| 3146 |
+
0.65
|
| 3147 |
+
0.65
|
| 3148 |
+
0.65
|
| 3149 |
+
0.9
|
| 3150 |
+
0.9
|
| 3151 |
+
0.65
|
| 3152 |
+
0.65
|
| 3153 |
+
0.8
|
| 3154 |
+
0.65
|
| 3155 |
+
0.65
|
| 3156 |
+
0.65
|
| 3157 |
+
0.65
|
| 3158 |
+
0.65
|
| 3159 |
+
0.65
|
| 3160 |
+
0.84
|
| 3161 |
+
0.65
|
| 3162 |
+
0.65
|
| 3163 |
+
0.8
|
| 3164 |
+
0.65
|
| 3165 |
+
0.81
|
| 3166 |
+
0.8
|
| 3167 |
+
0.8
|
| 3168 |
+
0.8
|
| 3169 |
+
0.82
|
| 3170 |
+
0.65
|
| 3171 |
+
0.65
|
| 3172 |
+
0.65
|
| 3173 |
+
0.8
|
| 3174 |
+
0.65
|
| 3175 |
+
0.65
|
| 3176 |
+
0.65
|
| 3177 |
+
0.65
|
| 3178 |
+
0.65
|
| 3179 |
+
0.65
|
| 3180 |
+
0.8
|
| 3181 |
+
0.65
|
| 3182 |
+
0.8
|
| 3183 |
+
0.65
|
| 3184 |
+
0.8
|
| 3185 |
+
0.65
|
| 3186 |
+
0.88
|
| 3187 |
+
0.65
|
| 3188 |
+
0.8
|
| 3189 |
+
0.65
|
| 3190 |
+
0.7
|
| 3191 |
+
0.65
|
| 3192 |
+
0.65
|
| 3193 |
+
0.65
|
| 3194 |
+
0.65
|
| 3195 |
+
0.65
|
| 3196 |
+
0.65
|
| 3197 |
+
0.65
|
| 3198 |
+
0.65
|
| 3199 |
+
0.8
|
| 3200 |
+
0.65
|
| 3201 |
+
0.65
|
| 3202 |
+
0.65
|
| 3203 |
+
0.65
|
| 3204 |
+
0.65
|
| 3205 |
+
0.8
|
| 3206 |
+
0.65
|
| 3207 |
+
0.65
|
| 3208 |
+
0.65
|
| 3209 |
+
0.8
|
| 3210 |
+
0.65
|
| 3211 |
+
1
|
| 3212 |
+
0.8
|
| 3213 |
+
0.8
|
| 3214 |
+
0.65
|
| 3215 |
+
0.65
|
| 3216 |
+
0.65
|
| 3217 |
+
0.8
|
| 3218 |
+
0.8
|
| 3219 |
+
0.8
|
| 3220 |
+
0.65
|
| 3221 |
+
0.74
|
| 3222 |
+
0.65
|
| 3223 |
+
0.65
|
| 3224 |
+
0.65
|
| 3225 |
+
0.8
|
| 3226 |
+
0.65
|
| 3227 |
+
0.8
|
| 3228 |
+
0.65
|
| 3229 |
+
0.65
|
| 3230 |
+
0.65
|
| 3231 |
+
0.65
|
| 3232 |
+
0.65
|
| 3233 |
+
0.65
|
| 3234 |
+
0.65
|
| 3235 |
+
0.65
|
| 3236 |
+
0.65
|
| 3237 |
+
0.65
|
| 3238 |
+
0.8
|
| 3239 |
+
0.8
|
| 3240 |
+
0.65
|
| 3241 |
+
0.65
|
| 3242 |
+
0.65
|
| 3243 |
+
0.65
|
| 3244 |
+
0.65
|
| 3245 |
+
0.65
|
| 3246 |
+
0.65
|
| 3247 |
+
0.65
|
| 3248 |
+
0.8
|
| 3249 |
+
0.65
|
| 3250 |
+
0.65
|
| 3251 |
+
0.65
|
| 3252 |
+
0.85
|
| 3253 |
+
0.65
|
| 3254 |
+
0.65
|
| 3255 |
+
0.65
|
| 3256 |
+
0.65
|
| 3257 |
+
0.8
|
| 3258 |
+
0.8
|
| 3259 |
+
0.65
|
| 3260 |
+
0.65
|
| 3261 |
+
0.65
|
| 3262 |
+
0.8
|
| 3263 |
+
0.65
|
| 3264 |
+
0.65
|
| 3265 |
+
0.65
|
| 3266 |
+
0.65
|
| 3267 |
+
0.65
|
| 3268 |
+
0.8
|
| 3269 |
+
0.65
|
| 3270 |
+
0.8
|
| 3271 |
+
0.65
|
| 3272 |
+
0.65
|
| 3273 |
+
0.65
|
| 3274 |
+
0.65
|
| 3275 |
+
0.65
|
| 3276 |
+
0.8
|
| 3277 |
+
0.9
|
| 3278 |
+
0.86
|
| 3279 |
+
0.8
|
| 3280 |
+
0.65
|
| 3281 |
+
0.8
|
| 3282 |
+
0.8
|
| 3283 |
+
0.65
|
| 3284 |
+
0.65
|
| 3285 |
+
0.65
|
| 3286 |
+
0.65
|
| 3287 |
+
0.65
|
| 3288 |
+
0.65
|
| 3289 |
+
0.65
|
| 3290 |
+
0.65
|
| 3291 |
+
0.64
|
| 3292 |
+
0.65
|
| 3293 |
+
0.65
|
| 3294 |
+
0.8
|
| 3295 |
+
0.8
|
| 3296 |
+
0.65
|
| 3297 |
+
0.87
|
| 3298 |
+
0.65
|
| 3299 |
+
0.65
|
| 3300 |
+
0.8
|
| 3301 |
+
0.8
|
| 3302 |
+
0.65
|
| 3303 |
+
0.65
|
| 3304 |
+
0.65
|
| 3305 |
+
0.65
|
| 3306 |
+
0.65
|
| 3307 |
+
0.65
|
| 3308 |
+
0.65
|
| 3309 |
+
0.65
|
| 3310 |
+
0.87
|
| 3311 |
+
0.65
|
| 3312 |
+
0.65
|
| 3313 |
+
0.65
|
| 3314 |
+
0.65
|
| 3315 |
+
0.65
|
| 3316 |
+
0.65
|
| 3317 |
+
0.8
|
| 3318 |
+
0.65
|
| 3319 |
+
0.65
|
| 3320 |
+
0.8
|
| 3321 |
+
0.65
|
| 3322 |
+
0.65
|
| 3323 |
+
0.65
|
| 3324 |
+
0.7
|
| 3325 |
+
0.65
|
| 3326 |
+
0.65
|
| 3327 |
+
0.8
|
| 3328 |
+
0.65
|
| 3329 |
+
0.65
|
| 3330 |
+
0.75
|
| 3331 |
+
0.65
|
| 3332 |
+
0.65
|
| 3333 |
+
0.65
|
| 3334 |
+
0.65
|
| 3335 |
+
0.65
|
| 3336 |
+
0.65
|
| 3337 |
+
0.85
|
| 3338 |
+
0.8
|
| 3339 |
+
0.65
|
| 3340 |
+
0.65
|
| 3341 |
+
0.65
|
| 3342 |
+
0.65
|
| 3343 |
+
0.65
|
| 3344 |
+
0.65
|
| 3345 |
+
0.65
|
| 3346 |
+
0.65
|
| 3347 |
+
0.8
|
| 3348 |
+
0.8
|
| 3349 |
+
0.65
|
| 3350 |
+
0.65
|
| 3351 |
+
0.65
|
| 3352 |
+
0.65
|
| 3353 |
+
0.65
|
| 3354 |
+
0.65
|
| 3355 |
+
0.65
|
| 3356 |
+
0.65
|
| 3357 |
+
0.8
|
| 3358 |
+
0.65
|
| 3359 |
+
0.65
|
| 3360 |
+
0.65
|
| 3361 |
+
0.71
|
| 3362 |
+
0.65
|
| 3363 |
+
0.65
|
| 3364 |
+
0.65
|
| 3365 |
+
0.65
|
| 3366 |
+
0.65
|
| 3367 |
+
0.65
|
| 3368 |
+
0.65
|
| 3369 |
+
0.65
|
| 3370 |
+
0.65
|
| 3371 |
+
0.65
|
| 3372 |
+
0.65
|
| 3373 |
+
0.65
|
| 3374 |
+
0.8
|
| 3375 |
+
0.65
|
| 3376 |
+
0.65
|
| 3377 |
+
0.65
|
| 3378 |
+
0.73
|
| 3379 |
+
0.65
|
| 3380 |
+
0.65
|
| 3381 |
+
0.8
|
| 3382 |
+
0.65
|
| 3383 |
+
0.65
|
| 3384 |
+
0.65
|
| 3385 |
+
0.65
|
| 3386 |
+
0.8
|
| 3387 |
+
0.8
|
| 3388 |
+
0.65
|
| 3389 |
+
0.65
|
| 3390 |
+
0.8
|
| 3391 |
+
0.65
|
| 3392 |
+
0.65
|
| 3393 |
+
0.65
|
| 3394 |
+
0.65
|
| 3395 |
+
0.9
|
| 3396 |
+
0.65
|
| 3397 |
+
0.65
|
| 3398 |
+
0.8
|
| 3399 |
+
0.65
|
| 3400 |
+
0.86
|
| 3401 |
+
0.65
|
| 3402 |
+
0.65
|
| 3403 |
+
0.65
|
| 3404 |
+
0.65
|
| 3405 |
+
0.9
|
| 3406 |
+
0.65
|
| 3407 |
+
0.65
|
| 3408 |
+
0.65
|
| 3409 |
+
0.65
|
| 3410 |
+
0.65
|
| 3411 |
+
0.65
|
| 3412 |
+
0.8
|
| 3413 |
+
0.75
|
| 3414 |
+
0.65
|
| 3415 |
+
0.8
|
| 3416 |
+
0.65
|
| 3417 |
+
0.65
|
| 3418 |
+
0.65
|
| 3419 |
+
0.65
|
| 3420 |
+
0.65
|
| 3421 |
+
0.65
|
| 3422 |
+
0.65
|
| 3423 |
+
0.65
|
| 3424 |
+
0.65
|
| 3425 |
+
0.8
|
| 3426 |
+
0.65
|
| 3427 |
+
0.65
|
| 3428 |
+
0.65
|
| 3429 |
+
0.65
|
| 3430 |
+
0.65
|
| 3431 |
+
0.65
|
| 3432 |
+
0.65
|
| 3433 |
+
0.65
|
| 3434 |
+
0.65
|
| 3435 |
+
0.65
|
| 3436 |
+
0.65
|
| 3437 |
+
0.8
|
| 3438 |
+
0.88
|
| 3439 |
+
0.65
|
| 3440 |
+
0.8
|
| 3441 |
+
0.65
|
| 3442 |
+
0.8
|
| 3443 |
+
0.65
|
| 3444 |
+
0.65
|
| 3445 |
+
0.65
|
| 3446 |
+
0.9
|
| 3447 |
+
0.65
|
| 3448 |
+
0.65
|
| 3449 |
+
0.65
|
| 3450 |
+
0.65
|
| 3451 |
+
0.65
|
| 3452 |
+
0.8
|
| 3453 |
+
0.65
|
| 3454 |
+
0.8
|
| 3455 |
+
0.65
|
| 3456 |
+
0.65
|
| 3457 |
+
0.65
|
| 3458 |
+
0.65
|
| 3459 |
+
0.65
|
| 3460 |
+
0.81
|
| 3461 |
+
0.65
|
| 3462 |
+
0.65
|
| 3463 |
+
0.8
|
| 3464 |
+
0.65
|
| 3465 |
+
0.65
|
| 3466 |
+
0.9
|
| 3467 |
+
0.8
|
| 3468 |
+
0.65
|
| 3469 |
+
0.65
|
| 3470 |
+
0.65
|
| 3471 |
+
0.8
|
| 3472 |
+
0.65
|
| 3473 |
+
0.65
|
| 3474 |
+
0.65
|
| 3475 |
+
0.65
|
| 3476 |
+
0.65
|
| 3477 |
+
0.65
|
| 3478 |
+
0.65
|
| 3479 |
+
0.65
|
| 3480 |
+
0.8
|
| 3481 |
+
0.9
|
| 3482 |
+
0.65
|
| 3483 |
+
0.65
|
| 3484 |
+
0.65
|
| 3485 |
+
0.65
|
| 3486 |
+
0.7
|
| 3487 |
+
0.65
|
| 3488 |
+
0.65
|
| 3489 |
+
0.65
|
| 3490 |
+
0.8
|
| 3491 |
+
0.65
|
| 3492 |
+
0.65
|
| 3493 |
+
0.65
|
| 3494 |
+
0.65
|
| 3495 |
+
0.65
|
| 3496 |
+
0.65
|
| 3497 |
+
0.65
|
| 3498 |
+
0.65
|
| 3499 |
+
0.65
|
| 3500 |
+
0.65
|
| 3501 |
+
0.65
|
| 3502 |
+
0.77
|
| 3503 |
+
0.65
|
| 3504 |
+
0.65
|
| 3505 |
+
0.65
|
| 3506 |
+
0.65
|
| 3507 |
+
0.65
|
| 3508 |
+
0.85
|
| 3509 |
+
0.65
|
| 3510 |
+
0.65
|
| 3511 |
+
0.65
|
| 3512 |
+
0.65
|
| 3513 |
+
0.65
|
| 3514 |
+
0.65
|
| 3515 |
+
0.65
|
| 3516 |
+
0.65
|
| 3517 |
+
0.65
|
| 3518 |
+
0.65
|
| 3519 |
+
0.8
|
| 3520 |
+
0.65
|
| 3521 |
+
0.65
|
| 3522 |
+
0.87
|
| 3523 |
+
0.65
|
| 3524 |
+
0.65
|
| 3525 |
+
0.65
|
| 3526 |
+
0.65
|
| 3527 |
+
0.65
|
| 3528 |
+
0.65
|
| 3529 |
+
0.65
|
| 3530 |
+
0.65
|
| 3531 |
+
0.65
|
| 3532 |
+
0.65
|
| 3533 |
+
0.65
|
| 3534 |
+
0.8
|
| 3535 |
+
0.8
|
| 3536 |
+
0.65
|
| 3537 |
+
0.65
|
| 3538 |
+
0.8
|
| 3539 |
+
0.65
|
| 3540 |
+
0.65
|
| 3541 |
+
0.65
|
| 3542 |
+
0.65
|
| 3543 |
+
0.65
|
| 3544 |
+
0.65
|
| 3545 |
+
0.65
|
| 3546 |
+
0.65
|
| 3547 |
+
0.9
|
| 3548 |
+
0.65
|
| 3549 |
+
0.65
|
| 3550 |
+
0.65
|
| 3551 |
+
0.65
|
| 3552 |
+
0.8
|
| 3553 |
+
0.65
|
| 3554 |
+
0.65
|
| 3555 |
+
0.65
|
| 3556 |
+
0.65
|
| 3557 |
+
0.65
|
| 3558 |
+
0.65
|
| 3559 |
+
0.65
|
| 3560 |
+
0.8
|
| 3561 |
+
0.65
|
| 3562 |
+
0.8
|
| 3563 |
+
0.65
|
| 3564 |
+
0.65
|
| 3565 |
+
0.65
|
| 3566 |
+
0.65
|
| 3567 |
+
0.65
|
| 3568 |
+
0.65
|
| 3569 |
+
0.8
|
| 3570 |
+
0.65
|
| 3571 |
+
0.65
|
| 3572 |
+
0.65
|
| 3573 |
+
0.65
|
| 3574 |
+
0.65
|
| 3575 |
+
0.65
|
| 3576 |
+
0.65
|
| 3577 |
+
0.65
|
| 3578 |
+
0.57
|
| 3579 |
+
0.65
|
| 3580 |
+
0.65
|
| 3581 |
+
0.8
|
| 3582 |
+
0.65
|
| 3583 |
+
0.65
|
| 3584 |
+
0.8
|
| 3585 |
+
0.8
|
| 3586 |
+
0.65
|
| 3587 |
+
0.65
|
| 3588 |
+
0.65
|
| 3589 |
+
0.65
|
| 3590 |
+
0.76
|
| 3591 |
+
1
|
| 3592 |
+
0.8
|
| 3593 |
+
0.65
|
| 3594 |
+
0.65
|
| 3595 |
+
0.58
|
| 3596 |
+
0.8
|
| 3597 |
+
0.65
|
| 3598 |
+
0.65
|
| 3599 |
+
0.65
|
| 3600 |
+
0.65
|
| 3601 |
+
0.65
|
| 3602 |
+
0.8
|
| 3603 |
+
1
|
| 3604 |
+
0.65
|
| 3605 |
+
0.8
|
| 3606 |
+
0.65
|
| 3607 |
+
0.65
|
| 3608 |
+
0.65
|
| 3609 |
+
0.8
|
| 3610 |
+
0.65
|
| 3611 |
+
0.9
|
| 3612 |
+
0.65
|
| 3613 |
+
0.65
|
| 3614 |
+
0.65
|
| 3615 |
+
0.65
|
| 3616 |
+
0.65
|
| 3617 |
+
0.65
|
| 3618 |
+
0.65
|
| 3619 |
+
0.87
|
| 3620 |
+
0.8
|
| 3621 |
+
0.9
|
| 3622 |
+
0.8
|
| 3623 |
+
0.8
|
| 3624 |
+
0.65
|
| 3625 |
+
0.65
|
| 3626 |
+
0.65
|
| 3627 |
+
0.65
|
| 3628 |
+
0.65
|
| 3629 |
+
0.65
|
| 3630 |
+
0.8
|
| 3631 |
+
0.65
|
| 3632 |
+
0.65
|
| 3633 |
+
0.65
|
| 3634 |
+
0.65
|
| 3635 |
+
0.8
|
| 3636 |
+
0.65
|
| 3637 |
+
0.65
|
| 3638 |
+
0.8
|
| 3639 |
+
0.65
|
| 3640 |
+
0.65
|
| 3641 |
+
0.65
|
| 3642 |
+
0.8
|
| 3643 |
+
0.65
|
| 3644 |
+
0.65
|
| 3645 |
+
0.65
|
| 3646 |
+
0.65
|
| 3647 |
+
0.65
|
| 3648 |
+
0.65
|
| 3649 |
+
0.65
|
| 3650 |
+
0.65
|
| 3651 |
+
0.65
|
| 3652 |
+
0.65
|
| 3653 |
+
0.8
|
| 3654 |
+
0.65
|
| 3655 |
+
0.8
|
| 3656 |
+
0.65
|
| 3657 |
+
0.65
|
| 3658 |
+
0.65
|
| 3659 |
+
0.65
|
| 3660 |
+
0.8
|
| 3661 |
+
0.65
|
| 3662 |
+
0.65
|
| 3663 |
+
0.65
|
| 3664 |
+
0.8
|
| 3665 |
+
0.65
|
| 3666 |
+
0.65
|
| 3667 |
+
0.65
|
| 3668 |
+
0.65
|
| 3669 |
+
0.65
|
| 3670 |
+
0.65
|
| 3671 |
+
0.87
|
| 3672 |
+
0.68
|
| 3673 |
+
0.8
|
| 3674 |
+
0.65
|
| 3675 |
+
0.65
|
| 3676 |
+
0.65
|
| 3677 |
+
0.65
|
| 3678 |
+
0.8
|
| 3679 |
+
0.65
|
| 3680 |
+
0.65
|
| 3681 |
+
0.65
|
| 3682 |
+
0.65
|
| 3683 |
+
0.65
|
| 3684 |
+
0.8
|
| 3685 |
+
0.65
|
| 3686 |
+
0.65
|
| 3687 |
+
0.65
|
| 3688 |
+
0.65
|
| 3689 |
+
0.65
|
| 3690 |
+
0.65
|
| 3691 |
+
0.8
|
| 3692 |
+
0.65
|
| 3693 |
+
0.65
|
| 3694 |
+
0.65
|
| 3695 |
+
0.99
|
| 3696 |
+
0.8
|
| 3697 |
+
0.77
|
| 3698 |
+
0.65
|
| 3699 |
+
0.9
|
| 3700 |
+
0.65
|
| 3701 |
+
0.65
|
| 3702 |
+
0.88
|
| 3703 |
+
0.65
|
| 3704 |
+
0.65
|
| 3705 |
+
0.65
|
| 3706 |
+
0.65
|
| 3707 |
+
0.9
|
| 3708 |
+
0.65
|
| 3709 |
+
0.88
|
| 3710 |
+
0.65
|
| 3711 |
+
0.65
|
| 3712 |
+
0.65
|
| 3713 |
+
0.65
|
| 3714 |
+
0.65
|
| 3715 |
+
0.65
|
| 3716 |
+
0.89
|
| 3717 |
+
0.65
|
| 3718 |
+
0.65
|
| 3719 |
+
0.8
|
| 3720 |
+
0.8
|
| 3721 |
+
0.65
|
| 3722 |
+
0.7
|
| 3723 |
+
0.65
|
| 3724 |
+
0.65
|
| 3725 |
+
0.8
|
| 3726 |
+
0.9
|
| 3727 |
+
0.65
|
| 3728 |
+
0.65
|
| 3729 |
+
0.65
|
| 3730 |
+
0.8
|
| 3731 |
+
0.65
|
| 3732 |
+
0.65
|
| 3733 |
+
0.8
|
| 3734 |
+
0.8
|
| 3735 |
+
0.65
|
| 3736 |
+
0.65
|
| 3737 |
+
0.65
|
| 3738 |
+
0.8
|
| 3739 |
+
0.65
|
| 3740 |
+
0.65
|
| 3741 |
+
0.65
|
| 3742 |
+
0.65
|
| 3743 |
+
0.65
|
| 3744 |
+
0.65
|
| 3745 |
+
0.65
|
| 3746 |
+
0.8
|
| 3747 |
+
0.8
|
| 3748 |
+
0.8
|
| 3749 |
+
0.65
|
| 3750 |
+
0.77
|
| 3751 |
+
0.65
|
| 3752 |
+
0.65
|
| 3753 |
+
0.65
|
| 3754 |
+
0.65
|
| 3755 |
+
0.79
|
| 3756 |
+
0.65
|
| 3757 |
+
0.65
|
| 3758 |
+
0.65
|
| 3759 |
+
0.65
|
| 3760 |
+
0.65
|
| 3761 |
+
0.8
|
| 3762 |
+
0.65
|
| 3763 |
+
0.65
|
| 3764 |
+
0.65
|
| 3765 |
+
0.65
|
| 3766 |
+
0.8
|
| 3767 |
+
0.65
|
| 3768 |
+
0.65
|
| 3769 |
+
0.65
|
| 3770 |
+
0.65
|
| 3771 |
+
0.65
|
| 3772 |
+
0.65
|
| 3773 |
+
0.65
|
| 3774 |
+
0.65
|
| 3775 |
+
0.65
|
| 3776 |
+
0.65
|
| 3777 |
+
0.65
|
| 3778 |
+
0.8
|
| 3779 |
+
0.65
|
| 3780 |
+
0.65
|
| 3781 |
+
0.65
|
| 3782 |
+
0.8
|
| 3783 |
+
0.65
|
| 3784 |
+
0.8
|
| 3785 |
+
0.65
|
| 3786 |
+
0.65
|
| 3787 |
+
0.65
|
| 3788 |
+
0.65
|
| 3789 |
+
0.65
|
| 3790 |
+
0.8
|
| 3791 |
+
0.8
|
| 3792 |
+
0.65
|
| 3793 |
+
0.65
|
| 3794 |
+
0.65
|
| 3795 |
+
0.85
|
| 3796 |
+
0.65
|
| 3797 |
+
0.65
|
| 3798 |
+
0.65
|
| 3799 |
+
0.65
|
| 3800 |
+
0.65
|
| 3801 |
+
0.65
|
| 3802 |
+
0.52
|
| 3803 |
+
0.65
|
| 3804 |
+
0.65
|
| 3805 |
+
0.8
|
| 3806 |
+
0.65
|
| 3807 |
+
0.65
|
| 3808 |
+
0.65
|
| 3809 |
+
0.65
|
| 3810 |
+
0.65
|
| 3811 |
+
0.65
|
| 3812 |
+
0.8
|
| 3813 |
+
0.65
|
| 3814 |
+
0.65
|
| 3815 |
+
0.65
|
| 3816 |
+
0.65
|
| 3817 |
+
0.65
|
| 3818 |
+
0.65
|
| 3819 |
+
0.65
|
| 3820 |
+
0.8
|
| 3821 |
+
0.65
|
| 3822 |
+
0.86
|
| 3823 |
+
0.65
|
| 3824 |
+
0.65
|
| 3825 |
+
0.8
|
| 3826 |
+
0.56
|
| 3827 |
+
0.65
|
| 3828 |
+
0.65
|
| 3829 |
+
0.65
|
| 3830 |
+
0.8
|
| 3831 |
+
0.65
|
| 3832 |
+
0.8
|
| 3833 |
+
0.8
|
| 3834 |
+
0.65
|
| 3835 |
+
0.65
|
| 3836 |
+
0.65
|
| 3837 |
+
0.65
|
| 3838 |
+
0.65
|
| 3839 |
+
0.65
|
| 3840 |
+
0.65
|
| 3841 |
+
0.8
|
| 3842 |
+
0.65
|
| 3843 |
+
0.65
|
| 3844 |
+
0.65
|
| 3845 |
+
0.65
|
| 3846 |
+
0.72
|
| 3847 |
+
0.65
|
| 3848 |
+
0.65
|
| 3849 |
+
0.65
|
| 3850 |
+
0.8
|
| 3851 |
+
0.8
|
| 3852 |
+
0.65
|
| 3853 |
+
0.9
|
| 3854 |
+
0.65
|
| 3855 |
+
0.65
|
| 3856 |
+
0.8
|
| 3857 |
+
0.65
|
| 3858 |
+
0.8
|
| 3859 |
+
0.6
|
| 3860 |
+
0.65
|
| 3861 |
+
0.65
|
| 3862 |
+
0.65
|
| 3863 |
+
0.8
|
| 3864 |
+
0.65
|
| 3865 |
+
0.65
|
| 3866 |
+
0.65
|
| 3867 |
+
0.8
|
| 3868 |
+
0.65
|
| 3869 |
+
0.88
|
| 3870 |
+
0.65
|
| 3871 |
+
0.65
|
| 3872 |
+
0.65
|
| 3873 |
+
0.65
|
| 3874 |
+
0.8
|
| 3875 |
+
0.65
|
| 3876 |
+
0.65
|
| 3877 |
+
0.89
|
| 3878 |
+
0.85
|
| 3879 |
+
0.65
|
| 3880 |
+
0.65
|
| 3881 |
+
0.65
|
| 3882 |
+
0.65
|
| 3883 |
+
0.65
|
| 3884 |
+
0.65
|
| 3885 |
+
0.65
|
| 3886 |
+
0.87
|
| 3887 |
+
0.65
|
| 3888 |
+
0.65
|
| 3889 |
+
0.65
|
| 3890 |
+
0.65
|
| 3891 |
+
0.65
|
| 3892 |
+
0.65
|
| 3893 |
+
0.8
|
| 3894 |
+
0.65
|
| 3895 |
+
0.8
|
| 3896 |
+
0.65
|
| 3897 |
+
0.65
|
| 3898 |
+
0.65
|
| 3899 |
+
0.65
|
| 3900 |
+
0.65
|
| 3901 |
+
0.65
|
| 3902 |
+
0.65
|
| 3903 |
+
0.65
|
| 3904 |
+
0.65
|
| 3905 |
+
0.75
|
| 3906 |
+
0.65
|
| 3907 |
+
0.65
|
| 3908 |
+
0.65
|
| 3909 |
+
0.65
|
| 3910 |
+
0.54
|
| 3911 |
+
1
|
| 3912 |
+
0.65
|
| 3913 |
+
0.65
|
| 3914 |
+
0.75
|
| 3915 |
+
0.65
|
| 3916 |
+
0.75
|
| 3917 |
+
0.65
|
| 3918 |
+
0.65
|
| 3919 |
+
0.65
|
| 3920 |
+
0.8
|
| 3921 |
+
0.65
|
| 3922 |
+
0.65
|
| 3923 |
+
0.8
|
| 3924 |
+
0.65
|
| 3925 |
+
0.65
|
| 3926 |
+
0.8
|
| 3927 |
+
0.65
|
| 3928 |
+
0.65
|
| 3929 |
+
0.65
|
| 3930 |
+
0.65
|
| 3931 |
+
0.65
|
| 3932 |
+
0.65
|
| 3933 |
+
0.65
|
| 3934 |
+
0.9
|
| 3935 |
+
0.9
|
| 3936 |
+
0.62
|
| 3937 |
+
0.65
|
| 3938 |
+
0.65
|
| 3939 |
+
0.65
|
| 3940 |
+
0.65
|
| 3941 |
+
0.86
|
| 3942 |
+
0.65
|
| 3943 |
+
0.65
|
| 3944 |
+
0.65
|
| 3945 |
+
0.65
|
| 3946 |
+
0.65
|
| 3947 |
+
0.65
|
| 3948 |
+
0.65
|
| 3949 |
+
0.65
|
| 3950 |
+
0.65
|
| 3951 |
+
0.65
|
| 3952 |
+
0.65
|
| 3953 |
+
0.65
|
| 3954 |
+
0.8
|
| 3955 |
+
0.65
|
| 3956 |
+
0.8
|
| 3957 |
+
0.8
|
| 3958 |
+
0.65
|
| 3959 |
+
0.8
|
| 3960 |
+
0.65
|
| 3961 |
+
0.65
|
| 3962 |
+
0.65
|
| 3963 |
+
0.65
|
| 3964 |
+
0.65
|
| 3965 |
+
0.65
|
| 3966 |
+
0.65
|
| 3967 |
+
0.8
|
| 3968 |
+
0.65
|
| 3969 |
+
0.82
|
| 3970 |
+
0.65
|
| 3971 |
+
0.65
|
| 3972 |
+
0.65
|
| 3973 |
+
0.65
|
| 3974 |
+
0.65
|
| 3975 |
+
0.65
|
| 3976 |
+
0.65
|
| 3977 |
+
0.65
|
| 3978 |
+
0.8
|
| 3979 |
+
0.65
|
| 3980 |
+
0.65
|
| 3981 |
+
0.65
|
| 3982 |
+
0.9
|
| 3983 |
+
0.74
|
| 3984 |
+
0.8
|
| 3985 |
+
0.65
|
| 3986 |
+
0.8
|
| 3987 |
+
0.8
|
| 3988 |
+
0.7
|
| 3989 |
+
0.65
|
| 3990 |
+
0.65
|
| 3991 |
+
0.65
|
| 3992 |
+
0.89
|
| 3993 |
+
0.65
|
| 3994 |
+
0.65
|
| 3995 |
+
0.8
|
| 3996 |
+
0.8
|
| 3997 |
+
0.8
|
| 3998 |
+
0.8
|
| 3999 |
+
0.65
|
| 4000 |
+
0.8
|
| 4001 |
+
0.65
|
| 4002 |
+
0.65
|
| 4003 |
+
0.65
|
| 4004 |
+
0.9
|
| 4005 |
+
0.65
|
| 4006 |
+
0.65
|
| 4007 |
+
0.65
|
| 4008 |
+
0.8
|
| 4009 |
+
0.8
|
| 4010 |
+
0.84
|
| 4011 |
+
0.8
|
| 4012 |
+
0.65
|
| 4013 |
+
0.65
|
| 4014 |
+
0.8
|
| 4015 |
+
0.75
|
| 4016 |
+
0.65
|
| 4017 |
+
0.65
|
| 4018 |
+
0.65
|
| 4019 |
+
0.89
|
| 4020 |
+
0.65
|
| 4021 |
+
0.65
|
| 4022 |
+
0.65
|
| 4023 |
+
0.65
|
| 4024 |
+
0.82
|
| 4025 |
+
0.65
|
| 4026 |
+
0.65
|
| 4027 |
+
0.65
|
| 4028 |
+
0.8
|
| 4029 |
+
0.65
|
| 4030 |
+
0.8
|
| 4031 |
+
0.65
|
| 4032 |
+
0.8
|
| 4033 |
+
0.65
|
| 4034 |
+
0.65
|
| 4035 |
+
0.65
|
| 4036 |
+
0.84
|
| 4037 |
+
0.65
|
| 4038 |
+
0.65
|
| 4039 |
+
0.65
|
| 4040 |
+
0.65
|
| 4041 |
+
0.65
|
| 4042 |
+
0.65
|
| 4043 |
+
0.65
|
| 4044 |
+
0.65
|
| 4045 |
+
0.8
|
| 4046 |
+
0.65
|
| 4047 |
+
0.65
|
| 4048 |
+
0.65
|
| 4049 |
+
0.65
|
| 4050 |
+
0.8
|
| 4051 |
+
0.8
|
| 4052 |
+
0.8
|
| 4053 |
+
0.65
|
| 4054 |
+
0.65
|
| 4055 |
+
0.65
|
| 4056 |
+
0.65
|
| 4057 |
+
0.65
|
| 4058 |
+
0.65
|
| 4059 |
+
0.65
|
| 4060 |
+
0.65
|
| 4061 |
+
0.65
|
| 4062 |
+
0.65
|
| 4063 |
+
0.65
|
| 4064 |
+
0.65
|
| 4065 |
+
0.65
|
| 4066 |
+
0.65
|
| 4067 |
+
0.8
|
| 4068 |
+
0.65
|
| 4069 |
+
0.8
|
| 4070 |
+
0.65
|
| 4071 |
+
0.8
|
| 4072 |
+
0.65
|
| 4073 |
+
0.7
|
| 4074 |
+
0.65
|
| 4075 |
+
0.65
|
| 4076 |
+
0.65
|
| 4077 |
+
0.65
|
| 4078 |
+
0.65
|
| 4079 |
+
0.65
|
| 4080 |
+
0.65
|
| 4081 |
+
0.65
|
| 4082 |
+
0.9
|
| 4083 |
+
0.65
|
| 4084 |
+
0.65
|
| 4085 |
+
0.8
|
| 4086 |
+
0.65
|
| 4087 |
+
0.65
|
| 4088 |
+
0.65
|
| 4089 |
+
0.65
|
| 4090 |
+
0.65
|
| 4091 |
+
0.65
|
| 4092 |
+
0.8
|
| 4093 |
+
0.65
|
| 4094 |
+
0.65
|
| 4095 |
+
0.65
|
| 4096 |
+
0.65
|
| 4097 |
+
0.65
|
| 4098 |
+
0.65
|
| 4099 |
+
0.8
|
| 4100 |
+
0.74
|
| 4101 |
+
0.65
|
| 4102 |
+
0.8
|
| 4103 |
+
0.65
|
| 4104 |
+
0.65
|
| 4105 |
+
0.65
|
| 4106 |
+
0.9
|
| 4107 |
+
0.65
|
| 4108 |
+
0.65
|
| 4109 |
+
0.65
|
| 4110 |
+
0.65
|
| 4111 |
+
0.85
|
| 4112 |
+
0.65
|
| 4113 |
+
0.9
|
| 4114 |
+
0.9
|
| 4115 |
+
0.65
|
| 4116 |
+
0.65
|
| 4117 |
+
0.65
|
| 4118 |
+
0.63
|
| 4119 |
+
0.82
|
| 4120 |
+
0.65
|
| 4121 |
+
0.65
|
| 4122 |
+
0.8
|
| 4123 |
+
0.65
|
| 4124 |
+
0.65
|
| 4125 |
+
0.65
|
| 4126 |
+
0.65
|
| 4127 |
+
0.65
|
| 4128 |
+
0.65
|
| 4129 |
+
0.8
|
| 4130 |
+
0.65
|
| 4131 |
+
0.65
|
| 4132 |
+
0.8
|
| 4133 |
+
0.65
|
| 4134 |
+
0.65
|
| 4135 |
+
0.8
|
| 4136 |
+
0.65
|
| 4137 |
+
0.65
|
| 4138 |
+
0.65
|
| 4139 |
+
0.65
|
| 4140 |
+
0.65
|
| 4141 |
+
0.65
|
| 4142 |
+
0.65
|
| 4143 |
+
0.65
|
| 4144 |
+
0.8
|
| 4145 |
+
0.65
|
| 4146 |
+
0.65
|
| 4147 |
+
0.65
|
| 4148 |
+
0.65
|
| 4149 |
+
0.8
|
| 4150 |
+
0.7
|
| 4151 |
+
0.65
|
| 4152 |
+
0.65
|
| 4153 |
+
0.65
|
| 4154 |
+
0.65
|
| 4155 |
+
0.65
|
| 4156 |
+
0.9
|
| 4157 |
+
0.65
|
| 4158 |
+
0.65
|
| 4159 |
+
0.74
|
| 4160 |
+
0.9
|
| 4161 |
+
0.65
|
| 4162 |
+
0.8
|
| 4163 |
+
0.65
|
| 4164 |
+
0.65
|
| 4165 |
+
0.58
|
| 4166 |
+
0.65
|
| 4167 |
+
0.65
|
| 4168 |
+
0.65
|
| 4169 |
+
0.65
|
| 4170 |
+
0.65
|
| 4171 |
+
0.65
|
| 4172 |
+
0.89
|
| 4173 |
+
0.75
|
| 4174 |
+
0.65
|
| 4175 |
+
0.65
|
| 4176 |
+
0.8
|
| 4177 |
+
0.65
|
| 4178 |
+
0.65
|
| 4179 |
+
0.88
|
| 4180 |
+
0.65
|
| 4181 |
+
0.65
|
| 4182 |
+
0.65
|
| 4183 |
+
0.8
|
| 4184 |
+
0.65
|
| 4185 |
+
0.65
|
| 4186 |
+
0.65
|
| 4187 |
+
0.65
|
| 4188 |
+
0.65
|
| 4189 |
+
0.65
|
| 4190 |
+
0.65
|
| 4191 |
+
0.89
|
| 4192 |
+
0.65
|
| 4193 |
+
0.65
|
| 4194 |
+
0.65
|
| 4195 |
+
0.65
|
| 4196 |
+
0.65
|
| 4197 |
+
0.65
|
| 4198 |
+
0.65
|
| 4199 |
+
0.65
|
| 4200 |
+
0.65
|
| 4201 |
+
0.65
|
| 4202 |
+
0.65
|
| 4203 |
+
0.65
|
| 4204 |
+
0.8
|
| 4205 |
+
0.8
|
| 4206 |
+
0.8
|
| 4207 |
+
0.65
|
| 4208 |
+
0.65
|
| 4209 |
+
0.8
|
| 4210 |
+
0.8
|
| 4211 |
+
0.65
|
| 4212 |
+
0.65
|
| 4213 |
+
0.87
|
| 4214 |
+
0.65
|
| 4215 |
+
0.65
|
| 4216 |
+
0.65
|
| 4217 |
+
0.8
|
| 4218 |
+
0.65
|
| 4219 |
+
0.64
|
| 4220 |
+
0.65
|
| 4221 |
+
0.65
|
| 4222 |
+
0.65
|
| 4223 |
+
0.8
|
| 4224 |
+
0.87
|
| 4225 |
+
0.65
|
| 4226 |
+
0.65
|
| 4227 |
+
0.8
|
| 4228 |
+
0.9
|
| 4229 |
+
0.65
|
| 4230 |
+
0.65
|
| 4231 |
+
0.65
|
| 4232 |
+
0.65
|
| 4233 |
+
0.8
|
| 4234 |
+
0.8
|
| 4235 |
+
0.65
|
| 4236 |
+
0.89
|
| 4237 |
+
0.65
|
| 4238 |
+
0.65
|
| 4239 |
+
0.65
|
| 4240 |
+
0.65
|
| 4241 |
+
0.65
|
| 4242 |
+
0.65
|
| 4243 |
+
0.8
|
| 4244 |
+
0.65
|
| 4245 |
+
0.65
|
| 4246 |
+
0.65
|
| 4247 |
+
0.83
|
| 4248 |
+
0.65
|
| 4249 |
+
0.65
|
| 4250 |
+
0.8
|
| 4251 |
+
0.65
|
| 4252 |
+
0.9
|
| 4253 |
+
0.65
|
| 4254 |
+
0.8
|
| 4255 |
+
0.8
|
| 4256 |
+
0.65
|
| 4257 |
+
0.65
|
| 4258 |
+
0.65
|
| 4259 |
+
0.65
|
| 4260 |
+
0.65
|
| 4261 |
+
0.65
|
| 4262 |
+
0.8
|
| 4263 |
+
0.65
|
| 4264 |
+
0.65
|
| 4265 |
+
0.65
|
| 4266 |
+
0.65
|
| 4267 |
+
0.65
|
| 4268 |
+
0.65
|
| 4269 |
+
0.65
|
| 4270 |
+
0.65
|
| 4271 |
+
0.65
|
| 4272 |
+
0.65
|
| 4273 |
+
0.78
|
| 4274 |
+
0.65
|
| 4275 |
+
0.8
|
| 4276 |
+
0.65
|
| 4277 |
+
0.9
|
| 4278 |
+
0.65
|
| 4279 |
+
0.8
|
| 4280 |
+
0.65
|
| 4281 |
+
0.65
|
| 4282 |
+
0.65
|
| 4283 |
+
0.65
|
| 4284 |
+
0.65
|
| 4285 |
+
0.9
|
| 4286 |
+
0.65
|
| 4287 |
+
0.88
|
| 4288 |
+
0.8
|
| 4289 |
+
0.65
|
| 4290 |
+
0.65
|
| 4291 |
+
0.65
|
| 4292 |
+
0.81
|
| 4293 |
+
0.65
|
| 4294 |
+
0.65
|
| 4295 |
+
0.65
|
| 4296 |
+
0.65
|
| 4297 |
+
0.65
|
| 4298 |
+
0.65
|
| 4299 |
+
0.65
|
| 4300 |
+
0.65
|
| 4301 |
+
0.65
|
| 4302 |
+
0.65
|
| 4303 |
+
0.65
|
| 4304 |
+
0.65
|
| 4305 |
+
0.65
|
| 4306 |
+
0.65
|
| 4307 |
+
0.8
|
| 4308 |
+
0.65
|
| 4309 |
+
0.65
|
| 4310 |
+
0.65
|
| 4311 |
+
0.65
|
| 4312 |
+
0.77
|
| 4313 |
+
0.65
|
| 4314 |
+
0.65
|
| 4315 |
+
0.65
|
| 4316 |
+
0.8
|
| 4317 |
+
0.8
|
| 4318 |
+
0.8
|
| 4319 |
+
0.8
|
| 4320 |
+
0.65
|
| 4321 |
+
0.65
|
| 4322 |
+
0.65
|
| 4323 |
+
1
|
| 4324 |
+
0.65
|
| 4325 |
+
0.65
|
| 4326 |
+
0.65
|
| 4327 |
+
0.8
|
| 4328 |
+
0.65
|
| 4329 |
+
0.65
|
| 4330 |
+
0.8
|
| 4331 |
+
0.65
|
| 4332 |
+
0.65
|
| 4333 |
+
0.8
|
| 4334 |
+
0.85
|
| 4335 |
+
0.65
|
| 4336 |
+
0.65
|
| 4337 |
+
0.8
|
| 4338 |
+
0.8
|
| 4339 |
+
0.65
|
| 4340 |
+
0.65
|
| 4341 |
+
0.65
|
| 4342 |
+
0.8
|
| 4343 |
+
0.65
|
| 4344 |
+
0.65
|
| 4345 |
+
0.65
|
| 4346 |
+
0.88
|
| 4347 |
+
0.65
|
| 4348 |
+
0.65
|
| 4349 |
+
0.65
|
| 4350 |
+
0.65
|
| 4351 |
+
0.8
|
| 4352 |
+
0.65
|
| 4353 |
+
0.65
|
| 4354 |
+
0.65
|
| 4355 |
+
0.65
|
| 4356 |
+
0.8
|
| 4357 |
+
0.65
|
| 4358 |
+
0.8
|
| 4359 |
+
0.65
|
| 4360 |
+
0.65
|
| 4361 |
+
0.65
|
| 4362 |
+
0.8
|
| 4363 |
+
0.8
|
| 4364 |
+
0.8
|
| 4365 |
+
0.65
|
| 4366 |
+
0.65
|
| 4367 |
+
0.65
|
| 4368 |
+
0.65
|
| 4369 |
+
0.68
|
| 4370 |
+
0.65
|
| 4371 |
+
0.65
|
| 4372 |
+
0.65
|
| 4373 |
+
0.65
|
| 4374 |
+
0.65
|
| 4375 |
+
0.65
|
| 4376 |
+
0.89
|
| 4377 |
+
0.65
|
| 4378 |
+
0.65
|
| 4379 |
+
0.65
|
| 4380 |
+
0.65
|
| 4381 |
+
0.65
|
| 4382 |
+
0.65
|
| 4383 |
+
0.65
|
| 4384 |
+
0.65
|
| 4385 |
+
0.65
|
| 4386 |
+
0.65
|
| 4387 |
+
0.65
|
| 4388 |
+
0.65
|
| 4389 |
+
0.65
|
| 4390 |
+
0.8
|
| 4391 |
+
0.65
|
| 4392 |
+
0.65
|
| 4393 |
+
0.65
|
| 4394 |
+
0.8
|
| 4395 |
+
0.9
|
| 4396 |
+
0.65
|
| 4397 |
+
0.8
|
| 4398 |
+
0.65
|
| 4399 |
+
0.8
|
| 4400 |
+
0.65
|
| 4401 |
+
0.65
|
| 4402 |
+
0.65
|
| 4403 |
+
0.65
|
| 4404 |
+
0.65
|
| 4405 |
+
0.65
|
| 4406 |
+
0.65
|
| 4407 |
+
0.81
|
| 4408 |
+
0.65
|
| 4409 |
+
0.65
|
| 4410 |
+
0.65
|
| 4411 |
+
0.8
|
| 4412 |
+
0.85
|
| 4413 |
+
0.65
|
| 4414 |
+
0.77
|
| 4415 |
+
0.65
|
| 4416 |
+
0.8
|
| 4417 |
+
0.65
|
| 4418 |
+
0.65
|
| 4419 |
+
0.65
|
| 4420 |
+
0.65
|
| 4421 |
+
0.65
|
| 4422 |
+
0.65
|
| 4423 |
+
0.65
|
| 4424 |
+
0.65
|
| 4425 |
+
0.65
|
| 4426 |
+
0.65
|
| 4427 |
+
0.65
|
| 4428 |
+
0.8
|
| 4429 |
+
0.8
|
| 4430 |
+
0.8
|
| 4431 |
+
0.9
|
| 4432 |
+
0.65
|
| 4433 |
+
0.65
|
| 4434 |
+
0.89
|
| 4435 |
+
0.65
|
| 4436 |
+
0.65
|
| 4437 |
+
0.8
|
| 4438 |
+
0.65
|
| 4439 |
+
0.65
|
| 4440 |
+
0.8
|
| 4441 |
+
0.8
|
| 4442 |
+
0.65
|
| 4443 |
+
0.65
|
| 4444 |
+
0.65
|
| 4445 |
+
0.88
|
| 4446 |
+
0.8
|
| 4447 |
+
0.65
|
| 4448 |
+
0.8
|
| 4449 |
+
0.65
|
| 4450 |
+
0.65
|
| 4451 |
+
0.65
|
| 4452 |
+
0.65
|
| 4453 |
+
0.65
|
| 4454 |
+
0.65
|
| 4455 |
+
0.8
|
| 4456 |
+
0.65
|
| 4457 |
+
0.65
|
| 4458 |
+
0.8
|
| 4459 |
+
0.65
|
| 4460 |
+
0.65
|
| 4461 |
+
0.65
|
| 4462 |
+
0.65
|
| 4463 |
+
0.65
|
| 4464 |
+
0.8
|
| 4465 |
+
0.65
|
| 4466 |
+
0.65
|
| 4467 |
+
0.65
|
| 4468 |
+
0.65
|
| 4469 |
+
0.65
|
| 4470 |
+
0.65
|
| 4471 |
+
0.82
|
| 4472 |
+
0.65
|
| 4473 |
+
0.8
|
| 4474 |
+
0.74
|
| 4475 |
+
0.65
|
| 4476 |
+
0.65
|
| 4477 |
+
0.65
|
| 4478 |
+
0.65
|
| 4479 |
+
0.65
|
| 4480 |
+
0.65
|
| 4481 |
+
0.85
|
| 4482 |
+
0.65
|
| 4483 |
+
0.65
|
| 4484 |
+
0.85
|
| 4485 |
+
0.65
|
| 4486 |
+
0.65
|
| 4487 |
+
0.65
|
| 4488 |
+
0.65
|
| 4489 |
+
0.7
|
| 4490 |
+
0.7
|
| 4491 |
+
0.8
|
| 4492 |
+
0.65
|
| 4493 |
+
0.65
|
| 4494 |
+
0.65
|
| 4495 |
+
0.65
|
| 4496 |
+
0.87
|
| 4497 |
+
0.8
|
| 4498 |
+
0.65
|
| 4499 |
+
0.65
|
| 4500 |
+
0.65
|
| 4501 |
+
0.89
|
| 4502 |
+
0.85
|
| 4503 |
+
0.65
|
| 4504 |
+
0.65
|
| 4505 |
+
0.65
|
| 4506 |
+
0.8
|
| 4507 |
+
0.65
|
| 4508 |
+
0.65
|
| 4509 |
+
0.65
|
| 4510 |
+
0.65
|
| 4511 |
+
0.65
|
| 4512 |
+
0.65
|
| 4513 |
+
0.65
|
| 4514 |
+
0.65
|
| 4515 |
+
0.65
|
| 4516 |
+
0.65
|
| 4517 |
+
0.65
|
| 4518 |
+
0.65
|
| 4519 |
+
0.65
|
| 4520 |
+
0.8
|
| 4521 |
+
0.7
|
| 4522 |
+
0.65
|
| 4523 |
+
0.65
|
| 4524 |
+
0.65
|
| 4525 |
+
0.65
|
| 4526 |
+
0.65
|
| 4527 |
+
0.8
|
| 4528 |
+
0.65
|
| 4529 |
+
0.65
|
| 4530 |
+
0.65
|
| 4531 |
+
0.65
|
| 4532 |
+
0.9
|
| 4533 |
+
0.8
|
| 4534 |
+
0.8
|
| 4535 |
+
0.65
|
| 4536 |
+
0.66
|
| 4537 |
+
0.57
|
| 4538 |
+
0.65
|
| 4539 |
+
0.65
|
| 4540 |
+
0.65
|
| 4541 |
+
0.49
|
| 4542 |
+
0.65
|
| 4543 |
+
0.65
|
| 4544 |
+
0.8
|
| 4545 |
+
0.65
|
| 4546 |
+
0.65
|
| 4547 |
+
0.8
|
| 4548 |
+
0.65
|
| 4549 |
+
0.65
|
| 4550 |
+
0.8
|
| 4551 |
+
0.65
|
| 4552 |
+
0.65
|
| 4553 |
+
0.65
|
| 4554 |
+
0.8
|
| 4555 |
+
0.65
|
| 4556 |
+
0.65
|
| 4557 |
+
0.65
|
| 4558 |
+
0.65
|
| 4559 |
+
0.65
|
| 4560 |
+
0.65
|
| 4561 |
+
0.8
|
| 4562 |
+
0.65
|
| 4563 |
+
0.65
|
| 4564 |
+
0.65
|
| 4565 |
+
0.65
|
| 4566 |
+
0.8
|
| 4567 |
+
0.65
|
| 4568 |
+
0.8
|
| 4569 |
+
0.8
|
| 4570 |
+
0.86
|
| 4571 |
+
0.65
|
| 4572 |
+
0.65
|
| 4573 |
+
0.65
|
| 4574 |
+
0.65
|
| 4575 |
+
0.65
|
| 4576 |
+
0.65
|
| 4577 |
+
0.65
|
| 4578 |
+
0.89
|
| 4579 |
+
0.65
|
| 4580 |
+
0.65
|
| 4581 |
+
0.65
|
| 4582 |
+
0.65
|
| 4583 |
+
0.65
|
| 4584 |
+
0.65
|
| 4585 |
+
0.76
|
ram/data/tag_list.txt
ADDED
|
@@ -0,0 +1,3429 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
tennis
|
| 2 |
+
bear cub
|
| 3 |
+
observatory
|
| 4 |
+
bicycle
|
| 5 |
+
hillside
|
| 6 |
+
judge
|
| 7 |
+
watercolor illustration
|
| 8 |
+
granite
|
| 9 |
+
lobster
|
| 10 |
+
livery
|
| 11 |
+
stone
|
| 12 |
+
ceramic
|
| 13 |
+
ranch
|
| 14 |
+
cloth
|
| 15 |
+
smile
|
| 16 |
+
building
|
| 17 |
+
tattoo
|
| 18 |
+
cricketer
|
| 19 |
+
cheek
|
| 20 |
+
pear
|
| 21 |
+
source
|
| 22 |
+
winter
|
| 23 |
+
surface
|
| 24 |
+
spray
|
| 25 |
+
ceremony
|
| 26 |
+
magic
|
| 27 |
+
curve
|
| 28 |
+
container
|
| 29 |
+
fair
|
| 30 |
+
medicine
|
| 31 |
+
baby
|
| 32 |
+
tennis racquet
|
| 33 |
+
ornament
|
| 34 |
+
bamboo
|
| 35 |
+
duckling
|
| 36 |
+
song
|
| 37 |
+
safari
|
| 38 |
+
team presentation
|
| 39 |
+
daffodil
|
| 40 |
+
cross
|
| 41 |
+
toothpaste
|
| 42 |
+
shield
|
| 43 |
+
fashion model
|
| 44 |
+
capsule
|
| 45 |
+
map
|
| 46 |
+
creek
|
| 47 |
+
glass house
|
| 48 |
+
glass plate
|
| 49 |
+
siding
|
| 50 |
+
corner
|
| 51 |
+
water buffalo
|
| 52 |
+
bison
|
| 53 |
+
figure skater
|
| 54 |
+
diploma
|
| 55 |
+
tire
|
| 56 |
+
race
|
| 57 |
+
cable car
|
| 58 |
+
brain
|
| 59 |
+
gas stove
|
| 60 |
+
soap bubble
|
| 61 |
+
palette
|
| 62 |
+
snowboard
|
| 63 |
+
school child
|
| 64 |
+
trench coat
|
| 65 |
+
monk
|
| 66 |
+
fiber
|
| 67 |
+
kitchen window
|
| 68 |
+
sunglass
|
| 69 |
+
coffee
|
| 70 |
+
security
|
| 71 |
+
strawberry
|
| 72 |
+
penguin
|
| 73 |
+
tree root
|
| 74 |
+
loaf
|
| 75 |
+
engagement ring
|
| 76 |
+
lamb
|
| 77 |
+
vector cartoon illustration
|
| 78 |
+
sandwich
|
| 79 |
+
mountain village
|
| 80 |
+
shape
|
| 81 |
+
charm
|
| 82 |
+
fiction
|
| 83 |
+
knot
|
| 84 |
+
greenhouse
|
| 85 |
+
sushi
|
| 86 |
+
text
|
| 87 |
+
disaster
|
| 88 |
+
trophy
|
| 89 |
+
gang
|
| 90 |
+
strap
|
| 91 |
+
soccer game
|
| 92 |
+
cardinal
|
| 93 |
+
tee
|
| 94 |
+
turtle
|
| 95 |
+
water surface
|
| 96 |
+
grassland
|
| 97 |
+
dolphin
|
| 98 |
+
store
|
| 99 |
+
dirt
|
| 100 |
+
iceberg
|
| 101 |
+
pergola
|
| 102 |
+
farmer market
|
| 103 |
+
publicity portrait
|
| 104 |
+
tote bag
|
| 105 |
+
teenage girl
|
| 106 |
+
view mirror
|
| 107 |
+
session
|
| 108 |
+
commuter
|
| 109 |
+
dressing room
|
| 110 |
+
tricycle
|
| 111 |
+
christmas ball
|
| 112 |
+
headlight
|
| 113 |
+
police
|
| 114 |
+
armchair
|
| 115 |
+
chart
|
| 116 |
+
yacht
|
| 117 |
+
saw
|
| 118 |
+
printer
|
| 119 |
+
rock band
|
| 120 |
+
gingerbread house
|
| 121 |
+
tag
|
| 122 |
+
table lamp
|
| 123 |
+
hockey game
|
| 124 |
+
slope
|
| 125 |
+
font
|
| 126 |
+
wicker basket
|
| 127 |
+
jewelry
|
| 128 |
+
quarter
|
| 129 |
+
software
|
| 130 |
+
weapon
|
| 131 |
+
pin
|
| 132 |
+
worship
|
| 133 |
+
painter
|
| 134 |
+
goal
|
| 135 |
+
morning light
|
| 136 |
+
bike
|
| 137 |
+
baseball bat
|
| 138 |
+
elevator
|
| 139 |
+
cuisine
|
| 140 |
+
sausage
|
| 141 |
+
stunt
|
| 142 |
+
wrestler
|
| 143 |
+
statue
|
| 144 |
+
landing
|
| 145 |
+
pillar
|
| 146 |
+
willow tree
|
| 147 |
+
sea wave
|
| 148 |
+
chicken
|
| 149 |
+
peanut
|
| 150 |
+
muscle
|
| 151 |
+
bob
|
| 152 |
+
tv genre
|
| 153 |
+
bathroom window
|
| 154 |
+
radish
|
| 155 |
+
textile
|
| 156 |
+
pelican
|
| 157 |
+
marketplace
|
| 158 |
+
crest
|
| 159 |
+
elevation map
|
| 160 |
+
gift
|
| 161 |
+
parish
|
| 162 |
+
traffic light
|
| 163 |
+
campfire
|
| 164 |
+
fog
|
| 165 |
+
award winner
|
| 166 |
+
beach ball
|
| 167 |
+
mat
|
| 168 |
+
white house
|
| 169 |
+
plaster
|
| 170 |
+
moped
|
| 171 |
+
football team
|
| 172 |
+
solution
|
| 173 |
+
bicyclist
|
| 174 |
+
bit
|
| 175 |
+
playground
|
| 176 |
+
darkness
|
| 177 |
+
cake
|
| 178 |
+
maple leave
|
| 179 |
+
mold
|
| 180 |
+
cracker
|
| 181 |
+
blueberry
|
| 182 |
+
rubble
|
| 183 |
+
container ship
|
| 184 |
+
pedestrian bridge
|
| 185 |
+
snail
|
| 186 |
+
parrot
|
| 187 |
+
form
|
| 188 |
+
circuit
|
| 189 |
+
highlight
|
| 190 |
+
pickup truck
|
| 191 |
+
koala
|
| 192 |
+
rain
|
| 193 |
+
system
|
| 194 |
+
weather
|
| 195 |
+
raincoat
|
| 196 |
+
soccer team
|
| 197 |
+
windshield
|
| 198 |
+
thunderstorm
|
| 199 |
+
mike
|
| 200 |
+
bird house
|
| 201 |
+
bridge
|
| 202 |
+
grandfather
|
| 203 |
+
restroom
|
| 204 |
+
animation
|
| 205 |
+
wilderness
|
| 206 |
+
clown
|
| 207 |
+
banana
|
| 208 |
+
brown
|
| 209 |
+
braid
|
| 210 |
+
dining room
|
| 211 |
+
kindergarten
|
| 212 |
+
launch event
|
| 213 |
+
purple
|
| 214 |
+
school
|
| 215 |
+
stairwell
|
| 216 |
+
brooch
|
| 217 |
+
movie poster image
|
| 218 |
+
mountain river
|
| 219 |
+
shelf
|
| 220 |
+
wicket
|
| 221 |
+
headboard
|
| 222 |
+
buddha
|
| 223 |
+
flower field
|
| 224 |
+
dugout
|
| 225 |
+
cd
|
| 226 |
+
bald eagle
|
| 227 |
+
lagoon
|
| 228 |
+
seaweed
|
| 229 |
+
agriculture
|
| 230 |
+
emergency service
|
| 231 |
+
maple tree
|
| 232 |
+
parachute
|
| 233 |
+
continent
|
| 234 |
+
amusement park
|
| 235 |
+
remote
|
| 236 |
+
bun
|
| 237 |
+
tackle
|
| 238 |
+
hospital
|
| 239 |
+
garage door
|
| 240 |
+
birthday party
|
| 241 |
+
friendship
|
| 242 |
+
go
|
| 243 |
+
mausoleum
|
| 244 |
+
jeep
|
| 245 |
+
raccoon
|
| 246 |
+
step
|
| 247 |
+
ice hockey team
|
| 248 |
+
cigarette
|
| 249 |
+
lace dress
|
| 250 |
+
forest floor
|
| 251 |
+
mall
|
| 252 |
+
captain
|
| 253 |
+
milk
|
| 254 |
+
golf course
|
| 255 |
+
meal
|
| 256 |
+
picnic table
|
| 257 |
+
sail
|
| 258 |
+
volleyball
|
| 259 |
+
canal
|
| 260 |
+
terrace
|
| 261 |
+
computer desk
|
| 262 |
+
caravan
|
| 263 |
+
hotel
|
| 264 |
+
cheerleader
|
| 265 |
+
nurse
|
| 266 |
+
museum
|
| 267 |
+
marsh
|
| 268 |
+
fox
|
| 269 |
+
plateau
|
| 270 |
+
night
|
| 271 |
+
twin
|
| 272 |
+
letter logo
|
| 273 |
+
autumn tree
|
| 274 |
+
powder
|
| 275 |
+
convention
|
| 276 |
+
creature
|
| 277 |
+
lighthouse
|
| 278 |
+
shop window
|
| 279 |
+
jacket
|
| 280 |
+
stork
|
| 281 |
+
taxi
|
| 282 |
+
trade
|
| 283 |
+
blackboard
|
| 284 |
+
olive
|
| 285 |
+
road sign
|
| 286 |
+
resort
|
| 287 |
+
snowflake
|
| 288 |
+
cemetery
|
| 289 |
+
travel
|
| 290 |
+
evening dress
|
| 291 |
+
picnic
|
| 292 |
+
drink
|
| 293 |
+
winter morning
|
| 294 |
+
football player
|
| 295 |
+
snack
|
| 296 |
+
boxing glove
|
| 297 |
+
dinner party
|
| 298 |
+
airline
|
| 299 |
+
swing
|
| 300 |
+
port
|
| 301 |
+
wheelbarrow
|
| 302 |
+
bathroom sink
|
| 303 |
+
sweater
|
| 304 |
+
ambulance
|
| 305 |
+
gear
|
| 306 |
+
oil
|
| 307 |
+
wii controller
|
| 308 |
+
array
|
| 309 |
+
home office
|
| 310 |
+
car show
|
| 311 |
+
mixture
|
| 312 |
+
profession
|
| 313 |
+
tree frog
|
| 314 |
+
square
|
| 315 |
+
facility
|
| 316 |
+
coral reef
|
| 317 |
+
sea wall
|
| 318 |
+
pizza
|
| 319 |
+
exhibit
|
| 320 |
+
demolition
|
| 321 |
+
trout
|
| 322 |
+
ring
|
| 323 |
+
coffee shop
|
| 324 |
+
bracelet
|
| 325 |
+
bean
|
| 326 |
+
lip
|
| 327 |
+
fencing
|
| 328 |
+
landscape
|
| 329 |
+
sitting
|
| 330 |
+
package
|
| 331 |
+
metal
|
| 332 |
+
bust
|
| 333 |
+
king
|
| 334 |
+
hair
|
| 335 |
+
window seat
|
| 336 |
+
wildlife
|
| 337 |
+
trunk
|
| 338 |
+
greenery
|
| 339 |
+
stencil
|
| 340 |
+
fire hydrant
|
| 341 |
+
bridesmaid
|
| 342 |
+
plaza
|
| 343 |
+
alps
|
| 344 |
+
tower bridge
|
| 345 |
+
crop top
|
| 346 |
+
crossing
|
| 347 |
+
cinema
|
| 348 |
+
pedestrian crossing
|
| 349 |
+
family
|
| 350 |
+
shopping cart
|
| 351 |
+
stomach
|
| 352 |
+
church building
|
| 353 |
+
screen door
|
| 354 |
+
skater
|
| 355 |
+
soccer field
|
| 356 |
+
kettle
|
| 357 |
+
mussel
|
| 358 |
+
raindrop
|
| 359 |
+
candy cane
|
| 360 |
+
water lily
|
| 361 |
+
flower girl
|
| 362 |
+
desert
|
| 363 |
+
enclosure
|
| 364 |
+
christmas light
|
| 365 |
+
kitchen
|
| 366 |
+
caterpillar
|
| 367 |
+
plaid
|
| 368 |
+
bath
|
| 369 |
+
bush
|
| 370 |
+
mud
|
| 371 |
+
ballet
|
| 372 |
+
knee
|
| 373 |
+
adult
|
| 374 |
+
raft
|
| 375 |
+
sea view
|
| 376 |
+
cactus
|
| 377 |
+
office chair
|
| 378 |
+
overall
|
| 379 |
+
rim
|
| 380 |
+
scaffolding
|
| 381 |
+
pig
|
| 382 |
+
cover
|
| 383 |
+
poster page
|
| 384 |
+
sprinkle
|
| 385 |
+
chandelier
|
| 386 |
+
algae
|
| 387 |
+
traffic
|
| 388 |
+
surfboard
|
| 389 |
+
book
|
| 390 |
+
filming
|
| 391 |
+
flash
|
| 392 |
+
mansion
|
| 393 |
+
camouflage
|
| 394 |
+
trouser
|
| 395 |
+
ticket
|
| 396 |
+
weed
|
| 397 |
+
cab
|
| 398 |
+
trench
|
| 399 |
+
elephant
|
| 400 |
+
huddle
|
| 401 |
+
sphere
|
| 402 |
+
christmas decoration
|
| 403 |
+
city
|
| 404 |
+
launch
|
| 405 |
+
doll
|
| 406 |
+
christmas ornament
|
| 407 |
+
fabric
|
| 408 |
+
bikini
|
| 409 |
+
biplane
|
| 410 |
+
breakfast
|
| 411 |
+
neighbourhood
|
| 412 |
+
race track
|
| 413 |
+
foliage
|
| 414 |
+
avocado
|
| 415 |
+
school bus
|
| 416 |
+
footwear
|
| 417 |
+
highway
|
| 418 |
+
ocean view
|
| 419 |
+
art vector illustration
|
| 420 |
+
wall clock
|
| 421 |
+
curtain
|
| 422 |
+
teenager
|
| 423 |
+
kitchen area
|
| 424 |
+
robot
|
| 425 |
+
tusk
|
| 426 |
+
lounge chair
|
| 427 |
+
beam
|
| 428 |
+
paddle
|
| 429 |
+
camel
|
| 430 |
+
lid
|
| 431 |
+
world map
|
| 432 |
+
city view
|
| 433 |
+
newlywed
|
| 434 |
+
cargo ship
|
| 435 |
+
yellow
|
| 436 |
+
exhibition
|
| 437 |
+
bend
|
| 438 |
+
novel
|
| 439 |
+
wool
|
| 440 |
+
ontario
|
| 441 |
+
bread
|
| 442 |
+
campus
|
| 443 |
+
coastline
|
| 444 |
+
cutting board
|
| 445 |
+
booth
|
| 446 |
+
table top
|
| 447 |
+
carpet
|
| 448 |
+
beach chair
|
| 449 |
+
workout
|
| 450 |
+
street food
|
| 451 |
+
fun
|
| 452 |
+
costumer film designer
|
| 453 |
+
gadget
|
| 454 |
+
artist
|
| 455 |
+
fishing village
|
| 456 |
+
builder
|
| 457 |
+
violinist
|
| 458 |
+
iphone
|
| 459 |
+
spider web
|
| 460 |
+
traffic sign
|
| 461 |
+
ruin
|
| 462 |
+
rescue
|
| 463 |
+
clipboard
|
| 464 |
+
seal
|
| 465 |
+
film director
|
| 466 |
+
paw
|
| 467 |
+
nursery
|
| 468 |
+
intersection
|
| 469 |
+
tomato sauce
|
| 470 |
+
taste
|
| 471 |
+
paddy field
|
| 472 |
+
christmas tree
|
| 473 |
+
wave
|
| 474 |
+
stool
|
| 475 |
+
watering can
|
| 476 |
+
rug
|
| 477 |
+
daytime
|
| 478 |
+
subway station
|
| 479 |
+
craft
|
| 480 |
+
pine forest
|
| 481 |
+
black
|
| 482 |
+
planet
|
| 483 |
+
motif
|
| 484 |
+
christmas market
|
| 485 |
+
glass window
|
| 486 |
+
college
|
| 487 |
+
wheat
|
| 488 |
+
damage
|
| 489 |
+
rectangle
|
| 490 |
+
picture frame
|
| 491 |
+
chess
|
| 492 |
+
guest room
|
| 493 |
+
street corner
|
| 494 |
+
religion
|
| 495 |
+
seed
|
| 496 |
+
puzzle
|
| 497 |
+
freeway
|
| 498 |
+
beauty
|
| 499 |
+
ocean
|
| 500 |
+
watch
|
| 501 |
+
mother
|
| 502 |
+
garage
|
| 503 |
+
quote
|
| 504 |
+
dj
|
| 505 |
+
supporter
|
| 506 |
+
hip hop artist
|
| 507 |
+
muffin
|
| 508 |
+
eiffel tower
|
| 509 |
+
cash
|
| 510 |
+
firefighter
|
| 511 |
+
cauliflower
|
| 512 |
+
bunker
|
| 513 |
+
sled
|
| 514 |
+
manicure
|
| 515 |
+
shark
|
| 516 |
+
stall
|
| 517 |
+
jungle
|
| 518 |
+
family home
|
| 519 |
+
tour bus
|
| 520 |
+
chimney
|
| 521 |
+
touchdown
|
| 522 |
+
roundabout
|
| 523 |
+
coyote
|
| 524 |
+
street scene
|
| 525 |
+
tank
|
| 526 |
+
wedding dress
|
| 527 |
+
mantle
|
| 528 |
+
bedroom window
|
| 529 |
+
coconut
|
| 530 |
+
chapel
|
| 531 |
+
goat
|
| 532 |
+
living space
|
| 533 |
+
rock wall
|
| 534 |
+
polka dot
|
| 535 |
+
railway
|
| 536 |
+
mandala
|
| 537 |
+
mango
|
| 538 |
+
lesson
|
| 539 |
+
mountain landscape
|
| 540 |
+
team photo
|
| 541 |
+
bookshelf
|
| 542 |
+
meter
|
| 543 |
+
bulldog
|
| 544 |
+
evening sun
|
| 545 |
+
stick
|
| 546 |
+
card
|
| 547 |
+
pink
|
| 548 |
+
fish pond
|
| 549 |
+
paint
|
| 550 |
+
pill
|
| 551 |
+
cart
|
| 552 |
+
pea
|
| 553 |
+
van
|
| 554 |
+
album
|
| 555 |
+
football college game
|
| 556 |
+
mountain pass
|
| 557 |
+
doughnut
|
| 558 |
+
ski slope
|
| 559 |
+
match
|
| 560 |
+
official
|
| 561 |
+
shadow
|
| 562 |
+
organ
|
| 563 |
+
celebration
|
| 564 |
+
coin
|
| 565 |
+
log cabin
|
| 566 |
+
firework display
|
| 567 |
+
present
|
| 568 |
+
twig
|
| 569 |
+
chef
|
| 570 |
+
confetti
|
| 571 |
+
footpath
|
| 572 |
+
tour
|
| 573 |
+
ponytail
|
| 574 |
+
artwork
|
| 575 |
+
race car
|
| 576 |
+
club
|
| 577 |
+
season
|
| 578 |
+
hose
|
| 579 |
+
pencil
|
| 580 |
+
aircraft
|
| 581 |
+
rock formation
|
| 582 |
+
wardrobe
|
| 583 |
+
participant
|
| 584 |
+
politician
|
| 585 |
+
engineer
|
| 586 |
+
peace
|
| 587 |
+
filter
|
| 588 |
+
sailing boat
|
| 589 |
+
water bottle
|
| 590 |
+
service dog
|
| 591 |
+
poodle
|
| 592 |
+
loki
|
| 593 |
+
statesman
|
| 594 |
+
sleeping bag
|
| 595 |
+
outskirt
|
| 596 |
+
clock
|
| 597 |
+
factory
|
| 598 |
+
oak tree
|
| 599 |
+
physician
|
| 600 |
+
color
|
| 601 |
+
room
|
| 602 |
+
stairway
|
| 603 |
+
company
|
| 604 |
+
lady
|
| 605 |
+
graph
|
| 606 |
+
faucet
|
| 607 |
+
tablecloth
|
| 608 |
+
subway train
|
| 609 |
+
chocolate chip cookie
|
| 610 |
+
headquarters
|
| 611 |
+
screw
|
| 612 |
+
goggle
|
| 613 |
+
halloween
|
| 614 |
+
city street
|
| 615 |
+
swirl
|
| 616 |
+
cord
|
| 617 |
+
forward
|
| 618 |
+
bone
|
| 619 |
+
bedding
|
| 620 |
+
archway
|
| 621 |
+
wig
|
| 622 |
+
lobby
|
| 623 |
+
mask
|
| 624 |
+
attic
|
| 625 |
+
kitchen table
|
| 626 |
+
skylight
|
| 627 |
+
fire
|
| 628 |
+
exit
|
| 629 |
+
oil painting
|
| 630 |
+
passenger
|
| 631 |
+
meditation
|
| 632 |
+
salmon
|
| 633 |
+
fedora
|
| 634 |
+
rubber stamp
|
| 635 |
+
orange juice
|
| 636 |
+
arch
|
| 637 |
+
scientist
|
| 638 |
+
stroll
|
| 639 |
+
manhattan
|
| 640 |
+
float
|
| 641 |
+
baseball uniform
|
| 642 |
+
circle
|
| 643 |
+
church
|
| 644 |
+
decker bus
|
| 645 |
+
competitor
|
| 646 |
+
zoo
|
| 647 |
+
basketball team
|
| 648 |
+
tourist
|
| 649 |
+
daughter
|
| 650 |
+
silverware
|
| 651 |
+
ceiling fan
|
| 652 |
+
birth
|
| 653 |
+
vase
|
| 654 |
+
jack
|
| 655 |
+
mushroom
|
| 656 |
+
spiral
|
| 657 |
+
cage
|
| 658 |
+
limb
|
| 659 |
+
salad
|
| 660 |
+
ad
|
| 661 |
+
control
|
| 662 |
+
earth
|
| 663 |
+
party
|
| 664 |
+
bolt
|
| 665 |
+
tractor
|
| 666 |
+
barley
|
| 667 |
+
wedding photo
|
| 668 |
+
hawk
|
| 669 |
+
warehouse
|
| 670 |
+
vegetable garden
|
| 671 |
+
chocolate cake
|
| 672 |
+
cabbage
|
| 673 |
+
floor window
|
| 674 |
+
baby shower
|
| 675 |
+
magnifying glass
|
| 676 |
+
table
|
| 677 |
+
stethoscope
|
| 678 |
+
reading
|
| 679 |
+
mission
|
| 680 |
+
croissant
|
| 681 |
+
gift box
|
| 682 |
+
rocket
|
| 683 |
+
forest road
|
| 684 |
+
cooking
|
| 685 |
+
suite
|
| 686 |
+
hill country
|
| 687 |
+
motorcycle
|
| 688 |
+
baseball player
|
| 689 |
+
angle
|
| 690 |
+
drug
|
| 691 |
+
sport association
|
| 692 |
+
championship
|
| 693 |
+
family portrait
|
| 694 |
+
florist
|
| 695 |
+
softball
|
| 696 |
+
egret
|
| 697 |
+
office
|
| 698 |
+
plywood
|
| 699 |
+
jockey
|
| 700 |
+
mosque
|
| 701 |
+
brunch
|
| 702 |
+
beanie
|
| 703 |
+
office building
|
| 704 |
+
pattern
|
| 705 |
+
calendar
|
| 706 |
+
indoor
|
| 707 |
+
pepper
|
| 708 |
+
ledge
|
| 709 |
+
trail
|
| 710 |
+
fuel
|
| 711 |
+
laptop computer
|
| 712 |
+
tennis shoe
|
| 713 |
+
deck chair
|
| 714 |
+
guitarist
|
| 715 |
+
barn
|
| 716 |
+
surgery
|
| 717 |
+
cartoon illustration
|
| 718 |
+
nebula
|
| 719 |
+
railroad
|
| 720 |
+
mountain goat
|
| 721 |
+
goose
|
| 722 |
+
car door
|
| 723 |
+
cheer
|
| 724 |
+
liquid
|
| 725 |
+
hardwood floor
|
| 726 |
+
pathway
|
| 727 |
+
acorn
|
| 728 |
+
gull
|
| 729 |
+
airliner
|
| 730 |
+
couch
|
| 731 |
+
lake house
|
| 732 |
+
spaghetti
|
| 733 |
+
promenade
|
| 734 |
+
collection
|
| 735 |
+
garden
|
| 736 |
+
bank
|
| 737 |
+
robin
|
| 738 |
+
tennis ball
|
| 739 |
+
peony
|
| 740 |
+
gymnast
|
| 741 |
+
lavender
|
| 742 |
+
deck
|
| 743 |
+
test
|
| 744 |
+
riverside
|
| 745 |
+
rapper
|
| 746 |
+
domino
|
| 747 |
+
bride
|
| 748 |
+
mouse
|
| 749 |
+
basil
|
| 750 |
+
wedding couple
|
| 751 |
+
ocean wave
|
| 752 |
+
arm
|
| 753 |
+
kitchen floor
|
| 754 |
+
grove
|
| 755 |
+
family member
|
| 756 |
+
backyard
|
| 757 |
+
raspberry
|
| 758 |
+
forest fire
|
| 759 |
+
officer
|
| 760 |
+
hibiscus
|
| 761 |
+
canyon
|
| 762 |
+
composer
|
| 763 |
+
signature
|
| 764 |
+
olive oil
|
| 765 |
+
hibiscus flower
|
| 766 |
+
rose
|
| 767 |
+
vector icon
|
| 768 |
+
sunrise
|
| 769 |
+
horseback
|
| 770 |
+
motor scooter
|
| 771 |
+
office worker
|
| 772 |
+
tradition
|
| 773 |
+
ingredient
|
| 774 |
+
washing machine
|
| 775 |
+
lighting
|
| 776 |
+
bagel
|
| 777 |
+
sailboat
|
| 778 |
+
policeman
|
| 779 |
+
mare
|
| 780 |
+
graphic
|
| 781 |
+
halloween pumpkin
|
| 782 |
+
stock
|
| 783 |
+
pilot
|
| 784 |
+
education
|
| 785 |
+
team
|
| 786 |
+
body
|
| 787 |
+
horse
|
| 788 |
+
kimono
|
| 789 |
+
bazaar
|
| 790 |
+
bag
|
| 791 |
+
recording studio
|
| 792 |
+
parsley
|
| 793 |
+
entrance
|
| 794 |
+
denim
|
| 795 |
+
vet
|
| 796 |
+
horse farm
|
| 797 |
+
charcoal
|
| 798 |
+
architecture
|
| 799 |
+
glass vase
|
| 800 |
+
puppy
|
| 801 |
+
estuary
|
| 802 |
+
television show host
|
| 803 |
+
city bus
|
| 804 |
+
shoulder
|
| 805 |
+
beast
|
| 806 |
+
balance
|
| 807 |
+
golfer
|
| 808 |
+
roadside
|
| 809 |
+
denim jacket
|
| 810 |
+
stone wall
|
| 811 |
+
counter top
|
| 812 |
+
app icon
|
| 813 |
+
toast
|
| 814 |
+
head coach
|
| 815 |
+
ham
|
| 816 |
+
warrior
|
| 817 |
+
gem
|
| 818 |
+
refrigerator
|
| 819 |
+
snowman
|
| 820 |
+
construction worker
|
| 821 |
+
coal
|
| 822 |
+
website
|
| 823 |
+
morning fog
|
| 824 |
+
mustard
|
| 825 |
+
human
|
| 826 |
+
owl
|
| 827 |
+
puppy dog
|
| 828 |
+
piggy bank
|
| 829 |
+
vegetation
|
| 830 |
+
pirate
|
| 831 |
+
action film
|
| 832 |
+
marshmallow
|
| 833 |
+
thanksgiving
|
| 834 |
+
business
|
| 835 |
+
disease
|
| 836 |
+
signage
|
| 837 |
+
greeting
|
| 838 |
+
skate park
|
| 839 |
+
tile
|
| 840 |
+
mouth
|
| 841 |
+
spinach
|
| 842 |
+
vacation
|
| 843 |
+
leader
|
| 844 |
+
shrine
|
| 845 |
+
walker
|
| 846 |
+
science fiction film
|
| 847 |
+
bill
|
| 848 |
+
rabbit
|
| 849 |
+
motor boat
|
| 850 |
+
bar
|
| 851 |
+
radio
|
| 852 |
+
barge
|
| 853 |
+
tail
|
| 854 |
+
chainsaw
|
| 855 |
+
gallery
|
| 856 |
+
rainbow
|
| 857 |
+
pasta
|
| 858 |
+
padlock
|
| 859 |
+
web
|
| 860 |
+
pastry
|
| 861 |
+
ink
|
| 862 |
+
reef
|
| 863 |
+
school uniform
|
| 864 |
+
shawl
|
| 865 |
+
treasure
|
| 866 |
+
peach
|
| 867 |
+
dinner table
|
| 868 |
+
injury
|
| 869 |
+
harbor
|
| 870 |
+
witch
|
| 871 |
+
car dealership
|
| 872 |
+
litter
|
| 873 |
+
gesture
|
| 874 |
+
documentary
|
| 875 |
+
marriage
|
| 876 |
+
sea shell
|
| 877 |
+
priest
|
| 878 |
+
dome
|
| 879 |
+
kit
|
| 880 |
+
icon
|
| 881 |
+
seaside
|
| 882 |
+
bucket
|
| 883 |
+
entertainment
|
| 884 |
+
stable
|
| 885 |
+
hat
|
| 886 |
+
puddle
|
| 887 |
+
sock
|
| 888 |
+
shopper
|
| 889 |
+
technology
|
| 890 |
+
harbour
|
| 891 |
+
orbit
|
| 892 |
+
antler
|
| 893 |
+
tube
|
| 894 |
+
flag waving
|
| 895 |
+
cook
|
| 896 |
+
tight
|
| 897 |
+
commander
|
| 898 |
+
farmland
|
| 899 |
+
switch
|
| 900 |
+
hiker
|
| 901 |
+
wedding ceremony
|
| 902 |
+
award ceremony
|
| 903 |
+
champion
|
| 904 |
+
chopstick
|
| 905 |
+
farmhouse
|
| 906 |
+
performer
|
| 907 |
+
spike
|
| 908 |
+
accident
|
| 909 |
+
cruise ship
|
| 910 |
+
passenger train
|
| 911 |
+
attraction
|
| 912 |
+
entertainer
|
| 913 |
+
rear view
|
| 914 |
+
sidewalk
|
| 915 |
+
parade
|
| 916 |
+
racing
|
| 917 |
+
plane
|
| 918 |
+
ritual
|
| 919 |
+
peacock
|
| 920 |
+
pocket
|
| 921 |
+
plum
|
| 922 |
+
drop
|
| 923 |
+
carrot
|
| 924 |
+
floor
|
| 925 |
+
sunset
|
| 926 |
+
troop
|
| 927 |
+
architect
|
| 928 |
+
coffee table
|
| 929 |
+
dust
|
| 930 |
+
outline
|
| 931 |
+
leather
|
| 932 |
+
charity event
|
| 933 |
+
heat
|
| 934 |
+
whale
|
| 935 |
+
laundry
|
| 936 |
+
coconut tree
|
| 937 |
+
crosswalk
|
| 938 |
+
pony
|
| 939 |
+
ant
|
| 940 |
+
pipe
|
| 941 |
+
string
|
| 942 |
+
coat
|
| 943 |
+
angel
|
| 944 |
+
beef
|
| 945 |
+
church tower
|
| 946 |
+
dish
|
| 947 |
+
pitch
|
| 948 |
+
cupboard
|
| 949 |
+
thermometer
|
| 950 |
+
dirt field
|
| 951 |
+
fireworks
|
| 952 |
+
minute
|
| 953 |
+
cane
|
| 954 |
+
pajama
|
| 955 |
+
flower garden
|
| 956 |
+
autumn
|
| 957 |
+
trash can
|
| 958 |
+
dachshund
|
| 959 |
+
banana tree
|
| 960 |
+
tray
|
| 961 |
+
moose
|
| 962 |
+
roadway
|
| 963 |
+
carnival
|
| 964 |
+
antenna
|
| 965 |
+
pole
|
| 966 |
+
castle wall
|
| 967 |
+
ram
|
| 968 |
+
cattle
|
| 969 |
+
hay
|
| 970 |
+
cookie
|
| 971 |
+
swimmer
|
| 972 |
+
baseball team
|
| 973 |
+
strait
|
| 974 |
+
hedge
|
| 975 |
+
jet
|
| 976 |
+
fire pit
|
| 977 |
+
octopus
|
| 978 |
+
calf
|
| 979 |
+
cube
|
| 980 |
+
opera
|
| 981 |
+
cardboard box
|
| 982 |
+
tiara
|
| 983 |
+
kitchen sink
|
| 984 |
+
prairie
|
| 985 |
+
bowl
|
| 986 |
+
galaxy
|
| 987 |
+
straw hat
|
| 988 |
+
linen
|
| 989 |
+
ski resort
|
| 990 |
+
stitch
|
| 991 |
+
street lamp
|
| 992 |
+
motorist
|
| 993 |
+
icicle
|
| 994 |
+
stain
|
| 995 |
+
flora
|
| 996 |
+
drain
|
| 997 |
+
kitchen cabinet
|
| 998 |
+
decor
|
| 999 |
+
bouquet
|
| 1000 |
+
pound
|
| 1001 |
+
interior design
|
| 1002 |
+
nail polish
|
| 1003 |
+
figurine
|
| 1004 |
+
tomb
|
| 1005 |
+
disc
|
| 1006 |
+
twist
|
| 1007 |
+
blouse
|
| 1008 |
+
ribbon
|
| 1009 |
+
figure
|
| 1010 |
+
burger
|
| 1011 |
+
cork
|
| 1012 |
+
soccer goalkeeper
|
| 1013 |
+
train bridge
|
| 1014 |
+
drinking water
|
| 1015 |
+
dew
|
| 1016 |
+
baker
|
| 1017 |
+
storm cloud
|
| 1018 |
+
tarmac
|
| 1019 |
+
tv drama
|
| 1020 |
+
sponge
|
| 1021 |
+
magnet
|
| 1022 |
+
sailor
|
| 1023 |
+
entry
|
| 1024 |
+
swan
|
| 1025 |
+
exercise
|
| 1026 |
+
sloth
|
| 1027 |
+
jewel
|
| 1028 |
+
scuba diver
|
| 1029 |
+
bite
|
| 1030 |
+
cat tree
|
| 1031 |
+
tent
|
| 1032 |
+
can
|
| 1033 |
+
tennis match
|
| 1034 |
+
ecosystem
|
| 1035 |
+
picket fence
|
| 1036 |
+
palm
|
| 1037 |
+
train car
|
| 1038 |
+
frying pan
|
| 1039 |
+
rally
|
| 1040 |
+
tablet pc
|
| 1041 |
+
reindeer
|
| 1042 |
+
image
|
| 1043 |
+
wolf
|
| 1044 |
+
chin
|
| 1045 |
+
conservatory
|
| 1046 |
+
flood water
|
| 1047 |
+
cityscape
|
| 1048 |
+
beach sand
|
| 1049 |
+
car park
|
| 1050 |
+
pavement
|
| 1051 |
+
farm field
|
| 1052 |
+
swimming
|
| 1053 |
+
winter storm
|
| 1054 |
+
stem
|
| 1055 |
+
pillow
|
| 1056 |
+
inning
|
| 1057 |
+
gorilla
|
| 1058 |
+
desk
|
| 1059 |
+
avenue
|
| 1060 |
+
fern
|
| 1061 |
+
money
|
| 1062 |
+
pearl
|
| 1063 |
+
train station
|
| 1064 |
+
skillet
|
| 1065 |
+
nap
|
| 1066 |
+
barber
|
| 1067 |
+
library
|
| 1068 |
+
freezer
|
| 1069 |
+
label
|
| 1070 |
+
rainforest
|
| 1071 |
+
parking sign
|
| 1072 |
+
mirror
|
| 1073 |
+
wing
|
| 1074 |
+
noodle
|
| 1075 |
+
press room
|
| 1076 |
+
sculpture
|
| 1077 |
+
tablet
|
| 1078 |
+
viewer
|
| 1079 |
+
prayer
|
| 1080 |
+
mini
|
| 1081 |
+
mechanic
|
| 1082 |
+
laugh
|
| 1083 |
+
rice field
|
| 1084 |
+
hand
|
| 1085 |
+
mustache
|
| 1086 |
+
mountain road
|
| 1087 |
+
catwalk
|
| 1088 |
+
conference
|
| 1089 |
+
cape
|
| 1090 |
+
installation
|
| 1091 |
+
musician
|
| 1092 |
+
stream
|
| 1093 |
+
machine
|
| 1094 |
+
speech
|
| 1095 |
+
crocodile
|
| 1096 |
+
soccer match
|
| 1097 |
+
town square
|
| 1098 |
+
passport
|
| 1099 |
+
post box
|
| 1100 |
+
point
|
| 1101 |
+
stone building
|
| 1102 |
+
motorway
|
| 1103 |
+
mix
|
| 1104 |
+
dentist
|
| 1105 |
+
businessperson
|
| 1106 |
+
happiness
|
| 1107 |
+
boat
|
| 1108 |
+
vineyard
|
| 1109 |
+
treadmill
|
| 1110 |
+
glass wall
|
| 1111 |
+
water droplet
|
| 1112 |
+
coffee mug
|
| 1113 |
+
graduate
|
| 1114 |
+
sunflower
|
| 1115 |
+
parliament
|
| 1116 |
+
shepherd
|
| 1117 |
+
movie
|
| 1118 |
+
wine
|
| 1119 |
+
orchard
|
| 1120 |
+
tulip
|
| 1121 |
+
motherboard
|
| 1122 |
+
cup
|
| 1123 |
+
broom
|
| 1124 |
+
spot
|
| 1125 |
+
drawing
|
| 1126 |
+
polo shirt
|
| 1127 |
+
graduation
|
| 1128 |
+
film producer
|
| 1129 |
+
moonlight
|
| 1130 |
+
glow
|
| 1131 |
+
film format
|
| 1132 |
+
t shirt
|
| 1133 |
+
rock face
|
| 1134 |
+
sword
|
| 1135 |
+
clinic
|
| 1136 |
+
festival day
|
| 1137 |
+
meadow
|
| 1138 |
+
staple
|
| 1139 |
+
pupil
|
| 1140 |
+
training ground
|
| 1141 |
+
rider
|
| 1142 |
+
flower
|
| 1143 |
+
foal
|
| 1144 |
+
wharf
|
| 1145 |
+
foot bridge
|
| 1146 |
+
shooting
|
| 1147 |
+
top
|
| 1148 |
+
mast
|
| 1149 |
+
police car
|
| 1150 |
+
robe
|
| 1151 |
+
wedding bouquet
|
| 1152 |
+
stop sign
|
| 1153 |
+
birthday cake
|
| 1154 |
+
glitter
|
| 1155 |
+
butter
|
| 1156 |
+
scooter
|
| 1157 |
+
tundra
|
| 1158 |
+
superhero
|
| 1159 |
+
pocket watch
|
| 1160 |
+
inscription
|
| 1161 |
+
youngster
|
| 1162 |
+
fruit tree
|
| 1163 |
+
movie poster
|
| 1164 |
+
engine
|
| 1165 |
+
foundation
|
| 1166 |
+
motorcyclist
|
| 1167 |
+
take
|
| 1168 |
+
woman
|
| 1169 |
+
antelope
|
| 1170 |
+
country artist
|
| 1171 |
+
road trip
|
| 1172 |
+
typewriter
|
| 1173 |
+
tuxedo
|
| 1174 |
+
brand
|
| 1175 |
+
pine
|
| 1176 |
+
bathroom
|
| 1177 |
+
paradise
|
| 1178 |
+
texture
|
| 1179 |
+
balloon
|
| 1180 |
+
dining table
|
| 1181 |
+
home
|
| 1182 |
+
computer screen
|
| 1183 |
+
actor
|
| 1184 |
+
clip
|
| 1185 |
+
tv tower
|
| 1186 |
+
panorama
|
| 1187 |
+
summit
|
| 1188 |
+
cat
|
| 1189 |
+
plot
|
| 1190 |
+
eagle
|
| 1191 |
+
dancer
|
| 1192 |
+
pup
|
| 1193 |
+
studio shot
|
| 1194 |
+
tear
|
| 1195 |
+
bird bath
|
| 1196 |
+
classroom
|
| 1197 |
+
bookstore
|
| 1198 |
+
city wall
|
| 1199 |
+
tv programme
|
| 1200 |
+
blade
|
| 1201 |
+
easel
|
| 1202 |
+
buttercream
|
| 1203 |
+
sweet
|
| 1204 |
+
designer
|
| 1205 |
+
diamond
|
| 1206 |
+
handshake
|
| 1207 |
+
herb
|
| 1208 |
+
corn field
|
| 1209 |
+
seafront
|
| 1210 |
+
concrete
|
| 1211 |
+
street artist
|
| 1212 |
+
gas
|
| 1213 |
+
stamp
|
| 1214 |
+
window display
|
| 1215 |
+
paper
|
| 1216 |
+
note
|
| 1217 |
+
pint
|
| 1218 |
+
quarry
|
| 1219 |
+
research
|
| 1220 |
+
fixture
|
| 1221 |
+
manager
|
| 1222 |
+
soil
|
| 1223 |
+
leopard
|
| 1224 |
+
board game
|
| 1225 |
+
ladder
|
| 1226 |
+
stop light
|
| 1227 |
+
island
|
| 1228 |
+
ramp
|
| 1229 |
+
football match
|
| 1230 |
+
icing
|
| 1231 |
+
drill
|
| 1232 |
+
currency
|
| 1233 |
+
summer evening
|
| 1234 |
+
topping
|
| 1235 |
+
pyramid
|
| 1236 |
+
pomegranate
|
| 1237 |
+
cell
|
| 1238 |
+
ivy
|
| 1239 |
+
squad
|
| 1240 |
+
scenery
|
| 1241 |
+
computer
|
| 1242 |
+
locomotive
|
| 1243 |
+
surf
|
| 1244 |
+
mascot
|
| 1245 |
+
dune
|
| 1246 |
+
path
|
| 1247 |
+
duck
|
| 1248 |
+
twilight
|
| 1249 |
+
wire
|
| 1250 |
+
bow tie
|
| 1251 |
+
strike
|
| 1252 |
+
cormorant
|
| 1253 |
+
car wash
|
| 1254 |
+
crane
|
| 1255 |
+
market
|
| 1256 |
+
philosopher
|
| 1257 |
+
alarm clock
|
| 1258 |
+
camera
|
| 1259 |
+
birch
|
| 1260 |
+
greeting card
|
| 1261 |
+
plain
|
| 1262 |
+
clay
|
| 1263 |
+
donut
|
| 1264 |
+
lock
|
| 1265 |
+
moth
|
| 1266 |
+
laboratory
|
| 1267 |
+
fan
|
| 1268 |
+
violin
|
| 1269 |
+
jazz fusion artist
|
| 1270 |
+
mountain biker
|
| 1271 |
+
terrain
|
| 1272 |
+
magazine
|
| 1273 |
+
pickup
|
| 1274 |
+
comedy film
|
| 1275 |
+
smartphone
|
| 1276 |
+
film
|
| 1277 |
+
bed
|
| 1278 |
+
microwave oven
|
| 1279 |
+
tournament
|
| 1280 |
+
lawn
|
| 1281 |
+
car window
|
| 1282 |
+
alligator
|
| 1283 |
+
screen
|
| 1284 |
+
jetty
|
| 1285 |
+
shopping bag
|
| 1286 |
+
landscape view
|
| 1287 |
+
cabinetry
|
| 1288 |
+
friendly match
|
| 1289 |
+
thing
|
| 1290 |
+
petal
|
| 1291 |
+
shopping center
|
| 1292 |
+
transport
|
| 1293 |
+
ballet dancer
|
| 1294 |
+
shoreline
|
| 1295 |
+
princess
|
| 1296 |
+
car seat
|
| 1297 |
+
parking meter
|
| 1298 |
+
green
|
| 1299 |
+
vodka
|
| 1300 |
+
band
|
| 1301 |
+
rock
|
| 1302 |
+
costume
|
| 1303 |
+
warning sign
|
| 1304 |
+
strip
|
| 1305 |
+
plaque
|
| 1306 |
+
wheelchair
|
| 1307 |
+
headband
|
| 1308 |
+
ginger
|
| 1309 |
+
dice
|
| 1310 |
+
media
|
| 1311 |
+
hairdresser
|
| 1312 |
+
press
|
| 1313 |
+
living room
|
| 1314 |
+
stove
|
| 1315 |
+
player
|
| 1316 |
+
cherry
|
| 1317 |
+
workshop
|
| 1318 |
+
carving
|
| 1319 |
+
embroidery
|
| 1320 |
+
doodle
|
| 1321 |
+
adventure
|
| 1322 |
+
rugby player
|
| 1323 |
+
monument
|
| 1324 |
+
brush
|
| 1325 |
+
marker
|
| 1326 |
+
loft
|
| 1327 |
+
postcard
|
| 1328 |
+
collage
|
| 1329 |
+
ball
|
| 1330 |
+
professor
|
| 1331 |
+
dresser
|
| 1332 |
+
gig
|
| 1333 |
+
festival
|
| 1334 |
+
blackbird
|
| 1335 |
+
makeup artist
|
| 1336 |
+
video camera
|
| 1337 |
+
sticker
|
| 1338 |
+
peak
|
| 1339 |
+
wildflower
|
| 1340 |
+
santa hat
|
| 1341 |
+
rodeo
|
| 1342 |
+
wedding photographer
|
| 1343 |
+
guy
|
| 1344 |
+
staff
|
| 1345 |
+
waterfall
|
| 1346 |
+
operation
|
| 1347 |
+
defender
|
| 1348 |
+
falcon
|
| 1349 |
+
haze
|
| 1350 |
+
individual
|
| 1351 |
+
gentleman
|
| 1352 |
+
greyhound
|
| 1353 |
+
rocking chair
|
| 1354 |
+
rice
|
| 1355 |
+
garbage
|
| 1356 |
+
platter
|
| 1357 |
+
chocolate
|
| 1358 |
+
splash
|
| 1359 |
+
business suit
|
| 1360 |
+
cheetah
|
| 1361 |
+
valley
|
| 1362 |
+
maze
|
| 1363 |
+
trampoline
|
| 1364 |
+
garland
|
| 1365 |
+
slalom
|
| 1366 |
+
unicorn
|
| 1367 |
+
tree stump
|
| 1368 |
+
painting
|
| 1369 |
+
romance
|
| 1370 |
+
fight
|
| 1371 |
+
alcohol
|
| 1372 |
+
ghost
|
| 1373 |
+
fondant
|
| 1374 |
+
spa
|
| 1375 |
+
shutter
|
| 1376 |
+
death
|
| 1377 |
+
demonstration
|
| 1378 |
+
cotton
|
| 1379 |
+
pier
|
| 1380 |
+
flea market
|
| 1381 |
+
history
|
| 1382 |
+
savannah
|
| 1383 |
+
fist
|
| 1384 |
+
aisle
|
| 1385 |
+
crew
|
| 1386 |
+
jug
|
| 1387 |
+
pose
|
| 1388 |
+
anchor
|
| 1389 |
+
teapot
|
| 1390 |
+
boat house
|
| 1391 |
+
business team
|
| 1392 |
+
tripod
|
| 1393 |
+
bee
|
| 1394 |
+
pebble
|
| 1395 |
+
mattress
|
| 1396 |
+
canvas
|
| 1397 |
+
hallway
|
| 1398 |
+
campaign
|
| 1399 |
+
pod
|
| 1400 |
+
lake district
|
| 1401 |
+
article
|
| 1402 |
+
white
|
| 1403 |
+
sofa
|
| 1404 |
+
honey
|
| 1405 |
+
marathon
|
| 1406 |
+
pancake
|
| 1407 |
+
tourist attraction
|
| 1408 |
+
wedding gown
|
| 1409 |
+
battle
|
| 1410 |
+
shelving
|
| 1411 |
+
sea
|
| 1412 |
+
sheet music
|
| 1413 |
+
pie
|
| 1414 |
+
yarn
|
| 1415 |
+
construction site
|
| 1416 |
+
flyer
|
| 1417 |
+
tie
|
| 1418 |
+
star
|
| 1419 |
+
lettuce
|
| 1420 |
+
martial artist
|
| 1421 |
+
dart
|
| 1422 |
+
straw
|
| 1423 |
+
reflection
|
| 1424 |
+
conference room
|
| 1425 |
+
temperature
|
| 1426 |
+
rugby
|
| 1427 |
+
mosquito
|
| 1428 |
+
physicist
|
| 1429 |
+
rock climber
|
| 1430 |
+
crash
|
| 1431 |
+
backdrop
|
| 1432 |
+
toilet seat
|
| 1433 |
+
sand castle
|
| 1434 |
+
water park
|
| 1435 |
+
toy car
|
| 1436 |
+
waste
|
| 1437 |
+
luxury
|
| 1438 |
+
hangar
|
| 1439 |
+
rv
|
| 1440 |
+
tree trunk
|
| 1441 |
+
board
|
| 1442 |
+
gold
|
| 1443 |
+
project picture
|
| 1444 |
+
cap
|
| 1445 |
+
cottage
|
| 1446 |
+
relief
|
| 1447 |
+
attire
|
| 1448 |
+
microscope
|
| 1449 |
+
battery
|
| 1450 |
+
roll
|
| 1451 |
+
line
|
| 1452 |
+
parking garage
|
| 1453 |
+
crystal
|
| 1454 |
+
broadcasting
|
| 1455 |
+
brick wall
|
| 1456 |
+
lab
|
| 1457 |
+
flooring
|
| 1458 |
+
meeting
|
| 1459 |
+
3d cg rendering
|
| 1460 |
+
desktop computer
|
| 1461 |
+
cowboy
|
| 1462 |
+
sailing ship
|
| 1463 |
+
junction
|
| 1464 |
+
hairstyle
|
| 1465 |
+
homework
|
| 1466 |
+
profile
|
| 1467 |
+
model
|
| 1468 |
+
flower pot
|
| 1469 |
+
street light
|
| 1470 |
+
salt lake
|
| 1471 |
+
maple
|
| 1472 |
+
space
|
| 1473 |
+
blizzard
|
| 1474 |
+
throw
|
| 1475 |
+
zebras
|
| 1476 |
+
brochure
|
| 1477 |
+
constellation
|
| 1478 |
+
beak
|
| 1479 |
+
kilt
|
| 1480 |
+
pond
|
| 1481 |
+
blue sky
|
| 1482 |
+
sneaker
|
| 1483 |
+
sand dune
|
| 1484 |
+
morning sun
|
| 1485 |
+
almond
|
| 1486 |
+
grill
|
| 1487 |
+
curl
|
| 1488 |
+
basketball girl game
|
| 1489 |
+
chameleon
|
| 1490 |
+
toilet bowl
|
| 1491 |
+
prince
|
| 1492 |
+
keyboard
|
| 1493 |
+
queen
|
| 1494 |
+
computer monitor
|
| 1495 |
+
writing
|
| 1496 |
+
crown
|
| 1497 |
+
basilica
|
| 1498 |
+
kiss
|
| 1499 |
+
house
|
| 1500 |
+
parking
|
| 1501 |
+
football competition
|
| 1502 |
+
shell
|
| 1503 |
+
sport equipment
|
| 1504 |
+
comedy
|
| 1505 |
+
baboon
|
| 1506 |
+
vendor
|
| 1507 |
+
rise building
|
| 1508 |
+
wrap
|
| 1509 |
+
food truck
|
| 1510 |
+
cat bed
|
| 1511 |
+
rickshaw
|
| 1512 |
+
flare
|
| 1513 |
+
teal
|
| 1514 |
+
nectar
|
| 1515 |
+
eclipse
|
| 1516 |
+
vehicle
|
| 1517 |
+
steam locomotive
|
| 1518 |
+
gorge
|
| 1519 |
+
cow
|
| 1520 |
+
christmas card
|
| 1521 |
+
demonstrator
|
| 1522 |
+
memorial
|
| 1523 |
+
towel
|
| 1524 |
+
jewellery
|
| 1525 |
+
train
|
| 1526 |
+
frisbee
|
| 1527 |
+
baseball game
|
| 1528 |
+
fur
|
| 1529 |
+
afternoon sun
|
| 1530 |
+
community
|
| 1531 |
+
sparkler
|
| 1532 |
+
bandage
|
| 1533 |
+
firework
|
| 1534 |
+
dollar
|
| 1535 |
+
pasture
|
| 1536 |
+
video
|
| 1537 |
+
bus
|
| 1538 |
+
tree house
|
| 1539 |
+
seashore
|
| 1540 |
+
field
|
| 1541 |
+
hamburger
|
| 1542 |
+
souvenir
|
| 1543 |
+
hedgehog
|
| 1544 |
+
worm
|
| 1545 |
+
pine cone
|
| 1546 |
+
osprey
|
| 1547 |
+
dinosaur
|
| 1548 |
+
vegetable
|
| 1549 |
+
junk
|
| 1550 |
+
poster
|
| 1551 |
+
army
|
| 1552 |
+
winger
|
| 1553 |
+
bundle
|
| 1554 |
+
stage
|
| 1555 |
+
growth
|
| 1556 |
+
wedding party
|
| 1557 |
+
service
|
| 1558 |
+
blanket
|
| 1559 |
+
ruler
|
| 1560 |
+
eye
|
| 1561 |
+
credit card
|
| 1562 |
+
castle
|
| 1563 |
+
diner
|
| 1564 |
+
hut
|
| 1565 |
+
elk
|
| 1566 |
+
hard rock artist
|
| 1567 |
+
nun
|
| 1568 |
+
dog breed
|
| 1569 |
+
nest
|
| 1570 |
+
drama film
|
| 1571 |
+
number icon
|
| 1572 |
+
water tank
|
| 1573 |
+
giraffe
|
| 1574 |
+
altar
|
| 1575 |
+
pavilion
|
| 1576 |
+
tv personality
|
| 1577 |
+
suv
|
| 1578 |
+
street vendor
|
| 1579 |
+
street sign
|
| 1580 |
+
ditch
|
| 1581 |
+
debris
|
| 1582 |
+
foam
|
| 1583 |
+
takeoff
|
| 1584 |
+
spice
|
| 1585 |
+
mountain lake
|
| 1586 |
+
tea
|
| 1587 |
+
orchestra
|
| 1588 |
+
spacecraft
|
| 1589 |
+
counter
|
| 1590 |
+
abbey
|
| 1591 |
+
mountain
|
| 1592 |
+
hydrangea
|
| 1593 |
+
racer
|
| 1594 |
+
orange tree
|
| 1595 |
+
tide
|
| 1596 |
+
cowboy hat
|
| 1597 |
+
rapid
|
| 1598 |
+
town
|
| 1599 |
+
wild
|
| 1600 |
+
herd
|
| 1601 |
+
vein
|
| 1602 |
+
driveway
|
| 1603 |
+
jar
|
| 1604 |
+
bark
|
| 1605 |
+
illustration
|
| 1606 |
+
horror film
|
| 1607 |
+
corn
|
| 1608 |
+
stroller
|
| 1609 |
+
industry
|
| 1610 |
+
mountain stream
|
| 1611 |
+
gym
|
| 1612 |
+
neckline
|
| 1613 |
+
pan
|
| 1614 |
+
client
|
| 1615 |
+
spectator
|
| 1616 |
+
eggplant
|
| 1617 |
+
camper
|
| 1618 |
+
fawn
|
| 1619 |
+
hoodie
|
| 1620 |
+
meat
|
| 1621 |
+
lemonade
|
| 1622 |
+
food market
|
| 1623 |
+
slum
|
| 1624 |
+
comic book character
|
| 1625 |
+
flower market
|
| 1626 |
+
love
|
| 1627 |
+
palace
|
| 1628 |
+
gun
|
| 1629 |
+
heel
|
| 1630 |
+
shopping street
|
| 1631 |
+
shooting basketball guard
|
| 1632 |
+
family photo
|
| 1633 |
+
rooftop
|
| 1634 |
+
laundry basket
|
| 1635 |
+
airport runway
|
| 1636 |
+
horn
|
| 1637 |
+
face mask
|
| 1638 |
+
flight
|
| 1639 |
+
appetizer
|
| 1640 |
+
violet
|
| 1641 |
+
country lane
|
| 1642 |
+
cement
|
| 1643 |
+
instrument
|
| 1644 |
+
tv actor
|
| 1645 |
+
spark
|
| 1646 |
+
celebrity
|
| 1647 |
+
award
|
| 1648 |
+
country house
|
| 1649 |
+
standing
|
| 1650 |
+
auction
|
| 1651 |
+
date
|
| 1652 |
+
engagement
|
| 1653 |
+
puck
|
| 1654 |
+
advertisement
|
| 1655 |
+
chair
|
| 1656 |
+
zebra
|
| 1657 |
+
driftwood
|
| 1658 |
+
bumblebee
|
| 1659 |
+
maple leaf
|
| 1660 |
+
bonnet
|
| 1661 |
+
orange
|
| 1662 |
+
water tower
|
| 1663 |
+
door
|
| 1664 |
+
singer
|
| 1665 |
+
floor plan
|
| 1666 |
+
discussion
|
| 1667 |
+
theatre
|
| 1668 |
+
pilgrim
|
| 1669 |
+
mug
|
| 1670 |
+
branch
|
| 1671 |
+
window sill
|
| 1672 |
+
baseball pitcher
|
| 1673 |
+
bakery
|
| 1674 |
+
lollipop
|
| 1675 |
+
basketball player
|
| 1676 |
+
toilet paper
|
| 1677 |
+
chalkboard
|
| 1678 |
+
cabin
|
| 1679 |
+
sign
|
| 1680 |
+
night sky
|
| 1681 |
+
cannon
|
| 1682 |
+
fishing net
|
| 1683 |
+
submarine
|
| 1684 |
+
suit
|
| 1685 |
+
fur coat
|
| 1686 |
+
wine bottle
|
| 1687 |
+
folder
|
| 1688 |
+
street art
|
| 1689 |
+
suspension bridge
|
| 1690 |
+
evening sky
|
| 1691 |
+
billboard
|
| 1692 |
+
postage stamp
|
| 1693 |
+
newspaper
|
| 1694 |
+
transportation
|
| 1695 |
+
surgeon
|
| 1696 |
+
light
|
| 1697 |
+
park
|
| 1698 |
+
horizon
|
| 1699 |
+
road
|
| 1700 |
+
sand bar
|
| 1701 |
+
trumpet
|
| 1702 |
+
lounge
|
| 1703 |
+
cloud forest
|
| 1704 |
+
birthday celebration
|
| 1705 |
+
balcony
|
| 1706 |
+
anime
|
| 1707 |
+
beehive
|
| 1708 |
+
umbrella
|
| 1709 |
+
goldfish
|
| 1710 |
+
baseball cap
|
| 1711 |
+
waterhole
|
| 1712 |
+
ceiling
|
| 1713 |
+
carousel
|
| 1714 |
+
backpack
|
| 1715 |
+
plant pot
|
| 1716 |
+
atmosphere
|
| 1717 |
+
sunflower field
|
| 1718 |
+
spire
|
| 1719 |
+
vision
|
| 1720 |
+
woodpecker
|
| 1721 |
+
chip
|
| 1722 |
+
pool table
|
| 1723 |
+
lotus flower
|
| 1724 |
+
cone
|
| 1725 |
+
humpback whale
|
| 1726 |
+
reservoir
|
| 1727 |
+
hunt
|
| 1728 |
+
piano
|
| 1729 |
+
plate
|
| 1730 |
+
dining area
|
| 1731 |
+
luggage
|
| 1732 |
+
skier
|
| 1733 |
+
dance floor
|
| 1734 |
+
crow
|
| 1735 |
+
stair
|
| 1736 |
+
overpass
|
| 1737 |
+
opera house
|
| 1738 |
+
bear
|
| 1739 |
+
jazz artist
|
| 1740 |
+
water
|
| 1741 |
+
vessel
|
| 1742 |
+
cast
|
| 1743 |
+
yard
|
| 1744 |
+
cathedral
|
| 1745 |
+
basketball hoop
|
| 1746 |
+
graveyard
|
| 1747 |
+
sound
|
| 1748 |
+
berry
|
| 1749 |
+
onlooker
|
| 1750 |
+
fauna
|
| 1751 |
+
birch tree
|
| 1752 |
+
retail
|
| 1753 |
+
hill
|
| 1754 |
+
skeleton
|
| 1755 |
+
journalist
|
| 1756 |
+
frost
|
| 1757 |
+
basket
|
| 1758 |
+
nail
|
| 1759 |
+
dusk
|
| 1760 |
+
trash
|
| 1761 |
+
dawn
|
| 1762 |
+
clover
|
| 1763 |
+
hen
|
| 1764 |
+
volcano
|
| 1765 |
+
basketball coach
|
| 1766 |
+
home decor
|
| 1767 |
+
charge
|
| 1768 |
+
haircut
|
| 1769 |
+
sense
|
| 1770 |
+
university
|
| 1771 |
+
lizard
|
| 1772 |
+
daisy
|
| 1773 |
+
tablet computer
|
| 1774 |
+
grass field
|
| 1775 |
+
prison
|
| 1776 |
+
metal artist
|
| 1777 |
+
bathroom mirror
|
| 1778 |
+
window frame
|
| 1779 |
+
chest
|
| 1780 |
+
flavor
|
| 1781 |
+
pop country artist
|
| 1782 |
+
market square
|
| 1783 |
+
monkey
|
| 1784 |
+
blog
|
| 1785 |
+
deer
|
| 1786 |
+
speech bubble
|
| 1787 |
+
dog
|
| 1788 |
+
independence day
|
| 1789 |
+
girl
|
| 1790 |
+
boy
|
| 1791 |
+
tartan
|
| 1792 |
+
furniture
|
| 1793 |
+
appliance
|
| 1794 |
+
office window
|
| 1795 |
+
fish boat
|
| 1796 |
+
sand box
|
| 1797 |
+
tv sitcom
|
| 1798 |
+
drama
|
| 1799 |
+
sleigh
|
| 1800 |
+
depression
|
| 1801 |
+
paper towel
|
| 1802 |
+
baseball
|
| 1803 |
+
protestor
|
| 1804 |
+
grape
|
| 1805 |
+
wedding cake
|
| 1806 |
+
invitation
|
| 1807 |
+
accessory
|
| 1808 |
+
pick
|
| 1809 |
+
grandparent
|
| 1810 |
+
racket
|
| 1811 |
+
tea plantation
|
| 1812 |
+
outdoors
|
| 1813 |
+
egg
|
| 1814 |
+
glass bowl
|
| 1815 |
+
sun
|
| 1816 |
+
organization
|
| 1817 |
+
lion
|
| 1818 |
+
panel
|
| 1819 |
+
station
|
| 1820 |
+
wallpaper
|
| 1821 |
+
helicopter
|
| 1822 |
+
salt
|
| 1823 |
+
vanity
|
| 1824 |
+
patio
|
| 1825 |
+
lunch
|
| 1826 |
+
street performer
|
| 1827 |
+
mountain range
|
| 1828 |
+
soup
|
| 1829 |
+
bacon
|
| 1830 |
+
power station
|
| 1831 |
+
cantilever bridge
|
| 1832 |
+
hummingbird
|
| 1833 |
+
shirt
|
| 1834 |
+
rope
|
| 1835 |
+
hip
|
| 1836 |
+
chalk
|
| 1837 |
+
pendant
|
| 1838 |
+
choir
|
| 1839 |
+
tv
|
| 1840 |
+
lichen
|
| 1841 |
+
railway bridge
|
| 1842 |
+
art gallery
|
| 1843 |
+
bartender
|
| 1844 |
+
wagon
|
| 1845 |
+
baby elephant
|
| 1846 |
+
accordion
|
| 1847 |
+
horseshoe
|
| 1848 |
+
building site
|
| 1849 |
+
clutch
|
| 1850 |
+
harvest
|
| 1851 |
+
savanna
|
| 1852 |
+
geranium
|
| 1853 |
+
business woman
|
| 1854 |
+
paddock
|
| 1855 |
+
patch
|
| 1856 |
+
beech tree
|
| 1857 |
+
war
|
| 1858 |
+
suburbs
|
| 1859 |
+
hospital bed
|
| 1860 |
+
motorcycle racer
|
| 1861 |
+
moss
|
| 1862 |
+
gravel
|
| 1863 |
+
government agency
|
| 1864 |
+
dollar bill
|
| 1865 |
+
father
|
| 1866 |
+
fjord
|
| 1867 |
+
concert
|
| 1868 |
+
nut
|
| 1869 |
+
wedding photography
|
| 1870 |
+
finish line
|
| 1871 |
+
home plate
|
| 1872 |
+
food
|
| 1873 |
+
nose
|
| 1874 |
+
thumb
|
| 1875 |
+
village
|
| 1876 |
+
dining room table
|
| 1877 |
+
bumper
|
| 1878 |
+
monster
|
| 1879 |
+
blackberry
|
| 1880 |
+
lime
|
| 1881 |
+
conflict
|
| 1882 |
+
gala
|
| 1883 |
+
wallet
|
| 1884 |
+
wrist
|
| 1885 |
+
hug
|
| 1886 |
+
mermaid
|
| 1887 |
+
lava
|
| 1888 |
+
lawyer
|
| 1889 |
+
folk rock artist
|
| 1890 |
+
arena
|
| 1891 |
+
onion
|
| 1892 |
+
toothbrush
|
| 1893 |
+
fashion
|
| 1894 |
+
perfume
|
| 1895 |
+
flip
|
| 1896 |
+
triangle
|
| 1897 |
+
woodland
|
| 1898 |
+
mail
|
| 1899 |
+
grasshopper
|
| 1900 |
+
studio
|
| 1901 |
+
wood floor
|
| 1902 |
+
den
|
| 1903 |
+
racquet
|
| 1904 |
+
cello
|
| 1905 |
+
lemur
|
| 1906 |
+
astronaut
|
| 1907 |
+
glass table
|
| 1908 |
+
blood
|
| 1909 |
+
dvd
|
| 1910 |
+
planter
|
| 1911 |
+
silver
|
| 1912 |
+
leash
|
| 1913 |
+
master bedroom
|
| 1914 |
+
forest
|
| 1915 |
+
batter
|
| 1916 |
+
shoe
|
| 1917 |
+
engraving
|
| 1918 |
+
opening
|
| 1919 |
+
product
|
| 1920 |
+
toe
|
| 1921 |
+
cocktail
|
| 1922 |
+
mallard duck
|
| 1923 |
+
bike ride
|
| 1924 |
+
oasis
|
| 1925 |
+
wedding ring
|
| 1926 |
+
cinematographer
|
| 1927 |
+
holly
|
| 1928 |
+
autograph
|
| 1929 |
+
fence
|
| 1930 |
+
ice cube
|
| 1931 |
+
cove
|
| 1932 |
+
pineapple
|
| 1933 |
+
aurora
|
| 1934 |
+
glass bead
|
| 1935 |
+
produce
|
| 1936 |
+
apartment building
|
| 1937 |
+
cob
|
| 1938 |
+
miniature
|
| 1939 |
+
cockpit
|
| 1940 |
+
flashlight
|
| 1941 |
+
frog
|
| 1942 |
+
sheep
|
| 1943 |
+
groom
|
| 1944 |
+
steel
|
| 1945 |
+
watermelon
|
| 1946 |
+
clip art
|
| 1947 |
+
paper plate
|
| 1948 |
+
ostrich
|
| 1949 |
+
contour
|
| 1950 |
+
mural
|
| 1951 |
+
cub
|
| 1952 |
+
paisley bandanna
|
| 1953 |
+
winery
|
| 1954 |
+
turn
|
| 1955 |
+
handle
|
| 1956 |
+
satellite
|
| 1957 |
+
post
|
| 1958 |
+
pork
|
| 1959 |
+
child
|
| 1960 |
+
asphalt
|
| 1961 |
+
grocery store
|
| 1962 |
+
vulture
|
| 1963 |
+
trolley
|
| 1964 |
+
nightclub
|
| 1965 |
+
brick
|
| 1966 |
+
trailer
|
| 1967 |
+
compass
|
| 1968 |
+
cereal
|
| 1969 |
+
cafe
|
| 1970 |
+
cartoon character
|
| 1971 |
+
sugar
|
| 1972 |
+
fiction book
|
| 1973 |
+
glass floor
|
| 1974 |
+
umpire
|
| 1975 |
+
guitar
|
| 1976 |
+
hamster
|
| 1977 |
+
protester
|
| 1978 |
+
airplane
|
| 1979 |
+
garment
|
| 1980 |
+
blazer
|
| 1981 |
+
railway line
|
| 1982 |
+
wedding
|
| 1983 |
+
shoe box
|
| 1984 |
+
parking lot
|
| 1985 |
+
construction
|
| 1986 |
+
graduation ceremony
|
| 1987 |
+
tram
|
| 1988 |
+
telescope
|
| 1989 |
+
copper
|
| 1990 |
+
pain
|
| 1991 |
+
autumn forest
|
| 1992 |
+
guest house
|
| 1993 |
+
partner
|
| 1994 |
+
crayon
|
| 1995 |
+
dip
|
| 1996 |
+
boot
|
| 1997 |
+
corridor
|
| 1998 |
+
computer keyboard
|
| 1999 |
+
hockey player
|
| 2000 |
+
chicken coop
|
| 2001 |
+
bus station
|
| 2002 |
+
gathering
|
| 2003 |
+
ankle
|
| 2004 |
+
bunk bed
|
| 2005 |
+
wood table
|
| 2006 |
+
football coach
|
| 2007 |
+
monarch
|
| 2008 |
+
pharmacy
|
| 2009 |
+
legging
|
| 2010 |
+
mannequin
|
| 2011 |
+
female
|
| 2012 |
+
train track
|
| 2013 |
+
stack
|
| 2014 |
+
canopy
|
| 2015 |
+
design element
|
| 2016 |
+
grandmother
|
| 2017 |
+
symbol
|
| 2018 |
+
beach hut
|
| 2019 |
+
zucchini
|
| 2020 |
+
bomb
|
| 2021 |
+
businessman
|
| 2022 |
+
skyscraper
|
| 2023 |
+
tongue
|
| 2024 |
+
case
|
| 2025 |
+
sparkle
|
| 2026 |
+
highland
|
| 2027 |
+
ballroom
|
| 2028 |
+
prom
|
| 2029 |
+
estate
|
| 2030 |
+
customer
|
| 2031 |
+
archipelago
|
| 2032 |
+
cheese
|
| 2033 |
+
debate
|
| 2034 |
+
carriage
|
| 2035 |
+
bulldozer
|
| 2036 |
+
pumpkin
|
| 2037 |
+
sitting room
|
| 2038 |
+
gas station
|
| 2039 |
+
wedding reception
|
| 2040 |
+
camp
|
| 2041 |
+
dog bed
|
| 2042 |
+
tower
|
| 2043 |
+
property
|
| 2044 |
+
river bed
|
| 2045 |
+
pop latin artist
|
| 2046 |
+
fridge
|
| 2047 |
+
wine glass
|
| 2048 |
+
coast
|
| 2049 |
+
beer
|
| 2050 |
+
tow truck
|
| 2051 |
+
fire truck
|
| 2052 |
+
mountain bike
|
| 2053 |
+
thigh
|
| 2054 |
+
heron
|
| 2055 |
+
boat ride
|
| 2056 |
+
gondola
|
| 2057 |
+
turquoise
|
| 2058 |
+
lake
|
| 2059 |
+
llama
|
| 2060 |
+
kitty
|
| 2061 |
+
tin
|
| 2062 |
+
waiting room
|
| 2063 |
+
coffee cup
|
| 2064 |
+
socialite
|
| 2065 |
+
guard
|
| 2066 |
+
tap
|
| 2067 |
+
waterway
|
| 2068 |
+
forehead
|
| 2069 |
+
list
|
| 2070 |
+
erosion
|
| 2071 |
+
box
|
| 2072 |
+
sea lion
|
| 2073 |
+
pollen
|
| 2074 |
+
dam
|
| 2075 |
+
wasp
|
| 2076 |
+
salon
|
| 2077 |
+
tennis tournament
|
| 2078 |
+
flower box
|
| 2079 |
+
aquarium
|
| 2080 |
+
rain cloud
|
| 2081 |
+
clothing store
|
| 2082 |
+
lead singer
|
| 2083 |
+
cupcake
|
| 2084 |
+
tortoise
|
| 2085 |
+
lettering
|
| 2086 |
+
sport facility
|
| 2087 |
+
dance
|
| 2088 |
+
dog house
|
| 2089 |
+
nature
|
| 2090 |
+
football
|
| 2091 |
+
rooster
|
| 2092 |
+
footballer
|
| 2093 |
+
railway track
|
| 2094 |
+
crowd
|
| 2095 |
+
fishing rod
|
| 2096 |
+
silhouette
|
| 2097 |
+
wind turbine
|
| 2098 |
+
sari
|
| 2099 |
+
bus window
|
| 2100 |
+
cloud
|
| 2101 |
+
charity
|
| 2102 |
+
medal
|
| 2103 |
+
yoga
|
| 2104 |
+
event
|
| 2105 |
+
veil
|
| 2106 |
+
fashion menswear milan week
|
| 2107 |
+
news
|
| 2108 |
+
knife
|
| 2109 |
+
print
|
| 2110 |
+
screen tv
|
| 2111 |
+
walnut
|
| 2112 |
+
fungus
|
| 2113 |
+
ice cream
|
| 2114 |
+
computer mouse
|
| 2115 |
+
play
|
| 2116 |
+
tribe
|
| 2117 |
+
picture
|
| 2118 |
+
video game
|
| 2119 |
+
business card
|
| 2120 |
+
music festival
|
| 2121 |
+
rack
|
| 2122 |
+
envelope
|
| 2123 |
+
shower
|
| 2124 |
+
dirt road
|
| 2125 |
+
mine
|
| 2126 |
+
oyster
|
| 2127 |
+
monarch butterfly
|
| 2128 |
+
dude
|
| 2129 |
+
fruit salad
|
| 2130 |
+
podium
|
| 2131 |
+
fork
|
| 2132 |
+
lace
|
| 2133 |
+
test match
|
| 2134 |
+
boulder
|
| 2135 |
+
cricket player
|
| 2136 |
+
staircase
|
| 2137 |
+
peninsula
|
| 2138 |
+
shopping
|
| 2139 |
+
popcorn
|
| 2140 |
+
oak
|
| 2141 |
+
market stall
|
| 2142 |
+
pine tree
|
| 2143 |
+
mountaineer
|
| 2144 |
+
student
|
| 2145 |
+
closet
|
| 2146 |
+
hood
|
| 2147 |
+
handstand
|
| 2148 |
+
centerpiece
|
| 2149 |
+
insect
|
| 2150 |
+
patient
|
| 2151 |
+
makeover
|
| 2152 |
+
tennis player
|
| 2153 |
+
sheet
|
| 2154 |
+
park bench
|
| 2155 |
+
apple
|
| 2156 |
+
organism
|
| 2157 |
+
hook
|
| 2158 |
+
turkey
|
| 2159 |
+
tangerine
|
| 2160 |
+
sibling
|
| 2161 |
+
shopping mall
|
| 2162 |
+
bird
|
| 2163 |
+
scarf
|
| 2164 |
+
smoothie
|
| 2165 |
+
net
|
| 2166 |
+
grass
|
| 2167 |
+
napkin
|
| 2168 |
+
ray
|
| 2169 |
+
eyebrow
|
| 2170 |
+
laptop keyboard
|
| 2171 |
+
motorbike
|
| 2172 |
+
woman hand
|
| 2173 |
+
oven
|
| 2174 |
+
book cover
|
| 2175 |
+
easter egg
|
| 2176 |
+
microwave
|
| 2177 |
+
sand
|
| 2178 |
+
snapshot
|
| 2179 |
+
soccer ball
|
| 2180 |
+
makeup
|
| 2181 |
+
knight
|
| 2182 |
+
bowling ball
|
| 2183 |
+
shower curtain
|
| 2184 |
+
flame
|
| 2185 |
+
lightning
|
| 2186 |
+
running
|
| 2187 |
+
power plant
|
| 2188 |
+
crib
|
| 2189 |
+
cartoon
|
| 2190 |
+
moat
|
| 2191 |
+
fashion girl
|
| 2192 |
+
wedding invitation
|
| 2193 |
+
bottle
|
| 2194 |
+
cliff
|
| 2195 |
+
monastery
|
| 2196 |
+
file photo
|
| 2197 |
+
apartment
|
| 2198 |
+
casino
|
| 2199 |
+
cream
|
| 2200 |
+
sweatshirt
|
| 2201 |
+
storm
|
| 2202 |
+
cruise
|
| 2203 |
+
teddy bear
|
| 2204 |
+
shovel
|
| 2205 |
+
wind farm
|
| 2206 |
+
writer
|
| 2207 |
+
dock
|
| 2208 |
+
professional
|
| 2209 |
+
hotel room
|
| 2210 |
+
job
|
| 2211 |
+
monitor
|
| 2212 |
+
donkey
|
| 2213 |
+
pass
|
| 2214 |
+
interview
|
| 2215 |
+
duchess
|
| 2216 |
+
mark
|
| 2217 |
+
plank
|
| 2218 |
+
beard
|
| 2219 |
+
zombie
|
| 2220 |
+
trio
|
| 2221 |
+
channel
|
| 2222 |
+
cricket team
|
| 2223 |
+
windmill
|
| 2224 |
+
vest
|
| 2225 |
+
diagram
|
| 2226 |
+
cable
|
| 2227 |
+
winter scene
|
| 2228 |
+
golden gate bridge
|
| 2229 |
+
buffalo
|
| 2230 |
+
studio portrait
|
| 2231 |
+
pagoda
|
| 2232 |
+
whiskey
|
| 2233 |
+
freight train
|
| 2234 |
+
kite
|
| 2235 |
+
future
|
| 2236 |
+
steam train
|
| 2237 |
+
phone box
|
| 2238 |
+
headset
|
| 2239 |
+
wood
|
| 2240 |
+
snowboarder
|
| 2241 |
+
paper bag
|
| 2242 |
+
slide
|
| 2243 |
+
grapefruit
|
| 2244 |
+
seating
|
| 2245 |
+
morning
|
| 2246 |
+
bronze sculpture
|
| 2247 |
+
theatre actor
|
| 2248 |
+
stump
|
| 2249 |
+
jean
|
| 2250 |
+
landmark
|
| 2251 |
+
jam
|
| 2252 |
+
waist
|
| 2253 |
+
watercolor
|
| 2254 |
+
hammock
|
| 2255 |
+
light fixture
|
| 2256 |
+
ice
|
| 2257 |
+
basin
|
| 2258 |
+
beverage
|
| 2259 |
+
shelter
|
| 2260 |
+
premiere
|
| 2261 |
+
mound
|
| 2262 |
+
ear
|
| 2263 |
+
bronze
|
| 2264 |
+
sunlight
|
| 2265 |
+
street
|
| 2266 |
+
energy
|
| 2267 |
+
barn door
|
| 2268 |
+
hike
|
| 2269 |
+
fleet
|
| 2270 |
+
claw
|
| 2271 |
+
beach
|
| 2272 |
+
pepperoni
|
| 2273 |
+
bin
|
| 2274 |
+
trainer
|
| 2275 |
+
buffet
|
| 2276 |
+
archive
|
| 2277 |
+
toddler
|
| 2278 |
+
referee
|
| 2279 |
+
bay window
|
| 2280 |
+
dove
|
| 2281 |
+
production company
|
| 2282 |
+
evening light
|
| 2283 |
+
gate
|
| 2284 |
+
farm
|
| 2285 |
+
reed
|
| 2286 |
+
fruit stand
|
| 2287 |
+
explorer
|
| 2288 |
+
snow storm
|
| 2289 |
+
throw pillow
|
| 2290 |
+
button
|
| 2291 |
+
display case
|
| 2292 |
+
bookcase
|
| 2293 |
+
lead
|
| 2294 |
+
lipstick
|
| 2295 |
+
basketball court
|
| 2296 |
+
cargo
|
| 2297 |
+
ensemble
|
| 2298 |
+
pope
|
| 2299 |
+
clock tower
|
| 2300 |
+
teen
|
| 2301 |
+
speaker
|
| 2302 |
+
rat
|
| 2303 |
+
laptop
|
| 2304 |
+
ski
|
| 2305 |
+
mess
|
| 2306 |
+
stadium
|
| 2307 |
+
ferry boat
|
| 2308 |
+
bunny
|
| 2309 |
+
waterfront
|
| 2310 |
+
downtown
|
| 2311 |
+
sink
|
| 2312 |
+
press conference
|
| 2313 |
+
dinner
|
| 2314 |
+
condiment
|
| 2315 |
+
thread
|
| 2316 |
+
audience
|
| 2317 |
+
grid
|
| 2318 |
+
car
|
| 2319 |
+
plastic
|
| 2320 |
+
people
|
| 2321 |
+
barbecue
|
| 2322 |
+
pigeon
|
| 2323 |
+
urinal
|
| 2324 |
+
seagull
|
| 2325 |
+
volunteer
|
| 2326 |
+
hockey
|
| 2327 |
+
fir tree
|
| 2328 |
+
pollution
|
| 2329 |
+
trial
|
| 2330 |
+
collar
|
| 2331 |
+
area
|
| 2332 |
+
meeting room
|
| 2333 |
+
circus
|
| 2334 |
+
yogurt
|
| 2335 |
+
orangutan
|
| 2336 |
+
viaduct
|
| 2337 |
+
comedian
|
| 2338 |
+
drone
|
| 2339 |
+
scissor
|
| 2340 |
+
pop rock artist
|
| 2341 |
+
biscuit
|
| 2342 |
+
panda
|
| 2343 |
+
water feature
|
| 2344 |
+
air balloon
|
| 2345 |
+
remote control
|
| 2346 |
+
watercolor painting
|
| 2347 |
+
show
|
| 2348 |
+
walk
|
| 2349 |
+
post office
|
| 2350 |
+
bike path
|
| 2351 |
+
rap gangsta artist
|
| 2352 |
+
microphone
|
| 2353 |
+
crack
|
| 2354 |
+
sunset sky
|
| 2355 |
+
glass
|
| 2356 |
+
tv show
|
| 2357 |
+
cartoon style
|
| 2358 |
+
stripe
|
| 2359 |
+
foyer
|
| 2360 |
+
signal
|
| 2361 |
+
calligraphy
|
| 2362 |
+
bulb
|
| 2363 |
+
gardener
|
| 2364 |
+
coffee bean
|
| 2365 |
+
spider
|
| 2366 |
+
tapestry
|
| 2367 |
+
city skyline
|
| 2368 |
+
necklace
|
| 2369 |
+
kitten
|
| 2370 |
+
traveler
|
| 2371 |
+
veteran
|
| 2372 |
+
frosting
|
| 2373 |
+
fry
|
| 2374 |
+
tennis court
|
| 2375 |
+
tank top
|
| 2376 |
+
butterfly house
|
| 2377 |
+
mist
|
| 2378 |
+
drummer
|
| 2379 |
+
water level
|
| 2380 |
+
scale
|
| 2381 |
+
baseball glove
|
| 2382 |
+
music video performer
|
| 2383 |
+
champagne
|
| 2384 |
+
camping
|
| 2385 |
+
clothing
|
| 2386 |
+
water drop
|
| 2387 |
+
telephone box
|
| 2388 |
+
pen
|
| 2389 |
+
morning mist
|
| 2390 |
+
fire engine
|
| 2391 |
+
porch
|
| 2392 |
+
opening ceremony
|
| 2393 |
+
style
|
| 2394 |
+
palm tree
|
| 2395 |
+
fashion show
|
| 2396 |
+
universe
|
| 2397 |
+
scratch
|
| 2398 |
+
axe
|
| 2399 |
+
ottoman
|
| 2400 |
+
explosion
|
| 2401 |
+
rib
|
| 2402 |
+
boutique
|
| 2403 |
+
game
|
| 2404 |
+
cucumber
|
| 2405 |
+
fruit
|
| 2406 |
+
stone bridge
|
| 2407 |
+
nature reserve
|
| 2408 |
+
track
|
| 2409 |
+
train window
|
| 2410 |
+
punch
|
| 2411 |
+
telephone pole
|
| 2412 |
+
velvet
|
| 2413 |
+
sauce
|
| 2414 |
+
moon
|
| 2415 |
+
contrast
|
| 2416 |
+
flamingo
|
| 2417 |
+
bat
|
| 2418 |
+
vending machine
|
| 2419 |
+
ship
|
| 2420 |
+
equestrian
|
| 2421 |
+
shade
|
| 2422 |
+
comforter
|
| 2423 |
+
pallet
|
| 2424 |
+
sparrow
|
| 2425 |
+
wii
|
| 2426 |
+
glaze
|
| 2427 |
+
grocery
|
| 2428 |
+
steeple
|
| 2429 |
+
soccer player
|
| 2430 |
+
contract
|
| 2431 |
+
advertising
|
| 2432 |
+
runner
|
| 2433 |
+
chimpanzee
|
| 2434 |
+
world
|
| 2435 |
+
seat
|
| 2436 |
+
project
|
| 2437 |
+
chihuahua
|
| 2438 |
+
bubble
|
| 2439 |
+
willow
|
| 2440 |
+
pedestal
|
| 2441 |
+
soul hip hop artist
|
| 2442 |
+
curb
|
| 2443 |
+
drawer
|
| 2444 |
+
leaf
|
| 2445 |
+
banner
|
| 2446 |
+
launch party
|
| 2447 |
+
coach
|
| 2448 |
+
government
|
| 2449 |
+
snowball
|
| 2450 |
+
toy
|
| 2451 |
+
portrait
|
| 2452 |
+
doctor
|
| 2453 |
+
whiteboard
|
| 2454 |
+
electronic
|
| 2455 |
+
tiger
|
| 2456 |
+
graffiti
|
| 2457 |
+
column
|
| 2458 |
+
nightstand
|
| 2459 |
+
whistle
|
| 2460 |
+
maxi dress
|
| 2461 |
+
bench
|
| 2462 |
+
wetsuit
|
| 2463 |
+
bird feeder
|
| 2464 |
+
football game
|
| 2465 |
+
basketball
|
| 2466 |
+
class
|
| 2467 |
+
bathroom door
|
| 2468 |
+
store window
|
| 2469 |
+
text message
|
| 2470 |
+
wreath
|
| 2471 |
+
street view
|
| 2472 |
+
binocular
|
| 2473 |
+
pet
|
| 2474 |
+
facade
|
| 2475 |
+
drought
|
| 2476 |
+
lemon
|
| 2477 |
+
new year
|
| 2478 |
+
night view
|
| 2479 |
+
airplane window
|
| 2480 |
+
specie
|
| 2481 |
+
rule
|
| 2482 |
+
jaw
|
| 2483 |
+
wheat field
|
| 2484 |
+
diet
|
| 2485 |
+
pop artist
|
| 2486 |
+
habitat
|
| 2487 |
+
screenshot
|
| 2488 |
+
scoreboard
|
| 2489 |
+
shore
|
| 2490 |
+
mane
|
| 2491 |
+
quilt
|
| 2492 |
+
ski lift
|
| 2493 |
+
orchid
|
| 2494 |
+
turban
|
| 2495 |
+
christmas
|
| 2496 |
+
airport
|
| 2497 |
+
marina
|
| 2498 |
+
glass door
|
| 2499 |
+
glass bottle
|
| 2500 |
+
restaurant
|
| 2501 |
+
conductor
|
| 2502 |
+
logo
|
| 2503 |
+
sleep
|
| 2504 |
+
tape
|
| 2505 |
+
tomato
|
| 2506 |
+
river bank
|
| 2507 |
+
lilac
|
| 2508 |
+
tooth
|
| 2509 |
+
training
|
| 2510 |
+
pottery
|
| 2511 |
+
shop
|
| 2512 |
+
steam engine
|
| 2513 |
+
mason jar
|
| 2514 |
+
base
|
| 2515 |
+
procession
|
| 2516 |
+
border
|
| 2517 |
+
shoot
|
| 2518 |
+
footprint
|
| 2519 |
+
hotdog
|
| 2520 |
+
bull
|
| 2521 |
+
stocking
|
| 2522 |
+
recreation
|
| 2523 |
+
automobile model
|
| 2524 |
+
design
|
| 2525 |
+
country pop artist
|
| 2526 |
+
river
|
| 2527 |
+
retriever
|
| 2528 |
+
department store
|
| 2529 |
+
auditorium
|
| 2530 |
+
sport car
|
| 2531 |
+
supermarket
|
| 2532 |
+
belt
|
| 2533 |
+
cricket
|
| 2534 |
+
window box
|
| 2535 |
+
dress shirt
|
| 2536 |
+
letter
|
| 2537 |
+
residence
|
| 2538 |
+
megaphone
|
| 2539 |
+
pant
|
| 2540 |
+
wildfire
|
| 2541 |
+
bird nest
|
| 2542 |
+
crab
|
| 2543 |
+
swimsuit
|
| 2544 |
+
candle
|
| 2545 |
+
funeral
|
| 2546 |
+
mill
|
| 2547 |
+
national park
|
| 2548 |
+
plant
|
| 2549 |
+
cop
|
| 2550 |
+
power line
|
| 2551 |
+
perch
|
| 2552 |
+
blue
|
| 2553 |
+
finger
|
| 2554 |
+
ferris wheel
|
| 2555 |
+
globe
|
| 2556 |
+
skateboard
|
| 2557 |
+
helmet
|
| 2558 |
+
movie theater
|
| 2559 |
+
uniform
|
| 2560 |
+
hammer
|
| 2561 |
+
material
|
| 2562 |
+
kid
|
| 2563 |
+
well
|
| 2564 |
+
butterfly
|
| 2565 |
+
sideline
|
| 2566 |
+
fashion fall show
|
| 2567 |
+
planet earth
|
| 2568 |
+
lift
|
| 2569 |
+
male
|
| 2570 |
+
sauna
|
| 2571 |
+
gray
|
| 2572 |
+
flour
|
| 2573 |
+
sand sculpture
|
| 2574 |
+
program
|
| 2575 |
+
cabinet
|
| 2576 |
+
infant
|
| 2577 |
+
wheel
|
| 2578 |
+
aircraft model
|
| 2579 |
+
dough
|
| 2580 |
+
garlic
|
| 2581 |
+
skate
|
| 2582 |
+
arrow
|
| 2583 |
+
wrapping paper
|
| 2584 |
+
ripple
|
| 2585 |
+
lamp
|
| 2586 |
+
iron
|
| 2587 |
+
banknote
|
| 2588 |
+
beaver
|
| 2589 |
+
ferry
|
| 2590 |
+
courtyard
|
| 2591 |
+
bassist
|
| 2592 |
+
countryside
|
| 2593 |
+
steak
|
| 2594 |
+
comfort
|
| 2595 |
+
boxer
|
| 2596 |
+
laundry room
|
| 2597 |
+
campsite
|
| 2598 |
+
brick building
|
| 2599 |
+
golf
|
| 2600 |
+
subway
|
| 2601 |
+
headphone
|
| 2602 |
+
fort
|
| 2603 |
+
handbag
|
| 2604 |
+
drum
|
| 2605 |
+
flood
|
| 2606 |
+
saddle
|
| 2607 |
+
bass
|
| 2608 |
+
labyrinth
|
| 2609 |
+
needle
|
| 2610 |
+
sun ray
|
| 2611 |
+
app
|
| 2612 |
+
menu
|
| 2613 |
+
president
|
| 2614 |
+
cardigan
|
| 2615 |
+
dandelion
|
| 2616 |
+
wetland
|
| 2617 |
+
ice hockey player
|
| 2618 |
+
number
|
| 2619 |
+
city hall
|
| 2620 |
+
fishing
|
| 2621 |
+
portrait session
|
| 2622 |
+
pug
|
| 2623 |
+
key
|
| 2624 |
+
art print
|
| 2625 |
+
minister
|
| 2626 |
+
hurdle
|
| 2627 |
+
emergency
|
| 2628 |
+
painting artist
|
| 2629 |
+
flag pole
|
| 2630 |
+
evening
|
| 2631 |
+
purse
|
| 2632 |
+
recipe
|
| 2633 |
+
golf ball
|
| 2634 |
+
coloring book
|
| 2635 |
+
mountain peak
|
| 2636 |
+
senior
|
| 2637 |
+
holiday
|
| 2638 |
+
bud
|
| 2639 |
+
cousin
|
| 2640 |
+
pantry
|
| 2641 |
+
lap
|
| 2642 |
+
skin
|
| 2643 |
+
flag
|
| 2644 |
+
tissue paper
|
| 2645 |
+
ridge
|
| 2646 |
+
wire fence
|
| 2647 |
+
surfer
|
| 2648 |
+
climber
|
| 2649 |
+
photograph
|
| 2650 |
+
sewing machine
|
| 2651 |
+
cooler
|
| 2652 |
+
actress
|
| 2653 |
+
apple tree
|
| 2654 |
+
cancer
|
| 2655 |
+
starfish
|
| 2656 |
+
automobile make
|
| 2657 |
+
dumbbell
|
| 2658 |
+
brace
|
| 2659 |
+
tunnel
|
| 2660 |
+
window
|
| 2661 |
+
paint artist
|
| 2662 |
+
composition
|
| 2663 |
+
school student
|
| 2664 |
+
condo
|
| 2665 |
+
convertible
|
| 2666 |
+
cushion
|
| 2667 |
+
selfie
|
| 2668 |
+
territory
|
| 2669 |
+
guide
|
| 2670 |
+
tree
|
| 2671 |
+
court
|
| 2672 |
+
shrimp
|
| 2673 |
+
stone house
|
| 2674 |
+
dress
|
| 2675 |
+
eyelash
|
| 2676 |
+
juice
|
| 2677 |
+
broccoli
|
| 2678 |
+
chain
|
| 2679 |
+
tourism
|
| 2680 |
+
mountain top
|
| 2681 |
+
concept car
|
| 2682 |
+
film premiere
|
| 2683 |
+
light bulb
|
| 2684 |
+
cafeteria
|
| 2685 |
+
badge
|
| 2686 |
+
flower bed
|
| 2687 |
+
theater
|
| 2688 |
+
root
|
| 2689 |
+
racecar driver
|
| 2690 |
+
basketball boy game
|
| 2691 |
+
glove
|
| 2692 |
+
skyline
|
| 2693 |
+
wall
|
| 2694 |
+
glacier
|
| 2695 |
+
airport terminal
|
| 2696 |
+
bug
|
| 2697 |
+
trim
|
| 2698 |
+
railway station
|
| 2699 |
+
briefcase
|
| 2700 |
+
flat
|
| 2701 |
+
fountain
|
| 2702 |
+
person
|
| 2703 |
+
lane
|
| 2704 |
+
asparagus
|
| 2705 |
+
art
|
| 2706 |
+
lantern
|
| 2707 |
+
dishwasher
|
| 2708 |
+
director
|
| 2709 |
+
snake
|
| 2710 |
+
lecture
|
| 2711 |
+
game controller
|
| 2712 |
+
tree branch
|
| 2713 |
+
pub
|
| 2714 |
+
bathing suit
|
| 2715 |
+
queue
|
| 2716 |
+
belly
|
| 2717 |
+
poppy
|
| 2718 |
+
bow
|
| 2719 |
+
pitcher
|
| 2720 |
+
ice cream cone
|
| 2721 |
+
cave
|
| 2722 |
+
candy
|
| 2723 |
+
road bridge
|
| 2724 |
+
host
|
| 2725 |
+
traffic jam
|
| 2726 |
+
earring
|
| 2727 |
+
file
|
| 2728 |
+
foot
|
| 2729 |
+
watermark overlay stamp
|
| 2730 |
+
mailbox
|
| 2731 |
+
supercar
|
| 2732 |
+
railing
|
| 2733 |
+
bedroom
|
| 2734 |
+
seafood
|
| 2735 |
+
waffle
|
| 2736 |
+
bronze statue
|
| 2737 |
+
plan
|
| 2738 |
+
flow
|
| 2739 |
+
marble
|
| 2740 |
+
basketball game
|
| 2741 |
+
automobile
|
| 2742 |
+
scene
|
| 2743 |
+
cypress tree
|
| 2744 |
+
soldier
|
| 2745 |
+
skateboarder
|
| 2746 |
+
glass building
|
| 2747 |
+
cherry tree
|
| 2748 |
+
pump
|
| 2749 |
+
grain
|
| 2750 |
+
wildebeest
|
| 2751 |
+
loop
|
| 2752 |
+
frame
|
| 2753 |
+
bathtub
|
| 2754 |
+
saxophone
|
| 2755 |
+
diver
|
| 2756 |
+
stalk
|
| 2757 |
+
lily
|
| 2758 |
+
bead
|
| 2759 |
+
alley
|
| 2760 |
+
flock
|
| 2761 |
+
family room
|
| 2762 |
+
manufacturing
|
| 2763 |
+
pointer
|
| 2764 |
+
worker
|
| 2765 |
+
navy
|
| 2766 |
+
potato
|
| 2767 |
+
teacher
|
| 2768 |
+
photography
|
| 2769 |
+
dolly
|
| 2770 |
+
boardwalk
|
| 2771 |
+
water fountain
|
| 2772 |
+
athlete
|
| 2773 |
+
side dish
|
| 2774 |
+
bay
|
| 2775 |
+
ice hockey
|
| 2776 |
+
phone
|
| 2777 |
+
hero
|
| 2778 |
+
face
|
| 2779 |
+
gold medal
|
| 2780 |
+
blind
|
| 2781 |
+
swamp
|
| 2782 |
+
researcher
|
| 2783 |
+
swim
|
| 2784 |
+
meatball
|
| 2785 |
+
iguana
|
| 2786 |
+
leather jacket
|
| 2787 |
+
jellyfish
|
| 2788 |
+
site
|
| 2789 |
+
smoke
|
| 2790 |
+
traffic signal
|
| 2791 |
+
melon
|
| 2792 |
+
beetle
|
| 2793 |
+
calculator
|
| 2794 |
+
skirt
|
| 2795 |
+
plantation
|
| 2796 |
+
sculptor
|
| 2797 |
+
barrier
|
| 2798 |
+
catcher
|
| 2799 |
+
security guard
|
| 2800 |
+
sketch
|
| 2801 |
+
awning
|
| 2802 |
+
steering wheel
|
| 2803 |
+
mountain view
|
| 2804 |
+
bus stop
|
| 2805 |
+
pool
|
| 2806 |
+
leg
|
| 2807 |
+
spotlight
|
| 2808 |
+
apron
|
| 2809 |
+
mineral
|
| 2810 |
+
inlet
|
| 2811 |
+
sleeve
|
| 2812 |
+
torch
|
| 2813 |
+
emotion
|
| 2814 |
+
march
|
| 2815 |
+
police officer
|
| 2816 |
+
performance
|
| 2817 |
+
lamp post
|
| 2818 |
+
fishing boat
|
| 2819 |
+
summer
|
| 2820 |
+
presentation
|
| 2821 |
+
saucer
|
| 2822 |
+
suitcase
|
| 2823 |
+
supermodel
|
| 2824 |
+
goalkeeper
|
| 2825 |
+
shrub
|
| 2826 |
+
rock artist
|
| 2827 |
+
document
|
| 2828 |
+
beach house
|
| 2829 |
+
man
|
| 2830 |
+
blue artist
|
| 2831 |
+
cigar
|
| 2832 |
+
railroad track
|
| 2833 |
+
gown
|
| 2834 |
+
mosaic
|
| 2835 |
+
bungalow
|
| 2836 |
+
alphabet
|
| 2837 |
+
baseball field
|
| 2838 |
+
shed
|
| 2839 |
+
pedestrian
|
| 2840 |
+
rail
|
| 2841 |
+
soap
|
| 2842 |
+
kitchen counter
|
| 2843 |
+
dessert
|
| 2844 |
+
dunk
|
| 2845 |
+
blossom
|
| 2846 |
+
conversation
|
| 2847 |
+
fruit market
|
| 2848 |
+
glass jar
|
| 2849 |
+
military
|
| 2850 |
+
beer bottle
|
| 2851 |
+
photographer
|
| 2852 |
+
tennis racket
|
| 2853 |
+
competition
|
| 2854 |
+
escalator
|
| 2855 |
+
bell tower
|
| 2856 |
+
stilt
|
| 2857 |
+
ballerina
|
| 2858 |
+
television
|
| 2859 |
+
feather
|
| 2860 |
+
fence post
|
| 2861 |
+
rear
|
| 2862 |
+
dahlia
|
| 2863 |
+
red carpet
|
| 2864 |
+
tub
|
| 2865 |
+
hole
|
| 2866 |
+
fortress
|
| 2867 |
+
pack
|
| 2868 |
+
telephone
|
| 2869 |
+
cardboard
|
| 2870 |
+
city park
|
| 2871 |
+
platform
|
| 2872 |
+
college student
|
| 2873 |
+
arch bridge
|
| 2874 |
+
wind
|
| 2875 |
+
blender
|
| 2876 |
+
bloom
|
| 2877 |
+
ice rink
|
| 2878 |
+
birthday
|
| 2879 |
+
raven
|
| 2880 |
+
fairy
|
| 2881 |
+
embankment
|
| 2882 |
+
hall
|
| 2883 |
+
flower shop
|
| 2884 |
+
suburb
|
| 2885 |
+
barrel
|
| 2886 |
+
biker
|
| 2887 |
+
steam
|
| 2888 |
+
dragonfly
|
| 2889 |
+
formation
|
| 2890 |
+
electricity
|
| 2891 |
+
business people
|
| 2892 |
+
symmetry
|
| 2893 |
+
walkway
|
| 2894 |
+
fisherman
|
| 2895 |
+
gas mask
|
| 2896 |
+
loch
|
| 2897 |
+
youth
|
| 2898 |
+
hanger
|
| 2899 |
+
dot
|
| 2900 |
+
fish
|
| 2901 |
+
street market
|
| 2902 |
+
animation film
|
| 2903 |
+
crime fiction film
|
| 2904 |
+
boar
|
| 2905 |
+
emblem
|
| 2906 |
+
halloween costume
|
| 2907 |
+
kangaroo
|
| 2908 |
+
couple
|
| 2909 |
+
spoon
|
| 2910 |
+
squirrel
|
| 2911 |
+
neon sign
|
| 2912 |
+
sky
|
| 2913 |
+
office desk
|
| 2914 |
+
beauty salon
|
| 2915 |
+
breakwater
|
| 2916 |
+
fashion look
|
| 2917 |
+
toaster
|
| 2918 |
+
author
|
| 2919 |
+
news conference
|
| 2920 |
+
outdoor
|
| 2921 |
+
canoe
|
| 2922 |
+
dragon
|
| 2923 |
+
tool
|
| 2924 |
+
shopping centre
|
| 2925 |
+
ladybug
|
| 2926 |
+
swimming pool
|
| 2927 |
+
landscaping
|
| 2928 |
+
ski pole
|
| 2929 |
+
red
|
| 2930 |
+
truck
|
| 2931 |
+
fly
|
| 2932 |
+
temple
|
| 2933 |
+
level
|
| 2934 |
+
sunday
|
| 2935 |
+
railroad bridge
|
| 2936 |
+
car mirror
|
| 2937 |
+
lawn mower
|
| 2938 |
+
flute
|
| 2939 |
+
aircraft carrier
|
| 2940 |
+
fashion menswear london week
|
| 2941 |
+
sunshine
|
| 2942 |
+
tile floor
|
| 2943 |
+
skull
|
| 2944 |
+
fossil
|
| 2945 |
+
flower arrangement
|
| 2946 |
+
diaper
|
| 2947 |
+
sea turtle
|
| 2948 |
+
cherry blossom
|
| 2949 |
+
fireman
|
| 2950 |
+
shack
|
| 2951 |
+
lens
|
| 2952 |
+
waiter
|
| 2953 |
+
animal
|
| 2954 |
+
basement
|
| 2955 |
+
snow
|
| 2956 |
+
autumn park
|
| 2957 |
+
glass box
|
| 2958 |
+
kick
|
| 2959 |
+
head
|
| 2960 |
+
anniversary
|
| 2961 |
+
vine
|
| 2962 |
+
back
|
| 2963 |
+
paper lantern
|
| 2964 |
+
fish tank
|
| 2965 |
+
cellphone
|
| 2966 |
+
silk
|
| 2967 |
+
coral
|
| 2968 |
+
notebook
|
| 2969 |
+
photo
|
| 2970 |
+
gazebo
|
| 2971 |
+
ketchup
|
| 2972 |
+
driver
|
| 2973 |
+
farmer
|
| 2974 |
+
bonfire
|
| 2975 |
+
chestnut
|
| 2976 |
+
photoshoot
|
| 2977 |
+
football field
|
| 2978 |
+
olive tree
|
| 2979 |
+
pheasant
|
| 2980 |
+
sandal
|
| 2981 |
+
toilet
|
| 2982 |
+
fireplace
|
| 2983 |
+
music
|
| 2984 |
+
deity
|
| 2985 |
+
fish market
|
| 2986 |
+
fig
|
| 2987 |
+
bell
|
| 2988 |
+
neck
|
| 2989 |
+
grave
|
| 2990 |
+
villa
|
| 2991 |
+
cyclist
|
| 2992 |
+
crate
|
| 2993 |
+
grey
|
| 2994 |
+
asphalt road
|
| 2995 |
+
soccer
|
| 2996 |
+
hostel
|
| 2997 |
+
municipality
|
| 2998 |
+
courthouse
|
| 2999 |
+
roof
|
| 3000 |
+
end table
|
| 3001 |
+
pot
|
| 3002 |
+
sedan
|
| 3003 |
+
structure
|
| 3004 |
+
folk artist
|
| 3005 |
+
sport
|
| 3006 |
+
sport team
|
| 3007 |
+
protest
|
| 3008 |
+
syringe
|
| 3009 |
+
fashion designer
|
| 3010 |
+
jersey
|
| 3011 |
+
heart shape
|
| 3012 |
+
kayak
|
| 3013 |
+
stare
|
| 3014 |
+
sit with
|
| 3015 |
+
direct
|
| 3016 |
+
read
|
| 3017 |
+
photograph
|
| 3018 |
+
spin
|
| 3019 |
+
teach
|
| 3020 |
+
laugh
|
| 3021 |
+
carve
|
| 3022 |
+
grow on
|
| 3023 |
+
warm
|
| 3024 |
+
watch
|
| 3025 |
+
stretch
|
| 3026 |
+
smell
|
| 3027 |
+
decorate
|
| 3028 |
+
shine
|
| 3029 |
+
light
|
| 3030 |
+
dance
|
| 3031 |
+
send
|
| 3032 |
+
park
|
| 3033 |
+
chase
|
| 3034 |
+
collect
|
| 3035 |
+
lead
|
| 3036 |
+
kiss
|
| 3037 |
+
lead to
|
| 3038 |
+
lick
|
| 3039 |
+
smile
|
| 3040 |
+
cheer
|
| 3041 |
+
sit
|
| 3042 |
+
point
|
| 3043 |
+
block
|
| 3044 |
+
rock
|
| 3045 |
+
drop
|
| 3046 |
+
cut
|
| 3047 |
+
ski
|
| 3048 |
+
wrap
|
| 3049 |
+
lose
|
| 3050 |
+
serve
|
| 3051 |
+
provide
|
| 3052 |
+
sleep
|
| 3053 |
+
dress
|
| 3054 |
+
embrace
|
| 3055 |
+
burn
|
| 3056 |
+
pack
|
| 3057 |
+
stir
|
| 3058 |
+
create
|
| 3059 |
+
touch
|
| 3060 |
+
wash
|
| 3061 |
+
stick
|
| 3062 |
+
reveal
|
| 3063 |
+
shop
|
| 3064 |
+
train
|
| 3065 |
+
paint
|
| 3066 |
+
groom
|
| 3067 |
+
hunt
|
| 3068 |
+
bloom
|
| 3069 |
+
play
|
| 3070 |
+
pay
|
| 3071 |
+
brush
|
| 3072 |
+
shoot
|
| 3073 |
+
hold
|
| 3074 |
+
picture
|
| 3075 |
+
carry
|
| 3076 |
+
sip
|
| 3077 |
+
contain
|
| 3078 |
+
turn
|
| 3079 |
+
pour
|
| 3080 |
+
pitch
|
| 3081 |
+
give
|
| 3082 |
+
add
|
| 3083 |
+
blow
|
| 3084 |
+
look in
|
| 3085 |
+
show
|
| 3086 |
+
walk
|
| 3087 |
+
illuminate
|
| 3088 |
+
kneel
|
| 3089 |
+
cover
|
| 3090 |
+
drag
|
| 3091 |
+
post
|
| 3092 |
+
present
|
| 3093 |
+
fit
|
| 3094 |
+
operate
|
| 3095 |
+
fish
|
| 3096 |
+
race
|
| 3097 |
+
write
|
| 3098 |
+
deliver
|
| 3099 |
+
peel
|
| 3100 |
+
push
|
| 3101 |
+
run
|
| 3102 |
+
sit around
|
| 3103 |
+
buy
|
| 3104 |
+
jump
|
| 3105 |
+
walk on
|
| 3106 |
+
attend
|
| 3107 |
+
clean
|
| 3108 |
+
sell
|
| 3109 |
+
ride on
|
| 3110 |
+
mount
|
| 3111 |
+
host
|
| 3112 |
+
dry
|
| 3113 |
+
plant
|
| 3114 |
+
sing
|
| 3115 |
+
row
|
| 3116 |
+
shake
|
| 3117 |
+
perch
|
| 3118 |
+
ride
|
| 3119 |
+
fight
|
| 3120 |
+
skateboard
|
| 3121 |
+
live
|
| 3122 |
+
call
|
| 3123 |
+
surround
|
| 3124 |
+
practice
|
| 3125 |
+
play on
|
| 3126 |
+
work on
|
| 3127 |
+
step
|
| 3128 |
+
relax
|
| 3129 |
+
hit
|
| 3130 |
+
fall in
|
| 3131 |
+
flow
|
| 3132 |
+
greet
|
| 3133 |
+
launch
|
| 3134 |
+
wear
|
| 3135 |
+
hang on
|
| 3136 |
+
drive
|
| 3137 |
+
sit in
|
| 3138 |
+
break
|
| 3139 |
+
learn
|
| 3140 |
+
fly
|
| 3141 |
+
connect
|
| 3142 |
+
display
|
| 3143 |
+
locate
|
| 3144 |
+
compete
|
| 3145 |
+
go for
|
| 3146 |
+
sail
|
| 3147 |
+
lift
|
| 3148 |
+
toast
|
| 3149 |
+
help
|
| 3150 |
+
run on
|
| 3151 |
+
reflect
|
| 3152 |
+
pose
|
| 3153 |
+
scratch
|
| 3154 |
+
frame
|
| 3155 |
+
dribble
|
| 3156 |
+
herd
|
| 3157 |
+
enter
|
| 3158 |
+
exit
|
| 3159 |
+
place
|
| 3160 |
+
inspect
|
| 3161 |
+
build
|
| 3162 |
+
pick
|
| 3163 |
+
fill
|
| 3164 |
+
grind
|
| 3165 |
+
skate
|
| 3166 |
+
offer
|
| 3167 |
+
float
|
| 3168 |
+
sit by
|
| 3169 |
+
stand
|
| 3170 |
+
release
|
| 3171 |
+
rest
|
| 3172 |
+
singe
|
| 3173 |
+
climb
|
| 3174 |
+
tie
|
| 3175 |
+
mark
|
| 3176 |
+
lay
|
| 3177 |
+
stand around
|
| 3178 |
+
capture
|
| 3179 |
+
set
|
| 3180 |
+
land
|
| 3181 |
+
swinge
|
| 3182 |
+
run in
|
| 3183 |
+
kick
|
| 3184 |
+
lean
|
| 3185 |
+
head
|
| 3186 |
+
sign
|
| 3187 |
+
approach
|
| 3188 |
+
swim
|
| 3189 |
+
close
|
| 3190 |
+
crash
|
| 3191 |
+
control
|
| 3192 |
+
fall
|
| 3193 |
+
remove
|
| 3194 |
+
repair
|
| 3195 |
+
open
|
| 3196 |
+
appear
|
| 3197 |
+
travel
|
| 3198 |
+
load
|
| 3199 |
+
miss
|
| 3200 |
+
check
|
| 3201 |
+
surf
|
| 3202 |
+
moor
|
| 3203 |
+
smoke
|
| 3204 |
+
drink
|
| 3205 |
+
board
|
| 3206 |
+
seat
|
| 3207 |
+
feed
|
| 3208 |
+
rise
|
| 3209 |
+
sit on
|
| 3210 |
+
swing
|
| 3211 |
+
grow
|
| 3212 |
+
strike
|
| 3213 |
+
date
|
| 3214 |
+
slide
|
| 3215 |
+
share
|
| 3216 |
+
graze
|
| 3217 |
+
jump in
|
| 3218 |
+
lie
|
| 3219 |
+
extrude
|
| 3220 |
+
roll
|
| 3221 |
+
move
|
| 3222 |
+
gather
|
| 3223 |
+
eat
|
| 3224 |
+
pull
|
| 3225 |
+
run through
|
| 3226 |
+
squeeze
|
| 3227 |
+
lay on
|
| 3228 |
+
draw
|
| 3229 |
+
play with
|
| 3230 |
+
wave
|
| 3231 |
+
assemble
|
| 3232 |
+
perform
|
| 3233 |
+
march
|
| 3234 |
+
score
|
| 3235 |
+
attach
|
| 3236 |
+
adjust
|
| 3237 |
+
hang
|
| 3238 |
+
hug
|
| 3239 |
+
sleep on
|
| 3240 |
+
throw
|
| 3241 |
+
live in
|
| 3242 |
+
talk
|
| 3243 |
+
pet
|
| 3244 |
+
work
|
| 3245 |
+
run with
|
| 3246 |
+
see
|
| 3247 |
+
flip
|
| 3248 |
+
catch
|
| 3249 |
+
cook
|
| 3250 |
+
receive
|
| 3251 |
+
celebrate
|
| 3252 |
+
look
|
| 3253 |
+
classic
|
| 3254 |
+
bridal
|
| 3255 |
+
indoor
|
| 3256 |
+
industrial
|
| 3257 |
+
teenage
|
| 3258 |
+
mini
|
| 3259 |
+
grassy
|
| 3260 |
+
aged
|
| 3261 |
+
long
|
| 3262 |
+
warm
|
| 3263 |
+
light
|
| 3264 |
+
handsome
|
| 3265 |
+
happy
|
| 3266 |
+
three
|
| 3267 |
+
pregnant
|
| 3268 |
+
circular
|
| 3269 |
+
urban
|
| 3270 |
+
silver
|
| 3271 |
+
ceramic
|
| 3272 |
+
3d
|
| 3273 |
+
green
|
| 3274 |
+
blonde
|
| 3275 |
+
golden
|
| 3276 |
+
dark
|
| 3277 |
+
tropical
|
| 3278 |
+
ripe
|
| 3279 |
+
deep
|
| 3280 |
+
fat
|
| 3281 |
+
musical
|
| 3282 |
+
giant
|
| 3283 |
+
medical
|
| 3284 |
+
medieval
|
| 3285 |
+
bare
|
| 3286 |
+
stunning
|
| 3287 |
+
bold
|
| 3288 |
+
geographical
|
| 3289 |
+
huge
|
| 3290 |
+
plastic
|
| 3291 |
+
foggy
|
| 3292 |
+
stormy
|
| 3293 |
+
gothic
|
| 3294 |
+
biological
|
| 3295 |
+
empty
|
| 3296 |
+
clear
|
| 3297 |
+
antique
|
| 3298 |
+
pink
|
| 3299 |
+
steep
|
| 3300 |
+
brown
|
| 3301 |
+
striped
|
| 3302 |
+
aerial
|
| 3303 |
+
rainy
|
| 3304 |
+
cool
|
| 3305 |
+
flying
|
| 3306 |
+
commercial
|
| 3307 |
+
purple
|
| 3308 |
+
trendy
|
| 3309 |
+
blank
|
| 3310 |
+
haired
|
| 3311 |
+
dead
|
| 3312 |
+
wooden
|
| 3313 |
+
flat
|
| 3314 |
+
high
|
| 3315 |
+
beige
|
| 3316 |
+
panoramic
|
| 3317 |
+
angry
|
| 3318 |
+
dozen
|
| 3319 |
+
rural
|
| 3320 |
+
solar
|
| 3321 |
+
big
|
| 3322 |
+
small
|
| 3323 |
+
stained
|
| 3324 |
+
thick
|
| 3325 |
+
many
|
| 3326 |
+
fresh
|
| 3327 |
+
clean
|
| 3328 |
+
strong
|
| 3329 |
+
abstract
|
| 3330 |
+
crowded
|
| 3331 |
+
retro
|
| 3332 |
+
dry
|
| 3333 |
+
gorgeous
|
| 3334 |
+
martial
|
| 3335 |
+
modern
|
| 3336 |
+
blue
|
| 3337 |
+
cloudy
|
| 3338 |
+
low
|
| 3339 |
+
four
|
| 3340 |
+
outdoor
|
| 3341 |
+
single
|
| 3342 |
+
much
|
| 3343 |
+
beautiful
|
| 3344 |
+
snowy
|
| 3345 |
+
pretty
|
| 3346 |
+
new
|
| 3347 |
+
short
|
| 3348 |
+
sunny
|
| 3349 |
+
closed
|
| 3350 |
+
rocky
|
| 3351 |
+
red
|
| 3352 |
+
two
|
| 3353 |
+
double
|
| 3354 |
+
male
|
| 3355 |
+
gray
|
| 3356 |
+
five
|
| 3357 |
+
colorful
|
| 3358 |
+
automotive
|
| 3359 |
+
various
|
| 3360 |
+
one
|
| 3361 |
+
old
|
| 3362 |
+
rusty
|
| 3363 |
+
tall
|
| 3364 |
+
wild
|
| 3365 |
+
narrow
|
| 3366 |
+
natural
|
| 3367 |
+
several
|
| 3368 |
+
frozen
|
| 3369 |
+
textured
|
| 3370 |
+
lush
|
| 3371 |
+
young
|
| 3372 |
+
hot
|
| 3373 |
+
mixed
|
| 3374 |
+
white
|
| 3375 |
+
float
|
| 3376 |
+
quiet
|
| 3377 |
+
round
|
| 3378 |
+
bright
|
| 3379 |
+
religious
|
| 3380 |
+
female
|
| 3381 |
+
historical
|
| 3382 |
+
shiny
|
| 3383 |
+
traditional
|
| 3384 |
+
tourist
|
| 3385 |
+
yellow
|
| 3386 |
+
bald
|
| 3387 |
+
coastal
|
| 3388 |
+
lovely
|
| 3389 |
+
little
|
| 3390 |
+
broken
|
| 3391 |
+
romantic
|
| 3392 |
+
wide
|
| 3393 |
+
royal
|
| 3394 |
+
rich
|
| 3395 |
+
open
|
| 3396 |
+
cute
|
| 3397 |
+
ancient
|
| 3398 |
+
cold
|
| 3399 |
+
political
|
| 3400 |
+
elderly
|
| 3401 |
+
gold
|
| 3402 |
+
full
|
| 3403 |
+
rustic
|
| 3404 |
+
metallic
|
| 3405 |
+
floral
|
| 3406 |
+
sad
|
| 3407 |
+
wet
|
| 3408 |
+
fancy
|
| 3409 |
+
senior
|
| 3410 |
+
tiny
|
| 3411 |
+
stylish
|
| 3412 |
+
large
|
| 3413 |
+
frosty
|
| 3414 |
+
orange
|
| 3415 |
+
transparent
|
| 3416 |
+
electronic
|
| 3417 |
+
shallow
|
| 3418 |
+
scared
|
| 3419 |
+
armed
|
| 3420 |
+
dirty
|
| 3421 |
+
historic
|
| 3422 |
+
black
|
| 3423 |
+
few
|
| 3424 |
+
windy
|
| 3425 |
+
some
|
| 3426 |
+
square
|
| 3427 |
+
ornamental
|
| 3428 |
+
sandy
|
| 3429 |
+
thin
|
ram/inference.py
ADDED
|
@@ -0,0 +1,46 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* The Inference of RAM and Tag2Text Models
|
| 3 |
+
* Written by Xinyu Huang
|
| 4 |
+
'''
|
| 5 |
+
import torch
|
| 6 |
+
|
| 7 |
+
|
| 8 |
+
def inference_tag2text(image, model, input_tag="None"):
|
| 9 |
+
|
| 10 |
+
with torch.no_grad():
|
| 11 |
+
caption, tag_predict = model.generate(image,
|
| 12 |
+
tag_input=None,
|
| 13 |
+
max_length=50,
|
| 14 |
+
return_tag_predict=True)
|
| 15 |
+
|
| 16 |
+
if input_tag == '' or input_tag == 'none' or input_tag == 'None':
|
| 17 |
+
return tag_predict[0], None, caption[0]
|
| 18 |
+
|
| 19 |
+
# If user input specified tags:
|
| 20 |
+
else:
|
| 21 |
+
input_tag_list = []
|
| 22 |
+
input_tag_list.append(input_tag.replace(',', ' | '))
|
| 23 |
+
|
| 24 |
+
with torch.no_grad():
|
| 25 |
+
caption, input_tag = model.generate(image,
|
| 26 |
+
tag_input=input_tag_list,
|
| 27 |
+
max_length=50,
|
| 28 |
+
return_tag_predict=True)
|
| 29 |
+
|
| 30 |
+
return tag_predict[0], input_tag[0], caption[0]
|
| 31 |
+
|
| 32 |
+
|
| 33 |
+
def inference_ram(image, model):
|
| 34 |
+
|
| 35 |
+
with torch.no_grad():
|
| 36 |
+
tags, tags_chinese = model.generate_tag(image)
|
| 37 |
+
|
| 38 |
+
return tags
|
| 39 |
+
|
| 40 |
+
|
| 41 |
+
def inference_ram_openset(image, model):
|
| 42 |
+
|
| 43 |
+
with torch.no_grad():
|
| 44 |
+
tags = model.generate_tag_openset(image)
|
| 45 |
+
|
| 46 |
+
return tags[0]
|
ram/models/__init__.py
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from .ram import ram
|
| 2 |
+
from .tag2text import tag2text
|
ram/models/bert.py
ADDED
|
@@ -0,0 +1,1035 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* Copyright (c) 2022, salesforce.com, inc.
|
| 3 |
+
* All rights reserved.
|
| 4 |
+
* SPDX-License-Identifier: BSD-3-Clause
|
| 5 |
+
* For full license text, see LICENSE.txt file in the repo root or https://opensource.org/licenses/BSD-3-Clause
|
| 6 |
+
* By Junnan Li
|
| 7 |
+
* Based on huggingface code base
|
| 8 |
+
* https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert
|
| 9 |
+
'''
|
| 10 |
+
|
| 11 |
+
import math
|
| 12 |
+
import os
|
| 13 |
+
import warnings
|
| 14 |
+
from dataclasses import dataclass
|
| 15 |
+
from typing import Optional, Tuple
|
| 16 |
+
|
| 17 |
+
import torch
|
| 18 |
+
from torch import Tensor, device, dtype, nn
|
| 19 |
+
import torch.utils.checkpoint
|
| 20 |
+
from torch import nn
|
| 21 |
+
from torch.nn import CrossEntropyLoss
|
| 22 |
+
import torch.nn.functional as F
|
| 23 |
+
|
| 24 |
+
from transformers.activations import ACT2FN
|
| 25 |
+
from transformers.file_utils import (
|
| 26 |
+
ModelOutput,
|
| 27 |
+
)
|
| 28 |
+
from transformers.modeling_outputs import (
|
| 29 |
+
BaseModelOutputWithPastAndCrossAttentions,
|
| 30 |
+
BaseModelOutputWithPoolingAndCrossAttentions,
|
| 31 |
+
CausalLMOutputWithCrossAttentions,
|
| 32 |
+
MaskedLMOutput,
|
| 33 |
+
MultipleChoiceModelOutput,
|
| 34 |
+
NextSentencePredictorOutput,
|
| 35 |
+
QuestionAnsweringModelOutput,
|
| 36 |
+
SequenceClassifierOutput,
|
| 37 |
+
TokenClassifierOutput,
|
| 38 |
+
)
|
| 39 |
+
from transformers.modeling_utils import (
|
| 40 |
+
PreTrainedModel,
|
| 41 |
+
apply_chunking_to_forward,
|
| 42 |
+
find_pruneable_heads_and_indices,
|
| 43 |
+
prune_linear_layer,
|
| 44 |
+
)
|
| 45 |
+
from transformers.utils import logging
|
| 46 |
+
from transformers.models.bert.configuration_bert import BertConfig
|
| 47 |
+
|
| 48 |
+
|
| 49 |
+
logger = logging.get_logger(__name__)
|
| 50 |
+
|
| 51 |
+
|
| 52 |
+
class BertEmbeddings_nopos(nn.Module):
|
| 53 |
+
"""Construct the embeddings from word and position embeddings."""
|
| 54 |
+
|
| 55 |
+
def __init__(self, config):
|
| 56 |
+
super().__init__()
|
| 57 |
+
self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size, padding_idx=config.pad_token_id)
|
| 58 |
+
# self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)
|
| 59 |
+
|
| 60 |
+
# self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load
|
| 61 |
+
# any TensorFlow checkpoint file
|
| 62 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 63 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 64 |
+
|
| 65 |
+
# position_ids (1, len position emb) is contiguous in memory and exported when serialized
|
| 66 |
+
# self.register_buffer("position_ids", torch.arange(config.max_position_embeddings).expand((1, -1)))
|
| 67 |
+
# self.position_embedding_type = getattr(config, "position_embedding_type", "absolute")
|
| 68 |
+
|
| 69 |
+
self.config = config
|
| 70 |
+
|
| 71 |
+
def forward(
|
| 72 |
+
self, input_ids=None, position_ids=None, inputs_embeds=None, past_key_values_length=0
|
| 73 |
+
):
|
| 74 |
+
if input_ids is not None:
|
| 75 |
+
input_shape = input_ids.size()
|
| 76 |
+
else:
|
| 77 |
+
input_shape = inputs_embeds.size()[:-1]
|
| 78 |
+
|
| 79 |
+
seq_length = input_shape[1]
|
| 80 |
+
|
| 81 |
+
# if position_ids is None:
|
| 82 |
+
# position_ids = self.position_ids[:, past_key_values_length : seq_length + past_key_values_length]
|
| 83 |
+
|
| 84 |
+
if inputs_embeds is None:
|
| 85 |
+
inputs_embeds = self.word_embeddings(input_ids)
|
| 86 |
+
|
| 87 |
+
embeddings = inputs_embeds
|
| 88 |
+
|
| 89 |
+
# if self.position_embedding_type == "absolute":
|
| 90 |
+
# position_embeddings = self.position_embeddings(position_ids)
|
| 91 |
+
# # print('add position_embeddings!!!!')
|
| 92 |
+
# embeddings += position_embeddings
|
| 93 |
+
embeddings = self.LayerNorm(embeddings)
|
| 94 |
+
embeddings = self.dropout(embeddings)
|
| 95 |
+
return embeddings
|
| 96 |
+
|
| 97 |
+
|
| 98 |
+
|
| 99 |
+
|
| 100 |
+
class BertEmbeddings(nn.Module):
|
| 101 |
+
"""Construct the embeddings from word and position embeddings."""
|
| 102 |
+
|
| 103 |
+
def __init__(self, config):
|
| 104 |
+
super().__init__()
|
| 105 |
+
self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size, padding_idx=config.pad_token_id)
|
| 106 |
+
self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)
|
| 107 |
+
|
| 108 |
+
# self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load
|
| 109 |
+
# any TensorFlow checkpoint file
|
| 110 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 111 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 112 |
+
|
| 113 |
+
# position_ids (1, len position emb) is contiguous in memory and exported when serialized
|
| 114 |
+
self.register_buffer("position_ids", torch.arange(config.max_position_embeddings).expand((1, -1)))
|
| 115 |
+
self.position_embedding_type = getattr(config, "position_embedding_type", "absolute")
|
| 116 |
+
|
| 117 |
+
self.config = config
|
| 118 |
+
|
| 119 |
+
def forward(
|
| 120 |
+
self, input_ids=None, position_ids=None, inputs_embeds=None, past_key_values_length=0
|
| 121 |
+
):
|
| 122 |
+
if input_ids is not None:
|
| 123 |
+
input_shape = input_ids.size()
|
| 124 |
+
else:
|
| 125 |
+
input_shape = inputs_embeds.size()[:-1]
|
| 126 |
+
|
| 127 |
+
seq_length = input_shape[1]
|
| 128 |
+
|
| 129 |
+
if position_ids is None:
|
| 130 |
+
position_ids = self.position_ids[:, past_key_values_length : seq_length + past_key_values_length]
|
| 131 |
+
|
| 132 |
+
if inputs_embeds is None:
|
| 133 |
+
inputs_embeds = self.word_embeddings(input_ids)
|
| 134 |
+
|
| 135 |
+
embeddings = inputs_embeds
|
| 136 |
+
|
| 137 |
+
if self.position_embedding_type == "absolute":
|
| 138 |
+
position_embeddings = self.position_embeddings(position_ids)
|
| 139 |
+
# print('add position_embeddings!!!!')
|
| 140 |
+
embeddings += position_embeddings
|
| 141 |
+
embeddings = self.LayerNorm(embeddings)
|
| 142 |
+
embeddings = self.dropout(embeddings)
|
| 143 |
+
return embeddings
|
| 144 |
+
|
| 145 |
+
|
| 146 |
+
class BertSelfAttention(nn.Module):
|
| 147 |
+
def __init__(self, config, is_cross_attention):
|
| 148 |
+
super().__init__()
|
| 149 |
+
self.config = config
|
| 150 |
+
if config.hidden_size % config.num_attention_heads != 0 and not hasattr(config, "embedding_size"):
|
| 151 |
+
raise ValueError(
|
| 152 |
+
"The hidden size (%d) is not a multiple of the number of attention "
|
| 153 |
+
"heads (%d)" % (config.hidden_size, config.num_attention_heads)
|
| 154 |
+
)
|
| 155 |
+
|
| 156 |
+
self.num_attention_heads = config.num_attention_heads
|
| 157 |
+
self.attention_head_size = int(config.hidden_size / config.num_attention_heads)
|
| 158 |
+
self.all_head_size = self.num_attention_heads * self.attention_head_size
|
| 159 |
+
|
| 160 |
+
self.query = nn.Linear(config.hidden_size, self.all_head_size)
|
| 161 |
+
if is_cross_attention:
|
| 162 |
+
self.key = nn.Linear(config.encoder_width, self.all_head_size)
|
| 163 |
+
self.value = nn.Linear(config.encoder_width, self.all_head_size)
|
| 164 |
+
else:
|
| 165 |
+
self.key = nn.Linear(config.hidden_size, self.all_head_size)
|
| 166 |
+
self.value = nn.Linear(config.hidden_size, self.all_head_size)
|
| 167 |
+
|
| 168 |
+
self.dropout = nn.Dropout(config.attention_probs_dropout_prob)
|
| 169 |
+
self.position_embedding_type = getattr(config, "position_embedding_type", "absolute")
|
| 170 |
+
if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":
|
| 171 |
+
self.max_position_embeddings = config.max_position_embeddings
|
| 172 |
+
self.distance_embedding = nn.Embedding(2 * config.max_position_embeddings - 1, self.attention_head_size)
|
| 173 |
+
self.save_attention = False
|
| 174 |
+
|
| 175 |
+
def save_attn_gradients(self, attn_gradients):
|
| 176 |
+
self.attn_gradients = attn_gradients
|
| 177 |
+
|
| 178 |
+
def get_attn_gradients(self):
|
| 179 |
+
return self.attn_gradients
|
| 180 |
+
|
| 181 |
+
def save_attention_map(self, attention_map):
|
| 182 |
+
self.attention_map = attention_map
|
| 183 |
+
|
| 184 |
+
def get_attention_map(self):
|
| 185 |
+
return self.attention_map
|
| 186 |
+
|
| 187 |
+
def transpose_for_scores(self, x):
|
| 188 |
+
new_x_shape = x.size()[:-1] + (self.num_attention_heads, self.attention_head_size)
|
| 189 |
+
x = x.view(*new_x_shape)
|
| 190 |
+
return x.permute(0, 2, 1, 3)
|
| 191 |
+
|
| 192 |
+
def forward(
|
| 193 |
+
self,
|
| 194 |
+
hidden_states,
|
| 195 |
+
attention_mask=None,
|
| 196 |
+
head_mask=None,
|
| 197 |
+
encoder_hidden_states=None,
|
| 198 |
+
encoder_attention_mask=None,
|
| 199 |
+
past_key_value=None,
|
| 200 |
+
output_attentions=False,
|
| 201 |
+
):
|
| 202 |
+
mixed_query_layer = self.query(hidden_states)
|
| 203 |
+
|
| 204 |
+
# If this is instantiated as a cross-attention module, the keys
|
| 205 |
+
# and values come from an encoder; the attention mask needs to be
|
| 206 |
+
# such that the encoder's padding tokens are not attended to.
|
| 207 |
+
is_cross_attention = encoder_hidden_states is not None
|
| 208 |
+
|
| 209 |
+
if is_cross_attention:
|
| 210 |
+
# print(self.key.weight.shape)
|
| 211 |
+
key_layer = self.transpose_for_scores(self.key(encoder_hidden_states))
|
| 212 |
+
value_layer = self.transpose_for_scores(self.value(encoder_hidden_states))
|
| 213 |
+
attention_mask = encoder_attention_mask
|
| 214 |
+
elif past_key_value is not None:
|
| 215 |
+
key_layer = self.transpose_for_scores(self.key(hidden_states))
|
| 216 |
+
value_layer = self.transpose_for_scores(self.value(hidden_states))
|
| 217 |
+
key_layer = torch.cat([past_key_value[0], key_layer], dim=2)
|
| 218 |
+
value_layer = torch.cat([past_key_value[1], value_layer], dim=2)
|
| 219 |
+
else:
|
| 220 |
+
key_layer = self.transpose_for_scores(self.key(hidden_states))
|
| 221 |
+
value_layer = self.transpose_for_scores(self.value(hidden_states))
|
| 222 |
+
|
| 223 |
+
query_layer = self.transpose_for_scores(mixed_query_layer)
|
| 224 |
+
|
| 225 |
+
past_key_value = (key_layer, value_layer)
|
| 226 |
+
|
| 227 |
+
# compatible with higher versions of transformers
|
| 228 |
+
if key_layer.shape[0] > query_layer.shape[0]:
|
| 229 |
+
key_layer = key_layer[:query_layer.shape[0], :, :, :]
|
| 230 |
+
attention_mask = attention_mask[:query_layer.shape[0], :, :]
|
| 231 |
+
value_layer = value_layer[:query_layer.shape[0], :, :, :]
|
| 232 |
+
|
| 233 |
+
# Take the dot product between "query" and "key" to get the raw attention scores.
|
| 234 |
+
attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2))
|
| 235 |
+
|
| 236 |
+
if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":
|
| 237 |
+
seq_length = hidden_states.size()[1]
|
| 238 |
+
position_ids_l = torch.arange(seq_length, dtype=torch.long, device=hidden_states.device).view(-1, 1)
|
| 239 |
+
position_ids_r = torch.arange(seq_length, dtype=torch.long, device=hidden_states.device).view(1, -1)
|
| 240 |
+
distance = position_ids_l - position_ids_r
|
| 241 |
+
positional_embedding = self.distance_embedding(distance + self.max_position_embeddings - 1)
|
| 242 |
+
positional_embedding = positional_embedding.to(dtype=query_layer.dtype) # fp16 compatibility
|
| 243 |
+
|
| 244 |
+
if self.position_embedding_type == "relative_key":
|
| 245 |
+
relative_position_scores = torch.einsum("bhld,lrd->bhlr", query_layer, positional_embedding)
|
| 246 |
+
attention_scores = attention_scores + relative_position_scores
|
| 247 |
+
elif self.position_embedding_type == "relative_key_query":
|
| 248 |
+
relative_position_scores_query = torch.einsum("bhld,lrd->bhlr", query_layer, positional_embedding)
|
| 249 |
+
relative_position_scores_key = torch.einsum("bhrd,lrd->bhlr", key_layer, positional_embedding)
|
| 250 |
+
attention_scores = attention_scores + relative_position_scores_query + relative_position_scores_key
|
| 251 |
+
|
| 252 |
+
attention_scores = attention_scores / math.sqrt(self.attention_head_size)
|
| 253 |
+
if attention_mask is not None:
|
| 254 |
+
# Apply the attention mask is (precomputed for all layers in BertModel forward() function)
|
| 255 |
+
attention_scores = attention_scores + attention_mask
|
| 256 |
+
|
| 257 |
+
# Normalize the attention scores to probabilities.
|
| 258 |
+
attention_probs = nn.Softmax(dim=-1)(attention_scores)
|
| 259 |
+
|
| 260 |
+
if is_cross_attention and self.save_attention:
|
| 261 |
+
self.save_attention_map(attention_probs)
|
| 262 |
+
attention_probs.register_hook(self.save_attn_gradients)
|
| 263 |
+
|
| 264 |
+
# This is actually dropping out entire tokens to attend to, which might
|
| 265 |
+
# seem a bit unusual, but is taken from the original Transformer paper.
|
| 266 |
+
attention_probs_dropped = self.dropout(attention_probs)
|
| 267 |
+
|
| 268 |
+
# Mask heads if we want to
|
| 269 |
+
if head_mask is not None:
|
| 270 |
+
attention_probs_dropped = attention_probs_dropped * head_mask
|
| 271 |
+
|
| 272 |
+
context_layer = torch.matmul(attention_probs_dropped, value_layer)
|
| 273 |
+
|
| 274 |
+
context_layer = context_layer.permute(0, 2, 1, 3).contiguous()
|
| 275 |
+
new_context_layer_shape = context_layer.size()[:-2] + (self.all_head_size,)
|
| 276 |
+
context_layer = context_layer.view(*new_context_layer_shape)
|
| 277 |
+
|
| 278 |
+
outputs = (context_layer, attention_probs) if output_attentions else (context_layer,)
|
| 279 |
+
|
| 280 |
+
outputs = outputs + (past_key_value,)
|
| 281 |
+
return outputs
|
| 282 |
+
|
| 283 |
+
|
| 284 |
+
class BertSelfOutput(nn.Module):
|
| 285 |
+
def __init__(self, config):
|
| 286 |
+
super().__init__()
|
| 287 |
+
self.dense = nn.Linear(config.hidden_size, config.hidden_size)
|
| 288 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 289 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 290 |
+
|
| 291 |
+
def forward(self, hidden_states, input_tensor):
|
| 292 |
+
hidden_states = self.dense(hidden_states)
|
| 293 |
+
hidden_states = self.dropout(hidden_states)
|
| 294 |
+
hidden_states = self.LayerNorm(hidden_states + input_tensor)
|
| 295 |
+
return hidden_states
|
| 296 |
+
|
| 297 |
+
|
| 298 |
+
class BertAttention(nn.Module):
|
| 299 |
+
def __init__(self, config, is_cross_attention=False):
|
| 300 |
+
super().__init__()
|
| 301 |
+
self.self = BertSelfAttention(config, is_cross_attention)
|
| 302 |
+
self.output = BertSelfOutput(config)
|
| 303 |
+
self.pruned_heads = set()
|
| 304 |
+
|
| 305 |
+
def prune_heads(self, heads):
|
| 306 |
+
if len(heads) == 0:
|
| 307 |
+
return
|
| 308 |
+
heads, index = find_pruneable_heads_and_indices(
|
| 309 |
+
heads, self.self.num_attention_heads, self.self.attention_head_size, self.pruned_heads
|
| 310 |
+
)
|
| 311 |
+
|
| 312 |
+
# Prune linear layers
|
| 313 |
+
self.self.query = prune_linear_layer(self.self.query, index)
|
| 314 |
+
self.self.key = prune_linear_layer(self.self.key, index)
|
| 315 |
+
self.self.value = prune_linear_layer(self.self.value, index)
|
| 316 |
+
self.output.dense = prune_linear_layer(self.output.dense, index, dim=1)
|
| 317 |
+
|
| 318 |
+
# Update hyper params and store pruned heads
|
| 319 |
+
self.self.num_attention_heads = self.self.num_attention_heads - len(heads)
|
| 320 |
+
self.self.all_head_size = self.self.attention_head_size * self.self.num_attention_heads
|
| 321 |
+
self.pruned_heads = self.pruned_heads.union(heads)
|
| 322 |
+
|
| 323 |
+
def forward(
|
| 324 |
+
self,
|
| 325 |
+
hidden_states,
|
| 326 |
+
attention_mask=None,
|
| 327 |
+
head_mask=None,
|
| 328 |
+
encoder_hidden_states=None,
|
| 329 |
+
encoder_attention_mask=None,
|
| 330 |
+
past_key_value=None,
|
| 331 |
+
output_attentions=False,
|
| 332 |
+
):
|
| 333 |
+
self_outputs = self.self(
|
| 334 |
+
hidden_states,
|
| 335 |
+
attention_mask,
|
| 336 |
+
head_mask,
|
| 337 |
+
encoder_hidden_states,
|
| 338 |
+
encoder_attention_mask,
|
| 339 |
+
past_key_value,
|
| 340 |
+
output_attentions,
|
| 341 |
+
)
|
| 342 |
+
attention_output = self.output(self_outputs[0], hidden_states)
|
| 343 |
+
outputs = (attention_output,) + self_outputs[1:] # add attentions if we output them
|
| 344 |
+
return outputs
|
| 345 |
+
|
| 346 |
+
|
| 347 |
+
class BertIntermediate(nn.Module):
|
| 348 |
+
def __init__(self, config):
|
| 349 |
+
super().__init__()
|
| 350 |
+
self.dense = nn.Linear(config.hidden_size, config.intermediate_size)
|
| 351 |
+
if isinstance(config.hidden_act, str):
|
| 352 |
+
self.intermediate_act_fn = ACT2FN[config.hidden_act]
|
| 353 |
+
else:
|
| 354 |
+
self.intermediate_act_fn = config.hidden_act
|
| 355 |
+
|
| 356 |
+
def forward(self, hidden_states):
|
| 357 |
+
hidden_states = self.dense(hidden_states)
|
| 358 |
+
hidden_states = self.intermediate_act_fn(hidden_states)
|
| 359 |
+
return hidden_states
|
| 360 |
+
|
| 361 |
+
|
| 362 |
+
class BertOutput(nn.Module):
|
| 363 |
+
def __init__(self, config):
|
| 364 |
+
super().__init__()
|
| 365 |
+
self.dense = nn.Linear(config.intermediate_size, config.hidden_size)
|
| 366 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 367 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 368 |
+
|
| 369 |
+
def forward(self, hidden_states, input_tensor):
|
| 370 |
+
hidden_states = self.dense(hidden_states)
|
| 371 |
+
hidden_states = self.dropout(hidden_states)
|
| 372 |
+
hidden_states = self.LayerNorm(hidden_states + input_tensor)
|
| 373 |
+
return hidden_states
|
| 374 |
+
|
| 375 |
+
|
| 376 |
+
class BertLayer(nn.Module):
|
| 377 |
+
def __init__(self, config, layer_num):
|
| 378 |
+
super().__init__()
|
| 379 |
+
self.config = config
|
| 380 |
+
self.chunk_size_feed_forward = config.chunk_size_feed_forward
|
| 381 |
+
self.seq_len_dim = 1
|
| 382 |
+
self.attention = BertAttention(config)
|
| 383 |
+
self.layer_num = layer_num
|
| 384 |
+
if self.config.add_cross_attention:
|
| 385 |
+
self.crossattention = BertAttention(config, is_cross_attention=self.config.add_cross_attention)
|
| 386 |
+
self.intermediate = BertIntermediate(config)
|
| 387 |
+
self.output = BertOutput(config)
|
| 388 |
+
|
| 389 |
+
def forward(
|
| 390 |
+
self,
|
| 391 |
+
hidden_states,
|
| 392 |
+
attention_mask=None,
|
| 393 |
+
head_mask=None,
|
| 394 |
+
encoder_hidden_states=None,
|
| 395 |
+
encoder_attention_mask=None,
|
| 396 |
+
past_key_value=None,
|
| 397 |
+
output_attentions=False,
|
| 398 |
+
mode=None,
|
| 399 |
+
):
|
| 400 |
+
|
| 401 |
+
if mode == 'tagging':
|
| 402 |
+
|
| 403 |
+
assert encoder_hidden_states is not None, "encoder_hidden_states must be given for cross-attention layers"
|
| 404 |
+
|
| 405 |
+
cross_attention_outputs = self.crossattention(
|
| 406 |
+
hidden_states,
|
| 407 |
+
attention_mask,
|
| 408 |
+
head_mask,
|
| 409 |
+
encoder_hidden_states,
|
| 410 |
+
encoder_attention_mask,
|
| 411 |
+
output_attentions=output_attentions,
|
| 412 |
+
)
|
| 413 |
+
attention_output = cross_attention_outputs[0]
|
| 414 |
+
outputs = cross_attention_outputs[1:-1] # add cross attentions if we output attention weights
|
| 415 |
+
|
| 416 |
+
present_key_value = cross_attention_outputs[-1]
|
| 417 |
+
|
| 418 |
+
else:
|
| 419 |
+
# decoder uni-directional self-attention cached key/values tuple is at positions 1,2
|
| 420 |
+
self_attn_past_key_value = past_key_value[:2] if past_key_value is not None else None
|
| 421 |
+
self_attention_outputs = self.attention(
|
| 422 |
+
hidden_states,
|
| 423 |
+
attention_mask,
|
| 424 |
+
head_mask,
|
| 425 |
+
output_attentions=output_attentions,
|
| 426 |
+
past_key_value=self_attn_past_key_value,
|
| 427 |
+
)
|
| 428 |
+
attention_output = self_attention_outputs[0]
|
| 429 |
+
|
| 430 |
+
outputs = self_attention_outputs[1:-1]
|
| 431 |
+
present_key_value = self_attention_outputs[-1]
|
| 432 |
+
|
| 433 |
+
if mode=='multimodal':
|
| 434 |
+
assert encoder_hidden_states is not None, "encoder_hidden_states must be given for cross-attention layers"
|
| 435 |
+
|
| 436 |
+
cross_attention_outputs = self.crossattention(
|
| 437 |
+
attention_output,
|
| 438 |
+
attention_mask,
|
| 439 |
+
head_mask,
|
| 440 |
+
encoder_hidden_states,
|
| 441 |
+
encoder_attention_mask,
|
| 442 |
+
output_attentions=output_attentions,
|
| 443 |
+
)
|
| 444 |
+
attention_output = cross_attention_outputs[0]
|
| 445 |
+
outputs = outputs + cross_attention_outputs[1:-1] # add cross attentions if we output attention weights
|
| 446 |
+
layer_output = apply_chunking_to_forward(
|
| 447 |
+
self.feed_forward_chunk, self.chunk_size_feed_forward, self.seq_len_dim, attention_output
|
| 448 |
+
)
|
| 449 |
+
outputs = (layer_output,) + outputs
|
| 450 |
+
|
| 451 |
+
outputs = outputs + (present_key_value,)
|
| 452 |
+
|
| 453 |
+
return outputs
|
| 454 |
+
|
| 455 |
+
def feed_forward_chunk(self, attention_output):
|
| 456 |
+
intermediate_output = self.intermediate(attention_output)
|
| 457 |
+
layer_output = self.output(intermediate_output, attention_output)
|
| 458 |
+
return layer_output
|
| 459 |
+
|
| 460 |
+
|
| 461 |
+
class BertEncoder(nn.Module):
|
| 462 |
+
def __init__(self, config):
|
| 463 |
+
super().__init__()
|
| 464 |
+
self.config = config
|
| 465 |
+
self.layer = nn.ModuleList([BertLayer(config,i) for i in range(config.num_hidden_layers)])
|
| 466 |
+
self.gradient_checkpointing = False
|
| 467 |
+
|
| 468 |
+
def forward(
|
| 469 |
+
self,
|
| 470 |
+
hidden_states,
|
| 471 |
+
attention_mask=None,
|
| 472 |
+
head_mask=None,
|
| 473 |
+
encoder_hidden_states=None,
|
| 474 |
+
encoder_attention_mask=None,
|
| 475 |
+
past_key_values=None,
|
| 476 |
+
use_cache=None,
|
| 477 |
+
output_attentions=False,
|
| 478 |
+
output_hidden_states=False,
|
| 479 |
+
return_dict=True,
|
| 480 |
+
mode='multimodal',
|
| 481 |
+
):
|
| 482 |
+
all_hidden_states = () if output_hidden_states else None
|
| 483 |
+
all_self_attentions = () if output_attentions else None
|
| 484 |
+
all_cross_attentions = () if output_attentions and self.config.add_cross_attention else None
|
| 485 |
+
|
| 486 |
+
next_decoder_cache = () if use_cache else None
|
| 487 |
+
|
| 488 |
+
for i in range(self.config.num_hidden_layers):
|
| 489 |
+
layer_module = self.layer[i]
|
| 490 |
+
if output_hidden_states:
|
| 491 |
+
all_hidden_states = all_hidden_states + (hidden_states,)
|
| 492 |
+
|
| 493 |
+
layer_head_mask = head_mask[i] if head_mask is not None else None
|
| 494 |
+
past_key_value = past_key_values[i] if past_key_values is not None else None
|
| 495 |
+
|
| 496 |
+
if self.gradient_checkpointing and self.training:
|
| 497 |
+
|
| 498 |
+
if use_cache:
|
| 499 |
+
logger.warn(
|
| 500 |
+
"`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`..."
|
| 501 |
+
)
|
| 502 |
+
use_cache = False
|
| 503 |
+
|
| 504 |
+
def create_custom_forward(module):
|
| 505 |
+
def custom_forward(*inputs):
|
| 506 |
+
return module(*inputs, past_key_value, output_attentions)
|
| 507 |
+
|
| 508 |
+
return custom_forward
|
| 509 |
+
|
| 510 |
+
layer_outputs = torch.utils.checkpoint.checkpoint(
|
| 511 |
+
create_custom_forward(layer_module),
|
| 512 |
+
hidden_states,
|
| 513 |
+
attention_mask,
|
| 514 |
+
layer_head_mask,
|
| 515 |
+
encoder_hidden_states,
|
| 516 |
+
encoder_attention_mask,
|
| 517 |
+
mode=mode,
|
| 518 |
+
)
|
| 519 |
+
else:
|
| 520 |
+
layer_outputs = layer_module(
|
| 521 |
+
hidden_states,
|
| 522 |
+
attention_mask,
|
| 523 |
+
layer_head_mask,
|
| 524 |
+
encoder_hidden_states,
|
| 525 |
+
encoder_attention_mask,
|
| 526 |
+
past_key_value,
|
| 527 |
+
output_attentions,
|
| 528 |
+
mode=mode,
|
| 529 |
+
)
|
| 530 |
+
|
| 531 |
+
hidden_states = layer_outputs[0]
|
| 532 |
+
if use_cache:
|
| 533 |
+
next_decoder_cache += (layer_outputs[-1],)
|
| 534 |
+
if output_attentions:
|
| 535 |
+
all_self_attentions = all_self_attentions + (layer_outputs[1],)
|
| 536 |
+
|
| 537 |
+
if output_hidden_states:
|
| 538 |
+
all_hidden_states = all_hidden_states + (hidden_states,)
|
| 539 |
+
|
| 540 |
+
if not return_dict:
|
| 541 |
+
return tuple(
|
| 542 |
+
v
|
| 543 |
+
for v in [
|
| 544 |
+
hidden_states,
|
| 545 |
+
next_decoder_cache,
|
| 546 |
+
all_hidden_states,
|
| 547 |
+
all_self_attentions,
|
| 548 |
+
all_cross_attentions,
|
| 549 |
+
]
|
| 550 |
+
if v is not None
|
| 551 |
+
)
|
| 552 |
+
return BaseModelOutputWithPastAndCrossAttentions(
|
| 553 |
+
last_hidden_state=hidden_states,
|
| 554 |
+
past_key_values=next_decoder_cache,
|
| 555 |
+
hidden_states=all_hidden_states,
|
| 556 |
+
attentions=all_self_attentions,
|
| 557 |
+
cross_attentions=all_cross_attentions,
|
| 558 |
+
)
|
| 559 |
+
|
| 560 |
+
|
| 561 |
+
class BertPooler(nn.Module):
|
| 562 |
+
def __init__(self, config):
|
| 563 |
+
super().__init__()
|
| 564 |
+
self.dense = nn.Linear(config.hidden_size, config.hidden_size)
|
| 565 |
+
self.activation = nn.Tanh()
|
| 566 |
+
|
| 567 |
+
def forward(self, hidden_states):
|
| 568 |
+
# We "pool" the model by simply taking the hidden state corresponding
|
| 569 |
+
# to the first token.
|
| 570 |
+
first_token_tensor = hidden_states[:, 0]
|
| 571 |
+
pooled_output = self.dense(first_token_tensor)
|
| 572 |
+
pooled_output = self.activation(pooled_output)
|
| 573 |
+
return pooled_output
|
| 574 |
+
|
| 575 |
+
|
| 576 |
+
class BertPredictionHeadTransform(nn.Module):
|
| 577 |
+
def __init__(self, config):
|
| 578 |
+
super().__init__()
|
| 579 |
+
self.dense = nn.Linear(config.hidden_size, config.hidden_size)
|
| 580 |
+
if isinstance(config.hidden_act, str):
|
| 581 |
+
self.transform_act_fn = ACT2FN[config.hidden_act]
|
| 582 |
+
else:
|
| 583 |
+
self.transform_act_fn = config.hidden_act
|
| 584 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 585 |
+
|
| 586 |
+
def forward(self, hidden_states):
|
| 587 |
+
hidden_states = self.dense(hidden_states)
|
| 588 |
+
hidden_states = self.transform_act_fn(hidden_states)
|
| 589 |
+
hidden_states = self.LayerNorm(hidden_states)
|
| 590 |
+
return hidden_states
|
| 591 |
+
|
| 592 |
+
|
| 593 |
+
class BertLMPredictionHead(nn.Module):
|
| 594 |
+
def __init__(self, config):
|
| 595 |
+
super().__init__()
|
| 596 |
+
self.transform = BertPredictionHeadTransform(config)
|
| 597 |
+
|
| 598 |
+
# The output weights are the same as the input embeddings, but there is
|
| 599 |
+
# an output-only bias for each token.
|
| 600 |
+
self.decoder = nn.Linear(config.hidden_size, config.vocab_size, bias=False)
|
| 601 |
+
|
| 602 |
+
self.bias = nn.Parameter(torch.zeros(config.vocab_size))
|
| 603 |
+
|
| 604 |
+
# Need a link between the two variables so that the bias is correctly resized with `resize_token_embeddings`
|
| 605 |
+
self.decoder.bias = self.bias
|
| 606 |
+
|
| 607 |
+
def forward(self, hidden_states):
|
| 608 |
+
hidden_states = self.transform(hidden_states)
|
| 609 |
+
hidden_states = self.decoder(hidden_states)
|
| 610 |
+
return hidden_states
|
| 611 |
+
|
| 612 |
+
|
| 613 |
+
class BertOnlyMLMHead(nn.Module):
|
| 614 |
+
def __init__(self, config):
|
| 615 |
+
super().__init__()
|
| 616 |
+
self.predictions = BertLMPredictionHead(config)
|
| 617 |
+
|
| 618 |
+
def forward(self, sequence_output):
|
| 619 |
+
prediction_scores = self.predictions(sequence_output)
|
| 620 |
+
return prediction_scores
|
| 621 |
+
|
| 622 |
+
|
| 623 |
+
class BertPreTrainedModel(PreTrainedModel):
|
| 624 |
+
"""
|
| 625 |
+
An abstract class to handle weights initialization and a simple interface for downloading and loading pretrained
|
| 626 |
+
models.
|
| 627 |
+
"""
|
| 628 |
+
|
| 629 |
+
config_class = BertConfig
|
| 630 |
+
base_model_prefix = "bert"
|
| 631 |
+
_keys_to_ignore_on_load_missing = [r"position_ids"]
|
| 632 |
+
|
| 633 |
+
def _init_weights(self, module):
|
| 634 |
+
""" Initialize the weights """
|
| 635 |
+
if isinstance(module, (nn.Linear, nn.Embedding)):
|
| 636 |
+
# Slightly different from the TF version which uses truncated_normal for initialization
|
| 637 |
+
# cf https://github.com/pytorch/pytorch/pull/5617
|
| 638 |
+
module.weight.data.normal_(mean=0.0, std=self.config.initializer_range)
|
| 639 |
+
elif isinstance(module, nn.LayerNorm):
|
| 640 |
+
module.bias.data.zero_()
|
| 641 |
+
module.weight.data.fill_(1.0)
|
| 642 |
+
if isinstance(module, nn.Linear) and module.bias is not None:
|
| 643 |
+
module.bias.data.zero_()
|
| 644 |
+
|
| 645 |
+
|
| 646 |
+
class BertModel(BertPreTrainedModel):
|
| 647 |
+
"""
|
| 648 |
+
The model can behave as an encoder (with only self-attention) as well as a decoder, in which case a layer of
|
| 649 |
+
cross-attention is added between the self-attention layers, following the architecture described in `Attention is
|
| 650 |
+
all you need <https://arxiv.org/abs/1706.03762>`__ by Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit,
|
| 651 |
+
Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin.
|
| 652 |
+
argument and :obj:`add_cross_attention` set to :obj:`True`; an :obj:`encoder_hidden_states` is then expected as an
|
| 653 |
+
input to the forward pass.
|
| 654 |
+
"""
|
| 655 |
+
|
| 656 |
+
def __init__(self, config, add_pooling_layer=True):
|
| 657 |
+
super().__init__(config)
|
| 658 |
+
self.config = config
|
| 659 |
+
|
| 660 |
+
self.embeddings = BertEmbeddings(config)
|
| 661 |
+
|
| 662 |
+
self.encoder = BertEncoder(config)
|
| 663 |
+
|
| 664 |
+
self.pooler = BertPooler(config) if add_pooling_layer else None
|
| 665 |
+
|
| 666 |
+
self.init_weights()
|
| 667 |
+
|
| 668 |
+
|
| 669 |
+
def get_input_embeddings(self):
|
| 670 |
+
return self.embeddings.word_embeddings
|
| 671 |
+
|
| 672 |
+
def set_input_embeddings(self, value):
|
| 673 |
+
self.embeddings.word_embeddings = value
|
| 674 |
+
|
| 675 |
+
def _prune_heads(self, heads_to_prune):
|
| 676 |
+
"""
|
| 677 |
+
Prunes heads of the model. heads_to_prune: dict of {layer_num: list of heads to prune in this layer} See base
|
| 678 |
+
class PreTrainedModel
|
| 679 |
+
"""
|
| 680 |
+
for layer, heads in heads_to_prune.items():
|
| 681 |
+
self.encoder.layer[layer].attention.prune_heads(heads)
|
| 682 |
+
|
| 683 |
+
|
| 684 |
+
def get_extended_attention_mask(self, attention_mask: Tensor, input_shape: Tuple[int], device: device, is_decoder: bool) -> Tensor:
|
| 685 |
+
"""
|
| 686 |
+
Makes broadcastable attention and causal masks so that future and masked tokens are ignored.
|
| 687 |
+
|
| 688 |
+
Arguments:
|
| 689 |
+
attention_mask (:obj:`torch.Tensor`):
|
| 690 |
+
Mask with ones indicating tokens to attend to, zeros for tokens to ignore.
|
| 691 |
+
input_shape (:obj:`Tuple[int]`):
|
| 692 |
+
The shape of the input to the model.
|
| 693 |
+
device: (:obj:`torch.device`):
|
| 694 |
+
The device of the input to the model.
|
| 695 |
+
|
| 696 |
+
Returns:
|
| 697 |
+
:obj:`torch.Tensor` The extended attention mask, with a the same dtype as :obj:`attention_mask.dtype`.
|
| 698 |
+
"""
|
| 699 |
+
# We can provide a self-attention mask of dimensions [batch_size, from_seq_length, to_seq_length]
|
| 700 |
+
# ourselves in which case we just need to make it broadcastable to all heads.
|
| 701 |
+
if attention_mask.dim() == 3:
|
| 702 |
+
extended_attention_mask = attention_mask[:, None, :, :]
|
| 703 |
+
elif attention_mask.dim() == 2:
|
| 704 |
+
# Provided a padding mask of dimensions [batch_size, seq_length]
|
| 705 |
+
# - if the model is a decoder, apply a causal mask in addition to the padding mask
|
| 706 |
+
# - if the model is an encoder, make the mask broadcastable to [batch_size, num_heads, seq_length, seq_length]
|
| 707 |
+
if is_decoder:
|
| 708 |
+
batch_size, seq_length = input_shape
|
| 709 |
+
|
| 710 |
+
seq_ids = torch.arange(seq_length, device=device)
|
| 711 |
+
causal_mask = seq_ids[None, None, :].repeat(batch_size, seq_length, 1) <= seq_ids[None, :, None]
|
| 712 |
+
# in case past_key_values are used we need to add a prefix ones mask to the causal mask
|
| 713 |
+
# causal and attention masks must have same type with pytorch version < 1.3
|
| 714 |
+
causal_mask = causal_mask.to(attention_mask.dtype)
|
| 715 |
+
|
| 716 |
+
if causal_mask.shape[1] < attention_mask.shape[1]:
|
| 717 |
+
prefix_seq_len = attention_mask.shape[1] - causal_mask.shape[1]
|
| 718 |
+
causal_mask = torch.cat(
|
| 719 |
+
[
|
| 720 |
+
torch.ones((batch_size, seq_length, prefix_seq_len), device=device, dtype=causal_mask.dtype),
|
| 721 |
+
causal_mask,
|
| 722 |
+
],
|
| 723 |
+
axis=-1,
|
| 724 |
+
)
|
| 725 |
+
|
| 726 |
+
extended_attention_mask = causal_mask[:, None, :, :] * attention_mask[:, None, None, :]
|
| 727 |
+
else:
|
| 728 |
+
extended_attention_mask = attention_mask[:, None, None, :]
|
| 729 |
+
else:
|
| 730 |
+
raise ValueError(
|
| 731 |
+
"Wrong shape for input_ids (shape {}) or attention_mask (shape {})".format(
|
| 732 |
+
input_shape, attention_mask.shape
|
| 733 |
+
)
|
| 734 |
+
)
|
| 735 |
+
|
| 736 |
+
# Since attention_mask is 1.0 for positions we want to attend and 0.0 for
|
| 737 |
+
# masked positions, this operation will create a tensor which is 0.0 for
|
| 738 |
+
# positions we want to attend and -10000.0 for masked positions.
|
| 739 |
+
# Since we are adding it to the raw scores before the softmax, this is
|
| 740 |
+
# effectively the same as removing these entirely.
|
| 741 |
+
extended_attention_mask = extended_attention_mask.to(dtype=self.dtype) # fp16 compatibility
|
| 742 |
+
extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0
|
| 743 |
+
return extended_attention_mask
|
| 744 |
+
|
| 745 |
+
def forward(
|
| 746 |
+
self,
|
| 747 |
+
input_ids=None,
|
| 748 |
+
attention_mask=None,
|
| 749 |
+
position_ids=None,
|
| 750 |
+
head_mask=None,
|
| 751 |
+
inputs_embeds=None,
|
| 752 |
+
encoder_embeds=None,
|
| 753 |
+
encoder_hidden_states=None,
|
| 754 |
+
encoder_attention_mask=None,
|
| 755 |
+
past_key_values=None,
|
| 756 |
+
use_cache=None,
|
| 757 |
+
output_attentions=None,
|
| 758 |
+
output_hidden_states=None,
|
| 759 |
+
return_dict=None,
|
| 760 |
+
is_decoder=False,
|
| 761 |
+
mode='multimodal',
|
| 762 |
+
):
|
| 763 |
+
r"""
|
| 764 |
+
encoder_hidden_states (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length, hidden_size)`, `optional`):
|
| 765 |
+
Sequence of hidden-states at the output of the last layer of the encoder. Used in the cross-attention if
|
| 766 |
+
the model is configured as a decoder.
|
| 767 |
+
encoder_attention_mask (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
|
| 768 |
+
Mask to avoid performing attention on the padding token indices of the encoder input. This mask is used in
|
| 769 |
+
the cross-attention if the model is configured as a decoder. Mask values selected in ``[0, 1]``:
|
| 770 |
+
- 1 for tokens that are **not masked**,
|
| 771 |
+
- 0 for tokens that are **masked**.
|
| 772 |
+
past_key_values (:obj:`tuple(tuple(torch.FloatTensor))` of length :obj:`config.n_layers` with each tuple having 4 tensors of shape :obj:`(batch_size, num_heads, sequence_length - 1, embed_size_per_head)`):
|
| 773 |
+
Contains precomputed key and value hidden states of the attention blocks. Can be used to speed up decoding.
|
| 774 |
+
If :obj:`past_key_values` are used, the user can optionally input only the last :obj:`decoder_input_ids`
|
| 775 |
+
(those that don't have their past key value states given to this model) of shape :obj:`(batch_size, 1)`
|
| 776 |
+
instead of all :obj:`decoder_input_ids` of shape :obj:`(batch_size, sequence_length)`.
|
| 777 |
+
use_cache (:obj:`bool`, `optional`):
|
| 778 |
+
If set to :obj:`True`, :obj:`past_key_values` key value states are returned and can be used to speed up
|
| 779 |
+
decoding (see :obj:`past_key_values`).
|
| 780 |
+
"""
|
| 781 |
+
output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions
|
| 782 |
+
output_hidden_states = (
|
| 783 |
+
output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states
|
| 784 |
+
)
|
| 785 |
+
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
|
| 786 |
+
|
| 787 |
+
if is_decoder:
|
| 788 |
+
use_cache = use_cache if use_cache is not None else self.config.use_cache
|
| 789 |
+
else:
|
| 790 |
+
use_cache = False
|
| 791 |
+
|
| 792 |
+
if input_ids is not None and inputs_embeds is not None:
|
| 793 |
+
raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time")
|
| 794 |
+
elif input_ids is not None:
|
| 795 |
+
input_shape = input_ids.size()
|
| 796 |
+
batch_size, seq_length = input_shape
|
| 797 |
+
device = input_ids.device
|
| 798 |
+
elif inputs_embeds is not None:
|
| 799 |
+
input_shape = inputs_embeds.size()[:-1]
|
| 800 |
+
batch_size, seq_length = input_shape
|
| 801 |
+
device = inputs_embeds.device
|
| 802 |
+
elif encoder_embeds is not None:
|
| 803 |
+
input_shape = encoder_embeds.size()[:-1]
|
| 804 |
+
batch_size, seq_length = input_shape
|
| 805 |
+
device = encoder_embeds.device
|
| 806 |
+
else:
|
| 807 |
+
raise ValueError("You have to specify either input_ids or inputs_embeds or encoder_embeds")
|
| 808 |
+
|
| 809 |
+
# past_key_values_length
|
| 810 |
+
past_key_values_length = past_key_values[0][0].shape[2] if past_key_values is not None else 0
|
| 811 |
+
|
| 812 |
+
if attention_mask is None:
|
| 813 |
+
attention_mask = torch.ones(((batch_size, seq_length + past_key_values_length)), device=device)
|
| 814 |
+
|
| 815 |
+
# We can provide a self-attention mask of dimensions [batch_size, from_seq_length, to_seq_length]
|
| 816 |
+
# ourselves in which case we just need to make it broadcastable to all heads.
|
| 817 |
+
extended_attention_mask: torch.Tensor = self.get_extended_attention_mask(attention_mask, input_shape,
|
| 818 |
+
device, is_decoder)
|
| 819 |
+
|
| 820 |
+
# If a 2D or 3D attention mask is provided for the cross-attention
|
| 821 |
+
# we need to make broadcastable to [batch_size, num_heads, seq_length, seq_length]
|
| 822 |
+
if encoder_hidden_states is not None:
|
| 823 |
+
if type(encoder_hidden_states) == list:
|
| 824 |
+
encoder_batch_size, encoder_sequence_length, _ = encoder_hidden_states[0].size()
|
| 825 |
+
else:
|
| 826 |
+
encoder_batch_size, encoder_sequence_length, _ = encoder_hidden_states.size()
|
| 827 |
+
encoder_hidden_shape = (encoder_batch_size, encoder_sequence_length)
|
| 828 |
+
|
| 829 |
+
if type(encoder_attention_mask) == list:
|
| 830 |
+
encoder_extended_attention_mask = [self.invert_attention_mask(mask) for mask in encoder_attention_mask]
|
| 831 |
+
elif encoder_attention_mask is None:
|
| 832 |
+
encoder_attention_mask = torch.ones(encoder_hidden_shape, device=device)
|
| 833 |
+
encoder_extended_attention_mask = self.invert_attention_mask(encoder_attention_mask)
|
| 834 |
+
else:
|
| 835 |
+
encoder_extended_attention_mask = self.invert_attention_mask(encoder_attention_mask)
|
| 836 |
+
else:
|
| 837 |
+
encoder_extended_attention_mask = None
|
| 838 |
+
|
| 839 |
+
# Prepare head mask if needed
|
| 840 |
+
# 1.0 in head_mask indicate we keep the head
|
| 841 |
+
# attention_probs has shape bsz x n_heads x N x N
|
| 842 |
+
# input head_mask has shape [num_heads] or [num_hidden_layers x num_heads]
|
| 843 |
+
# and head_mask is converted to shape [num_hidden_layers x batch x num_heads x seq_length x seq_length]
|
| 844 |
+
head_mask = self.get_head_mask(head_mask, self.config.num_hidden_layers)
|
| 845 |
+
|
| 846 |
+
if encoder_embeds is None:
|
| 847 |
+
embedding_output = self.embeddings(
|
| 848 |
+
input_ids=input_ids,
|
| 849 |
+
position_ids=position_ids,
|
| 850 |
+
inputs_embeds=inputs_embeds,
|
| 851 |
+
past_key_values_length=past_key_values_length,
|
| 852 |
+
)
|
| 853 |
+
else:
|
| 854 |
+
embedding_output = encoder_embeds
|
| 855 |
+
|
| 856 |
+
encoder_outputs = self.encoder(
|
| 857 |
+
embedding_output,
|
| 858 |
+
attention_mask=extended_attention_mask,
|
| 859 |
+
head_mask=head_mask,
|
| 860 |
+
encoder_hidden_states=encoder_hidden_states,
|
| 861 |
+
encoder_attention_mask=encoder_extended_attention_mask,
|
| 862 |
+
past_key_values=past_key_values,
|
| 863 |
+
use_cache=use_cache,
|
| 864 |
+
output_attentions=output_attentions,
|
| 865 |
+
output_hidden_states=output_hidden_states,
|
| 866 |
+
return_dict=return_dict,
|
| 867 |
+
mode=mode,
|
| 868 |
+
)
|
| 869 |
+
sequence_output = encoder_outputs[0]
|
| 870 |
+
pooled_output = self.pooler(sequence_output) if self.pooler is not None else None
|
| 871 |
+
|
| 872 |
+
if not return_dict:
|
| 873 |
+
return (sequence_output, pooled_output) + encoder_outputs[1:]
|
| 874 |
+
|
| 875 |
+
return BaseModelOutputWithPoolingAndCrossAttentions(
|
| 876 |
+
last_hidden_state=sequence_output,
|
| 877 |
+
pooler_output=pooled_output,
|
| 878 |
+
past_key_values=encoder_outputs.past_key_values,
|
| 879 |
+
hidden_states=encoder_outputs.hidden_states,
|
| 880 |
+
attentions=encoder_outputs.attentions,
|
| 881 |
+
cross_attentions=encoder_outputs.cross_attentions,
|
| 882 |
+
)
|
| 883 |
+
|
| 884 |
+
|
| 885 |
+
class BertLMHeadModel(BertPreTrainedModel):
|
| 886 |
+
|
| 887 |
+
_keys_to_ignore_on_load_unexpected = [r"pooler"]
|
| 888 |
+
_keys_to_ignore_on_load_missing = [r"position_ids", r"predictions.decoder.bias"]
|
| 889 |
+
|
| 890 |
+
def __init__(self, config):
|
| 891 |
+
super().__init__(config)
|
| 892 |
+
|
| 893 |
+
self.bert = BertModel(config, add_pooling_layer=False)
|
| 894 |
+
self.cls = BertOnlyMLMHead(config)
|
| 895 |
+
|
| 896 |
+
self.init_weights()
|
| 897 |
+
|
| 898 |
+
def get_output_embeddings(self):
|
| 899 |
+
return self.cls.predictions.decoder
|
| 900 |
+
|
| 901 |
+
def set_output_embeddings(self, new_embeddings):
|
| 902 |
+
self.cls.predictions.decoder = new_embeddings
|
| 903 |
+
|
| 904 |
+
def forward(
|
| 905 |
+
self,
|
| 906 |
+
input_ids=None,
|
| 907 |
+
attention_mask=None,
|
| 908 |
+
position_ids=None,
|
| 909 |
+
head_mask=None,
|
| 910 |
+
inputs_embeds=None,
|
| 911 |
+
encoder_hidden_states=None,
|
| 912 |
+
encoder_attention_mask=None,
|
| 913 |
+
labels=None,
|
| 914 |
+
past_key_values=None,
|
| 915 |
+
use_cache=None,
|
| 916 |
+
output_attentions=None,
|
| 917 |
+
output_hidden_states=None,
|
| 918 |
+
return_dict=None,
|
| 919 |
+
return_logits=False,
|
| 920 |
+
is_decoder=True,
|
| 921 |
+
reduction='mean',
|
| 922 |
+
mode='multimodal',
|
| 923 |
+
):
|
| 924 |
+
r"""
|
| 925 |
+
encoder_hidden_states (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length, hidden_size)`, `optional`):
|
| 926 |
+
Sequence of hidden-states at the output of the last layer of the encoder. Used in the cross-attention if
|
| 927 |
+
the model is configured as a decoder.
|
| 928 |
+
encoder_attention_mask (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
|
| 929 |
+
Mask to avoid performing attention on the padding token indices of the encoder input. This mask is used in
|
| 930 |
+
the cross-attention if the model is configured as a decoder. Mask values selected in ``[0, 1]``:
|
| 931 |
+
- 1 for tokens that are **not masked**,
|
| 932 |
+
- 0 for tokens that are **masked**.
|
| 933 |
+
labels (:obj:`torch.LongTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
|
| 934 |
+
Labels for computing the left-to-right language modeling loss (next word prediction). Indices should be in
|
| 935 |
+
``[-100, 0, ..., config.vocab_size]`` (see ``input_ids`` docstring) Tokens with indices set to ``-100`` are
|
| 936 |
+
ignored (masked), the loss is only computed for the tokens with labels n ``[0, ..., config.vocab_size]``
|
| 937 |
+
past_key_values (:obj:`tuple(tuple(torch.FloatTensor))` of length :obj:`config.n_layers` with each tuple having 4 tensors of shape :obj:`(batch_size, num_heads, sequence_length - 1, embed_size_per_head)`):
|
| 938 |
+
Contains precomputed key and value hidden states of the attention blocks. Can be used to speed up decoding.
|
| 939 |
+
If :obj:`past_key_values` are used, the user can optionally input only the last :obj:`decoder_input_ids`
|
| 940 |
+
(those that don't have their past key value states given to this model) of shape :obj:`(batch_size, 1)`
|
| 941 |
+
instead of all :obj:`decoder_input_ids` of shape :obj:`(batch_size, sequence_length)`.
|
| 942 |
+
use_cache (:obj:`bool`, `optional`):
|
| 943 |
+
If set to :obj:`True`, :obj:`past_key_values` key value states are returned and can be used to speed up
|
| 944 |
+
decoding (see :obj:`past_key_values`).
|
| 945 |
+
Returns:
|
| 946 |
+
Example::
|
| 947 |
+
>>> from transformers import BertTokenizer, BertLMHeadModel, BertConfig
|
| 948 |
+
>>> import torch
|
| 949 |
+
>>> tokenizer = BertTokenizer.from_pretrained('bert-base-cased')
|
| 950 |
+
>>> config = BertConfig.from_pretrained("bert-base-cased")
|
| 951 |
+
>>> model = BertLMHeadModel.from_pretrained('bert-base-cased', config=config)
|
| 952 |
+
>>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
|
| 953 |
+
>>> outputs = model(**inputs)
|
| 954 |
+
>>> prediction_logits = outputs.logits
|
| 955 |
+
"""
|
| 956 |
+
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
|
| 957 |
+
if labels is not None:
|
| 958 |
+
use_cache = False
|
| 959 |
+
|
| 960 |
+
outputs = self.bert(
|
| 961 |
+
input_ids,
|
| 962 |
+
attention_mask=attention_mask,
|
| 963 |
+
position_ids=position_ids,
|
| 964 |
+
head_mask=head_mask,
|
| 965 |
+
inputs_embeds=inputs_embeds,
|
| 966 |
+
encoder_hidden_states=encoder_hidden_states,
|
| 967 |
+
encoder_attention_mask=encoder_attention_mask,
|
| 968 |
+
past_key_values=past_key_values,
|
| 969 |
+
use_cache=use_cache,
|
| 970 |
+
output_attentions=output_attentions,
|
| 971 |
+
output_hidden_states=output_hidden_states,
|
| 972 |
+
return_dict=return_dict,
|
| 973 |
+
is_decoder=is_decoder,
|
| 974 |
+
mode=mode,
|
| 975 |
+
)
|
| 976 |
+
|
| 977 |
+
sequence_output = outputs[0]
|
| 978 |
+
prediction_scores = self.cls(sequence_output)
|
| 979 |
+
# sequence_output.shape torch.Size([85, 30, 768])
|
| 980 |
+
# prediction_scores.shape torch.Size([85, 30, 30524])
|
| 981 |
+
# labels.shape torch.Size([85, 30])
|
| 982 |
+
|
| 983 |
+
|
| 984 |
+
if return_logits:
|
| 985 |
+
return prediction_scores[:, :-1, :].contiguous()
|
| 986 |
+
|
| 987 |
+
lm_loss = None
|
| 988 |
+
if labels is not None:
|
| 989 |
+
# we are doing next-token prediction; shift prediction scores and input ids by one
|
| 990 |
+
shifted_prediction_scores = prediction_scores[:, :-1, :].contiguous()
|
| 991 |
+
labels = labels[:, 1:].contiguous()
|
| 992 |
+
loss_fct = CrossEntropyLoss(reduction=reduction, label_smoothing=0.1)
|
| 993 |
+
lm_loss = loss_fct(shifted_prediction_scores.view(-1, self.config.vocab_size), labels.view(-1))
|
| 994 |
+
if reduction=='none':
|
| 995 |
+
lm_loss = lm_loss.view(prediction_scores.size(0),-1).sum(1)
|
| 996 |
+
|
| 997 |
+
if not return_dict:
|
| 998 |
+
output = (prediction_scores,) + outputs[2:]
|
| 999 |
+
return ((lm_loss,) + output) if lm_loss is not None else output
|
| 1000 |
+
|
| 1001 |
+
return CausalLMOutputWithCrossAttentions(
|
| 1002 |
+
loss=lm_loss,
|
| 1003 |
+
logits=prediction_scores,
|
| 1004 |
+
past_key_values=outputs.past_key_values,
|
| 1005 |
+
hidden_states=outputs.hidden_states,
|
| 1006 |
+
attentions=outputs.attentions,
|
| 1007 |
+
cross_attentions=outputs.cross_attentions,
|
| 1008 |
+
)
|
| 1009 |
+
|
| 1010 |
+
def prepare_inputs_for_generation(self, input_ids, past=None, attention_mask=None, **model_kwargs):
|
| 1011 |
+
input_shape = input_ids.shape
|
| 1012 |
+
# if model is used as a decoder in encoder-decoder model, the decoder attention mask is created on the fly
|
| 1013 |
+
if attention_mask is None:
|
| 1014 |
+
attention_mask = input_ids.new_ones(input_shape)
|
| 1015 |
+
|
| 1016 |
+
# cut decoder_input_ids if past is used
|
| 1017 |
+
if past is not None:
|
| 1018 |
+
input_ids = input_ids[:, -1:]
|
| 1019 |
+
|
| 1020 |
+
return {
|
| 1021 |
+
"input_ids": input_ids,
|
| 1022 |
+
"attention_mask": attention_mask,
|
| 1023 |
+
"past_key_values": past,
|
| 1024 |
+
"encoder_hidden_states": model_kwargs.get("encoder_hidden_states", None),
|
| 1025 |
+
"encoder_attention_mask": model_kwargs.get("encoder_attention_mask", None),
|
| 1026 |
+
"is_decoder": True,
|
| 1027 |
+
}
|
| 1028 |
+
|
| 1029 |
+
def _reorder_cache(self, past, beam_idx):
|
| 1030 |
+
reordered_past = ()
|
| 1031 |
+
for layer_past in past:
|
| 1032 |
+
reordered_past += (tuple(past_state.index_select(0, beam_idx) for past_state in layer_past),)
|
| 1033 |
+
return reordered_past
|
| 1034 |
+
|
| 1035 |
+
|
ram/models/bert_lora.py
ADDED
|
@@ -0,0 +1,1040 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* Copyright (c) 2022, salesforce.com, inc.
|
| 3 |
+
* All rights reserved.
|
| 4 |
+
* SPDX-License-Identifier: BSD-3-Clause
|
| 5 |
+
* For full license text, see LICENSE.txt file in the repo root or https://opensource.org/licenses/BSD-3-Clause
|
| 6 |
+
* By Junnan Li
|
| 7 |
+
* Based on huggingface code base
|
| 8 |
+
* https://github.com/huggingface/transformers/blob/v4.15.0/src/transformers/models/bert
|
| 9 |
+
'''
|
| 10 |
+
|
| 11 |
+
import math
|
| 12 |
+
import os
|
| 13 |
+
import warnings
|
| 14 |
+
from dataclasses import dataclass
|
| 15 |
+
from typing import Optional, Tuple
|
| 16 |
+
|
| 17 |
+
import torch
|
| 18 |
+
from torch import Tensor, device, dtype, nn
|
| 19 |
+
import torch.utils.checkpoint
|
| 20 |
+
from torch import nn
|
| 21 |
+
from torch.nn import CrossEntropyLoss
|
| 22 |
+
import torch.nn.functional as F
|
| 23 |
+
|
| 24 |
+
from transformers.activations import ACT2FN
|
| 25 |
+
from transformers.file_utils import (
|
| 26 |
+
ModelOutput,
|
| 27 |
+
)
|
| 28 |
+
from transformers.modeling_outputs import (
|
| 29 |
+
BaseModelOutputWithPastAndCrossAttentions,
|
| 30 |
+
BaseModelOutputWithPoolingAndCrossAttentions,
|
| 31 |
+
CausalLMOutputWithCrossAttentions,
|
| 32 |
+
MaskedLMOutput,
|
| 33 |
+
MultipleChoiceModelOutput,
|
| 34 |
+
NextSentencePredictorOutput,
|
| 35 |
+
QuestionAnsweringModelOutput,
|
| 36 |
+
SequenceClassifierOutput,
|
| 37 |
+
TokenClassifierOutput,
|
| 38 |
+
)
|
| 39 |
+
from transformers.modeling_utils import (
|
| 40 |
+
PreTrainedModel,
|
| 41 |
+
apply_chunking_to_forward,
|
| 42 |
+
find_pruneable_heads_and_indices,
|
| 43 |
+
prune_linear_layer,
|
| 44 |
+
)
|
| 45 |
+
from transformers.utils import logging
|
| 46 |
+
from transformers.models.bert.configuration_bert import BertConfig
|
| 47 |
+
|
| 48 |
+
import loralib as lora
|
| 49 |
+
|
| 50 |
+
|
| 51 |
+
logger = logging.get_logger(__name__)
|
| 52 |
+
|
| 53 |
+
|
| 54 |
+
class BertEmbeddings_nopos(nn.Module):
|
| 55 |
+
"""Construct the embeddings from word and position embeddings."""
|
| 56 |
+
|
| 57 |
+
def __init__(self, config):
|
| 58 |
+
super().__init__()
|
| 59 |
+
self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size, padding_idx=config.pad_token_id)
|
| 60 |
+
# self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)
|
| 61 |
+
|
| 62 |
+
# self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load
|
| 63 |
+
# any TensorFlow checkpoint file
|
| 64 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 65 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 66 |
+
|
| 67 |
+
# position_ids (1, len position emb) is contiguous in memory and exported when serialized
|
| 68 |
+
# self.register_buffer("position_ids", torch.arange(config.max_position_embeddings).expand((1, -1)))
|
| 69 |
+
# self.position_embedding_type = getattr(config, "position_embedding_type", "absolute")
|
| 70 |
+
|
| 71 |
+
self.config = config
|
| 72 |
+
|
| 73 |
+
def forward(
|
| 74 |
+
self, input_ids=None, position_ids=None, inputs_embeds=None, past_key_values_length=0
|
| 75 |
+
):
|
| 76 |
+
if input_ids is not None:
|
| 77 |
+
input_shape = input_ids.size()
|
| 78 |
+
else:
|
| 79 |
+
input_shape = inputs_embeds.size()[:-1]
|
| 80 |
+
|
| 81 |
+
seq_length = input_shape[1]
|
| 82 |
+
|
| 83 |
+
# if position_ids is None:
|
| 84 |
+
# position_ids = self.position_ids[:, past_key_values_length : seq_length + past_key_values_length]
|
| 85 |
+
|
| 86 |
+
if inputs_embeds is None:
|
| 87 |
+
inputs_embeds = self.word_embeddings(input_ids)
|
| 88 |
+
|
| 89 |
+
embeddings = inputs_embeds
|
| 90 |
+
|
| 91 |
+
# if self.position_embedding_type == "absolute":
|
| 92 |
+
# position_embeddings = self.position_embeddings(position_ids)
|
| 93 |
+
# # print('add position_embeddings!!!!')
|
| 94 |
+
# embeddings += position_embeddings
|
| 95 |
+
embeddings = self.LayerNorm(embeddings)
|
| 96 |
+
embeddings = self.dropout(embeddings)
|
| 97 |
+
return embeddings
|
| 98 |
+
|
| 99 |
+
|
| 100 |
+
|
| 101 |
+
|
| 102 |
+
class BertEmbeddings(nn.Module):
|
| 103 |
+
"""Construct the embeddings from word and position embeddings."""
|
| 104 |
+
|
| 105 |
+
def __init__(self, config):
|
| 106 |
+
super().__init__()
|
| 107 |
+
self.word_embeddings = nn.Embedding(config.vocab_size, config.hidden_size, padding_idx=config.pad_token_id)
|
| 108 |
+
self.position_embeddings = nn.Embedding(config.max_position_embeddings, config.hidden_size)
|
| 109 |
+
|
| 110 |
+
# self.LayerNorm is not snake-cased to stick with TensorFlow model variable name and be able to load
|
| 111 |
+
# any TensorFlow checkpoint file
|
| 112 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 113 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 114 |
+
|
| 115 |
+
# position_ids (1, len position emb) is contiguous in memory and exported when serialized
|
| 116 |
+
self.register_buffer("position_ids", torch.arange(config.max_position_embeddings).expand((1, -1)))
|
| 117 |
+
self.position_embedding_type = getattr(config, "position_embedding_type", "absolute")
|
| 118 |
+
|
| 119 |
+
self.config = config
|
| 120 |
+
|
| 121 |
+
def forward(
|
| 122 |
+
self, input_ids=None, position_ids=None, inputs_embeds=None, past_key_values_length=0
|
| 123 |
+
):
|
| 124 |
+
if input_ids is not None:
|
| 125 |
+
input_shape = input_ids.size()
|
| 126 |
+
else:
|
| 127 |
+
input_shape = inputs_embeds.size()[:-1]
|
| 128 |
+
|
| 129 |
+
seq_length = input_shape[1]
|
| 130 |
+
|
| 131 |
+
if position_ids is None:
|
| 132 |
+
position_ids = self.position_ids[:, past_key_values_length : seq_length + past_key_values_length]
|
| 133 |
+
|
| 134 |
+
if inputs_embeds is None:
|
| 135 |
+
inputs_embeds = self.word_embeddings(input_ids)
|
| 136 |
+
|
| 137 |
+
embeddings = inputs_embeds
|
| 138 |
+
|
| 139 |
+
if self.position_embedding_type == "absolute":
|
| 140 |
+
position_embeddings = self.position_embeddings(position_ids)
|
| 141 |
+
# print('add position_embeddings!!!!')
|
| 142 |
+
embeddings += position_embeddings
|
| 143 |
+
embeddings = self.LayerNorm(embeddings)
|
| 144 |
+
embeddings = self.dropout(embeddings)
|
| 145 |
+
return embeddings
|
| 146 |
+
|
| 147 |
+
|
| 148 |
+
class BertSelfAttention(nn.Module):
|
| 149 |
+
def __init__(self, config, is_cross_attention):
|
| 150 |
+
super().__init__()
|
| 151 |
+
self.config = config
|
| 152 |
+
if config.hidden_size % config.num_attention_heads != 0 and not hasattr(config, "embedding_size"):
|
| 153 |
+
raise ValueError(
|
| 154 |
+
"The hidden size (%d) is not a multiple of the number of attention "
|
| 155 |
+
"heads (%d)" % (config.hidden_size, config.num_attention_heads)
|
| 156 |
+
)
|
| 157 |
+
|
| 158 |
+
self.num_attention_heads = config.num_attention_heads
|
| 159 |
+
self.attention_head_size = int(config.hidden_size / config.num_attention_heads)
|
| 160 |
+
self.all_head_size = self.num_attention_heads * self.attention_head_size
|
| 161 |
+
|
| 162 |
+
# self.query = nn.Linear(config.hidden_size, self.all_head_size)
|
| 163 |
+
self.query = lora.Linear(config.hidden_size, self.all_head_size, r=8)
|
| 164 |
+
if is_cross_attention:
|
| 165 |
+
# self.key = nn.Linear(config.encoder_width, self.all_head_size)
|
| 166 |
+
self.key = lora.Linear(config.encoder_width, self.all_head_size, r=8)
|
| 167 |
+
self.value = nn.Linear(config.encoder_width, self.all_head_size)
|
| 168 |
+
else:
|
| 169 |
+
# self.key = nn.Linear(config.hidden_size, self.all_head_size)
|
| 170 |
+
self.key = lora.Linear(config.hidden_size, self.all_head_size, r=8)
|
| 171 |
+
self.value = nn.Linear(config.hidden_size, self.all_head_size)
|
| 172 |
+
|
| 173 |
+
self.dropout = nn.Dropout(config.attention_probs_dropout_prob)
|
| 174 |
+
self.position_embedding_type = getattr(config, "position_embedding_type", "absolute")
|
| 175 |
+
if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":
|
| 176 |
+
self.max_position_embeddings = config.max_position_embeddings
|
| 177 |
+
self.distance_embedding = nn.Embedding(2 * config.max_position_embeddings - 1, self.attention_head_size)
|
| 178 |
+
self.save_attention = False
|
| 179 |
+
|
| 180 |
+
def save_attn_gradients(self, attn_gradients):
|
| 181 |
+
self.attn_gradients = attn_gradients
|
| 182 |
+
|
| 183 |
+
def get_attn_gradients(self):
|
| 184 |
+
return self.attn_gradients
|
| 185 |
+
|
| 186 |
+
def save_attention_map(self, attention_map):
|
| 187 |
+
self.attention_map = attention_map
|
| 188 |
+
|
| 189 |
+
def get_attention_map(self):
|
| 190 |
+
return self.attention_map
|
| 191 |
+
|
| 192 |
+
def transpose_for_scores(self, x):
|
| 193 |
+
new_x_shape = x.size()[:-1] + (self.num_attention_heads, self.attention_head_size)
|
| 194 |
+
x = x.view(*new_x_shape)
|
| 195 |
+
return x.permute(0, 2, 1, 3)
|
| 196 |
+
|
| 197 |
+
def forward(
|
| 198 |
+
self,
|
| 199 |
+
hidden_states,
|
| 200 |
+
attention_mask=None,
|
| 201 |
+
head_mask=None,
|
| 202 |
+
encoder_hidden_states=None,
|
| 203 |
+
encoder_attention_mask=None,
|
| 204 |
+
past_key_value=None,
|
| 205 |
+
output_attentions=False,
|
| 206 |
+
):
|
| 207 |
+
mixed_query_layer = self.query(hidden_states)
|
| 208 |
+
|
| 209 |
+
# If this is instantiated as a cross-attention module, the keys
|
| 210 |
+
# and values come from an encoder; the attention mask needs to be
|
| 211 |
+
# such that the encoder's padding tokens are not attended to.
|
| 212 |
+
is_cross_attention = encoder_hidden_states is not None
|
| 213 |
+
|
| 214 |
+
if is_cross_attention:
|
| 215 |
+
# print(self.key.weight.shape)
|
| 216 |
+
key_layer = self.transpose_for_scores(self.key(encoder_hidden_states))
|
| 217 |
+
value_layer = self.transpose_for_scores(self.value(encoder_hidden_states))
|
| 218 |
+
attention_mask = encoder_attention_mask
|
| 219 |
+
elif past_key_value is not None:
|
| 220 |
+
key_layer = self.transpose_for_scores(self.key(hidden_states))
|
| 221 |
+
value_layer = self.transpose_for_scores(self.value(hidden_states))
|
| 222 |
+
key_layer = torch.cat([past_key_value[0], key_layer], dim=2)
|
| 223 |
+
value_layer = torch.cat([past_key_value[1], value_layer], dim=2)
|
| 224 |
+
else:
|
| 225 |
+
key_layer = self.transpose_for_scores(self.key(hidden_states))
|
| 226 |
+
value_layer = self.transpose_for_scores(self.value(hidden_states))
|
| 227 |
+
|
| 228 |
+
query_layer = self.transpose_for_scores(mixed_query_layer)
|
| 229 |
+
|
| 230 |
+
past_key_value = (key_layer, value_layer)
|
| 231 |
+
|
| 232 |
+
# compatible with higher versions of transformers
|
| 233 |
+
if key_layer.shape[0] > query_layer.shape[0]:
|
| 234 |
+
key_layer = key_layer[:query_layer.shape[0], :, :, :]
|
| 235 |
+
attention_mask = attention_mask[:query_layer.shape[0], :, :]
|
| 236 |
+
value_layer = value_layer[:query_layer.shape[0], :, :, :]
|
| 237 |
+
|
| 238 |
+
# Take the dot product between "query" and "key" to get the raw attention scores.
|
| 239 |
+
attention_scores = torch.matmul(query_layer, key_layer.transpose(-1, -2))
|
| 240 |
+
|
| 241 |
+
if self.position_embedding_type == "relative_key" or self.position_embedding_type == "relative_key_query":
|
| 242 |
+
seq_length = hidden_states.size()[1]
|
| 243 |
+
position_ids_l = torch.arange(seq_length, dtype=torch.long, device=hidden_states.device).view(-1, 1)
|
| 244 |
+
position_ids_r = torch.arange(seq_length, dtype=torch.long, device=hidden_states.device).view(1, -1)
|
| 245 |
+
distance = position_ids_l - position_ids_r
|
| 246 |
+
positional_embedding = self.distance_embedding(distance + self.max_position_embeddings - 1)
|
| 247 |
+
positional_embedding = positional_embedding.to(dtype=query_layer.dtype) # fp16 compatibility
|
| 248 |
+
|
| 249 |
+
if self.position_embedding_type == "relative_key":
|
| 250 |
+
relative_position_scores = torch.einsum("bhld,lrd->bhlr", query_layer, positional_embedding)
|
| 251 |
+
attention_scores = attention_scores + relative_position_scores
|
| 252 |
+
elif self.position_embedding_type == "relative_key_query":
|
| 253 |
+
relative_position_scores_query = torch.einsum("bhld,lrd->bhlr", query_layer, positional_embedding)
|
| 254 |
+
relative_position_scores_key = torch.einsum("bhrd,lrd->bhlr", key_layer, positional_embedding)
|
| 255 |
+
attention_scores = attention_scores + relative_position_scores_query + relative_position_scores_key
|
| 256 |
+
|
| 257 |
+
attention_scores = attention_scores / math.sqrt(self.attention_head_size)
|
| 258 |
+
if attention_mask is not None:
|
| 259 |
+
# Apply the attention mask is (precomputed for all layers in BertModel forward() function)
|
| 260 |
+
attention_scores = attention_scores + attention_mask
|
| 261 |
+
|
| 262 |
+
# Normalize the attention scores to probabilities.
|
| 263 |
+
attention_probs = nn.Softmax(dim=-1)(attention_scores)
|
| 264 |
+
|
| 265 |
+
if is_cross_attention and self.save_attention:
|
| 266 |
+
self.save_attention_map(attention_probs)
|
| 267 |
+
attention_probs.register_hook(self.save_attn_gradients)
|
| 268 |
+
|
| 269 |
+
# This is actually dropping out entire tokens to attend to, which might
|
| 270 |
+
# seem a bit unusual, but is taken from the original Transformer paper.
|
| 271 |
+
attention_probs_dropped = self.dropout(attention_probs)
|
| 272 |
+
|
| 273 |
+
# Mask heads if we want to
|
| 274 |
+
if head_mask is not None:
|
| 275 |
+
attention_probs_dropped = attention_probs_dropped * head_mask
|
| 276 |
+
|
| 277 |
+
context_layer = torch.matmul(attention_probs_dropped, value_layer)
|
| 278 |
+
|
| 279 |
+
context_layer = context_layer.permute(0, 2, 1, 3).contiguous()
|
| 280 |
+
new_context_layer_shape = context_layer.size()[:-2] + (self.all_head_size,)
|
| 281 |
+
context_layer = context_layer.view(*new_context_layer_shape)
|
| 282 |
+
|
| 283 |
+
outputs = (context_layer, attention_probs) if output_attentions else (context_layer,)
|
| 284 |
+
|
| 285 |
+
outputs = outputs + (past_key_value,)
|
| 286 |
+
return outputs
|
| 287 |
+
|
| 288 |
+
|
| 289 |
+
class BertSelfOutput(nn.Module):
|
| 290 |
+
def __init__(self, config):
|
| 291 |
+
super().__init__()
|
| 292 |
+
self.dense = nn.Linear(config.hidden_size, config.hidden_size)
|
| 293 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 294 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 295 |
+
|
| 296 |
+
def forward(self, hidden_states, input_tensor):
|
| 297 |
+
hidden_states = self.dense(hidden_states)
|
| 298 |
+
hidden_states = self.dropout(hidden_states)
|
| 299 |
+
hidden_states = self.LayerNorm(hidden_states + input_tensor)
|
| 300 |
+
return hidden_states
|
| 301 |
+
|
| 302 |
+
|
| 303 |
+
class BertAttention(nn.Module):
|
| 304 |
+
def __init__(self, config, is_cross_attention=False):
|
| 305 |
+
super().__init__()
|
| 306 |
+
self.self = BertSelfAttention(config, is_cross_attention)
|
| 307 |
+
self.output = BertSelfOutput(config)
|
| 308 |
+
self.pruned_heads = set()
|
| 309 |
+
|
| 310 |
+
def prune_heads(self, heads):
|
| 311 |
+
if len(heads) == 0:
|
| 312 |
+
return
|
| 313 |
+
heads, index = find_pruneable_heads_and_indices(
|
| 314 |
+
heads, self.self.num_attention_heads, self.self.attention_head_size, self.pruned_heads
|
| 315 |
+
)
|
| 316 |
+
|
| 317 |
+
# Prune linear layers
|
| 318 |
+
self.self.query = prune_linear_layer(self.self.query, index)
|
| 319 |
+
self.self.key = prune_linear_layer(self.self.key, index)
|
| 320 |
+
self.self.value = prune_linear_layer(self.self.value, index)
|
| 321 |
+
self.output.dense = prune_linear_layer(self.output.dense, index, dim=1)
|
| 322 |
+
|
| 323 |
+
# Update hyper params and store pruned heads
|
| 324 |
+
self.self.num_attention_heads = self.self.num_attention_heads - len(heads)
|
| 325 |
+
self.self.all_head_size = self.self.attention_head_size * self.self.num_attention_heads
|
| 326 |
+
self.pruned_heads = self.pruned_heads.union(heads)
|
| 327 |
+
|
| 328 |
+
def forward(
|
| 329 |
+
self,
|
| 330 |
+
hidden_states,
|
| 331 |
+
attention_mask=None,
|
| 332 |
+
head_mask=None,
|
| 333 |
+
encoder_hidden_states=None,
|
| 334 |
+
encoder_attention_mask=None,
|
| 335 |
+
past_key_value=None,
|
| 336 |
+
output_attentions=False,
|
| 337 |
+
):
|
| 338 |
+
self_outputs = self.self(
|
| 339 |
+
hidden_states,
|
| 340 |
+
attention_mask,
|
| 341 |
+
head_mask,
|
| 342 |
+
encoder_hidden_states,
|
| 343 |
+
encoder_attention_mask,
|
| 344 |
+
past_key_value,
|
| 345 |
+
output_attentions,
|
| 346 |
+
)
|
| 347 |
+
attention_output = self.output(self_outputs[0], hidden_states)
|
| 348 |
+
outputs = (attention_output,) + self_outputs[1:] # add attentions if we output them
|
| 349 |
+
return outputs
|
| 350 |
+
|
| 351 |
+
|
| 352 |
+
class BertIntermediate(nn.Module):
|
| 353 |
+
def __init__(self, config):
|
| 354 |
+
super().__init__()
|
| 355 |
+
self.dense = nn.Linear(config.hidden_size, config.intermediate_size)
|
| 356 |
+
if isinstance(config.hidden_act, str):
|
| 357 |
+
self.intermediate_act_fn = ACT2FN[config.hidden_act]
|
| 358 |
+
else:
|
| 359 |
+
self.intermediate_act_fn = config.hidden_act
|
| 360 |
+
|
| 361 |
+
def forward(self, hidden_states):
|
| 362 |
+
hidden_states = self.dense(hidden_states)
|
| 363 |
+
hidden_states = self.intermediate_act_fn(hidden_states)
|
| 364 |
+
return hidden_states
|
| 365 |
+
|
| 366 |
+
|
| 367 |
+
class BertOutput(nn.Module):
|
| 368 |
+
def __init__(self, config):
|
| 369 |
+
super().__init__()
|
| 370 |
+
self.dense = nn.Linear(config.intermediate_size, config.hidden_size)
|
| 371 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 372 |
+
self.dropout = nn.Dropout(config.hidden_dropout_prob)
|
| 373 |
+
|
| 374 |
+
def forward(self, hidden_states, input_tensor):
|
| 375 |
+
hidden_states = self.dense(hidden_states)
|
| 376 |
+
hidden_states = self.dropout(hidden_states)
|
| 377 |
+
hidden_states = self.LayerNorm(hidden_states + input_tensor)
|
| 378 |
+
return hidden_states
|
| 379 |
+
|
| 380 |
+
|
| 381 |
+
class BertLayer(nn.Module):
|
| 382 |
+
def __init__(self, config, layer_num):
|
| 383 |
+
super().__init__()
|
| 384 |
+
self.config = config
|
| 385 |
+
self.chunk_size_feed_forward = config.chunk_size_feed_forward
|
| 386 |
+
self.seq_len_dim = 1
|
| 387 |
+
self.attention = BertAttention(config)
|
| 388 |
+
self.layer_num = layer_num
|
| 389 |
+
if self.config.add_cross_attention:
|
| 390 |
+
self.crossattention = BertAttention(config, is_cross_attention=self.config.add_cross_attention)
|
| 391 |
+
self.intermediate = BertIntermediate(config)
|
| 392 |
+
self.output = BertOutput(config)
|
| 393 |
+
|
| 394 |
+
def forward(
|
| 395 |
+
self,
|
| 396 |
+
hidden_states,
|
| 397 |
+
attention_mask=None,
|
| 398 |
+
head_mask=None,
|
| 399 |
+
encoder_hidden_states=None,
|
| 400 |
+
encoder_attention_mask=None,
|
| 401 |
+
past_key_value=None,
|
| 402 |
+
output_attentions=False,
|
| 403 |
+
mode=None,
|
| 404 |
+
):
|
| 405 |
+
|
| 406 |
+
if mode == 'tagging':
|
| 407 |
+
|
| 408 |
+
assert encoder_hidden_states is not None, "encoder_hidden_states must be given for cross-attention layers"
|
| 409 |
+
|
| 410 |
+
cross_attention_outputs = self.crossattention(
|
| 411 |
+
hidden_states,
|
| 412 |
+
attention_mask,
|
| 413 |
+
head_mask,
|
| 414 |
+
encoder_hidden_states,
|
| 415 |
+
encoder_attention_mask,
|
| 416 |
+
output_attentions=output_attentions,
|
| 417 |
+
)
|
| 418 |
+
attention_output = cross_attention_outputs[0]
|
| 419 |
+
outputs = cross_attention_outputs[1:-1] # add cross attentions if we output attention weights
|
| 420 |
+
|
| 421 |
+
present_key_value = cross_attention_outputs[-1]
|
| 422 |
+
|
| 423 |
+
else:
|
| 424 |
+
# decoder uni-directional self-attention cached key/values tuple is at positions 1,2
|
| 425 |
+
self_attn_past_key_value = past_key_value[:2] if past_key_value is not None else None
|
| 426 |
+
self_attention_outputs = self.attention(
|
| 427 |
+
hidden_states,
|
| 428 |
+
attention_mask,
|
| 429 |
+
head_mask,
|
| 430 |
+
output_attentions=output_attentions,
|
| 431 |
+
past_key_value=self_attn_past_key_value,
|
| 432 |
+
)
|
| 433 |
+
attention_output = self_attention_outputs[0]
|
| 434 |
+
|
| 435 |
+
outputs = self_attention_outputs[1:-1]
|
| 436 |
+
present_key_value = self_attention_outputs[-1]
|
| 437 |
+
|
| 438 |
+
if mode=='multimodal':
|
| 439 |
+
assert encoder_hidden_states is not None, "encoder_hidden_states must be given for cross-attention layers"
|
| 440 |
+
|
| 441 |
+
cross_attention_outputs = self.crossattention(
|
| 442 |
+
attention_output,
|
| 443 |
+
attention_mask,
|
| 444 |
+
head_mask,
|
| 445 |
+
encoder_hidden_states,
|
| 446 |
+
encoder_attention_mask,
|
| 447 |
+
output_attentions=output_attentions,
|
| 448 |
+
)
|
| 449 |
+
attention_output = cross_attention_outputs[0]
|
| 450 |
+
outputs = outputs + cross_attention_outputs[1:-1] # add cross attentions if we output attention weights
|
| 451 |
+
layer_output = apply_chunking_to_forward(
|
| 452 |
+
self.feed_forward_chunk, self.chunk_size_feed_forward, self.seq_len_dim, attention_output
|
| 453 |
+
)
|
| 454 |
+
outputs = (layer_output,) + outputs
|
| 455 |
+
|
| 456 |
+
outputs = outputs + (present_key_value,)
|
| 457 |
+
|
| 458 |
+
return outputs
|
| 459 |
+
|
| 460 |
+
def feed_forward_chunk(self, attention_output):
|
| 461 |
+
intermediate_output = self.intermediate(attention_output)
|
| 462 |
+
layer_output = self.output(intermediate_output, attention_output)
|
| 463 |
+
return layer_output
|
| 464 |
+
|
| 465 |
+
|
| 466 |
+
class BertEncoder(nn.Module):
|
| 467 |
+
def __init__(self, config):
|
| 468 |
+
super().__init__()
|
| 469 |
+
self.config = config
|
| 470 |
+
self.layer = nn.ModuleList([BertLayer(config,i) for i in range(config.num_hidden_layers)])
|
| 471 |
+
self.gradient_checkpointing = False
|
| 472 |
+
|
| 473 |
+
def forward(
|
| 474 |
+
self,
|
| 475 |
+
hidden_states,
|
| 476 |
+
attention_mask=None,
|
| 477 |
+
head_mask=None,
|
| 478 |
+
encoder_hidden_states=None,
|
| 479 |
+
encoder_attention_mask=None,
|
| 480 |
+
past_key_values=None,
|
| 481 |
+
use_cache=None,
|
| 482 |
+
output_attentions=False,
|
| 483 |
+
output_hidden_states=False,
|
| 484 |
+
return_dict=True,
|
| 485 |
+
mode='multimodal',
|
| 486 |
+
):
|
| 487 |
+
all_hidden_states = () if output_hidden_states else None
|
| 488 |
+
all_self_attentions = () if output_attentions else None
|
| 489 |
+
all_cross_attentions = () if output_attentions and self.config.add_cross_attention else None
|
| 490 |
+
|
| 491 |
+
next_decoder_cache = () if use_cache else None
|
| 492 |
+
|
| 493 |
+
for i in range(self.config.num_hidden_layers):
|
| 494 |
+
layer_module = self.layer[i]
|
| 495 |
+
if output_hidden_states:
|
| 496 |
+
all_hidden_states = all_hidden_states + (hidden_states,)
|
| 497 |
+
|
| 498 |
+
layer_head_mask = head_mask[i] if head_mask is not None else None
|
| 499 |
+
past_key_value = past_key_values[i] if past_key_values is not None else None
|
| 500 |
+
|
| 501 |
+
if self.gradient_checkpointing and self.training:
|
| 502 |
+
|
| 503 |
+
if use_cache:
|
| 504 |
+
logger.warn(
|
| 505 |
+
"`use_cache=True` is incompatible with gradient checkpointing. Setting `use_cache=False`..."
|
| 506 |
+
)
|
| 507 |
+
use_cache = False
|
| 508 |
+
|
| 509 |
+
def create_custom_forward(module):
|
| 510 |
+
def custom_forward(*inputs):
|
| 511 |
+
return module(*inputs, past_key_value, output_attentions)
|
| 512 |
+
|
| 513 |
+
return custom_forward
|
| 514 |
+
|
| 515 |
+
layer_outputs = torch.utils.checkpoint.checkpoint(
|
| 516 |
+
create_custom_forward(layer_module),
|
| 517 |
+
hidden_states,
|
| 518 |
+
attention_mask,
|
| 519 |
+
layer_head_mask,
|
| 520 |
+
encoder_hidden_states,
|
| 521 |
+
encoder_attention_mask,
|
| 522 |
+
mode=mode,
|
| 523 |
+
)
|
| 524 |
+
else:
|
| 525 |
+
layer_outputs = layer_module(
|
| 526 |
+
hidden_states,
|
| 527 |
+
attention_mask,
|
| 528 |
+
layer_head_mask,
|
| 529 |
+
encoder_hidden_states,
|
| 530 |
+
encoder_attention_mask,
|
| 531 |
+
past_key_value,
|
| 532 |
+
output_attentions,
|
| 533 |
+
mode=mode,
|
| 534 |
+
)
|
| 535 |
+
|
| 536 |
+
hidden_states = layer_outputs[0]
|
| 537 |
+
if use_cache:
|
| 538 |
+
next_decoder_cache += (layer_outputs[-1],)
|
| 539 |
+
if output_attentions:
|
| 540 |
+
all_self_attentions = all_self_attentions + (layer_outputs[1],)
|
| 541 |
+
|
| 542 |
+
if output_hidden_states:
|
| 543 |
+
all_hidden_states = all_hidden_states + (hidden_states,)
|
| 544 |
+
|
| 545 |
+
if not return_dict:
|
| 546 |
+
return tuple(
|
| 547 |
+
v
|
| 548 |
+
for v in [
|
| 549 |
+
hidden_states,
|
| 550 |
+
next_decoder_cache,
|
| 551 |
+
all_hidden_states,
|
| 552 |
+
all_self_attentions,
|
| 553 |
+
all_cross_attentions,
|
| 554 |
+
]
|
| 555 |
+
if v is not None
|
| 556 |
+
)
|
| 557 |
+
return BaseModelOutputWithPastAndCrossAttentions(
|
| 558 |
+
last_hidden_state=hidden_states,
|
| 559 |
+
past_key_values=next_decoder_cache,
|
| 560 |
+
hidden_states=all_hidden_states,
|
| 561 |
+
attentions=all_self_attentions,
|
| 562 |
+
cross_attentions=all_cross_attentions,
|
| 563 |
+
)
|
| 564 |
+
|
| 565 |
+
|
| 566 |
+
class BertPooler(nn.Module):
|
| 567 |
+
def __init__(self, config):
|
| 568 |
+
super().__init__()
|
| 569 |
+
self.dense = nn.Linear(config.hidden_size, config.hidden_size)
|
| 570 |
+
self.activation = nn.Tanh()
|
| 571 |
+
|
| 572 |
+
def forward(self, hidden_states):
|
| 573 |
+
# We "pool" the model by simply taking the hidden state corresponding
|
| 574 |
+
# to the first token.
|
| 575 |
+
first_token_tensor = hidden_states[:, 0]
|
| 576 |
+
pooled_output = self.dense(first_token_tensor)
|
| 577 |
+
pooled_output = self.activation(pooled_output)
|
| 578 |
+
return pooled_output
|
| 579 |
+
|
| 580 |
+
|
| 581 |
+
class BertPredictionHeadTransform(nn.Module):
|
| 582 |
+
def __init__(self, config):
|
| 583 |
+
super().__init__()
|
| 584 |
+
self.dense = nn.Linear(config.hidden_size, config.hidden_size)
|
| 585 |
+
if isinstance(config.hidden_act, str):
|
| 586 |
+
self.transform_act_fn = ACT2FN[config.hidden_act]
|
| 587 |
+
else:
|
| 588 |
+
self.transform_act_fn = config.hidden_act
|
| 589 |
+
self.LayerNorm = nn.LayerNorm(config.hidden_size, eps=config.layer_norm_eps)
|
| 590 |
+
|
| 591 |
+
def forward(self, hidden_states):
|
| 592 |
+
hidden_states = self.dense(hidden_states)
|
| 593 |
+
hidden_states = self.transform_act_fn(hidden_states)
|
| 594 |
+
hidden_states = self.LayerNorm(hidden_states)
|
| 595 |
+
return hidden_states
|
| 596 |
+
|
| 597 |
+
|
| 598 |
+
class BertLMPredictionHead(nn.Module):
|
| 599 |
+
def __init__(self, config):
|
| 600 |
+
super().__init__()
|
| 601 |
+
self.transform = BertPredictionHeadTransform(config)
|
| 602 |
+
|
| 603 |
+
# The output weights are the same as the input embeddings, but there is
|
| 604 |
+
# an output-only bias for each token.
|
| 605 |
+
self.decoder = nn.Linear(config.hidden_size, config.vocab_size, bias=False)
|
| 606 |
+
|
| 607 |
+
self.bias = nn.Parameter(torch.zeros(config.vocab_size))
|
| 608 |
+
|
| 609 |
+
# Need a link between the two variables so that the bias is correctly resized with `resize_token_embeddings`
|
| 610 |
+
self.decoder.bias = self.bias
|
| 611 |
+
|
| 612 |
+
def forward(self, hidden_states):
|
| 613 |
+
hidden_states = self.transform(hidden_states)
|
| 614 |
+
hidden_states = self.decoder(hidden_states)
|
| 615 |
+
return hidden_states
|
| 616 |
+
|
| 617 |
+
|
| 618 |
+
class BertOnlyMLMHead(nn.Module):
|
| 619 |
+
def __init__(self, config):
|
| 620 |
+
super().__init__()
|
| 621 |
+
self.predictions = BertLMPredictionHead(config)
|
| 622 |
+
|
| 623 |
+
def forward(self, sequence_output):
|
| 624 |
+
prediction_scores = self.predictions(sequence_output)
|
| 625 |
+
return prediction_scores
|
| 626 |
+
|
| 627 |
+
|
| 628 |
+
class BertPreTrainedModel(PreTrainedModel):
|
| 629 |
+
"""
|
| 630 |
+
An abstract class to handle weights initialization and a simple interface for downloading and loading pretrained
|
| 631 |
+
models.
|
| 632 |
+
"""
|
| 633 |
+
|
| 634 |
+
config_class = BertConfig
|
| 635 |
+
base_model_prefix = "bert"
|
| 636 |
+
_keys_to_ignore_on_load_missing = [r"position_ids"]
|
| 637 |
+
|
| 638 |
+
def _init_weights(self, module):
|
| 639 |
+
""" Initialize the weights """
|
| 640 |
+
if isinstance(module, (nn.Linear, nn.Embedding)):
|
| 641 |
+
# Slightly different from the TF version which uses truncated_normal for initialization
|
| 642 |
+
# cf https://github.com/pytorch/pytorch/pull/5617
|
| 643 |
+
module.weight.data.normal_(mean=0.0, std=self.config.initializer_range)
|
| 644 |
+
elif isinstance(module, nn.LayerNorm):
|
| 645 |
+
module.bias.data.zero_()
|
| 646 |
+
module.weight.data.fill_(1.0)
|
| 647 |
+
if isinstance(module, nn.Linear) and module.bias is not None:
|
| 648 |
+
module.bias.data.zero_()
|
| 649 |
+
|
| 650 |
+
|
| 651 |
+
class BertModel(BertPreTrainedModel):
|
| 652 |
+
"""
|
| 653 |
+
The model can behave as an encoder (with only self-attention) as well as a decoder, in which case a layer of
|
| 654 |
+
cross-attention is added between the self-attention layers, following the architecture described in `Attention is
|
| 655 |
+
all you need <https://arxiv.org/abs/1706.03762>`__ by Ashish Vaswani, Noam Shazeer, Niki Parmar, Jakob Uszkoreit,
|
| 656 |
+
Llion Jones, Aidan N. Gomez, Lukasz Kaiser and Illia Polosukhin.
|
| 657 |
+
argument and :obj:`add_cross_attention` set to :obj:`True`; an :obj:`encoder_hidden_states` is then expected as an
|
| 658 |
+
input to the forward pass.
|
| 659 |
+
"""
|
| 660 |
+
|
| 661 |
+
def __init__(self, config, add_pooling_layer=True):
|
| 662 |
+
super().__init__(config)
|
| 663 |
+
self.config = config
|
| 664 |
+
|
| 665 |
+
self.embeddings = BertEmbeddings(config)
|
| 666 |
+
|
| 667 |
+
self.encoder = BertEncoder(config)
|
| 668 |
+
|
| 669 |
+
self.pooler = BertPooler(config) if add_pooling_layer else None
|
| 670 |
+
|
| 671 |
+
self.init_weights()
|
| 672 |
+
|
| 673 |
+
|
| 674 |
+
def get_input_embeddings(self):
|
| 675 |
+
return self.embeddings.word_embeddings
|
| 676 |
+
|
| 677 |
+
def set_input_embeddings(self, value):
|
| 678 |
+
self.embeddings.word_embeddings = value
|
| 679 |
+
|
| 680 |
+
def _prune_heads(self, heads_to_prune):
|
| 681 |
+
"""
|
| 682 |
+
Prunes heads of the model. heads_to_prune: dict of {layer_num: list of heads to prune in this layer} See base
|
| 683 |
+
class PreTrainedModel
|
| 684 |
+
"""
|
| 685 |
+
for layer, heads in heads_to_prune.items():
|
| 686 |
+
self.encoder.layer[layer].attention.prune_heads(heads)
|
| 687 |
+
|
| 688 |
+
|
| 689 |
+
def get_extended_attention_mask(self, attention_mask: Tensor, input_shape: Tuple[int], device: device, is_decoder: bool) -> Tensor:
|
| 690 |
+
"""
|
| 691 |
+
Makes broadcastable attention and causal masks so that future and masked tokens are ignored.
|
| 692 |
+
|
| 693 |
+
Arguments:
|
| 694 |
+
attention_mask (:obj:`torch.Tensor`):
|
| 695 |
+
Mask with ones indicating tokens to attend to, zeros for tokens to ignore.
|
| 696 |
+
input_shape (:obj:`Tuple[int]`):
|
| 697 |
+
The shape of the input to the model.
|
| 698 |
+
device: (:obj:`torch.device`):
|
| 699 |
+
The device of the input to the model.
|
| 700 |
+
|
| 701 |
+
Returns:
|
| 702 |
+
:obj:`torch.Tensor` The extended attention mask, with a the same dtype as :obj:`attention_mask.dtype`.
|
| 703 |
+
"""
|
| 704 |
+
# We can provide a self-attention mask of dimensions [batch_size, from_seq_length, to_seq_length]
|
| 705 |
+
# ourselves in which case we just need to make it broadcastable to all heads.
|
| 706 |
+
if attention_mask.dim() == 3:
|
| 707 |
+
extended_attention_mask = attention_mask[:, None, :, :]
|
| 708 |
+
elif attention_mask.dim() == 2:
|
| 709 |
+
# Provided a padding mask of dimensions [batch_size, seq_length]
|
| 710 |
+
# - if the model is a decoder, apply a causal mask in addition to the padding mask
|
| 711 |
+
# - if the model is an encoder, make the mask broadcastable to [batch_size, num_heads, seq_length, seq_length]
|
| 712 |
+
if is_decoder:
|
| 713 |
+
batch_size, seq_length = input_shape
|
| 714 |
+
|
| 715 |
+
seq_ids = torch.arange(seq_length, device=device)
|
| 716 |
+
causal_mask = seq_ids[None, None, :].repeat(batch_size, seq_length, 1) <= seq_ids[None, :, None]
|
| 717 |
+
# in case past_key_values are used we need to add a prefix ones mask to the causal mask
|
| 718 |
+
# causal and attention masks must have same type with pytorch version < 1.3
|
| 719 |
+
causal_mask = causal_mask.to(attention_mask.dtype)
|
| 720 |
+
|
| 721 |
+
if causal_mask.shape[1] < attention_mask.shape[1]:
|
| 722 |
+
prefix_seq_len = attention_mask.shape[1] - causal_mask.shape[1]
|
| 723 |
+
causal_mask = torch.cat(
|
| 724 |
+
[
|
| 725 |
+
torch.ones((batch_size, seq_length, prefix_seq_len), device=device, dtype=causal_mask.dtype),
|
| 726 |
+
causal_mask,
|
| 727 |
+
],
|
| 728 |
+
axis=-1,
|
| 729 |
+
)
|
| 730 |
+
|
| 731 |
+
extended_attention_mask = causal_mask[:, None, :, :] * attention_mask[:, None, None, :]
|
| 732 |
+
else:
|
| 733 |
+
extended_attention_mask = attention_mask[:, None, None, :]
|
| 734 |
+
else:
|
| 735 |
+
raise ValueError(
|
| 736 |
+
"Wrong shape for input_ids (shape {}) or attention_mask (shape {})".format(
|
| 737 |
+
input_shape, attention_mask.shape
|
| 738 |
+
)
|
| 739 |
+
)
|
| 740 |
+
|
| 741 |
+
# Since attention_mask is 1.0 for positions we want to attend and 0.0 for
|
| 742 |
+
# masked positions, this operation will create a tensor which is 0.0 for
|
| 743 |
+
# positions we want to attend and -10000.0 for masked positions.
|
| 744 |
+
# Since we are adding it to the raw scores before the softmax, this is
|
| 745 |
+
# effectively the same as removing these entirely.
|
| 746 |
+
extended_attention_mask = extended_attention_mask.to(dtype=self.dtype) # fp16 compatibility
|
| 747 |
+
extended_attention_mask = (1.0 - extended_attention_mask) * -10000.0
|
| 748 |
+
return extended_attention_mask
|
| 749 |
+
|
| 750 |
+
def forward(
|
| 751 |
+
self,
|
| 752 |
+
input_ids=None,
|
| 753 |
+
attention_mask=None,
|
| 754 |
+
position_ids=None,
|
| 755 |
+
head_mask=None,
|
| 756 |
+
inputs_embeds=None,
|
| 757 |
+
encoder_embeds=None,
|
| 758 |
+
encoder_hidden_states=None,
|
| 759 |
+
encoder_attention_mask=None,
|
| 760 |
+
past_key_values=None,
|
| 761 |
+
use_cache=None,
|
| 762 |
+
output_attentions=None,
|
| 763 |
+
output_hidden_states=None,
|
| 764 |
+
return_dict=None,
|
| 765 |
+
is_decoder=False,
|
| 766 |
+
mode='multimodal',
|
| 767 |
+
):
|
| 768 |
+
r"""
|
| 769 |
+
encoder_hidden_states (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length, hidden_size)`, `optional`):
|
| 770 |
+
Sequence of hidden-states at the output of the last layer of the encoder. Used in the cross-attention if
|
| 771 |
+
the model is configured as a decoder.
|
| 772 |
+
encoder_attention_mask (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
|
| 773 |
+
Mask to avoid performing attention on the padding token indices of the encoder input. This mask is used in
|
| 774 |
+
the cross-attention if the model is configured as a decoder. Mask values selected in ``[0, 1]``:
|
| 775 |
+
- 1 for tokens that are **not masked**,
|
| 776 |
+
- 0 for tokens that are **masked**.
|
| 777 |
+
past_key_values (:obj:`tuple(tuple(torch.FloatTensor))` of length :obj:`config.n_layers` with each tuple having 4 tensors of shape :obj:`(batch_size, num_heads, sequence_length - 1, embed_size_per_head)`):
|
| 778 |
+
Contains precomputed key and value hidden states of the attention blocks. Can be used to speed up decoding.
|
| 779 |
+
If :obj:`past_key_values` are used, the user can optionally input only the last :obj:`decoder_input_ids`
|
| 780 |
+
(those that don't have their past key value states given to this model) of shape :obj:`(batch_size, 1)`
|
| 781 |
+
instead of all :obj:`decoder_input_ids` of shape :obj:`(batch_size, sequence_length)`.
|
| 782 |
+
use_cache (:obj:`bool`, `optional`):
|
| 783 |
+
If set to :obj:`True`, :obj:`past_key_values` key value states are returned and can be used to speed up
|
| 784 |
+
decoding (see :obj:`past_key_values`).
|
| 785 |
+
"""
|
| 786 |
+
output_attentions = output_attentions if output_attentions is not None else self.config.output_attentions
|
| 787 |
+
output_hidden_states = (
|
| 788 |
+
output_hidden_states if output_hidden_states is not None else self.config.output_hidden_states
|
| 789 |
+
)
|
| 790 |
+
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
|
| 791 |
+
|
| 792 |
+
if is_decoder:
|
| 793 |
+
use_cache = use_cache if use_cache is not None else self.config.use_cache
|
| 794 |
+
else:
|
| 795 |
+
use_cache = False
|
| 796 |
+
|
| 797 |
+
if input_ids is not None and inputs_embeds is not None:
|
| 798 |
+
raise ValueError("You cannot specify both input_ids and inputs_embeds at the same time")
|
| 799 |
+
elif input_ids is not None:
|
| 800 |
+
input_shape = input_ids.size()
|
| 801 |
+
batch_size, seq_length = input_shape
|
| 802 |
+
device = input_ids.device
|
| 803 |
+
elif inputs_embeds is not None:
|
| 804 |
+
input_shape = inputs_embeds.size()[:-1]
|
| 805 |
+
batch_size, seq_length = input_shape
|
| 806 |
+
device = inputs_embeds.device
|
| 807 |
+
elif encoder_embeds is not None:
|
| 808 |
+
input_shape = encoder_embeds.size()[:-1]
|
| 809 |
+
batch_size, seq_length = input_shape
|
| 810 |
+
device = encoder_embeds.device
|
| 811 |
+
else:
|
| 812 |
+
raise ValueError("You have to specify either input_ids or inputs_embeds or encoder_embeds")
|
| 813 |
+
|
| 814 |
+
# past_key_values_length
|
| 815 |
+
past_key_values_length = past_key_values[0][0].shape[2] if past_key_values is not None else 0
|
| 816 |
+
|
| 817 |
+
if attention_mask is None:
|
| 818 |
+
attention_mask = torch.ones(((batch_size, seq_length + past_key_values_length)), device=device)
|
| 819 |
+
|
| 820 |
+
# We can provide a self-attention mask of dimensions [batch_size, from_seq_length, to_seq_length]
|
| 821 |
+
# ourselves in which case we just need to make it broadcastable to all heads.
|
| 822 |
+
extended_attention_mask: torch.Tensor = self.get_extended_attention_mask(attention_mask, input_shape,
|
| 823 |
+
device, is_decoder)
|
| 824 |
+
|
| 825 |
+
# If a 2D or 3D attention mask is provided for the cross-attention
|
| 826 |
+
# we need to make broadcastable to [batch_size, num_heads, seq_length, seq_length]
|
| 827 |
+
if encoder_hidden_states is not None:
|
| 828 |
+
if type(encoder_hidden_states) == list:
|
| 829 |
+
encoder_batch_size, encoder_sequence_length, _ = encoder_hidden_states[0].size()
|
| 830 |
+
else:
|
| 831 |
+
encoder_batch_size, encoder_sequence_length, _ = encoder_hidden_states.size()
|
| 832 |
+
encoder_hidden_shape = (encoder_batch_size, encoder_sequence_length)
|
| 833 |
+
|
| 834 |
+
if type(encoder_attention_mask) == list:
|
| 835 |
+
encoder_extended_attention_mask = [self.invert_attention_mask(mask) for mask in encoder_attention_mask]
|
| 836 |
+
elif encoder_attention_mask is None:
|
| 837 |
+
encoder_attention_mask = torch.ones(encoder_hidden_shape, device=device)
|
| 838 |
+
encoder_extended_attention_mask = self.invert_attention_mask(encoder_attention_mask)
|
| 839 |
+
else:
|
| 840 |
+
encoder_extended_attention_mask = self.invert_attention_mask(encoder_attention_mask)
|
| 841 |
+
else:
|
| 842 |
+
encoder_extended_attention_mask = None
|
| 843 |
+
|
| 844 |
+
# Prepare head mask if needed
|
| 845 |
+
# 1.0 in head_mask indicate we keep the head
|
| 846 |
+
# attention_probs has shape bsz x n_heads x N x N
|
| 847 |
+
# input head_mask has shape [num_heads] or [num_hidden_layers x num_heads]
|
| 848 |
+
# and head_mask is converted to shape [num_hidden_layers x batch x num_heads x seq_length x seq_length]
|
| 849 |
+
head_mask = self.get_head_mask(head_mask, self.config.num_hidden_layers)
|
| 850 |
+
|
| 851 |
+
if encoder_embeds is None:
|
| 852 |
+
embedding_output = self.embeddings(
|
| 853 |
+
input_ids=input_ids,
|
| 854 |
+
position_ids=position_ids,
|
| 855 |
+
inputs_embeds=inputs_embeds,
|
| 856 |
+
past_key_values_length=past_key_values_length,
|
| 857 |
+
)
|
| 858 |
+
else:
|
| 859 |
+
embedding_output = encoder_embeds
|
| 860 |
+
|
| 861 |
+
encoder_outputs = self.encoder(
|
| 862 |
+
embedding_output,
|
| 863 |
+
attention_mask=extended_attention_mask,
|
| 864 |
+
head_mask=head_mask,
|
| 865 |
+
encoder_hidden_states=encoder_hidden_states,
|
| 866 |
+
encoder_attention_mask=encoder_extended_attention_mask,
|
| 867 |
+
past_key_values=past_key_values,
|
| 868 |
+
use_cache=use_cache,
|
| 869 |
+
output_attentions=output_attentions,
|
| 870 |
+
output_hidden_states=output_hidden_states,
|
| 871 |
+
return_dict=return_dict,
|
| 872 |
+
mode=mode,
|
| 873 |
+
)
|
| 874 |
+
sequence_output = encoder_outputs[0]
|
| 875 |
+
pooled_output = self.pooler(sequence_output) if self.pooler is not None else None
|
| 876 |
+
|
| 877 |
+
if not return_dict:
|
| 878 |
+
return (sequence_output, pooled_output) + encoder_outputs[1:]
|
| 879 |
+
|
| 880 |
+
return BaseModelOutputWithPoolingAndCrossAttentions(
|
| 881 |
+
last_hidden_state=sequence_output,
|
| 882 |
+
pooler_output=pooled_output,
|
| 883 |
+
past_key_values=encoder_outputs.past_key_values,
|
| 884 |
+
hidden_states=encoder_outputs.hidden_states,
|
| 885 |
+
attentions=encoder_outputs.attentions,
|
| 886 |
+
cross_attentions=encoder_outputs.cross_attentions,
|
| 887 |
+
)
|
| 888 |
+
|
| 889 |
+
|
| 890 |
+
class BertLMHeadModel(BertPreTrainedModel):
|
| 891 |
+
|
| 892 |
+
_keys_to_ignore_on_load_unexpected = [r"pooler"]
|
| 893 |
+
_keys_to_ignore_on_load_missing = [r"position_ids", r"predictions.decoder.bias"]
|
| 894 |
+
|
| 895 |
+
def __init__(self, config):
|
| 896 |
+
super().__init__(config)
|
| 897 |
+
|
| 898 |
+
self.bert = BertModel(config, add_pooling_layer=False)
|
| 899 |
+
self.cls = BertOnlyMLMHead(config)
|
| 900 |
+
|
| 901 |
+
self.init_weights()
|
| 902 |
+
|
| 903 |
+
def get_output_embeddings(self):
|
| 904 |
+
return self.cls.predictions.decoder
|
| 905 |
+
|
| 906 |
+
def set_output_embeddings(self, new_embeddings):
|
| 907 |
+
self.cls.predictions.decoder = new_embeddings
|
| 908 |
+
|
| 909 |
+
def forward(
|
| 910 |
+
self,
|
| 911 |
+
input_ids=None,
|
| 912 |
+
attention_mask=None,
|
| 913 |
+
position_ids=None,
|
| 914 |
+
head_mask=None,
|
| 915 |
+
inputs_embeds=None,
|
| 916 |
+
encoder_hidden_states=None,
|
| 917 |
+
encoder_attention_mask=None,
|
| 918 |
+
labels=None,
|
| 919 |
+
past_key_values=None,
|
| 920 |
+
use_cache=None,
|
| 921 |
+
output_attentions=None,
|
| 922 |
+
output_hidden_states=None,
|
| 923 |
+
return_dict=None,
|
| 924 |
+
return_logits=False,
|
| 925 |
+
is_decoder=True,
|
| 926 |
+
reduction='mean',
|
| 927 |
+
mode='multimodal',
|
| 928 |
+
):
|
| 929 |
+
r"""
|
| 930 |
+
encoder_hidden_states (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length, hidden_size)`, `optional`):
|
| 931 |
+
Sequence of hidden-states at the output of the last layer of the encoder. Used in the cross-attention if
|
| 932 |
+
the model is configured as a decoder.
|
| 933 |
+
encoder_attention_mask (:obj:`torch.FloatTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
|
| 934 |
+
Mask to avoid performing attention on the padding token indices of the encoder input. This mask is used in
|
| 935 |
+
the cross-attention if the model is configured as a decoder. Mask values selected in ``[0, 1]``:
|
| 936 |
+
- 1 for tokens that are **not masked**,
|
| 937 |
+
- 0 for tokens that are **masked**.
|
| 938 |
+
labels (:obj:`torch.LongTensor` of shape :obj:`(batch_size, sequence_length)`, `optional`):
|
| 939 |
+
Labels for computing the left-to-right language modeling loss (next word prediction). Indices should be in
|
| 940 |
+
``[-100, 0, ..., config.vocab_size]`` (see ``input_ids`` docstring) Tokens with indices set to ``-100`` are
|
| 941 |
+
ignored (masked), the loss is only computed for the tokens with labels n ``[0, ..., config.vocab_size]``
|
| 942 |
+
past_key_values (:obj:`tuple(tuple(torch.FloatTensor))` of length :obj:`config.n_layers` with each tuple having 4 tensors of shape :obj:`(batch_size, num_heads, sequence_length - 1, embed_size_per_head)`):
|
| 943 |
+
Contains precomputed key and value hidden states of the attention blocks. Can be used to speed up decoding.
|
| 944 |
+
If :obj:`past_key_values` are used, the user can optionally input only the last :obj:`decoder_input_ids`
|
| 945 |
+
(those that don't have their past key value states given to this model) of shape :obj:`(batch_size, 1)`
|
| 946 |
+
instead of all :obj:`decoder_input_ids` of shape :obj:`(batch_size, sequence_length)`.
|
| 947 |
+
use_cache (:obj:`bool`, `optional`):
|
| 948 |
+
If set to :obj:`True`, :obj:`past_key_values` key value states are returned and can be used to speed up
|
| 949 |
+
decoding (see :obj:`past_key_values`).
|
| 950 |
+
Returns:
|
| 951 |
+
Example::
|
| 952 |
+
>>> from transformers import BertTokenizer, BertLMHeadModel, BertConfig
|
| 953 |
+
>>> import torch
|
| 954 |
+
>>> tokenizer = BertTokenizer.from_pretrained('bert-base-cased')
|
| 955 |
+
>>> config = BertConfig.from_pretrained("bert-base-cased")
|
| 956 |
+
>>> model = BertLMHeadModel.from_pretrained('bert-base-cased', config=config)
|
| 957 |
+
>>> inputs = tokenizer("Hello, my dog is cute", return_tensors="pt")
|
| 958 |
+
>>> outputs = model(**inputs)
|
| 959 |
+
>>> prediction_logits = outputs.logits
|
| 960 |
+
"""
|
| 961 |
+
return_dict = return_dict if return_dict is not None else self.config.use_return_dict
|
| 962 |
+
if labels is not None:
|
| 963 |
+
use_cache = False
|
| 964 |
+
|
| 965 |
+
outputs = self.bert(
|
| 966 |
+
input_ids,
|
| 967 |
+
attention_mask=attention_mask,
|
| 968 |
+
position_ids=position_ids,
|
| 969 |
+
head_mask=head_mask,
|
| 970 |
+
inputs_embeds=inputs_embeds,
|
| 971 |
+
encoder_hidden_states=encoder_hidden_states,
|
| 972 |
+
encoder_attention_mask=encoder_attention_mask,
|
| 973 |
+
past_key_values=past_key_values,
|
| 974 |
+
use_cache=use_cache,
|
| 975 |
+
output_attentions=output_attentions,
|
| 976 |
+
output_hidden_states=output_hidden_states,
|
| 977 |
+
return_dict=return_dict,
|
| 978 |
+
is_decoder=is_decoder,
|
| 979 |
+
mode=mode,
|
| 980 |
+
)
|
| 981 |
+
|
| 982 |
+
sequence_output = outputs[0]
|
| 983 |
+
prediction_scores = self.cls(sequence_output)
|
| 984 |
+
# sequence_output.shape torch.Size([85, 30, 768])
|
| 985 |
+
# prediction_scores.shape torch.Size([85, 30, 30524])
|
| 986 |
+
# labels.shape torch.Size([85, 30])
|
| 987 |
+
|
| 988 |
+
|
| 989 |
+
if return_logits:
|
| 990 |
+
return prediction_scores[:, :-1, :].contiguous()
|
| 991 |
+
|
| 992 |
+
lm_loss = None
|
| 993 |
+
if labels is not None:
|
| 994 |
+
# we are doing next-token prediction; shift prediction scores and input ids by one
|
| 995 |
+
shifted_prediction_scores = prediction_scores[:, :-1, :].contiguous()
|
| 996 |
+
labels = labels[:, 1:].contiguous()
|
| 997 |
+
loss_fct = CrossEntropyLoss(reduction=reduction, label_smoothing=0.1)
|
| 998 |
+
lm_loss = loss_fct(shifted_prediction_scores.view(-1, self.config.vocab_size), labels.view(-1))
|
| 999 |
+
if reduction=='none':
|
| 1000 |
+
lm_loss = lm_loss.view(prediction_scores.size(0),-1).sum(1)
|
| 1001 |
+
|
| 1002 |
+
if not return_dict:
|
| 1003 |
+
output = (prediction_scores,) + outputs[2:]
|
| 1004 |
+
return ((lm_loss,) + output) if lm_loss is not None else output
|
| 1005 |
+
|
| 1006 |
+
return CausalLMOutputWithCrossAttentions(
|
| 1007 |
+
loss=lm_loss,
|
| 1008 |
+
logits=prediction_scores,
|
| 1009 |
+
past_key_values=outputs.past_key_values,
|
| 1010 |
+
hidden_states=outputs.hidden_states,
|
| 1011 |
+
attentions=outputs.attentions,
|
| 1012 |
+
cross_attentions=outputs.cross_attentions,
|
| 1013 |
+
)
|
| 1014 |
+
|
| 1015 |
+
def prepare_inputs_for_generation(self, input_ids, past=None, attention_mask=None, **model_kwargs):
|
| 1016 |
+
input_shape = input_ids.shape
|
| 1017 |
+
# if model is used as a decoder in encoder-decoder model, the decoder attention mask is created on the fly
|
| 1018 |
+
if attention_mask is None:
|
| 1019 |
+
attention_mask = input_ids.new_ones(input_shape)
|
| 1020 |
+
|
| 1021 |
+
# cut decoder_input_ids if past is used
|
| 1022 |
+
if past is not None:
|
| 1023 |
+
input_ids = input_ids[:, -1:]
|
| 1024 |
+
|
| 1025 |
+
return {
|
| 1026 |
+
"input_ids": input_ids,
|
| 1027 |
+
"attention_mask": attention_mask,
|
| 1028 |
+
"past_key_values": past,
|
| 1029 |
+
"encoder_hidden_states": model_kwargs.get("encoder_hidden_states", None),
|
| 1030 |
+
"encoder_attention_mask": model_kwargs.get("encoder_attention_mask", None),
|
| 1031 |
+
"is_decoder": True,
|
| 1032 |
+
}
|
| 1033 |
+
|
| 1034 |
+
def _reorder_cache(self, past, beam_idx):
|
| 1035 |
+
reordered_past = ()
|
| 1036 |
+
for layer_past in past:
|
| 1037 |
+
reordered_past += (tuple(past_state.index_select(0, beam_idx) for past_state in layer_past),)
|
| 1038 |
+
return reordered_past
|
| 1039 |
+
|
| 1040 |
+
|
ram/models/ram.py
ADDED
|
@@ -0,0 +1,317 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* The Recognize Anything Model (RAM)
|
| 3 |
+
* Written by Xinyu Huang
|
| 4 |
+
'''
|
| 5 |
+
import json
|
| 6 |
+
import warnings
|
| 7 |
+
|
| 8 |
+
import numpy as np
|
| 9 |
+
import torch
|
| 10 |
+
from torch import nn
|
| 11 |
+
|
| 12 |
+
from .bert import BertConfig, BertLMHeadModel, BertModel
|
| 13 |
+
from .swin_transformer import SwinTransformer
|
| 14 |
+
from .utils import *
|
| 15 |
+
|
| 16 |
+
warnings.filterwarnings("ignore")
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
|
| 20 |
+
class RAM(nn.Module):
|
| 21 |
+
def __init__(self,
|
| 22 |
+
med_config=f'{CONFIG_PATH}/configs/med_config.json',
|
| 23 |
+
image_size=384,
|
| 24 |
+
vit='base',
|
| 25 |
+
vit_grad_ckpt=False,
|
| 26 |
+
vit_ckpt_layer=0,
|
| 27 |
+
prompt='a picture of ',
|
| 28 |
+
threshold=0.68,
|
| 29 |
+
delete_tag_index=[],
|
| 30 |
+
tag_list=f'{CONFIG_PATH}/data/ram_tag_list.txt',
|
| 31 |
+
tag_list_chinese=f'{CONFIG_PATH}/data/ram_tag_list_chinese.txt'):
|
| 32 |
+
r""" The Recognize Anything Model (RAM) inference module.
|
| 33 |
+
RAM is a strong image tagging model, which can recognize any common category with high accuracy.
|
| 34 |
+
Described in the paper " Recognize Anything: A Strong Image Tagging Model" https://recognize-anything.github.io/
|
| 35 |
+
|
| 36 |
+
Args:
|
| 37 |
+
med_config (str): path for the mixture of encoder-decoder model's configuration file
|
| 38 |
+
image_size (int): input image size
|
| 39 |
+
vit (str): model size of vision transformer
|
| 40 |
+
threshold (int): tagging threshold
|
| 41 |
+
delete_tag_index (list): delete some tags that may disturb captioning
|
| 42 |
+
"""
|
| 43 |
+
super().__init__()
|
| 44 |
+
|
| 45 |
+
# create image encoder
|
| 46 |
+
if vit == 'swin_b':
|
| 47 |
+
if image_size == 224:
|
| 48 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_224.json'
|
| 49 |
+
elif image_size == 384:
|
| 50 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_384.json'
|
| 51 |
+
vision_config = read_json(vision_config_path)
|
| 52 |
+
assert image_size == vision_config['image_res']
|
| 53 |
+
# assert config['patch_size'] == 32
|
| 54 |
+
vision_width = vision_config['vision_width']
|
| 55 |
+
|
| 56 |
+
self.visual_encoder = SwinTransformer(
|
| 57 |
+
img_size=vision_config['image_res'],
|
| 58 |
+
patch_size=4,
|
| 59 |
+
in_chans=3,
|
| 60 |
+
embed_dim=vision_config['embed_dim'],
|
| 61 |
+
depths=vision_config['depths'],
|
| 62 |
+
num_heads=vision_config['num_heads'],
|
| 63 |
+
window_size=vision_config['window_size'],
|
| 64 |
+
mlp_ratio=4.,
|
| 65 |
+
qkv_bias=True,
|
| 66 |
+
drop_rate=0.0,
|
| 67 |
+
drop_path_rate=0.1,
|
| 68 |
+
ape=False,
|
| 69 |
+
patch_norm=True,
|
| 70 |
+
use_checkpoint=False)
|
| 71 |
+
|
| 72 |
+
elif vit == 'swin_l':
|
| 73 |
+
if image_size == 224:
|
| 74 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_224.json'
|
| 75 |
+
elif image_size == 384:
|
| 76 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_384.json'
|
| 77 |
+
elif image_size == 444:
|
| 78 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_444.json'
|
| 79 |
+
vision_config = read_json(vision_config_path)
|
| 80 |
+
assert image_size == vision_config['image_res']
|
| 81 |
+
# assert config['patch_size'] == 32
|
| 82 |
+
vision_width = vision_config['vision_width']
|
| 83 |
+
|
| 84 |
+
self.visual_encoder = SwinTransformer(
|
| 85 |
+
img_size=vision_config['image_res'],
|
| 86 |
+
patch_size=4,
|
| 87 |
+
in_chans=3,
|
| 88 |
+
embed_dim=vision_config['embed_dim'],
|
| 89 |
+
depths=vision_config['depths'],
|
| 90 |
+
num_heads=vision_config['num_heads'],
|
| 91 |
+
window_size=vision_config['window_size'],
|
| 92 |
+
mlp_ratio=4.,
|
| 93 |
+
qkv_bias=True,
|
| 94 |
+
drop_rate=0.0,
|
| 95 |
+
drop_path_rate=0.1,
|
| 96 |
+
ape=False,
|
| 97 |
+
patch_norm=True,
|
| 98 |
+
use_checkpoint=False)
|
| 99 |
+
|
| 100 |
+
else:
|
| 101 |
+
self.visual_encoder, vision_width = create_vit(
|
| 102 |
+
vit, image_size, vit_grad_ckpt, vit_ckpt_layer)
|
| 103 |
+
|
| 104 |
+
# create tokenzier
|
| 105 |
+
self.tokenizer = init_tokenizer()
|
| 106 |
+
|
| 107 |
+
# Tag2Text employ encoder-decoder architecture for image-tag-text generation: image-tag interaction encoder and image-tag-text decoder
|
| 108 |
+
# create image-tag interaction encoder
|
| 109 |
+
encoder_config = BertConfig.from_json_file(med_config)
|
| 110 |
+
encoder_config.encoder_width = 512
|
| 111 |
+
self.tag_encoder = BertModel(config=encoder_config,
|
| 112 |
+
add_pooling_layer=False)
|
| 113 |
+
|
| 114 |
+
# create image-tag-text decoder
|
| 115 |
+
decoder_config = BertConfig.from_json_file(med_config)
|
| 116 |
+
self.text_decoder = BertLMHeadModel(config=decoder_config)
|
| 117 |
+
|
| 118 |
+
self.delete_tag_index = delete_tag_index
|
| 119 |
+
self.prompt = prompt
|
| 120 |
+
self.prompt_length = len(self.tokenizer(self.prompt).input_ids) - 1
|
| 121 |
+
|
| 122 |
+
# load tag list
|
| 123 |
+
self.tag_list = self.load_tag_list(tag_list)
|
| 124 |
+
self.tag_list_chinese = self.load_tag_list(tag_list_chinese)
|
| 125 |
+
|
| 126 |
+
# create image-tag recognition decoder
|
| 127 |
+
self.threshold = threshold
|
| 128 |
+
self.num_class = len(self.tag_list)
|
| 129 |
+
q2l_config = BertConfig.from_json_file(f'{CONFIG_PATH}/configs/q2l_config.json')
|
| 130 |
+
q2l_config.encoder_width = 512
|
| 131 |
+
self.tagging_head = BertModel(config=q2l_config,
|
| 132 |
+
add_pooling_layer=False)
|
| 133 |
+
self.tagging_head.resize_token_embeddings(len(self.tokenizer))
|
| 134 |
+
# self.label_embed = nn.Embedding(self.num_class, q2l_config.hidden_size)
|
| 135 |
+
self.label_embed = nn.Parameter(torch.zeros(self.num_class, q2l_config.encoder_width))
|
| 136 |
+
|
| 137 |
+
if q2l_config.hidden_size != 512:
|
| 138 |
+
self.wordvec_proj = nn.Linear(512, q2l_config.hidden_size)
|
| 139 |
+
else:
|
| 140 |
+
self.wordvec_proj = nn.Identity()
|
| 141 |
+
|
| 142 |
+
self.fc = nn.Linear(q2l_config.hidden_size, 1)
|
| 143 |
+
|
| 144 |
+
self.del_selfattention()
|
| 145 |
+
|
| 146 |
+
# share weights of the lowest 2-layer of "image-tag interaction encoder" with the "image-tag recogntion decoder"
|
| 147 |
+
tie_encoder_decoder_weights(self.tag_encoder, self.tagging_head, '',
|
| 148 |
+
' ')
|
| 149 |
+
self.image_proj = nn.Linear(vision_width, 512)
|
| 150 |
+
# self.label_embed = nn.Parameter(torch.load(f'{CONFIG_PATH}/data/textual_label_embedding.pth',map_location='cpu').float())
|
| 151 |
+
|
| 152 |
+
# adjust thresholds for some tags
|
| 153 |
+
self.class_threshold = torch.ones(self.num_class) * self.threshold
|
| 154 |
+
ram_class_threshold_path = f'{CONFIG_PATH}/data/ram_tag_list_threshold.txt'
|
| 155 |
+
with open(ram_class_threshold_path, 'r', encoding='utf-8') as f:
|
| 156 |
+
ram_class_threshold = [float(s.strip()) for s in f]
|
| 157 |
+
for key,value in enumerate(ram_class_threshold):
|
| 158 |
+
self.class_threshold[key] = value
|
| 159 |
+
|
| 160 |
+
def load_tag_list(self, tag_list_file):
|
| 161 |
+
with open(tag_list_file, 'r', encoding="utf-8") as f:
|
| 162 |
+
tag_list = f.read().splitlines()
|
| 163 |
+
tag_list = np.array(tag_list)
|
| 164 |
+
return tag_list
|
| 165 |
+
|
| 166 |
+
# delete self-attention layer of image-tag recognition decoder to reduce computation, follower Query2Label
|
| 167 |
+
def del_selfattention(self):
|
| 168 |
+
del self.tagging_head.embeddings
|
| 169 |
+
for layer in self.tagging_head.encoder.layer:
|
| 170 |
+
del layer.attention
|
| 171 |
+
|
| 172 |
+
def condition_forward(self,
|
| 173 |
+
image,
|
| 174 |
+
threshold=0.68,
|
| 175 |
+
condition_flag=None,
|
| 176 |
+
tag_input=None,
|
| 177 |
+
only_feature=True,
|
| 178 |
+
):
|
| 179 |
+
|
| 180 |
+
label_embed = torch.nn.functional.relu(self.wordvec_proj(self.label_embed))
|
| 181 |
+
|
| 182 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 183 |
+
if only_feature:
|
| 184 |
+
return image_embeds
|
| 185 |
+
else:
|
| 186 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 187 |
+
dtype=torch.long).to(image.device)
|
| 188 |
+
|
| 189 |
+
# recognized image tags using image-tag recogntiion decoder
|
| 190 |
+
image_cls_embeds = image_embeds[:, 0, :]
|
| 191 |
+
image_spatial_embeds = image_embeds[:, 1:, :]
|
| 192 |
+
|
| 193 |
+
bs = image_spatial_embeds.shape[0]
|
| 194 |
+
label_embed = label_embed.unsqueeze(0).repeat(bs, 1, 1)
|
| 195 |
+
tagging_embed = self.tagging_head(
|
| 196 |
+
encoder_embeds=label_embed,
|
| 197 |
+
encoder_hidden_states=image_embeds,
|
| 198 |
+
encoder_attention_mask=image_atts,
|
| 199 |
+
return_dict=False,
|
| 200 |
+
mode='tagging',
|
| 201 |
+
)
|
| 202 |
+
|
| 203 |
+
logits = self.fc(tagging_embed[0]).squeeze(-1)
|
| 204 |
+
|
| 205 |
+
targets = torch.where(
|
| 206 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 207 |
+
torch.tensor(1.0).to(image.device),
|
| 208 |
+
torch.zeros(self.num_class).to(image.device))
|
| 209 |
+
|
| 210 |
+
return image_embeds, logits, targets
|
| 211 |
+
|
| 212 |
+
def generate_tag(self,
|
| 213 |
+
image,
|
| 214 |
+
threshold=0.68,
|
| 215 |
+
tag_input=None,
|
| 216 |
+
):
|
| 217 |
+
|
| 218 |
+
label_embed = torch.nn.functional.relu(self.wordvec_proj(self.label_embed))
|
| 219 |
+
|
| 220 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 221 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 222 |
+
dtype=torch.long).to(image.device)
|
| 223 |
+
|
| 224 |
+
# recognized image tags using image-tag recogntiion decoder
|
| 225 |
+
image_cls_embeds = image_embeds[:, 0, :]
|
| 226 |
+
image_spatial_embeds = image_embeds[:, 1:, :]
|
| 227 |
+
|
| 228 |
+
bs = image_spatial_embeds.shape[0]
|
| 229 |
+
label_embed = label_embed.unsqueeze(0).repeat(bs, 1, 1)
|
| 230 |
+
tagging_embed = self.tagging_head(
|
| 231 |
+
encoder_embeds=label_embed,
|
| 232 |
+
encoder_hidden_states=image_embeds,
|
| 233 |
+
encoder_attention_mask=image_atts,
|
| 234 |
+
return_dict=False,
|
| 235 |
+
mode='tagging',
|
| 236 |
+
)
|
| 237 |
+
|
| 238 |
+
logits = self.fc(tagging_embed[0]).squeeze(-1)
|
| 239 |
+
|
| 240 |
+
targets = torch.where(
|
| 241 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 242 |
+
torch.tensor(1.0).to(image.device),
|
| 243 |
+
torch.zeros(self.num_class).to(image.device))
|
| 244 |
+
|
| 245 |
+
tag = targets.cpu().numpy()
|
| 246 |
+
tag[:,self.delete_tag_index] = 0
|
| 247 |
+
tag_output = []
|
| 248 |
+
tag_output_chinese = []
|
| 249 |
+
for b in range(bs):
|
| 250 |
+
index = np.argwhere(tag[b] == 1)
|
| 251 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 252 |
+
# tag_output.append(' | '.join(token))
|
| 253 |
+
tag_output.append(', '.join(token))
|
| 254 |
+
token_chinese = self.tag_list_chinese[index].squeeze(axis=1)
|
| 255 |
+
# tag_output_chinese.append(' | '.join(token_chinese))
|
| 256 |
+
tag_output_chinese.append(', '.join(token_chinese))
|
| 257 |
+
|
| 258 |
+
|
| 259 |
+
return tag_output, tag_output_chinese
|
| 260 |
+
|
| 261 |
+
def generate_tag_openset(self,
|
| 262 |
+
image,
|
| 263 |
+
threshold=0.68,
|
| 264 |
+
tag_input=None,
|
| 265 |
+
):
|
| 266 |
+
|
| 267 |
+
label_embed = torch.nn.functional.relu(self.wordvec_proj(self.label_embed))
|
| 268 |
+
|
| 269 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 270 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 271 |
+
dtype=torch.long).to(image.device)
|
| 272 |
+
|
| 273 |
+
# recognized image tags using image-tag recogntiion decoder
|
| 274 |
+
image_cls_embeds = image_embeds[:, 0, :]
|
| 275 |
+
image_spatial_embeds = image_embeds[:, 1:, :]
|
| 276 |
+
|
| 277 |
+
bs = image_spatial_embeds.shape[0]
|
| 278 |
+
label_embed = label_embed.unsqueeze(0).repeat(bs, 1, 1)
|
| 279 |
+
tagging_embed = self.tagging_head(
|
| 280 |
+
encoder_embeds=label_embed,
|
| 281 |
+
encoder_hidden_states=image_embeds,
|
| 282 |
+
encoder_attention_mask=image_atts,
|
| 283 |
+
return_dict=False,
|
| 284 |
+
mode='tagging',
|
| 285 |
+
)
|
| 286 |
+
|
| 287 |
+
logits = self.fc(tagging_embed[0]).squeeze(-1)
|
| 288 |
+
|
| 289 |
+
targets = torch.where(
|
| 290 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 291 |
+
torch.tensor(1.0).to(image.device),
|
| 292 |
+
torch.zeros(self.num_class).to(image.device))
|
| 293 |
+
|
| 294 |
+
tag = targets.cpu().numpy()
|
| 295 |
+
tag[:,self.delete_tag_index] = 0
|
| 296 |
+
tag_output = []
|
| 297 |
+
for b in range(bs):
|
| 298 |
+
index = np.argwhere(tag[b] == 1)
|
| 299 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 300 |
+
tag_output.append(' | '.join(token))
|
| 301 |
+
|
| 302 |
+
return tag_output
|
| 303 |
+
|
| 304 |
+
|
| 305 |
+
# load RAM pretrained model parameters
|
| 306 |
+
def ram(pretrained='', **kwargs):
|
| 307 |
+
model = RAM(**kwargs)
|
| 308 |
+
if pretrained:
|
| 309 |
+
if kwargs['vit'] == 'swin_b':
|
| 310 |
+
model, msg = load_checkpoint_swinbase(model, pretrained, kwargs)
|
| 311 |
+
elif kwargs['vit'] == 'swin_l':
|
| 312 |
+
model, msg = load_checkpoint_swinlarge(model, pretrained, kwargs)
|
| 313 |
+
else:
|
| 314 |
+
model, msg = load_checkpoint(model, pretrained)
|
| 315 |
+
print('vit:', kwargs['vit'])
|
| 316 |
+
# print('msg', msg)
|
| 317 |
+
return model
|
ram/models/ram_lora.py
ADDED
|
@@ -0,0 +1,344 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* The Recognize Anything Model (RAM)
|
| 3 |
+
* Written by Xinyu Huang
|
| 4 |
+
'''
|
| 5 |
+
import json
|
| 6 |
+
import warnings
|
| 7 |
+
|
| 8 |
+
import numpy as np
|
| 9 |
+
import torch
|
| 10 |
+
from torch import nn
|
| 11 |
+
|
| 12 |
+
|
| 13 |
+
from .bert_lora import BertConfig, BertLMHeadModel, BertModel
|
| 14 |
+
from .swin_transformer_lora import SwinTransformer
|
| 15 |
+
from .utils import *
|
| 16 |
+
|
| 17 |
+
warnings.filterwarnings("ignore")
|
| 18 |
+
|
| 19 |
+
|
| 20 |
+
|
| 21 |
+
class RAMLora(nn.Module):
|
| 22 |
+
def __init__(self,
|
| 23 |
+
condition_config=f'{CONFIG_PATH}/configs/condition_config.json',
|
| 24 |
+
med_config=f'{CONFIG_PATH}/configs/med_config.json',
|
| 25 |
+
image_size=384,
|
| 26 |
+
vit='base',
|
| 27 |
+
vit_grad_ckpt=False,
|
| 28 |
+
vit_ckpt_layer=0,
|
| 29 |
+
prompt='a picture of ',
|
| 30 |
+
threshold=0.68,
|
| 31 |
+
max_threthold=0.9,
|
| 32 |
+
add_threthold=0,
|
| 33 |
+
delete_tag_index=[],
|
| 34 |
+
tag_list=f'{CONFIG_PATH}/data/ram_tag_list.txt',
|
| 35 |
+
tag_list_chinese=f'{CONFIG_PATH}/data/ram_tag_list_chinese.txt'):
|
| 36 |
+
r""" The Recognize Anything Model (RAM) inference module.
|
| 37 |
+
RAM is a strong image tagging model, which can recognize any common category with high accuracy.
|
| 38 |
+
Described in the paper " Recognize Anything: A Strong Image Tagging Model" https://recognize-anything.github.io/
|
| 39 |
+
|
| 40 |
+
Args:
|
| 41 |
+
med_config (str): path for the mixture of encoder-decoder model's configuration file
|
| 42 |
+
image_size (int): input image size
|
| 43 |
+
vit (str): model size of vision transformer
|
| 44 |
+
threshold (int): tagging threshold
|
| 45 |
+
delete_tag_index (list): delete some tags that may disturb captioning
|
| 46 |
+
"""
|
| 47 |
+
super().__init__()
|
| 48 |
+
|
| 49 |
+
# create image encoder
|
| 50 |
+
if vit == 'swin_b':
|
| 51 |
+
if image_size == 224:
|
| 52 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_224.json'
|
| 53 |
+
elif image_size == 384:
|
| 54 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_384.json'
|
| 55 |
+
vision_config = read_json(vision_config_path)
|
| 56 |
+
assert image_size == vision_config['image_res']
|
| 57 |
+
# assert config['patch_size'] == 32
|
| 58 |
+
vision_width = vision_config['vision_width']
|
| 59 |
+
|
| 60 |
+
self.visual_encoder = SwinTransformer(
|
| 61 |
+
img_size=vision_config['image_res'],
|
| 62 |
+
patch_size=4,
|
| 63 |
+
in_chans=3,
|
| 64 |
+
embed_dim=vision_config['embed_dim'],
|
| 65 |
+
depths=vision_config['depths'],
|
| 66 |
+
num_heads=vision_config['num_heads'],
|
| 67 |
+
window_size=vision_config['window_size'],
|
| 68 |
+
mlp_ratio=4.,
|
| 69 |
+
qkv_bias=True,
|
| 70 |
+
drop_rate=0.0,
|
| 71 |
+
drop_path_rate=0.1,
|
| 72 |
+
ape=False,
|
| 73 |
+
patch_norm=True,
|
| 74 |
+
use_checkpoint=False)
|
| 75 |
+
|
| 76 |
+
elif vit == 'swin_l':
|
| 77 |
+
if image_size == 224:
|
| 78 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_224.json'
|
| 79 |
+
elif image_size == 384:
|
| 80 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_384.json'
|
| 81 |
+
elif image_size == 444:
|
| 82 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_444.json'
|
| 83 |
+
vision_config = read_json(vision_config_path)
|
| 84 |
+
assert image_size == vision_config['image_res']
|
| 85 |
+
# assert config['patch_size'] == 32
|
| 86 |
+
vision_width = vision_config['vision_width']
|
| 87 |
+
|
| 88 |
+
self.visual_encoder = SwinTransformer(
|
| 89 |
+
img_size=vision_config['image_res'],
|
| 90 |
+
patch_size=4,
|
| 91 |
+
in_chans=3,
|
| 92 |
+
embed_dim=vision_config['embed_dim'],
|
| 93 |
+
depths=vision_config['depths'],
|
| 94 |
+
num_heads=vision_config['num_heads'],
|
| 95 |
+
window_size=vision_config['window_size'],
|
| 96 |
+
mlp_ratio=4.,
|
| 97 |
+
qkv_bias=True,
|
| 98 |
+
drop_rate=0.0,
|
| 99 |
+
drop_path_rate=0.1,
|
| 100 |
+
ape=False,
|
| 101 |
+
patch_norm=True,
|
| 102 |
+
use_checkpoint=False)
|
| 103 |
+
|
| 104 |
+
else:
|
| 105 |
+
self.visual_encoder, vision_width = create_vit(
|
| 106 |
+
vit, image_size, vit_grad_ckpt, vit_ckpt_layer)
|
| 107 |
+
|
| 108 |
+
# create tokenzier
|
| 109 |
+
self.tokenizer = init_tokenizer()
|
| 110 |
+
|
| 111 |
+
# Tag2Text employ encoder-decoder architecture for image-tag-text generation: image-tag interaction encoder and image-tag-text decoder
|
| 112 |
+
# create image-tag interaction encoder
|
| 113 |
+
encoder_config = BertConfig.from_json_file(med_config)
|
| 114 |
+
encoder_config.encoder_width = 512
|
| 115 |
+
self.tag_encoder = BertModel(config=encoder_config,
|
| 116 |
+
add_pooling_layer=False)
|
| 117 |
+
|
| 118 |
+
# create image-tag-text decoder
|
| 119 |
+
decoder_config = BertConfig.from_json_file(med_config)
|
| 120 |
+
self.text_decoder = BertLMHeadModel(config=decoder_config)
|
| 121 |
+
|
| 122 |
+
self.delete_tag_index = delete_tag_index
|
| 123 |
+
self.prompt = prompt
|
| 124 |
+
self.prompt_length = len(self.tokenizer(self.prompt).input_ids) - 1
|
| 125 |
+
|
| 126 |
+
# load tag list
|
| 127 |
+
self.tag_list = self.load_tag_list(tag_list)
|
| 128 |
+
self.tag_list_chinese = self.load_tag_list(tag_list_chinese)
|
| 129 |
+
|
| 130 |
+
# create image-tag recognition decoder
|
| 131 |
+
self.threshold = threshold
|
| 132 |
+
self.num_class = len(self.tag_list)
|
| 133 |
+
q2l_config = BertConfig.from_json_file(f'{CONFIG_PATH}/configs/q2l_config.json')
|
| 134 |
+
q2l_config.encoder_width = 512
|
| 135 |
+
self.tagging_head = BertModel(config=q2l_config,
|
| 136 |
+
add_pooling_layer=False)
|
| 137 |
+
self.tagging_head.resize_token_embeddings(len(self.tokenizer))
|
| 138 |
+
# self.label_embed = nn.Embedding(self.num_class, q2l_config.hidden_size)
|
| 139 |
+
self.label_embed = nn.Parameter(torch.zeros(self.num_class, q2l_config.encoder_width))
|
| 140 |
+
|
| 141 |
+
if q2l_config.hidden_size != 512:
|
| 142 |
+
self.wordvec_proj = nn.Linear(512, q2l_config.hidden_size)
|
| 143 |
+
else:
|
| 144 |
+
self.wordvec_proj = nn.Identity()
|
| 145 |
+
|
| 146 |
+
self.fc = nn.Linear(q2l_config.hidden_size, 1)
|
| 147 |
+
|
| 148 |
+
self.del_selfattention()
|
| 149 |
+
|
| 150 |
+
# share weights of the lowest 2-layer of "image-tag interaction encoder" with the "image-tag recogntion decoder"
|
| 151 |
+
tie_encoder_decoder_weights(self.tag_encoder, self.tagging_head, '',
|
| 152 |
+
' ')
|
| 153 |
+
self.image_proj = nn.Linear(vision_width, 512)
|
| 154 |
+
# self.label_embed = nn.Parameter(torch.load(f'{CONFIG_PATH}/data/textual_label_embedding.pth',map_location='cpu').float())
|
| 155 |
+
|
| 156 |
+
# adjust thresholds for some tags
|
| 157 |
+
self.class_threshold = torch.ones(self.num_class) * self.threshold
|
| 158 |
+
|
| 159 |
+
print(f'Loading default thretholds from .txt....')
|
| 160 |
+
ram_class_threshold_path = f'{CONFIG_PATH}/data/ram_tag_list_threshold.txt'
|
| 161 |
+
with open(ram_class_threshold_path, 'r', encoding='utf-8') as f:
|
| 162 |
+
ram_class_threshold = [float(s.strip()) for s in f]
|
| 163 |
+
for key,value in enumerate(ram_class_threshold):
|
| 164 |
+
if value > max_threthold:
|
| 165 |
+
self.class_threshold[key] = value
|
| 166 |
+
else:
|
| 167 |
+
self.class_threshold[key] = min(value + add_threthold, max_threthold)
|
| 168 |
+
|
| 169 |
+
|
| 170 |
+
|
| 171 |
+
def load_tag_list(self, tag_list_file):
|
| 172 |
+
with open(tag_list_file, 'r', encoding="utf-8") as f:
|
| 173 |
+
tag_list = f.read().splitlines()
|
| 174 |
+
tag_list = np.array(tag_list)
|
| 175 |
+
return tag_list
|
| 176 |
+
|
| 177 |
+
# delete self-attention layer of image-tag recognition decoder to reduce computation, follower Query2Label
|
| 178 |
+
def del_selfattention(self):
|
| 179 |
+
del self.tagging_head.embeddings
|
| 180 |
+
for layer in self.tagging_head.encoder.layer:
|
| 181 |
+
del layer.attention
|
| 182 |
+
|
| 183 |
+
def generate_image_embeds(self,
|
| 184 |
+
image,
|
| 185 |
+
condition=False
|
| 186 |
+
):
|
| 187 |
+
|
| 188 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 189 |
+
|
| 190 |
+
return image_embeds
|
| 191 |
+
|
| 192 |
+
def generate_tag(self,
|
| 193 |
+
image,
|
| 194 |
+
threshold=0.68,
|
| 195 |
+
tag_input=None,
|
| 196 |
+
):
|
| 197 |
+
|
| 198 |
+
label_embed = torch.nn.functional.relu(self.wordvec_proj(self.label_embed))
|
| 199 |
+
|
| 200 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 201 |
+
|
| 202 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 203 |
+
dtype=torch.long).to(image.device)
|
| 204 |
+
|
| 205 |
+
# recognized image tags using image-tag recogntiion decoder
|
| 206 |
+
image_cls_embeds = image_embeds[:, 0, :]
|
| 207 |
+
image_spatial_embeds = image_embeds[:, 1:, :]
|
| 208 |
+
|
| 209 |
+
bs = image_spatial_embeds.shape[0]
|
| 210 |
+
label_embed = label_embed.unsqueeze(0).repeat(bs, 1, 1)
|
| 211 |
+
tagging_embed = self.tagging_head(
|
| 212 |
+
encoder_embeds=label_embed,
|
| 213 |
+
encoder_hidden_states=image_embeds,
|
| 214 |
+
encoder_attention_mask=image_atts,
|
| 215 |
+
return_dict=False,
|
| 216 |
+
mode='tagging',
|
| 217 |
+
)
|
| 218 |
+
|
| 219 |
+
logits = self.fc(tagging_embed[0]).squeeze(-1)
|
| 220 |
+
targets = torch.where(
|
| 221 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 222 |
+
torch.tensor(1.0).to(image.device),
|
| 223 |
+
torch.zeros(self.num_class).to(image.device))
|
| 224 |
+
|
| 225 |
+
tag = targets.cpu().numpy()
|
| 226 |
+
tag[:,self.delete_tag_index] = 0
|
| 227 |
+
tag_output = []
|
| 228 |
+
tag_output_chinese = []
|
| 229 |
+
for b in range(bs):
|
| 230 |
+
index = np.argwhere(tag[b] == 1)
|
| 231 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 232 |
+
# tag_output.append(' | '.join(token))
|
| 233 |
+
tag_output.append(', '.join(token))
|
| 234 |
+
token_chinese = self.tag_list_chinese[index].squeeze(axis=1)
|
| 235 |
+
# tag_output_chinese.append(' | '.join(token_chinese))
|
| 236 |
+
tag_output_chinese.append(', '.join(token_chinese))
|
| 237 |
+
|
| 238 |
+
|
| 239 |
+
return tag_output, tag_output_chinese
|
| 240 |
+
|
| 241 |
+
|
| 242 |
+
|
| 243 |
+
def condition_forward(self,
|
| 244 |
+
image,
|
| 245 |
+
threshold=0.68,
|
| 246 |
+
condition_flag=None,
|
| 247 |
+
tag_input=None,
|
| 248 |
+
only_feature=True
|
| 249 |
+
):
|
| 250 |
+
|
| 251 |
+
label_embed = torch.nn.functional.relu(self.wordvec_proj(self.label_embed))
|
| 252 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 253 |
+
|
| 254 |
+
if only_feature:
|
| 255 |
+
return image_embeds
|
| 256 |
+
else:
|
| 257 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 258 |
+
dtype=torch.long).to(image.device)
|
| 259 |
+
|
| 260 |
+
# recognized image tags using image-tag recogntiion decoder
|
| 261 |
+
image_cls_embeds = image_embeds[:, 0, :]
|
| 262 |
+
image_spatial_embeds = image_embeds[:, 1:, :]
|
| 263 |
+
|
| 264 |
+
bs = image_spatial_embeds.shape[0]
|
| 265 |
+
label_embed = label_embed.unsqueeze(0).repeat(bs, 1, 1)
|
| 266 |
+
tagging_embed = self.tagging_head(
|
| 267 |
+
encoder_embeds=label_embed,
|
| 268 |
+
encoder_hidden_states=image_embeds,
|
| 269 |
+
encoder_attention_mask=image_atts,
|
| 270 |
+
return_dict=False,
|
| 271 |
+
mode='tagging',
|
| 272 |
+
)
|
| 273 |
+
|
| 274 |
+
logits = self.fc(tagging_embed[0]).squeeze(-1)
|
| 275 |
+
|
| 276 |
+
targets = torch.where(
|
| 277 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 278 |
+
torch.tensor(1.0).to(image.device),
|
| 279 |
+
torch.zeros(self.num_class).to(image.device))
|
| 280 |
+
|
| 281 |
+
return image_embeds, logits, targets
|
| 282 |
+
|
| 283 |
+
def generate_tag_openset(self,
|
| 284 |
+
image,
|
| 285 |
+
threshold=0.68,
|
| 286 |
+
tag_input=None,
|
| 287 |
+
):
|
| 288 |
+
|
| 289 |
+
label_embed = torch.nn.functional.relu(self.wordvec_proj(self.label_embed))
|
| 290 |
+
|
| 291 |
+
image_embeds = self.image_proj(self.visual_encoder(image))
|
| 292 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 293 |
+
dtype=torch.long).to(image.device)
|
| 294 |
+
|
| 295 |
+
# recognized image tags using image-tag recogntiion decoder
|
| 296 |
+
image_cls_embeds = image_embeds[:, 0, :]
|
| 297 |
+
image_spatial_embeds = image_embeds[:, 1:, :]
|
| 298 |
+
|
| 299 |
+
bs = image_spatial_embeds.shape[0]
|
| 300 |
+
label_embed = label_embed.unsqueeze(0).repeat(bs, 1, 1)
|
| 301 |
+
tagging_embed = self.tagging_head(
|
| 302 |
+
encoder_embeds=label_embed,
|
| 303 |
+
encoder_hidden_states=image_embeds,
|
| 304 |
+
encoder_attention_mask=image_atts,
|
| 305 |
+
return_dict=False,
|
| 306 |
+
mode='tagging',
|
| 307 |
+
)
|
| 308 |
+
|
| 309 |
+
logits = self.fc(tagging_embed[0]).squeeze(-1)
|
| 310 |
+
|
| 311 |
+
targets = torch.where(
|
| 312 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 313 |
+
torch.tensor(1.0).to(image.device),
|
| 314 |
+
torch.zeros(self.num_class).to(image.device))
|
| 315 |
+
|
| 316 |
+
tag = targets.cpu().numpy()
|
| 317 |
+
tag[:,self.delete_tag_index] = 0
|
| 318 |
+
tag_output = []
|
| 319 |
+
for b in range(bs):
|
| 320 |
+
index = np.argwhere(tag[b] == 1)
|
| 321 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 322 |
+
tag_output.append(' | '.join(token))
|
| 323 |
+
|
| 324 |
+
return tag_output
|
| 325 |
+
|
| 326 |
+
|
| 327 |
+
# load RAM pretrained model parameters
|
| 328 |
+
def ram(pretrained='', pretrained_condition='', **kwargs):
|
| 329 |
+
model = RAMLora(**kwargs)
|
| 330 |
+
|
| 331 |
+
if pretrained:
|
| 332 |
+
if kwargs['vit'] == 'swin_b':
|
| 333 |
+
model, msg = load_checkpoint_swinbase(model, pretrained, kwargs)
|
| 334 |
+
elif kwargs['vit'] == 'swin_l':
|
| 335 |
+
model, msg = load_checkpoint_swinlarge(model, pretrained, kwargs)
|
| 336 |
+
else:
|
| 337 |
+
model, msg = load_checkpoint(model, pretrained)
|
| 338 |
+
print('vit:', kwargs['vit'])
|
| 339 |
+
|
| 340 |
+
if pretrained_condition:
|
| 341 |
+
model.load_state_dict(torch.load(pretrained_condition), strict=False)
|
| 342 |
+
print(f'load lora from {pretrained_condition}')
|
| 343 |
+
|
| 344 |
+
return model
|
ram/models/swin_transformer.py
ADDED
|
@@ -0,0 +1,696 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# --------------------------------------------------------
|
| 2 |
+
# Swin Transformer
|
| 3 |
+
# Copyright (c) 2021 Microsoft
|
| 4 |
+
# Licensed under The MIT License [see LICENSE for details]
|
| 5 |
+
# Written by Ze Liu
|
| 6 |
+
# --------------------------------------------------------
|
| 7 |
+
|
| 8 |
+
import numpy as np
|
| 9 |
+
from scipy import interpolate
|
| 10 |
+
|
| 11 |
+
import torch
|
| 12 |
+
import torch.nn as nn
|
| 13 |
+
import torch.utils.checkpoint as checkpoint
|
| 14 |
+
from timm.models.layers import DropPath, to_2tuple, trunc_normal_
|
| 15 |
+
|
| 16 |
+
|
| 17 |
+
class Mlp(nn.Module):
|
| 18 |
+
def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):
|
| 19 |
+
super().__init__()
|
| 20 |
+
out_features = out_features or in_features
|
| 21 |
+
hidden_features = hidden_features or in_features
|
| 22 |
+
self.fc1 = nn.Linear(in_features, hidden_features)
|
| 23 |
+
self.act = act_layer()
|
| 24 |
+
self.fc2 = nn.Linear(hidden_features, out_features)
|
| 25 |
+
self.drop = nn.Dropout(drop)
|
| 26 |
+
|
| 27 |
+
def forward(self, x):
|
| 28 |
+
x = self.fc1(x)
|
| 29 |
+
x = self.act(x)
|
| 30 |
+
x = self.drop(x)
|
| 31 |
+
x = self.fc2(x)
|
| 32 |
+
x = self.drop(x)
|
| 33 |
+
return x
|
| 34 |
+
|
| 35 |
+
|
| 36 |
+
def window_partition(x, window_size):
|
| 37 |
+
"""
|
| 38 |
+
Args:
|
| 39 |
+
x: (B, H, W, C)
|
| 40 |
+
window_size (int): window size
|
| 41 |
+
|
| 42 |
+
Returns:
|
| 43 |
+
windows: (num_windows*B, window_size, window_size, C)
|
| 44 |
+
"""
|
| 45 |
+
B, H, W, C = x.shape
|
| 46 |
+
x = x.view(B, H // window_size, window_size, W // window_size, window_size, C)
|
| 47 |
+
windows = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(-1, window_size, window_size, C)
|
| 48 |
+
return windows
|
| 49 |
+
|
| 50 |
+
|
| 51 |
+
def window_reverse(windows, window_size, H, W):
|
| 52 |
+
"""
|
| 53 |
+
Args:
|
| 54 |
+
windows: (num_windows*B, window_size, window_size, C)
|
| 55 |
+
window_size (int): Window size
|
| 56 |
+
H (int): Height of image
|
| 57 |
+
W (int): Width of image
|
| 58 |
+
|
| 59 |
+
Returns:
|
| 60 |
+
x: (B, H, W, C)
|
| 61 |
+
"""
|
| 62 |
+
B = int(windows.shape[0] / (H * W / window_size / window_size))
|
| 63 |
+
x = windows.view(B, H // window_size, W // window_size, window_size, window_size, -1)
|
| 64 |
+
x = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(B, H, W, -1)
|
| 65 |
+
return x
|
| 66 |
+
|
| 67 |
+
|
| 68 |
+
class WindowAttention(nn.Module):
|
| 69 |
+
r""" Window based multi-head self attention (W-MSA) module with relative position bias.
|
| 70 |
+
It supports both of shifted and non-shifted window.
|
| 71 |
+
|
| 72 |
+
Args:
|
| 73 |
+
dim (int): Number of input channels.
|
| 74 |
+
window_size (tuple[int]): The height and width of the window.
|
| 75 |
+
num_heads (int): Number of attention heads.
|
| 76 |
+
qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True
|
| 77 |
+
qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set
|
| 78 |
+
attn_drop (float, optional): Dropout ratio of attention weight. Default: 0.0
|
| 79 |
+
proj_drop (float, optional): Dropout ratio of output. Default: 0.0
|
| 80 |
+
"""
|
| 81 |
+
|
| 82 |
+
def __init__(self, dim, window_size, num_heads, qkv_bias=True, qk_scale=None, attn_drop=0., proj_drop=0.):
|
| 83 |
+
|
| 84 |
+
super().__init__()
|
| 85 |
+
self.dim = dim
|
| 86 |
+
self.window_size = window_size # Wh, Ww
|
| 87 |
+
self.num_heads = num_heads
|
| 88 |
+
head_dim = dim // num_heads
|
| 89 |
+
self.scale = qk_scale or head_dim ** -0.5
|
| 90 |
+
|
| 91 |
+
# define a parameter table of relative position bias
|
| 92 |
+
self.relative_position_bias_table = nn.Parameter(
|
| 93 |
+
torch.zeros((2 * window_size[0] - 1) * (2 * window_size[1] - 1), num_heads)) # 2*Wh-1 * 2*Ww-1, nH
|
| 94 |
+
|
| 95 |
+
# get pair-wise relative position index for each token inside the window
|
| 96 |
+
coords_h = torch.arange(self.window_size[0])
|
| 97 |
+
coords_w = torch.arange(self.window_size[1])
|
| 98 |
+
coords = torch.stack(torch.meshgrid([coords_h, coords_w])) # 2, Wh, Ww
|
| 99 |
+
coords_flatten = torch.flatten(coords, 1) # 2, Wh*Ww
|
| 100 |
+
relative_coords = coords_flatten[:, :, None] - coords_flatten[:, None, :] # 2, Wh*Ww, Wh*Ww
|
| 101 |
+
relative_coords = relative_coords.permute(1, 2, 0).contiguous() # Wh*Ww, Wh*Ww, 2
|
| 102 |
+
relative_coords[:, :, 0] += self.window_size[0] - 1 # shift to start from 0
|
| 103 |
+
relative_coords[:, :, 1] += self.window_size[1] - 1
|
| 104 |
+
relative_coords[:, :, 0] *= 2 * self.window_size[1] - 1
|
| 105 |
+
relative_position_index = relative_coords.sum(-1) # Wh*Ww, Wh*Ww
|
| 106 |
+
self.register_buffer("relative_position_index", relative_position_index)
|
| 107 |
+
|
| 108 |
+
self.qkv = nn.Linear(dim, dim * 3, bias=qkv_bias)
|
| 109 |
+
self.attn_drop = nn.Dropout(attn_drop)
|
| 110 |
+
self.proj = nn.Linear(dim, dim)
|
| 111 |
+
self.proj_drop = nn.Dropout(proj_drop)
|
| 112 |
+
|
| 113 |
+
trunc_normal_(self.relative_position_bias_table, std=.02)
|
| 114 |
+
self.softmax = nn.Softmax(dim=-1)
|
| 115 |
+
|
| 116 |
+
def forward(self, x, mask=None):
|
| 117 |
+
"""
|
| 118 |
+
Args:
|
| 119 |
+
x: input features with shape of (num_windows*B, N, C)
|
| 120 |
+
mask: (0/-inf) mask with shape of (num_windows, Wh*Ww, Wh*Ww) or None
|
| 121 |
+
"""
|
| 122 |
+
B_, N, C = x.shape
|
| 123 |
+
qkv = self.qkv(x).reshape(B_, N, 3, self.num_heads, C // self.num_heads).permute(2, 0, 3, 1, 4)
|
| 124 |
+
q, k, v = qkv[0], qkv[1], qkv[2] # make torchscript happy (cannot use tensor as tuple)
|
| 125 |
+
|
| 126 |
+
q = q * self.scale
|
| 127 |
+
attn = (q @ k.transpose(-2, -1))
|
| 128 |
+
|
| 129 |
+
relative_position_bias = self.relative_position_bias_table[self.relative_position_index.view(-1)].view(
|
| 130 |
+
self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1], -1) # Wh*Ww,Wh*Ww,nH
|
| 131 |
+
relative_position_bias = relative_position_bias.permute(2, 0, 1).contiguous() # nH, Wh*Ww, Wh*Ww
|
| 132 |
+
attn = attn + relative_position_bias.unsqueeze(0)
|
| 133 |
+
|
| 134 |
+
if mask is not None:
|
| 135 |
+
nW = mask.shape[0]
|
| 136 |
+
attn = attn.view(B_ // nW, nW, self.num_heads, N, N) + mask.unsqueeze(1).unsqueeze(0)
|
| 137 |
+
attn = attn.view(-1, self.num_heads, N, N)
|
| 138 |
+
attn = self.softmax(attn)
|
| 139 |
+
else:
|
| 140 |
+
attn = self.softmax(attn)
|
| 141 |
+
|
| 142 |
+
attn = self.attn_drop(attn)
|
| 143 |
+
|
| 144 |
+
x = (attn @ v).transpose(1, 2).reshape(B_, N, C)
|
| 145 |
+
x = self.proj(x)
|
| 146 |
+
x = self.proj_drop(x)
|
| 147 |
+
return x
|
| 148 |
+
|
| 149 |
+
def extra_repr(self) -> str:
|
| 150 |
+
return f'dim={self.dim}, window_size={self.window_size}, num_heads={self.num_heads}'
|
| 151 |
+
|
| 152 |
+
def flops(self, N):
|
| 153 |
+
# calculate flops for 1 window with token length of N
|
| 154 |
+
flops = 0
|
| 155 |
+
# qkv = self.qkv(x)
|
| 156 |
+
flops += N * self.dim * 3 * self.dim
|
| 157 |
+
# attn = (q @ k.transpose(-2, -1))
|
| 158 |
+
flops += self.num_heads * N * (self.dim // self.num_heads) * N
|
| 159 |
+
# x = (attn @ v)
|
| 160 |
+
flops += self.num_heads * N * N * (self.dim // self.num_heads)
|
| 161 |
+
# x = self.proj(x)
|
| 162 |
+
flops += N * self.dim * self.dim
|
| 163 |
+
return flops
|
| 164 |
+
|
| 165 |
+
|
| 166 |
+
class SwinTransformerBlock(nn.Module):
|
| 167 |
+
r""" Swin Transformer Block.
|
| 168 |
+
|
| 169 |
+
Args:
|
| 170 |
+
dim (int): Number of input channels.
|
| 171 |
+
input_resolution (tuple[int]): Input resulotion.
|
| 172 |
+
num_heads (int): Number of attention heads.
|
| 173 |
+
window_size (int): Window size.
|
| 174 |
+
shift_size (int): Shift size for SW-MSA.
|
| 175 |
+
mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.
|
| 176 |
+
qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True
|
| 177 |
+
qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.
|
| 178 |
+
drop (float, optional): Dropout rate. Default: 0.0
|
| 179 |
+
attn_drop (float, optional): Attention dropout rate. Default: 0.0
|
| 180 |
+
drop_path (float, optional): Stochastic depth rate. Default: 0.0
|
| 181 |
+
act_layer (nn.Module, optional): Activation layer. Default: nn.GELU
|
| 182 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: nn.LayerNorm
|
| 183 |
+
"""
|
| 184 |
+
|
| 185 |
+
def __init__(self, dim, input_resolution, num_heads, window_size=7, shift_size=0,
|
| 186 |
+
mlp_ratio=4., qkv_bias=True, qk_scale=None, drop=0., attn_drop=0., drop_path=0.,
|
| 187 |
+
act_layer=nn.GELU, norm_layer=nn.LayerNorm):
|
| 188 |
+
super().__init__()
|
| 189 |
+
self.dim = dim
|
| 190 |
+
self.input_resolution = input_resolution
|
| 191 |
+
self.num_heads = num_heads
|
| 192 |
+
self.window_size = window_size
|
| 193 |
+
self.shift_size = shift_size
|
| 194 |
+
self.mlp_ratio = mlp_ratio
|
| 195 |
+
if min(self.input_resolution) <= self.window_size:
|
| 196 |
+
# if window size is larger than input resolution, we don't partition windows
|
| 197 |
+
self.shift_size = 0
|
| 198 |
+
self.window_size = min(self.input_resolution)
|
| 199 |
+
assert 0 <= self.shift_size < self.window_size, "shift_size must in 0-window_size"
|
| 200 |
+
|
| 201 |
+
self.norm1 = norm_layer(dim)
|
| 202 |
+
self.attn = WindowAttention(
|
| 203 |
+
dim, window_size=to_2tuple(self.window_size), num_heads=num_heads,
|
| 204 |
+
qkv_bias=qkv_bias, qk_scale=qk_scale, attn_drop=attn_drop, proj_drop=drop)
|
| 205 |
+
|
| 206 |
+
self.drop_path = DropPath(drop_path) if drop_path > 0. else nn.Identity()
|
| 207 |
+
self.norm2 = norm_layer(dim)
|
| 208 |
+
mlp_hidden_dim = int(dim * mlp_ratio)
|
| 209 |
+
self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)
|
| 210 |
+
|
| 211 |
+
if self.shift_size > 0:
|
| 212 |
+
# calculate attention mask for SW-MSA
|
| 213 |
+
H, W = self.input_resolution
|
| 214 |
+
img_mask = torch.zeros((1, H, W, 1)) # 1 H W 1
|
| 215 |
+
h_slices = (slice(0, -self.window_size),
|
| 216 |
+
slice(-self.window_size, -self.shift_size),
|
| 217 |
+
slice(-self.shift_size, None))
|
| 218 |
+
w_slices = (slice(0, -self.window_size),
|
| 219 |
+
slice(-self.window_size, -self.shift_size),
|
| 220 |
+
slice(-self.shift_size, None))
|
| 221 |
+
cnt = 0
|
| 222 |
+
for h in h_slices:
|
| 223 |
+
for w in w_slices:
|
| 224 |
+
img_mask[:, h, w, :] = cnt
|
| 225 |
+
cnt += 1
|
| 226 |
+
|
| 227 |
+
mask_windows = window_partition(img_mask, self.window_size) # nW, window_size, window_size, 1
|
| 228 |
+
mask_windows = mask_windows.view(-1, self.window_size * self.window_size)
|
| 229 |
+
attn_mask = mask_windows.unsqueeze(1) - mask_windows.unsqueeze(2)
|
| 230 |
+
attn_mask = attn_mask.masked_fill(attn_mask != 0, float(-100.0)).masked_fill(attn_mask == 0, float(0.0))
|
| 231 |
+
else:
|
| 232 |
+
attn_mask = None
|
| 233 |
+
|
| 234 |
+
self.register_buffer("attn_mask", attn_mask)
|
| 235 |
+
|
| 236 |
+
## condition from LR
|
| 237 |
+
self.condition_attention = nn.Sequential(
|
| 238 |
+
nn.Linear(256, dim*2, bias=False),
|
| 239 |
+
)
|
| 240 |
+
self.condition_ffn = nn.Sequential(
|
| 241 |
+
nn.Linear(256, dim*2, bias=False),
|
| 242 |
+
)
|
| 243 |
+
|
| 244 |
+
zero_module(self.condition_attention)
|
| 245 |
+
zero_module(self.condition_ffn)
|
| 246 |
+
|
| 247 |
+
def forward(self, x, condition=None):
|
| 248 |
+
H, W = self.input_resolution
|
| 249 |
+
B, L, C = x.shape
|
| 250 |
+
assert L == H * W, "input feature has wrong size"
|
| 251 |
+
|
| 252 |
+
shortcut = x
|
| 253 |
+
x = self.norm1(x)
|
| 254 |
+
x = x.view(B, H, W, C)
|
| 255 |
+
|
| 256 |
+
# add condition before attention
|
| 257 |
+
# input B,H,W,C
|
| 258 |
+
if condition is not None:
|
| 259 |
+
x = x.permute(0, 3, 1, 2) # BCHW
|
| 260 |
+
condition_attention = self.condition_attention(condition).view(-1, 2*C, 1, 1)
|
| 261 |
+
condition_attn_multiplication, condition_attn_addition = condition_attention.chunk(2, dim=1)
|
| 262 |
+
x = x*condition_attn_multiplication + condition_attn_multiplication
|
| 263 |
+
x = x.permute(0, 2, 3, 1)
|
| 264 |
+
|
| 265 |
+
|
| 266 |
+
|
| 267 |
+
# cyclic shift
|
| 268 |
+
if self.shift_size > 0:
|
| 269 |
+
shifted_x = torch.roll(x, shifts=(-self.shift_size, -self.shift_size), dims=(1, 2))
|
| 270 |
+
else:
|
| 271 |
+
shifted_x = x
|
| 272 |
+
|
| 273 |
+
# partition windows
|
| 274 |
+
x_windows = window_partition(shifted_x, self.window_size) # nW*B, window_size, window_size, C
|
| 275 |
+
x_windows = x_windows.view(-1, self.window_size * self.window_size, C) # nW*B, window_size*window_size, C
|
| 276 |
+
|
| 277 |
+
# W-MSA/SW-MSA
|
| 278 |
+
attn_windows = self.attn(x_windows, mask=self.attn_mask) # nW*B, window_size*window_size, C
|
| 279 |
+
|
| 280 |
+
# merge windows
|
| 281 |
+
attn_windows = attn_windows.view(-1, self.window_size, self.window_size, C)
|
| 282 |
+
shifted_x = window_reverse(attn_windows, self.window_size, H, W) # B H' W' C
|
| 283 |
+
|
| 284 |
+
# reverse cyclic shift
|
| 285 |
+
if self.shift_size > 0:
|
| 286 |
+
x = torch.roll(shifted_x, shifts=(self.shift_size, self.shift_size), dims=(1, 2))
|
| 287 |
+
else:
|
| 288 |
+
x = shifted_x
|
| 289 |
+
x = x.view(B, H * W, C)
|
| 290 |
+
|
| 291 |
+
# FFN
|
| 292 |
+
x = shortcut + self.drop_path(x)
|
| 293 |
+
# x = x + self.drop_path(self.mlp(self.norm2(x)))
|
| 294 |
+
|
| 295 |
+
# add condition before ffn
|
| 296 |
+
# input B,H*W,C
|
| 297 |
+
if condition is not None:
|
| 298 |
+
res = x
|
| 299 |
+
x = self.norm2(x)
|
| 300 |
+
x = x.view(B, H, W, C)
|
| 301 |
+
x = x.permute(0, 3, 1, 2) # BCHW
|
| 302 |
+
condition_ffn = self.condition_ffn(condition).view(-1, 2*C, 1, 1)
|
| 303 |
+
condition_ffn_multiplication, condition_ffn_addition = condition_ffn.chunk(2, dim=1)
|
| 304 |
+
x = x*condition_ffn_multiplication + condition_ffn_addition
|
| 305 |
+
x = x.permute(0, 2, 3, 1)
|
| 306 |
+
x = x.view(B, H*W, C)
|
| 307 |
+
x = res + self.drop_path(self.mlp(x))
|
| 308 |
+
else:
|
| 309 |
+
x = x + self.drop_path(self.mlp(self.norm2(x)))
|
| 310 |
+
return x
|
| 311 |
+
|
| 312 |
+
def extra_repr(self) -> str:
|
| 313 |
+
return f"dim={self.dim}, input_resolution={self.input_resolution}, num_heads={self.num_heads}, " \
|
| 314 |
+
f"window_size={self.window_size}, shift_size={self.shift_size}, mlp_ratio={self.mlp_ratio}"
|
| 315 |
+
|
| 316 |
+
def flops(self):
|
| 317 |
+
flops = 0
|
| 318 |
+
H, W = self.input_resolution
|
| 319 |
+
# norm1
|
| 320 |
+
flops += self.dim * H * W
|
| 321 |
+
# W-MSA/SW-MSA
|
| 322 |
+
nW = H * W / self.window_size / self.window_size
|
| 323 |
+
flops += nW * self.attn.flops(self.window_size * self.window_size)
|
| 324 |
+
# mlp
|
| 325 |
+
flops += 2 * H * W * self.dim * self.dim * self.mlp_ratio
|
| 326 |
+
# norm2
|
| 327 |
+
flops += self.dim * H * W
|
| 328 |
+
return flops
|
| 329 |
+
|
| 330 |
+
|
| 331 |
+
class PatchMerging(nn.Module):
|
| 332 |
+
r""" Patch Merging Layer.
|
| 333 |
+
|
| 334 |
+
Args:
|
| 335 |
+
input_resolution (tuple[int]): Resolution of input feature.
|
| 336 |
+
dim (int): Number of input channels.
|
| 337 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: nn.LayerNorm
|
| 338 |
+
"""
|
| 339 |
+
|
| 340 |
+
def __init__(self, input_resolution, dim, norm_layer=nn.LayerNorm):
|
| 341 |
+
super().__init__()
|
| 342 |
+
self.input_resolution = input_resolution
|
| 343 |
+
self.dim = dim
|
| 344 |
+
self.reduction = nn.Linear(4 * dim, 2 * dim, bias=False)
|
| 345 |
+
self.norm = norm_layer(4 * dim)
|
| 346 |
+
|
| 347 |
+
def forward(self, x):
|
| 348 |
+
"""
|
| 349 |
+
x: B, H*W, C
|
| 350 |
+
"""
|
| 351 |
+
H, W = self.input_resolution
|
| 352 |
+
B, L, C = x.shape
|
| 353 |
+
assert L == H * W, "input feature has wrong size"
|
| 354 |
+
assert H % 2 == 0 and W % 2 == 0, f"x size ({H}*{W}) are not even."
|
| 355 |
+
|
| 356 |
+
x = x.view(B, H, W, C)
|
| 357 |
+
|
| 358 |
+
x0 = x[:, 0::2, 0::2, :] # B H/2 W/2 C
|
| 359 |
+
x1 = x[:, 1::2, 0::2, :] # B H/2 W/2 C
|
| 360 |
+
x2 = x[:, 0::2, 1::2, :] # B H/2 W/2 C
|
| 361 |
+
x3 = x[:, 1::2, 1::2, :] # B H/2 W/2 C
|
| 362 |
+
x = torch.cat([x0, x1, x2, x3], -1) # B H/2 W/2 4*C
|
| 363 |
+
x = x.view(B, -1, 4 * C) # B H/2*W/2 4*C
|
| 364 |
+
|
| 365 |
+
x = self.norm(x)
|
| 366 |
+
x = self.reduction(x)
|
| 367 |
+
|
| 368 |
+
return x
|
| 369 |
+
|
| 370 |
+
def extra_repr(self) -> str:
|
| 371 |
+
return f"input_resolution={self.input_resolution}, dim={self.dim}"
|
| 372 |
+
|
| 373 |
+
def flops(self):
|
| 374 |
+
H, W = self.input_resolution
|
| 375 |
+
flops = H * W * self.dim
|
| 376 |
+
flops += (H // 2) * (W // 2) * 4 * self.dim * 2 * self.dim
|
| 377 |
+
return flops
|
| 378 |
+
|
| 379 |
+
|
| 380 |
+
class BasicLayer(nn.Module):
|
| 381 |
+
""" A basic Swin Transformer layer for one stage.
|
| 382 |
+
|
| 383 |
+
Args:
|
| 384 |
+
dim (int): Number of input channels.
|
| 385 |
+
input_resolution (tuple[int]): Input resolution.
|
| 386 |
+
depth (int): Number of blocks.
|
| 387 |
+
num_heads (int): Number of attention heads.
|
| 388 |
+
window_size (int): Local window size.
|
| 389 |
+
mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.
|
| 390 |
+
qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True
|
| 391 |
+
qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.
|
| 392 |
+
drop (float, optional): Dropout rate. Default: 0.0
|
| 393 |
+
attn_drop (float, optional): Attention dropout rate. Default: 0.0
|
| 394 |
+
drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0
|
| 395 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: nn.LayerNorm
|
| 396 |
+
downsample (nn.Module | None, optional): Downsample layer at the end of the layer. Default: None
|
| 397 |
+
use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.
|
| 398 |
+
"""
|
| 399 |
+
|
| 400 |
+
def __init__(self, dim, input_resolution, depth, num_heads, window_size,
|
| 401 |
+
mlp_ratio=4., qkv_bias=True, qk_scale=None, drop=0., attn_drop=0.,
|
| 402 |
+
drop_path=0., norm_layer=nn.LayerNorm, downsample=None, use_checkpoint=False):
|
| 403 |
+
|
| 404 |
+
super().__init__()
|
| 405 |
+
self.dim = dim
|
| 406 |
+
self.input_resolution = input_resolution
|
| 407 |
+
self.depth = depth
|
| 408 |
+
self.use_checkpoint = use_checkpoint
|
| 409 |
+
|
| 410 |
+
# build blocks
|
| 411 |
+
self.blocks = nn.ModuleList([
|
| 412 |
+
SwinTransformerBlock(dim=dim, input_resolution=input_resolution,
|
| 413 |
+
num_heads=num_heads, window_size=window_size,
|
| 414 |
+
shift_size=0 if (i % 2 == 0) else window_size // 2,
|
| 415 |
+
mlp_ratio=mlp_ratio,
|
| 416 |
+
qkv_bias=qkv_bias, qk_scale=qk_scale,
|
| 417 |
+
drop=drop, attn_drop=attn_drop,
|
| 418 |
+
drop_path=drop_path[i] if isinstance(drop_path, list) else drop_path,
|
| 419 |
+
norm_layer=norm_layer)
|
| 420 |
+
for i in range(depth)])
|
| 421 |
+
|
| 422 |
+
# patch merging layer
|
| 423 |
+
if downsample is not None:
|
| 424 |
+
self.downsample = downsample(input_resolution, dim=dim, norm_layer=norm_layer)
|
| 425 |
+
else:
|
| 426 |
+
self.downsample = None
|
| 427 |
+
|
| 428 |
+
def forward(self, x, condition=None):
|
| 429 |
+
for blk in self.blocks:
|
| 430 |
+
if self.use_checkpoint:
|
| 431 |
+
x = checkpoint.checkpoint(blk, x)
|
| 432 |
+
else:
|
| 433 |
+
x = blk(x, condition=condition)
|
| 434 |
+
if self.downsample is not None:
|
| 435 |
+
x = self.downsample(x)
|
| 436 |
+
return x
|
| 437 |
+
|
| 438 |
+
def extra_repr(self) -> str:
|
| 439 |
+
return f"dim={self.dim}, input_resolution={self.input_resolution}, depth={self.depth}"
|
| 440 |
+
|
| 441 |
+
def flops(self):
|
| 442 |
+
flops = 0
|
| 443 |
+
for blk in self.blocks:
|
| 444 |
+
flops += blk.flops()
|
| 445 |
+
if self.downsample is not None:
|
| 446 |
+
flops += self.downsample.flops()
|
| 447 |
+
return flops
|
| 448 |
+
|
| 449 |
+
|
| 450 |
+
class PatchEmbed(nn.Module):
|
| 451 |
+
r""" Image to Patch Embedding
|
| 452 |
+
|
| 453 |
+
Args:
|
| 454 |
+
img_size (int): Image size. Default: 224.
|
| 455 |
+
patch_size (int): Patch token size. Default: 4.
|
| 456 |
+
in_chans (int): Number of input image channels. Default: 3.
|
| 457 |
+
embed_dim (int): Number of linear projection output channels. Default: 96.
|
| 458 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: None
|
| 459 |
+
"""
|
| 460 |
+
|
| 461 |
+
def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):
|
| 462 |
+
super().__init__()
|
| 463 |
+
img_size = to_2tuple(img_size)
|
| 464 |
+
patch_size = to_2tuple(patch_size)
|
| 465 |
+
patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]
|
| 466 |
+
self.img_size = img_size
|
| 467 |
+
self.patch_size = patch_size
|
| 468 |
+
self.patches_resolution = patches_resolution
|
| 469 |
+
self.num_patches = patches_resolution[0] * patches_resolution[1]
|
| 470 |
+
|
| 471 |
+
self.in_chans = in_chans
|
| 472 |
+
self.embed_dim = embed_dim
|
| 473 |
+
|
| 474 |
+
self.proj = nn.Conv2d(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size)
|
| 475 |
+
if norm_layer is not None:
|
| 476 |
+
self.norm = norm_layer(embed_dim)
|
| 477 |
+
else:
|
| 478 |
+
self.norm = None
|
| 479 |
+
|
| 480 |
+
def forward(self, x):
|
| 481 |
+
B, C, H, W = x.shape
|
| 482 |
+
# FIXME look at relaxing size constraints
|
| 483 |
+
assert H == self.img_size[0] and W == self.img_size[1], \
|
| 484 |
+
f"Input image size ({H}*{W}) doesn't match model ({self.img_size[0]}*{self.img_size[1]})."
|
| 485 |
+
x = self.proj(x).flatten(2).transpose(1, 2) # B Ph*Pw C
|
| 486 |
+
if self.norm is not None:
|
| 487 |
+
x = self.norm(x)
|
| 488 |
+
return x
|
| 489 |
+
|
| 490 |
+
def flops(self):
|
| 491 |
+
Ho, Wo = self.patches_resolution
|
| 492 |
+
flops = Ho * Wo * self.embed_dim * self.in_chans * (self.patch_size[0] * self.patch_size[1])
|
| 493 |
+
if self.norm is not None:
|
| 494 |
+
flops += Ho * Wo * self.embed_dim
|
| 495 |
+
return flops
|
| 496 |
+
|
| 497 |
+
|
| 498 |
+
class SwinTransformer(nn.Module):
|
| 499 |
+
r""" Swin Transformer
|
| 500 |
+
A PyTorch impl of : `Swin Transformer: Hierarchical Vision Transformer using Shifted Windows` -
|
| 501 |
+
https://arxiv.org/pdf/2103.14030
|
| 502 |
+
|
| 503 |
+
Args:
|
| 504 |
+
img_size (int | tuple(int)): Input image size. Default 224
|
| 505 |
+
patch_size (int | tuple(int)): Patch size. Default: 4
|
| 506 |
+
in_chans (int): Number of input image channels. Default: 3
|
| 507 |
+
num_classes (int): Number of classes for classification head. Default: 1000
|
| 508 |
+
embed_dim (int): Patch embedding dimension. Default: 96
|
| 509 |
+
depths (tuple(int)): Depth of each Swin Transformer layer.
|
| 510 |
+
num_heads (tuple(int)): Number of attention heads in different layers.
|
| 511 |
+
window_size (int): Window size. Default: 7
|
| 512 |
+
mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. Default: 4
|
| 513 |
+
qkv_bias (bool): If True, add a learnable bias to query, key, value. Default: True
|
| 514 |
+
qk_scale (float): Override default qk scale of head_dim ** -0.5 if set. Default: None
|
| 515 |
+
drop_rate (float): Dropout rate. Default: 0
|
| 516 |
+
attn_drop_rate (float): Attention dropout rate. Default: 0
|
| 517 |
+
drop_path_rate (float): Stochastic depth rate. Default: 0.1
|
| 518 |
+
norm_layer (nn.Module): Normalization layer. Default: nn.LayerNorm.
|
| 519 |
+
ape (bool): If True, add absolute position embedding to the patch embedding. Default: False
|
| 520 |
+
patch_norm (bool): If True, add normalization after patch embedding. Default: True
|
| 521 |
+
use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False
|
| 522 |
+
"""
|
| 523 |
+
|
| 524 |
+
def __init__(self, img_size=224, patch_size=4, in_chans=3, num_classes=1000,
|
| 525 |
+
embed_dim=96, depths=[2, 2, 6, 2], num_heads=[3, 6, 12, 24],
|
| 526 |
+
window_size=7, mlp_ratio=4., qkv_bias=True, qk_scale=None,
|
| 527 |
+
drop_rate=0., attn_drop_rate=0., drop_path_rate=0.1,
|
| 528 |
+
norm_layer=nn.LayerNorm, ape=False, patch_norm=True,
|
| 529 |
+
use_checkpoint=False, **kwargs):
|
| 530 |
+
super().__init__()
|
| 531 |
+
|
| 532 |
+
self.num_classes = num_classes
|
| 533 |
+
self.num_layers = len(depths)
|
| 534 |
+
self.embed_dim = embed_dim
|
| 535 |
+
self.ape = ape
|
| 536 |
+
self.patch_norm = patch_norm
|
| 537 |
+
self.num_features = int(embed_dim * 2 ** (self.num_layers - 1))
|
| 538 |
+
self.mlp_ratio = mlp_ratio
|
| 539 |
+
|
| 540 |
+
# split image into non-overlapping patches
|
| 541 |
+
self.patch_embed = PatchEmbed(
|
| 542 |
+
img_size=img_size, patch_size=patch_size, in_chans=in_chans, embed_dim=embed_dim,
|
| 543 |
+
norm_layer=norm_layer if self.patch_norm else None)
|
| 544 |
+
num_patches = self.patch_embed.num_patches
|
| 545 |
+
patches_resolution = self.patch_embed.patches_resolution
|
| 546 |
+
self.patches_resolution = patches_resolution
|
| 547 |
+
|
| 548 |
+
# absolute position embedding
|
| 549 |
+
if self.ape:
|
| 550 |
+
self.absolute_pos_embed = nn.Parameter(torch.zeros(1, num_patches, embed_dim))
|
| 551 |
+
trunc_normal_(self.absolute_pos_embed, std=.02)
|
| 552 |
+
|
| 553 |
+
self.pos_drop = nn.Dropout(p=drop_rate)
|
| 554 |
+
|
| 555 |
+
# stochastic depth
|
| 556 |
+
dpr = [x.item() for x in torch.linspace(0, drop_path_rate, sum(depths))] # stochastic depth decay rule
|
| 557 |
+
|
| 558 |
+
# build layers
|
| 559 |
+
self.layers = nn.ModuleList()
|
| 560 |
+
for i_layer in range(self.num_layers):
|
| 561 |
+
layer = BasicLayer(dim=int(embed_dim * 2 ** i_layer),
|
| 562 |
+
input_resolution=(patches_resolution[0] // (2 ** i_layer),
|
| 563 |
+
patches_resolution[1] // (2 ** i_layer)),
|
| 564 |
+
depth=depths[i_layer],
|
| 565 |
+
num_heads=num_heads[i_layer],
|
| 566 |
+
window_size=window_size,
|
| 567 |
+
mlp_ratio=self.mlp_ratio,
|
| 568 |
+
qkv_bias=qkv_bias, qk_scale=qk_scale,
|
| 569 |
+
drop=drop_rate, attn_drop=attn_drop_rate,
|
| 570 |
+
drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],
|
| 571 |
+
norm_layer=norm_layer,
|
| 572 |
+
downsample=PatchMerging if (i_layer < self.num_layers - 1) else None,
|
| 573 |
+
use_checkpoint=use_checkpoint)
|
| 574 |
+
self.layers.append(layer)
|
| 575 |
+
|
| 576 |
+
self.norm = norm_layer(self.num_features)
|
| 577 |
+
self.avgpool = nn.AdaptiveAvgPool1d(1)
|
| 578 |
+
# self.head = nn.Linear(self.num_features, num_classes) if num_classes > 0 else nn.Identity()
|
| 579 |
+
|
| 580 |
+
self.apply(self._init_weights)
|
| 581 |
+
|
| 582 |
+
def _init_weights(self, m):
|
| 583 |
+
if isinstance(m, nn.Linear):
|
| 584 |
+
trunc_normal_(m.weight, std=.02)
|
| 585 |
+
if isinstance(m, nn.Linear) and m.bias is not None:
|
| 586 |
+
nn.init.constant_(m.bias, 0)
|
| 587 |
+
elif isinstance(m, nn.LayerNorm):
|
| 588 |
+
nn.init.constant_(m.bias, 0)
|
| 589 |
+
nn.init.constant_(m.weight, 1.0)
|
| 590 |
+
|
| 591 |
+
@torch.jit.ignore
|
| 592 |
+
def no_weight_decay(self):
|
| 593 |
+
return {'absolute_pos_embed'}
|
| 594 |
+
|
| 595 |
+
@torch.jit.ignore
|
| 596 |
+
def no_weight_decay_keywords(self):
|
| 597 |
+
return {'relative_position_bias_table'}
|
| 598 |
+
|
| 599 |
+
def forward(self, x, idx_to_group_img=None, image_atts=None, condition=None, **kwargs):
|
| 600 |
+
x = self.patch_embed(x)
|
| 601 |
+
if self.ape:
|
| 602 |
+
x = x + self.absolute_pos_embed
|
| 603 |
+
x = self.pos_drop(x)
|
| 604 |
+
|
| 605 |
+
for layer in self.layers:
|
| 606 |
+
x = layer(x, condition=condition)
|
| 607 |
+
|
| 608 |
+
x = self.norm(x) # B L C
|
| 609 |
+
|
| 610 |
+
x_cls = self.avgpool(x.transpose(1, 2)) # B C 1
|
| 611 |
+
|
| 612 |
+
if idx_to_group_img is None:
|
| 613 |
+
return torch.cat([x_cls.transpose(1, 2), x], dim=1)
|
| 614 |
+
else:
|
| 615 |
+
x_bs = torch.gather(x, dim=0, index=idx_to_group_img.view(-1, 1, 1).expand(-1, x.shape[1], x.shape[2]))
|
| 616 |
+
weights = image_atts[:, 1:].unsqueeze(2) # B L 1
|
| 617 |
+
x_bs_cls = torch.sum((weights * x_bs).transpose(1, 2), dim=-1, keepdim=True) # B C 1
|
| 618 |
+
x_bs_cls = x_bs_cls / torch.sum(weights.transpose(1, 2), dim=-1, keepdim=True) # avgpool
|
| 619 |
+
|
| 620 |
+
return torch.cat([x_bs_cls.transpose(1, 2), x_bs], dim=1), \
|
| 621 |
+
torch.cat([x_cls.transpose(1, 2), x], dim=1)
|
| 622 |
+
|
| 623 |
+
def flops(self):
|
| 624 |
+
flops = 0
|
| 625 |
+
flops += self.patch_embed.flops()
|
| 626 |
+
for i, layer in enumerate(self.layers):
|
| 627 |
+
flops += layer.flops()
|
| 628 |
+
flops += self.num_features * self.patches_resolution[0] * self.patches_resolution[1] // (2 ** self.num_layers)
|
| 629 |
+
flops += self.num_features * self.num_classes
|
| 630 |
+
return flops
|
| 631 |
+
|
| 632 |
+
|
| 633 |
+
def interpolate_relative_pos_embed(rel_pos_bias, dst_num_pos, param_name=''):
|
| 634 |
+
# from: https://github.com/microsoft/unilm/blob/8a0a1c1f4e7326938ea7580a00d56d7f17d65612/beit/run_class_finetuning.py#L348
|
| 635 |
+
|
| 636 |
+
# rel_pos_bias: relative_position_bias_table
|
| 637 |
+
src_num_pos, num_attn_heads = rel_pos_bias.size()
|
| 638 |
+
|
| 639 |
+
num_extra_tokens = 0
|
| 640 |
+
src_size = int((src_num_pos - num_extra_tokens) ** 0.5)
|
| 641 |
+
dst_size = int((dst_num_pos - num_extra_tokens) ** 0.5)
|
| 642 |
+
if src_size != dst_size:
|
| 643 |
+
print("Position interpolate %s from %dx%d to %dx%d" % (param_name, src_size, src_size, dst_size, dst_size))
|
| 644 |
+
|
| 645 |
+
# extra_tokens = rel_pos_bias[-num_extra_tokens:, :]
|
| 646 |
+
# rel_pos_bias = rel_pos_bias[:-num_extra_tokens, :]
|
| 647 |
+
|
| 648 |
+
def geometric_progression(a, r, n):
|
| 649 |
+
return a * (1.0 - r ** n) / (1.0 - r)
|
| 650 |
+
|
| 651 |
+
left, right = 1.01, 1.5
|
| 652 |
+
while right - left > 1e-6:
|
| 653 |
+
q = (left + right) / 2.0
|
| 654 |
+
gp = geometric_progression(1, q, src_size // 2)
|
| 655 |
+
if gp > dst_size // 2:
|
| 656 |
+
right = q
|
| 657 |
+
else:
|
| 658 |
+
left = q
|
| 659 |
+
|
| 660 |
+
# if q > 1.090307:
|
| 661 |
+
# q = 1.090307
|
| 662 |
+
|
| 663 |
+
dis = []
|
| 664 |
+
cur = 1
|
| 665 |
+
for i in range(src_size // 2):
|
| 666 |
+
dis.append(cur)
|
| 667 |
+
cur += q ** (i + 1)
|
| 668 |
+
|
| 669 |
+
r_ids = [-_ for _ in reversed(dis)]
|
| 670 |
+
|
| 671 |
+
x = r_ids + [0] + dis
|
| 672 |
+
y = r_ids + [0] + dis
|
| 673 |
+
|
| 674 |
+
t = dst_size // 2.0
|
| 675 |
+
dx = np.arange(-t, t + 0.1, 1.0)
|
| 676 |
+
dy = np.arange(-t, t + 0.1, 1.0)
|
| 677 |
+
|
| 678 |
+
# print("Original positions = %s" % str(x))
|
| 679 |
+
# print("Target positions = %s" % str(dx))
|
| 680 |
+
|
| 681 |
+
all_rel_pos_bias = []
|
| 682 |
+
|
| 683 |
+
for i in range(num_attn_heads):
|
| 684 |
+
z = rel_pos_bias[:, i].view(src_size, src_size).float().numpy()
|
| 685 |
+
f = interpolate.interp2d(x, y, z, kind='cubic')
|
| 686 |
+
all_rel_pos_bias.append(
|
| 687 |
+
torch.Tensor(f(dx, dy)).contiguous().view(-1, 1).to(rel_pos_bias.device))
|
| 688 |
+
|
| 689 |
+
rel_pos_bias = torch.cat(all_rel_pos_bias, dim=-1)
|
| 690 |
+
|
| 691 |
+
return rel_pos_bias
|
| 692 |
+
|
| 693 |
+
def zero_module(module):
|
| 694 |
+
for p in module.parameters():
|
| 695 |
+
nn.init.zeros_(p)
|
| 696 |
+
return module
|
ram/models/swin_transformer_lora.py
ADDED
|
@@ -0,0 +1,660 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
# --------------------------------------------------------
|
| 2 |
+
# Swin Transformer
|
| 3 |
+
# Copyright (c) 2021 Microsoft
|
| 4 |
+
# Licensed under The MIT License [see LICENSE for details]
|
| 5 |
+
# Written by Ze Liu
|
| 6 |
+
# --------------------------------------------------------
|
| 7 |
+
|
| 8 |
+
import numpy as np
|
| 9 |
+
from scipy import interpolate
|
| 10 |
+
|
| 11 |
+
import torch
|
| 12 |
+
import torch.nn as nn
|
| 13 |
+
import torch.utils.checkpoint as checkpoint
|
| 14 |
+
from timm.models.layers import DropPath, to_2tuple, trunc_normal_
|
| 15 |
+
|
| 16 |
+
import loralib as lora
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
class Mlp(nn.Module):
|
| 20 |
+
def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):
|
| 21 |
+
super().__init__()
|
| 22 |
+
out_features = out_features or in_features
|
| 23 |
+
hidden_features = hidden_features or in_features
|
| 24 |
+
self.fc1 = nn.Linear(in_features, hidden_features)
|
| 25 |
+
# self.fc1 = lora.Linear(in_features, hidden_features, r=16)
|
| 26 |
+
self.act = act_layer()
|
| 27 |
+
self.fc2 = nn.Linear(hidden_features, out_features)
|
| 28 |
+
# self.fc2 = lora.Linear(hidden_features, out_features, r=16)
|
| 29 |
+
self.drop = nn.Dropout(drop)
|
| 30 |
+
|
| 31 |
+
def forward(self, x):
|
| 32 |
+
x = self.fc1(x)
|
| 33 |
+
x = self.act(x)
|
| 34 |
+
x = self.drop(x)
|
| 35 |
+
x = self.fc2(x)
|
| 36 |
+
x = self.drop(x)
|
| 37 |
+
return x
|
| 38 |
+
|
| 39 |
+
|
| 40 |
+
def window_partition(x, window_size):
|
| 41 |
+
"""
|
| 42 |
+
Args:
|
| 43 |
+
x: (B, H, W, C)
|
| 44 |
+
window_size (int): window size
|
| 45 |
+
|
| 46 |
+
Returns:
|
| 47 |
+
windows: (num_windows*B, window_size, window_size, C)
|
| 48 |
+
"""
|
| 49 |
+
B, H, W, C = x.shape
|
| 50 |
+
x = x.view(B, H // window_size, window_size, W // window_size, window_size, C)
|
| 51 |
+
windows = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(-1, window_size, window_size, C)
|
| 52 |
+
return windows
|
| 53 |
+
|
| 54 |
+
|
| 55 |
+
def window_reverse(windows, window_size, H, W):
|
| 56 |
+
"""
|
| 57 |
+
Args:
|
| 58 |
+
windows: (num_windows*B, window_size, window_size, C)
|
| 59 |
+
window_size (int): Window size
|
| 60 |
+
H (int): Height of image
|
| 61 |
+
W (int): Width of image
|
| 62 |
+
|
| 63 |
+
Returns:
|
| 64 |
+
x: (B, H, W, C)
|
| 65 |
+
"""
|
| 66 |
+
B = int(windows.shape[0] / (H * W / window_size / window_size))
|
| 67 |
+
x = windows.view(B, H // window_size, W // window_size, window_size, window_size, -1)
|
| 68 |
+
x = x.permute(0, 1, 3, 2, 4, 5).contiguous().view(B, H, W, -1)
|
| 69 |
+
return x
|
| 70 |
+
|
| 71 |
+
|
| 72 |
+
class WindowAttention(nn.Module):
|
| 73 |
+
r""" Window based multi-head self attention (W-MSA) module with relative position bias.
|
| 74 |
+
It supports both of shifted and non-shifted window.
|
| 75 |
+
|
| 76 |
+
Args:
|
| 77 |
+
dim (int): Number of input channels.
|
| 78 |
+
window_size (tuple[int]): The height and width of the window.
|
| 79 |
+
num_heads (int): Number of attention heads.
|
| 80 |
+
qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True
|
| 81 |
+
qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set
|
| 82 |
+
attn_drop (float, optional): Dropout ratio of attention weight. Default: 0.0
|
| 83 |
+
proj_drop (float, optional): Dropout ratio of output. Default: 0.0
|
| 84 |
+
"""
|
| 85 |
+
|
| 86 |
+
def __init__(self, dim, window_size, num_heads, qkv_bias=True, qk_scale=None, attn_drop=0., proj_drop=0.):
|
| 87 |
+
|
| 88 |
+
super().__init__()
|
| 89 |
+
self.dim = dim
|
| 90 |
+
self.window_size = window_size # Wh, Ww
|
| 91 |
+
self.num_heads = num_heads
|
| 92 |
+
head_dim = dim // num_heads
|
| 93 |
+
self.scale = qk_scale or head_dim ** -0.5
|
| 94 |
+
|
| 95 |
+
# define a parameter table of relative position bias
|
| 96 |
+
self.relative_position_bias_table = nn.Parameter(
|
| 97 |
+
torch.zeros((2 * window_size[0] - 1) * (2 * window_size[1] - 1), num_heads)) # 2*Wh-1 * 2*Ww-1, nH
|
| 98 |
+
|
| 99 |
+
# get pair-wise relative position index for each token inside the window
|
| 100 |
+
coords_h = torch.arange(self.window_size[0])
|
| 101 |
+
coords_w = torch.arange(self.window_size[1])
|
| 102 |
+
coords = torch.stack(torch.meshgrid([coords_h, coords_w])) # 2, Wh, Ww
|
| 103 |
+
coords_flatten = torch.flatten(coords, 1) # 2, Wh*Ww
|
| 104 |
+
relative_coords = coords_flatten[:, :, None] - coords_flatten[:, None, :] # 2, Wh*Ww, Wh*Ww
|
| 105 |
+
relative_coords = relative_coords.permute(1, 2, 0).contiguous() # Wh*Ww, Wh*Ww, 2
|
| 106 |
+
relative_coords[:, :, 0] += self.window_size[0] - 1 # shift to start from 0
|
| 107 |
+
relative_coords[:, :, 1] += self.window_size[1] - 1
|
| 108 |
+
relative_coords[:, :, 0] *= 2 * self.window_size[1] - 1
|
| 109 |
+
relative_position_index = relative_coords.sum(-1) # Wh*Ww, Wh*Ww
|
| 110 |
+
self.register_buffer("relative_position_index", relative_position_index)
|
| 111 |
+
|
| 112 |
+
# self.qkv = nn.Linear(dim, dim * 3, bias=qkv_bias)
|
| 113 |
+
# lora version
|
| 114 |
+
self.qkv = lora.MergedLinear(dim, 3*dim, r=8, enable_lora=[True, False, True])
|
| 115 |
+
self.attn_drop = nn.Dropout(attn_drop)
|
| 116 |
+
self.proj = nn.Linear(dim, dim)
|
| 117 |
+
self.proj_drop = nn.Dropout(proj_drop)
|
| 118 |
+
|
| 119 |
+
trunc_normal_(self.relative_position_bias_table, std=.02)
|
| 120 |
+
self.softmax = nn.Softmax(dim=-1)
|
| 121 |
+
|
| 122 |
+
def forward(self, x, mask=None):
|
| 123 |
+
"""
|
| 124 |
+
Args:
|
| 125 |
+
x: input features with shape of (num_windows*B, N, C)
|
| 126 |
+
mask: (0/-inf) mask with shape of (num_windows, Wh*Ww, Wh*Ww) or None
|
| 127 |
+
"""
|
| 128 |
+
B_, N, C = x.shape
|
| 129 |
+
qkv = self.qkv(x).reshape(B_, N, 3, self.num_heads, C // self.num_heads).permute(2, 0, 3, 1, 4)
|
| 130 |
+
q, k, v = qkv[0], qkv[1], qkv[2] # make torchscript happy (cannot use tensor as tuple)
|
| 131 |
+
|
| 132 |
+
q = q * self.scale
|
| 133 |
+
attn = (q @ k.transpose(-2, -1))
|
| 134 |
+
|
| 135 |
+
relative_position_bias = self.relative_position_bias_table[self.relative_position_index.view(-1)].view(
|
| 136 |
+
self.window_size[0] * self.window_size[1], self.window_size[0] * self.window_size[1], -1) # Wh*Ww,Wh*Ww,nH
|
| 137 |
+
relative_position_bias = relative_position_bias.permute(2, 0, 1).contiguous() # nH, Wh*Ww, Wh*Ww
|
| 138 |
+
attn = attn + relative_position_bias.unsqueeze(0)
|
| 139 |
+
|
| 140 |
+
if mask is not None:
|
| 141 |
+
nW = mask.shape[0]
|
| 142 |
+
attn = attn.view(B_ // nW, nW, self.num_heads, N, N) + mask.unsqueeze(1).unsqueeze(0)
|
| 143 |
+
attn = attn.view(-1, self.num_heads, N, N)
|
| 144 |
+
attn = self.softmax(attn)
|
| 145 |
+
else:
|
| 146 |
+
attn = self.softmax(attn)
|
| 147 |
+
|
| 148 |
+
attn = self.attn_drop(attn)
|
| 149 |
+
|
| 150 |
+
x = (attn @ v).transpose(1, 2).reshape(B_, N, C)
|
| 151 |
+
x = self.proj(x)
|
| 152 |
+
x = self.proj_drop(x)
|
| 153 |
+
return x
|
| 154 |
+
|
| 155 |
+
def extra_repr(self) -> str:
|
| 156 |
+
return f'dim={self.dim}, window_size={self.window_size}, num_heads={self.num_heads}'
|
| 157 |
+
|
| 158 |
+
def flops(self, N):
|
| 159 |
+
# calculate flops for 1 window with token length of N
|
| 160 |
+
flops = 0
|
| 161 |
+
# qkv = self.qkv(x)
|
| 162 |
+
flops += N * self.dim * 3 * self.dim
|
| 163 |
+
# attn = (q @ k.transpose(-2, -1))
|
| 164 |
+
flops += self.num_heads * N * (self.dim // self.num_heads) * N
|
| 165 |
+
# x = (attn @ v)
|
| 166 |
+
flops += self.num_heads * N * N * (self.dim // self.num_heads)
|
| 167 |
+
# x = self.proj(x)
|
| 168 |
+
flops += N * self.dim * self.dim
|
| 169 |
+
return flops
|
| 170 |
+
|
| 171 |
+
|
| 172 |
+
class SwinTransformerBlock(nn.Module):
|
| 173 |
+
r""" Swin Transformer Block.
|
| 174 |
+
|
| 175 |
+
Args:
|
| 176 |
+
dim (int): Number of input channels.
|
| 177 |
+
input_resolution (tuple[int]): Input resulotion.
|
| 178 |
+
num_heads (int): Number of attention heads.
|
| 179 |
+
window_size (int): Window size.
|
| 180 |
+
shift_size (int): Shift size for SW-MSA.
|
| 181 |
+
mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.
|
| 182 |
+
qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True
|
| 183 |
+
qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.
|
| 184 |
+
drop (float, optional): Dropout rate. Default: 0.0
|
| 185 |
+
attn_drop (float, optional): Attention dropout rate. Default: 0.0
|
| 186 |
+
drop_path (float, optional): Stochastic depth rate. Default: 0.0
|
| 187 |
+
act_layer (nn.Module, optional): Activation layer. Default: nn.GELU
|
| 188 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: nn.LayerNorm
|
| 189 |
+
"""
|
| 190 |
+
|
| 191 |
+
def __init__(self, dim, input_resolution, num_heads, window_size=7, shift_size=0,
|
| 192 |
+
mlp_ratio=4., qkv_bias=True, qk_scale=None, drop=0., attn_drop=0., drop_path=0.,
|
| 193 |
+
act_layer=nn.GELU, norm_layer=nn.LayerNorm):
|
| 194 |
+
super().__init__()
|
| 195 |
+
self.dim = dim
|
| 196 |
+
self.input_resolution = input_resolution
|
| 197 |
+
self.num_heads = num_heads
|
| 198 |
+
self.window_size = window_size
|
| 199 |
+
self.shift_size = shift_size
|
| 200 |
+
self.mlp_ratio = mlp_ratio
|
| 201 |
+
if min(self.input_resolution) <= self.window_size:
|
| 202 |
+
# if window size is larger than input resolution, we don't partition windows
|
| 203 |
+
self.shift_size = 0
|
| 204 |
+
self.window_size = min(self.input_resolution)
|
| 205 |
+
assert 0 <= self.shift_size < self.window_size, "shift_size must in 0-window_size"
|
| 206 |
+
|
| 207 |
+
self.norm1 = norm_layer(dim)
|
| 208 |
+
self.attn = WindowAttention(
|
| 209 |
+
dim, window_size=to_2tuple(self.window_size), num_heads=num_heads,
|
| 210 |
+
qkv_bias=qkv_bias, qk_scale=qk_scale, attn_drop=attn_drop, proj_drop=drop)
|
| 211 |
+
|
| 212 |
+
self.drop_path = DropPath(drop_path) if drop_path > 0. else nn.Identity()
|
| 213 |
+
self.norm2 = norm_layer(dim)
|
| 214 |
+
mlp_hidden_dim = int(dim * mlp_ratio)
|
| 215 |
+
self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)
|
| 216 |
+
|
| 217 |
+
if self.shift_size > 0:
|
| 218 |
+
# calculate attention mask for SW-MSA
|
| 219 |
+
H, W = self.input_resolution
|
| 220 |
+
img_mask = torch.zeros((1, H, W, 1)) # 1 H W 1
|
| 221 |
+
h_slices = (slice(0, -self.window_size),
|
| 222 |
+
slice(-self.window_size, -self.shift_size),
|
| 223 |
+
slice(-self.shift_size, None))
|
| 224 |
+
w_slices = (slice(0, -self.window_size),
|
| 225 |
+
slice(-self.window_size, -self.shift_size),
|
| 226 |
+
slice(-self.shift_size, None))
|
| 227 |
+
cnt = 0
|
| 228 |
+
for h in h_slices:
|
| 229 |
+
for w in w_slices:
|
| 230 |
+
img_mask[:, h, w, :] = cnt
|
| 231 |
+
cnt += 1
|
| 232 |
+
|
| 233 |
+
mask_windows = window_partition(img_mask, self.window_size) # nW, window_size, window_size, 1
|
| 234 |
+
mask_windows = mask_windows.view(-1, self.window_size * self.window_size)
|
| 235 |
+
attn_mask = mask_windows.unsqueeze(1) - mask_windows.unsqueeze(2)
|
| 236 |
+
attn_mask = attn_mask.masked_fill(attn_mask != 0, float(-100.0)).masked_fill(attn_mask == 0, float(0.0))
|
| 237 |
+
else:
|
| 238 |
+
attn_mask = None
|
| 239 |
+
|
| 240 |
+
self.register_buffer("attn_mask", attn_mask)
|
| 241 |
+
|
| 242 |
+
def forward(self, x):
|
| 243 |
+
H, W = self.input_resolution
|
| 244 |
+
B, L, C = x.shape
|
| 245 |
+
assert L == H * W, "input feature has wrong size"
|
| 246 |
+
|
| 247 |
+
shortcut = x
|
| 248 |
+
x = self.norm1(x)
|
| 249 |
+
x = x.view(B, H, W, C)
|
| 250 |
+
|
| 251 |
+
# cyclic shift
|
| 252 |
+
if self.shift_size > 0:
|
| 253 |
+
shifted_x = torch.roll(x, shifts=(-self.shift_size, -self.shift_size), dims=(1, 2))
|
| 254 |
+
else:
|
| 255 |
+
shifted_x = x
|
| 256 |
+
|
| 257 |
+
# partition windows
|
| 258 |
+
x_windows = window_partition(shifted_x, self.window_size) # nW*B, window_size, window_size, C
|
| 259 |
+
x_windows = x_windows.view(-1, self.window_size * self.window_size, C) # nW*B, window_size*window_size, C
|
| 260 |
+
|
| 261 |
+
# W-MSA/SW-MSA
|
| 262 |
+
attn_windows = self.attn(x_windows, mask=self.attn_mask) # nW*B, window_size*window_size, C
|
| 263 |
+
|
| 264 |
+
# merge windows
|
| 265 |
+
attn_windows = attn_windows.view(-1, self.window_size, self.window_size, C)
|
| 266 |
+
shifted_x = window_reverse(attn_windows, self.window_size, H, W) # B H' W' C
|
| 267 |
+
|
| 268 |
+
# reverse cyclic shift
|
| 269 |
+
if self.shift_size > 0:
|
| 270 |
+
x = torch.roll(shifted_x, shifts=(self.shift_size, self.shift_size), dims=(1, 2))
|
| 271 |
+
else:
|
| 272 |
+
x = shifted_x
|
| 273 |
+
x = x.view(B, H * W, C)
|
| 274 |
+
|
| 275 |
+
# FFN
|
| 276 |
+
x = shortcut + self.drop_path(x)
|
| 277 |
+
x = x + self.drop_path(self.mlp(self.norm2(x)))
|
| 278 |
+
|
| 279 |
+
return x
|
| 280 |
+
|
| 281 |
+
def extra_repr(self) -> str:
|
| 282 |
+
return f"dim={self.dim}, input_resolution={self.input_resolution}, num_heads={self.num_heads}, " \
|
| 283 |
+
f"window_size={self.window_size}, shift_size={self.shift_size}, mlp_ratio={self.mlp_ratio}"
|
| 284 |
+
|
| 285 |
+
def flops(self):
|
| 286 |
+
flops = 0
|
| 287 |
+
H, W = self.input_resolution
|
| 288 |
+
# norm1
|
| 289 |
+
flops += self.dim * H * W
|
| 290 |
+
# W-MSA/SW-MSA
|
| 291 |
+
nW = H * W / self.window_size / self.window_size
|
| 292 |
+
flops += nW * self.attn.flops(self.window_size * self.window_size)
|
| 293 |
+
# mlp
|
| 294 |
+
flops += 2 * H * W * self.dim * self.dim * self.mlp_ratio
|
| 295 |
+
# norm2
|
| 296 |
+
flops += self.dim * H * W
|
| 297 |
+
return flops
|
| 298 |
+
|
| 299 |
+
|
| 300 |
+
class PatchMerging(nn.Module):
|
| 301 |
+
r""" Patch Merging Layer.
|
| 302 |
+
|
| 303 |
+
Args:
|
| 304 |
+
input_resolution (tuple[int]): Resolution of input feature.
|
| 305 |
+
dim (int): Number of input channels.
|
| 306 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: nn.LayerNorm
|
| 307 |
+
"""
|
| 308 |
+
|
| 309 |
+
def __init__(self, input_resolution, dim, norm_layer=nn.LayerNorm):
|
| 310 |
+
super().__init__()
|
| 311 |
+
self.input_resolution = input_resolution
|
| 312 |
+
self.dim = dim
|
| 313 |
+
self.reduction = nn.Linear(4 * dim, 2 * dim, bias=False)
|
| 314 |
+
self.norm = norm_layer(4 * dim)
|
| 315 |
+
|
| 316 |
+
def forward(self, x):
|
| 317 |
+
"""
|
| 318 |
+
x: B, H*W, C
|
| 319 |
+
"""
|
| 320 |
+
H, W = self.input_resolution
|
| 321 |
+
B, L, C = x.shape
|
| 322 |
+
assert L == H * W, "input feature has wrong size"
|
| 323 |
+
assert H % 2 == 0 and W % 2 == 0, f"x size ({H}*{W}) are not even."
|
| 324 |
+
|
| 325 |
+
x = x.view(B, H, W, C)
|
| 326 |
+
|
| 327 |
+
x0 = x[:, 0::2, 0::2, :] # B H/2 W/2 C
|
| 328 |
+
x1 = x[:, 1::2, 0::2, :] # B H/2 W/2 C
|
| 329 |
+
x2 = x[:, 0::2, 1::2, :] # B H/2 W/2 C
|
| 330 |
+
x3 = x[:, 1::2, 1::2, :] # B H/2 W/2 C
|
| 331 |
+
x = torch.cat([x0, x1, x2, x3], -1) # B H/2 W/2 4*C
|
| 332 |
+
x = x.view(B, -1, 4 * C) # B H/2*W/2 4*C
|
| 333 |
+
|
| 334 |
+
x = self.norm(x)
|
| 335 |
+
x = self.reduction(x)
|
| 336 |
+
|
| 337 |
+
return x
|
| 338 |
+
|
| 339 |
+
def extra_repr(self) -> str:
|
| 340 |
+
return f"input_resolution={self.input_resolution}, dim={self.dim}"
|
| 341 |
+
|
| 342 |
+
def flops(self):
|
| 343 |
+
H, W = self.input_resolution
|
| 344 |
+
flops = H * W * self.dim
|
| 345 |
+
flops += (H // 2) * (W // 2) * 4 * self.dim * 2 * self.dim
|
| 346 |
+
return flops
|
| 347 |
+
|
| 348 |
+
|
| 349 |
+
class BasicLayer(nn.Module):
|
| 350 |
+
""" A basic Swin Transformer layer for one stage.
|
| 351 |
+
|
| 352 |
+
Args:
|
| 353 |
+
dim (int): Number of input channels.
|
| 354 |
+
input_resolution (tuple[int]): Input resolution.
|
| 355 |
+
depth (int): Number of blocks.
|
| 356 |
+
num_heads (int): Number of attention heads.
|
| 357 |
+
window_size (int): Local window size.
|
| 358 |
+
mlp_ratio (float): Ratio of mlp hidden dim to embedding dim.
|
| 359 |
+
qkv_bias (bool, optional): If True, add a learnable bias to query, key, value. Default: True
|
| 360 |
+
qk_scale (float | None, optional): Override default qk scale of head_dim ** -0.5 if set.
|
| 361 |
+
drop (float, optional): Dropout rate. Default: 0.0
|
| 362 |
+
attn_drop (float, optional): Attention dropout rate. Default: 0.0
|
| 363 |
+
drop_path (float | tuple[float], optional): Stochastic depth rate. Default: 0.0
|
| 364 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: nn.LayerNorm
|
| 365 |
+
downsample (nn.Module | None, optional): Downsample layer at the end of the layer. Default: None
|
| 366 |
+
use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False.
|
| 367 |
+
"""
|
| 368 |
+
|
| 369 |
+
def __init__(self, dim, input_resolution, depth, num_heads, window_size,
|
| 370 |
+
mlp_ratio=4., qkv_bias=True, qk_scale=None, drop=0., attn_drop=0.,
|
| 371 |
+
drop_path=0., norm_layer=nn.LayerNorm, downsample=None, use_checkpoint=False):
|
| 372 |
+
|
| 373 |
+
super().__init__()
|
| 374 |
+
self.dim = dim
|
| 375 |
+
self.input_resolution = input_resolution
|
| 376 |
+
self.depth = depth
|
| 377 |
+
self.use_checkpoint = use_checkpoint
|
| 378 |
+
|
| 379 |
+
# build blocks
|
| 380 |
+
self.blocks = nn.ModuleList([
|
| 381 |
+
SwinTransformerBlock(dim=dim, input_resolution=input_resolution,
|
| 382 |
+
num_heads=num_heads, window_size=window_size,
|
| 383 |
+
shift_size=0 if (i % 2 == 0) else window_size // 2,
|
| 384 |
+
mlp_ratio=mlp_ratio,
|
| 385 |
+
qkv_bias=qkv_bias, qk_scale=qk_scale,
|
| 386 |
+
drop=drop, attn_drop=attn_drop,
|
| 387 |
+
drop_path=drop_path[i] if isinstance(drop_path, list) else drop_path,
|
| 388 |
+
norm_layer=norm_layer)
|
| 389 |
+
for i in range(depth)])
|
| 390 |
+
|
| 391 |
+
# patch merging layer
|
| 392 |
+
if downsample is not None:
|
| 393 |
+
self.downsample = downsample(input_resolution, dim=dim, norm_layer=norm_layer)
|
| 394 |
+
else:
|
| 395 |
+
self.downsample = None
|
| 396 |
+
|
| 397 |
+
def forward(self, x):
|
| 398 |
+
for blk in self.blocks:
|
| 399 |
+
if self.use_checkpoint:
|
| 400 |
+
x = checkpoint.checkpoint(blk, x)
|
| 401 |
+
else:
|
| 402 |
+
x = blk(x)
|
| 403 |
+
if self.downsample is not None:
|
| 404 |
+
x = self.downsample(x)
|
| 405 |
+
return x
|
| 406 |
+
|
| 407 |
+
def extra_repr(self) -> str:
|
| 408 |
+
return f"dim={self.dim}, input_resolution={self.input_resolution}, depth={self.depth}"
|
| 409 |
+
|
| 410 |
+
def flops(self):
|
| 411 |
+
flops = 0
|
| 412 |
+
for blk in self.blocks:
|
| 413 |
+
flops += blk.flops()
|
| 414 |
+
if self.downsample is not None:
|
| 415 |
+
flops += self.downsample.flops()
|
| 416 |
+
return flops
|
| 417 |
+
|
| 418 |
+
|
| 419 |
+
class PatchEmbed(nn.Module):
|
| 420 |
+
r""" Image to Patch Embedding
|
| 421 |
+
|
| 422 |
+
Args:
|
| 423 |
+
img_size (int): Image size. Default: 224.
|
| 424 |
+
patch_size (int): Patch token size. Default: 4.
|
| 425 |
+
in_chans (int): Number of input image channels. Default: 3.
|
| 426 |
+
embed_dim (int): Number of linear projection output channels. Default: 96.
|
| 427 |
+
norm_layer (nn.Module, optional): Normalization layer. Default: None
|
| 428 |
+
"""
|
| 429 |
+
|
| 430 |
+
def __init__(self, img_size=224, patch_size=4, in_chans=3, embed_dim=96, norm_layer=None):
|
| 431 |
+
super().__init__()
|
| 432 |
+
img_size = to_2tuple(img_size)
|
| 433 |
+
patch_size = to_2tuple(patch_size)
|
| 434 |
+
patches_resolution = [img_size[0] // patch_size[0], img_size[1] // patch_size[1]]
|
| 435 |
+
self.img_size = img_size
|
| 436 |
+
self.patch_size = patch_size
|
| 437 |
+
self.patches_resolution = patches_resolution
|
| 438 |
+
self.num_patches = patches_resolution[0] * patches_resolution[1]
|
| 439 |
+
|
| 440 |
+
self.in_chans = in_chans
|
| 441 |
+
self.embed_dim = embed_dim
|
| 442 |
+
|
| 443 |
+
self.proj = nn.Conv2d(in_chans, embed_dim, kernel_size=patch_size, stride=patch_size)
|
| 444 |
+
if norm_layer is not None:
|
| 445 |
+
self.norm = norm_layer(embed_dim)
|
| 446 |
+
else:
|
| 447 |
+
self.norm = None
|
| 448 |
+
|
| 449 |
+
def forward(self, x):
|
| 450 |
+
B, C, H, W = x.shape
|
| 451 |
+
# FIXME look at relaxing size constraints
|
| 452 |
+
assert H == self.img_size[0] and W == self.img_size[1], \
|
| 453 |
+
f"Input image size ({H}*{W}) doesn't match model ({self.img_size[0]}*{self.img_size[1]})."
|
| 454 |
+
x = self.proj(x).flatten(2).transpose(1, 2) # B Ph*Pw C
|
| 455 |
+
if self.norm is not None:
|
| 456 |
+
x = self.norm(x)
|
| 457 |
+
return x
|
| 458 |
+
|
| 459 |
+
def flops(self):
|
| 460 |
+
Ho, Wo = self.patches_resolution
|
| 461 |
+
flops = Ho * Wo * self.embed_dim * self.in_chans * (self.patch_size[0] * self.patch_size[1])
|
| 462 |
+
if self.norm is not None:
|
| 463 |
+
flops += Ho * Wo * self.embed_dim
|
| 464 |
+
return flops
|
| 465 |
+
|
| 466 |
+
|
| 467 |
+
class SwinTransformer(nn.Module):
|
| 468 |
+
r""" Swin Transformer
|
| 469 |
+
A PyTorch impl of : `Swin Transformer: Hierarchical Vision Transformer using Shifted Windows` -
|
| 470 |
+
https://arxiv.org/pdf/2103.14030
|
| 471 |
+
|
| 472 |
+
Args:
|
| 473 |
+
img_size (int | tuple(int)): Input image size. Default 224
|
| 474 |
+
patch_size (int | tuple(int)): Patch size. Default: 4
|
| 475 |
+
in_chans (int): Number of input image channels. Default: 3
|
| 476 |
+
num_classes (int): Number of classes for classification head. Default: 1000
|
| 477 |
+
embed_dim (int): Patch embedding dimension. Default: 96
|
| 478 |
+
depths (tuple(int)): Depth of each Swin Transformer layer.
|
| 479 |
+
num_heads (tuple(int)): Number of attention heads in different layers.
|
| 480 |
+
window_size (int): Window size. Default: 7
|
| 481 |
+
mlp_ratio (float): Ratio of mlp hidden dim to embedding dim. Default: 4
|
| 482 |
+
qkv_bias (bool): If True, add a learnable bias to query, key, value. Default: True
|
| 483 |
+
qk_scale (float): Override default qk scale of head_dim ** -0.5 if set. Default: None
|
| 484 |
+
drop_rate (float): Dropout rate. Default: 0
|
| 485 |
+
attn_drop_rate (float): Attention dropout rate. Default: 0
|
| 486 |
+
drop_path_rate (float): Stochastic depth rate. Default: 0.1
|
| 487 |
+
norm_layer (nn.Module): Normalization layer. Default: nn.LayerNorm.
|
| 488 |
+
ape (bool): If True, add absolute position embedding to the patch embedding. Default: False
|
| 489 |
+
patch_norm (bool): If True, add normalization after patch embedding. Default: True
|
| 490 |
+
use_checkpoint (bool): Whether to use checkpointing to save memory. Default: False
|
| 491 |
+
"""
|
| 492 |
+
|
| 493 |
+
def __init__(self, img_size=224, patch_size=4, in_chans=3, num_classes=1000,
|
| 494 |
+
embed_dim=96, depths=[2, 2, 6, 2], num_heads=[3, 6, 12, 24],
|
| 495 |
+
window_size=7, mlp_ratio=4., qkv_bias=True, qk_scale=None,
|
| 496 |
+
drop_rate=0., attn_drop_rate=0., drop_path_rate=0.1,
|
| 497 |
+
norm_layer=nn.LayerNorm, ape=False, patch_norm=True,
|
| 498 |
+
use_checkpoint=False, **kwargs):
|
| 499 |
+
super().__init__()
|
| 500 |
+
|
| 501 |
+
self.num_classes = num_classes
|
| 502 |
+
self.num_layers = len(depths)
|
| 503 |
+
self.embed_dim = embed_dim
|
| 504 |
+
self.ape = ape
|
| 505 |
+
self.patch_norm = patch_norm
|
| 506 |
+
self.num_features = int(embed_dim * 2 ** (self.num_layers - 1))
|
| 507 |
+
self.mlp_ratio = mlp_ratio
|
| 508 |
+
|
| 509 |
+
# split image into non-overlapping patches
|
| 510 |
+
self.patch_embed = PatchEmbed(
|
| 511 |
+
img_size=img_size, patch_size=patch_size, in_chans=in_chans, embed_dim=embed_dim,
|
| 512 |
+
norm_layer=norm_layer if self.patch_norm else None)
|
| 513 |
+
num_patches = self.patch_embed.num_patches
|
| 514 |
+
patches_resolution = self.patch_embed.patches_resolution
|
| 515 |
+
self.patches_resolution = patches_resolution
|
| 516 |
+
|
| 517 |
+
# absolute position embedding
|
| 518 |
+
if self.ape:
|
| 519 |
+
self.absolute_pos_embed = nn.Parameter(torch.zeros(1, num_patches, embed_dim))
|
| 520 |
+
trunc_normal_(self.absolute_pos_embed, std=.02)
|
| 521 |
+
|
| 522 |
+
self.pos_drop = nn.Dropout(p=drop_rate)
|
| 523 |
+
|
| 524 |
+
# stochastic depth
|
| 525 |
+
dpr = [x.item() for x in torch.linspace(0, drop_path_rate, sum(depths))] # stochastic depth decay rule
|
| 526 |
+
|
| 527 |
+
# build layers
|
| 528 |
+
self.layers = nn.ModuleList()
|
| 529 |
+
for i_layer in range(self.num_layers):
|
| 530 |
+
layer = BasicLayer(dim=int(embed_dim * 2 ** i_layer),
|
| 531 |
+
input_resolution=(patches_resolution[0] // (2 ** i_layer),
|
| 532 |
+
patches_resolution[1] // (2 ** i_layer)),
|
| 533 |
+
depth=depths[i_layer],
|
| 534 |
+
num_heads=num_heads[i_layer],
|
| 535 |
+
window_size=window_size,
|
| 536 |
+
mlp_ratio=self.mlp_ratio,
|
| 537 |
+
qkv_bias=qkv_bias, qk_scale=qk_scale,
|
| 538 |
+
drop=drop_rate, attn_drop=attn_drop_rate,
|
| 539 |
+
drop_path=dpr[sum(depths[:i_layer]):sum(depths[:i_layer + 1])],
|
| 540 |
+
norm_layer=norm_layer,
|
| 541 |
+
downsample=PatchMerging if (i_layer < self.num_layers - 1) else None,
|
| 542 |
+
use_checkpoint=use_checkpoint)
|
| 543 |
+
self.layers.append(layer)
|
| 544 |
+
|
| 545 |
+
self.norm = norm_layer(self.num_features)
|
| 546 |
+
self.avgpool = nn.AdaptiveAvgPool1d(1)
|
| 547 |
+
# self.head = nn.Linear(self.num_features, num_classes) if num_classes > 0 else nn.Identity()
|
| 548 |
+
|
| 549 |
+
self.apply(self._init_weights)
|
| 550 |
+
|
| 551 |
+
def _init_weights(self, m):
|
| 552 |
+
if isinstance(m, nn.Linear):
|
| 553 |
+
trunc_normal_(m.weight, std=.02)
|
| 554 |
+
if isinstance(m, nn.Linear) and m.bias is not None:
|
| 555 |
+
nn.init.constant_(m.bias, 0)
|
| 556 |
+
elif isinstance(m, nn.LayerNorm):
|
| 557 |
+
nn.init.constant_(m.bias, 0)
|
| 558 |
+
nn.init.constant_(m.weight, 1.0)
|
| 559 |
+
|
| 560 |
+
@torch.jit.ignore
|
| 561 |
+
def no_weight_decay(self):
|
| 562 |
+
return {'absolute_pos_embed'}
|
| 563 |
+
|
| 564 |
+
@torch.jit.ignore
|
| 565 |
+
def no_weight_decay_keywords(self):
|
| 566 |
+
return {'relative_position_bias_table'}
|
| 567 |
+
|
| 568 |
+
def forward(self, x, idx_to_group_img=None, image_atts=None, **kwargs):
|
| 569 |
+
x = self.patch_embed(x)
|
| 570 |
+
if self.ape:
|
| 571 |
+
x = x + self.absolute_pos_embed
|
| 572 |
+
x = self.pos_drop(x)
|
| 573 |
+
|
| 574 |
+
for layer in self.layers:
|
| 575 |
+
x = layer(x)
|
| 576 |
+
|
| 577 |
+
x = self.norm(x) # B L C
|
| 578 |
+
|
| 579 |
+
x_cls = self.avgpool(x.transpose(1, 2)) # B C 1
|
| 580 |
+
|
| 581 |
+
if idx_to_group_img is None:
|
| 582 |
+
return torch.cat([x_cls.transpose(1, 2), x], dim=1)
|
| 583 |
+
else:
|
| 584 |
+
x_bs = torch.gather(x, dim=0, index=idx_to_group_img.view(-1, 1, 1).expand(-1, x.shape[1], x.shape[2]))
|
| 585 |
+
weights = image_atts[:, 1:].unsqueeze(2) # B L 1
|
| 586 |
+
x_bs_cls = torch.sum((weights * x_bs).transpose(1, 2), dim=-1, keepdim=True) # B C 1
|
| 587 |
+
x_bs_cls = x_bs_cls / torch.sum(weights.transpose(1, 2), dim=-1, keepdim=True) # avgpool
|
| 588 |
+
|
| 589 |
+
return torch.cat([x_bs_cls.transpose(1, 2), x_bs], dim=1), \
|
| 590 |
+
torch.cat([x_cls.transpose(1, 2), x], dim=1)
|
| 591 |
+
|
| 592 |
+
def flops(self):
|
| 593 |
+
flops = 0
|
| 594 |
+
flops += self.patch_embed.flops()
|
| 595 |
+
for i, layer in enumerate(self.layers):
|
| 596 |
+
flops += layer.flops()
|
| 597 |
+
flops += self.num_features * self.patches_resolution[0] * self.patches_resolution[1] // (2 ** self.num_layers)
|
| 598 |
+
flops += self.num_features * self.num_classes
|
| 599 |
+
return flops
|
| 600 |
+
|
| 601 |
+
|
| 602 |
+
def interpolate_relative_pos_embed(rel_pos_bias, dst_num_pos, param_name=''):
|
| 603 |
+
# from: https://github.com/microsoft/unilm/blob/8a0a1c1f4e7326938ea7580a00d56d7f17d65612/beit/run_class_finetuning.py#L348
|
| 604 |
+
|
| 605 |
+
# rel_pos_bias: relative_position_bias_table
|
| 606 |
+
src_num_pos, num_attn_heads = rel_pos_bias.size()
|
| 607 |
+
|
| 608 |
+
num_extra_tokens = 0
|
| 609 |
+
src_size = int((src_num_pos - num_extra_tokens) ** 0.5)
|
| 610 |
+
dst_size = int((dst_num_pos - num_extra_tokens) ** 0.5)
|
| 611 |
+
if src_size != dst_size:
|
| 612 |
+
print("Position interpolate %s from %dx%d to %dx%d" % (param_name, src_size, src_size, dst_size, dst_size))
|
| 613 |
+
|
| 614 |
+
# extra_tokens = rel_pos_bias[-num_extra_tokens:, :]
|
| 615 |
+
# rel_pos_bias = rel_pos_bias[:-num_extra_tokens, :]
|
| 616 |
+
|
| 617 |
+
def geometric_progression(a, r, n):
|
| 618 |
+
return a * (1.0 - r ** n) / (1.0 - r)
|
| 619 |
+
|
| 620 |
+
left, right = 1.01, 1.5
|
| 621 |
+
while right - left > 1e-6:
|
| 622 |
+
q = (left + right) / 2.0
|
| 623 |
+
gp = geometric_progression(1, q, src_size // 2)
|
| 624 |
+
if gp > dst_size // 2:
|
| 625 |
+
right = q
|
| 626 |
+
else:
|
| 627 |
+
left = q
|
| 628 |
+
|
| 629 |
+
# if q > 1.090307:
|
| 630 |
+
# q = 1.090307
|
| 631 |
+
|
| 632 |
+
dis = []
|
| 633 |
+
cur = 1
|
| 634 |
+
for i in range(src_size // 2):
|
| 635 |
+
dis.append(cur)
|
| 636 |
+
cur += q ** (i + 1)
|
| 637 |
+
|
| 638 |
+
r_ids = [-_ for _ in reversed(dis)]
|
| 639 |
+
|
| 640 |
+
x = r_ids + [0] + dis
|
| 641 |
+
y = r_ids + [0] + dis
|
| 642 |
+
|
| 643 |
+
t = dst_size // 2.0
|
| 644 |
+
dx = np.arange(-t, t + 0.1, 1.0)
|
| 645 |
+
dy = np.arange(-t, t + 0.1, 1.0)
|
| 646 |
+
|
| 647 |
+
# print("Original positions = %s" % str(x))
|
| 648 |
+
# print("Target positions = %s" % str(dx))
|
| 649 |
+
|
| 650 |
+
all_rel_pos_bias = []
|
| 651 |
+
|
| 652 |
+
for i in range(num_attn_heads):
|
| 653 |
+
z = rel_pos_bias[:, i].view(src_size, src_size).float().numpy()
|
| 654 |
+
f = interpolate.interp2d(x, y, z, kind='cubic')
|
| 655 |
+
all_rel_pos_bias.append(
|
| 656 |
+
torch.Tensor(f(dx, dy)).contiguous().view(-1, 1).to(rel_pos_bias.device))
|
| 657 |
+
|
| 658 |
+
rel_pos_bias = torch.cat(all_rel_pos_bias, dim=-1)
|
| 659 |
+
|
| 660 |
+
return rel_pos_bias
|
ram/models/tag2text.py
ADDED
|
@@ -0,0 +1,419 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* The Tag2Text Model
|
| 3 |
+
* Written by Xinyu Huang
|
| 4 |
+
'''
|
| 5 |
+
import numpy as np
|
| 6 |
+
import json
|
| 7 |
+
import torch
|
| 8 |
+
import warnings
|
| 9 |
+
|
| 10 |
+
from torch import nn
|
| 11 |
+
from .bert import BertConfig, BertModel, BertLMHeadModel
|
| 12 |
+
from .swin_transformer import SwinTransformer
|
| 13 |
+
|
| 14 |
+
from .utils import *
|
| 15 |
+
|
| 16 |
+
warnings.filterwarnings("ignore")
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
class Tag2Text(nn.Module):
|
| 20 |
+
|
| 21 |
+
def __init__(self,
|
| 22 |
+
med_config=f'{CONFIG_PATH}/configs/med_config.json',
|
| 23 |
+
image_size=384,
|
| 24 |
+
vit='base',
|
| 25 |
+
vit_grad_ckpt=False,
|
| 26 |
+
vit_ckpt_layer=0,
|
| 27 |
+
prompt='a picture of ',
|
| 28 |
+
threshold=0.68,
|
| 29 |
+
delete_tag_index=[127,2961, 3351, 3265, 3338, 3355, 3359],
|
| 30 |
+
tag_list=f'{CONFIG_PATH}/data/tag_list.txt'):
|
| 31 |
+
r""" Tag2Text inference module, both captioning and tagging are included.
|
| 32 |
+
Tag2Text is an efficient and controllable vision-language pre-training framework.
|
| 33 |
+
Described in the paper "Tag2Text: Guiding Vision-Language Model via Image Tagging" https://arxiv.org/abs/2303.05657
|
| 34 |
+
|
| 35 |
+
Args:
|
| 36 |
+
med_config (str): path for the mixture of encoder-decoder model's configuration file
|
| 37 |
+
image_size (int): input image size
|
| 38 |
+
vit (str): model size of vision transformer
|
| 39 |
+
threshold (int): tagging threshold
|
| 40 |
+
delete_tag_index (list): delete some tags that may disturb captioning
|
| 41 |
+
"""
|
| 42 |
+
super().__init__()
|
| 43 |
+
|
| 44 |
+
# create image encoder
|
| 45 |
+
if vit == 'swin_b':
|
| 46 |
+
if image_size == 224:
|
| 47 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_224.json'
|
| 48 |
+
elif image_size == 384:
|
| 49 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_384.json'
|
| 50 |
+
vision_config = read_json(vision_config_path)
|
| 51 |
+
assert image_size == vision_config['image_res']
|
| 52 |
+
# assert config['patch_size'] == 32
|
| 53 |
+
vision_width = vision_config['vision_width']
|
| 54 |
+
|
| 55 |
+
self.visual_encoder = SwinTransformer(
|
| 56 |
+
img_size=vision_config['image_res'],
|
| 57 |
+
patch_size=4,
|
| 58 |
+
in_chans=3,
|
| 59 |
+
embed_dim=vision_config['embed_dim'],
|
| 60 |
+
depths=vision_config['depths'],
|
| 61 |
+
num_heads=vision_config['num_heads'],
|
| 62 |
+
window_size=vision_config['window_size'],
|
| 63 |
+
mlp_ratio=4.,
|
| 64 |
+
qkv_bias=True,
|
| 65 |
+
drop_rate=0.0,
|
| 66 |
+
drop_path_rate=0.1,
|
| 67 |
+
ape=False,
|
| 68 |
+
patch_norm=True,
|
| 69 |
+
use_checkpoint=False)
|
| 70 |
+
|
| 71 |
+
else:
|
| 72 |
+
self.visual_encoder, vision_width = create_vit(
|
| 73 |
+
vit, image_size, vit_grad_ckpt, vit_ckpt_layer)
|
| 74 |
+
|
| 75 |
+
# create tokenzier
|
| 76 |
+
self.tokenizer = init_tokenizer()
|
| 77 |
+
|
| 78 |
+
# Tag2Text employ encoder-decoder architecture for image-tag-text generation: image-tag interaction encoder and image-tag-text decoder
|
| 79 |
+
# create image-tag interaction encoder
|
| 80 |
+
encoder_config = BertConfig.from_json_file(med_config)
|
| 81 |
+
encoder_config.encoder_width = vision_width
|
| 82 |
+
self.tag_encoder = BertModel(config=encoder_config,
|
| 83 |
+
add_pooling_layer=False)
|
| 84 |
+
|
| 85 |
+
# create image-tag-text decoder
|
| 86 |
+
decoder_config = BertConfig.from_json_file(med_config)
|
| 87 |
+
self.text_decoder = BertLMHeadModel(config=decoder_config)
|
| 88 |
+
|
| 89 |
+
# delete some tags that may disturb captioning
|
| 90 |
+
# 127: "quarter"; 2961: "back"; 3351: "two"; 3265: "three"; 3338: "four"; 3355: "five"; 3359: "one"
|
| 91 |
+
self.delete_tag_index = delete_tag_index
|
| 92 |
+
self.prompt = prompt
|
| 93 |
+
self.prompt_length = len(self.tokenizer(self.prompt).input_ids) - 1
|
| 94 |
+
|
| 95 |
+
# load tag list
|
| 96 |
+
self.tag_list = self.load_tag_list(tag_list)
|
| 97 |
+
|
| 98 |
+
# create image-tag recognition decoder
|
| 99 |
+
self.threshold = threshold
|
| 100 |
+
self.num_class = len(self.tag_list)
|
| 101 |
+
q2l_config = BertConfig.from_json_file(f'{CONFIG_PATH}/configs/q2l_config.json')
|
| 102 |
+
q2l_config.encoder_width = vision_width
|
| 103 |
+
self.tagging_head = BertModel(config=q2l_config,
|
| 104 |
+
add_pooling_layer=False)
|
| 105 |
+
self.tagging_head.resize_token_embeddings(len(self.tokenizer))
|
| 106 |
+
self.label_embed = nn.Embedding(self.num_class, q2l_config.hidden_size)
|
| 107 |
+
self.fc = GroupWiseLinear(self.num_class,
|
| 108 |
+
q2l_config.hidden_size,
|
| 109 |
+
bias=True)
|
| 110 |
+
self.del_selfattention()
|
| 111 |
+
|
| 112 |
+
self.tagging_loss_function = AsymmetricLoss(gamma_neg=7,
|
| 113 |
+
gamma_pos=0,
|
| 114 |
+
clip=0.05)
|
| 115 |
+
|
| 116 |
+
# share weights of the lowest 2-layer of "image-tag interaction encoder" with the "image-tag recogntion decoder"
|
| 117 |
+
tie_encoder_decoder_weights(self.tag_encoder, self.tagging_head, '',
|
| 118 |
+
' ')
|
| 119 |
+
|
| 120 |
+
# adjust thresholds for some tags
|
| 121 |
+
# default threshold: 0.68
|
| 122 |
+
# 2701: "person"; 2828: "man"; 1167: "woman";
|
| 123 |
+
tag_thrshold = {2701:0.7, 2828: 0.7, 1167: 0.7}
|
| 124 |
+
self.class_threshold = torch.ones(self.num_class) * self.threshold
|
| 125 |
+
for key,value in tag_thrshold.items():
|
| 126 |
+
self.class_threshold[key] = value
|
| 127 |
+
|
| 128 |
+
def load_tag_list(self, tag_list_file):
|
| 129 |
+
with open(tag_list_file, 'r') as f:
|
| 130 |
+
tag_list = f.read().splitlines()
|
| 131 |
+
tag_list = np.array(tag_list)
|
| 132 |
+
return tag_list
|
| 133 |
+
|
| 134 |
+
# delete self-attention layer of image-tag recognition decoder to reduce computation, follower Query2Label
|
| 135 |
+
def del_selfattention(self):
|
| 136 |
+
del self.tagging_head.embeddings
|
| 137 |
+
for layer in self.tagging_head.encoder.layer:
|
| 138 |
+
del layer.attention
|
| 139 |
+
|
| 140 |
+
|
| 141 |
+
def forward(self, image, caption, tag):
|
| 142 |
+
"""
|
| 143 |
+
call function as forward
|
| 144 |
+
|
| 145 |
+
Args:
|
| 146 |
+
image: type: torch.Tensor shape: batch_size * 3 * 384 * 384
|
| 147 |
+
caption: type: list[string] len: batch_size
|
| 148 |
+
tag: type: torch.Tensor shape: batch * class_num (e.g. 3429) value: positive sample is 1.0, negative sample is 0.0
|
| 149 |
+
|
| 150 |
+
Returns:
|
| 151 |
+
loss: type: torch.Tensor
|
| 152 |
+
"""
|
| 153 |
+
|
| 154 |
+
image_embeds = self.visual_encoder(image)
|
| 155 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 156 |
+
dtype=torch.long).to(image.device)
|
| 157 |
+
|
| 158 |
+
##================= Image Tagging ================##
|
| 159 |
+
bs = image_embeds.shape[0]
|
| 160 |
+
label_embed = self.label_embed.weight.unsqueeze(0).repeat(bs, 1, 1)
|
| 161 |
+
|
| 162 |
+
tagging_embed = self.tagging_head(
|
| 163 |
+
encoder_embeds=label_embed,
|
| 164 |
+
encoder_hidden_states=image_embeds,
|
| 165 |
+
encoder_attention_mask=image_atts,
|
| 166 |
+
return_dict=False,
|
| 167 |
+
mode='tagging',
|
| 168 |
+
)
|
| 169 |
+
|
| 170 |
+
logits = self.fc(tagging_embed[0])
|
| 171 |
+
|
| 172 |
+
loss_tag = self.tagging_loss_function(logits, tag)
|
| 173 |
+
|
| 174 |
+
##================= Image-Tag-Text Generation ================##
|
| 175 |
+
tag = tag.cpu().numpy()
|
| 176 |
+
tag_input = []
|
| 177 |
+
for b in range(bs):
|
| 178 |
+
index = np.argwhere(tag[b] == 1)
|
| 179 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 180 |
+
tag_input.append(' | '.join(token))
|
| 181 |
+
|
| 182 |
+
# tokenizer input tags
|
| 183 |
+
tag_input_tokenzier = self.tokenizer(tag_input,
|
| 184 |
+
padding='max_length',
|
| 185 |
+
truncation=True,
|
| 186 |
+
max_length=40,
|
| 187 |
+
return_tensors="pt").to(
|
| 188 |
+
image.device)
|
| 189 |
+
encoder_input_ids = tag_input_tokenzier.input_ids
|
| 190 |
+
encoder_input_ids[:, 0] = self.tokenizer.enc_token_id
|
| 191 |
+
|
| 192 |
+
# put input tag into image-tag interaction encoder to interact with image embeddings
|
| 193 |
+
output_tagembedding = self.tag_encoder(
|
| 194 |
+
encoder_input_ids,
|
| 195 |
+
attention_mask=tag_input_tokenzier.attention_mask,
|
| 196 |
+
encoder_hidden_states=image_embeds,
|
| 197 |
+
encoder_attention_mask=image_atts,
|
| 198 |
+
return_dict=True,
|
| 199 |
+
)
|
| 200 |
+
|
| 201 |
+
text = self.tokenizer(caption,
|
| 202 |
+
padding='longest',
|
| 203 |
+
truncation=True,
|
| 204 |
+
max_length=40,
|
| 205 |
+
return_tensors="pt").to(
|
| 206 |
+
image.device)
|
| 207 |
+
|
| 208 |
+
decoder_input_ids = text.input_ids
|
| 209 |
+
decoder_input_ids[:,0] = self.tokenizer.bos_token_id
|
| 210 |
+
|
| 211 |
+
decoder_targets = decoder_input_ids.masked_fill(
|
| 212 |
+
decoder_input_ids == self.tokenizer.pad_token_id, -100)
|
| 213 |
+
decoder_targets[:,:self.prompt_length] = -100
|
| 214 |
+
|
| 215 |
+
decoder_output = self.text_decoder(decoder_input_ids,
|
| 216 |
+
attention_mask = text.attention_mask,
|
| 217 |
+
encoder_hidden_states = output_tagembedding.last_hidden_state,
|
| 218 |
+
encoder_attention_mask = None,
|
| 219 |
+
labels = decoder_targets,
|
| 220 |
+
return_dict = True,
|
| 221 |
+
)
|
| 222 |
+
|
| 223 |
+
loss_t2t = decoder_output.loss
|
| 224 |
+
|
| 225 |
+
# balance loss scale
|
| 226 |
+
loss = loss_t2t + loss_tag/(loss_tag/loss_t2t).detach()
|
| 227 |
+
|
| 228 |
+
return loss
|
| 229 |
+
|
| 230 |
+
def generate_image_embeds(self,
|
| 231 |
+
image,
|
| 232 |
+
condition=False
|
| 233 |
+
):
|
| 234 |
+
|
| 235 |
+
image_embeds = self.visual_encoder(image)
|
| 236 |
+
|
| 237 |
+
return image_embeds
|
| 238 |
+
|
| 239 |
+
def condition_forward(self,
|
| 240 |
+
image,
|
| 241 |
+
sample=False,
|
| 242 |
+
num_beams=3,
|
| 243 |
+
max_length=30,
|
| 244 |
+
min_length=10,
|
| 245 |
+
top_p=0.9,
|
| 246 |
+
repetition_penalty=1.0,
|
| 247 |
+
tag_input=None,
|
| 248 |
+
return_tag_predict=False):
|
| 249 |
+
|
| 250 |
+
image_embeds = self.visual_encoder(image)
|
| 251 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 252 |
+
dtype=torch.long).to(image.device)
|
| 253 |
+
|
| 254 |
+
# if not user specified tags, recognized image tags using image-tag recogntiion decoder
|
| 255 |
+
|
| 256 |
+
|
| 257 |
+
bs = image_embeds.shape[0]
|
| 258 |
+
label_embed = self.label_embed.weight.unsqueeze(0).repeat(bs, 1, 1)
|
| 259 |
+
tagging_embed = self.tagging_head(
|
| 260 |
+
encoder_embeds=label_embed,
|
| 261 |
+
encoder_hidden_states=image_embeds,
|
| 262 |
+
encoder_attention_mask=image_atts,
|
| 263 |
+
return_dict=False,
|
| 264 |
+
mode='tagging',
|
| 265 |
+
)
|
| 266 |
+
|
| 267 |
+
logits = self.fc(tagging_embed[0])
|
| 268 |
+
|
| 269 |
+
targets = torch.where(
|
| 270 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 271 |
+
torch.tensor(1.0).to(image.device),
|
| 272 |
+
torch.zeros(self.num_class).to(image.device))
|
| 273 |
+
|
| 274 |
+
# delete some tags that may disturb captioning
|
| 275 |
+
targets[:, self.delete_tag_index] = 0
|
| 276 |
+
|
| 277 |
+
return image_embeds, logits, targets
|
| 278 |
+
|
| 279 |
+
|
| 280 |
+
def generate(self,
|
| 281 |
+
image,
|
| 282 |
+
sample=False,
|
| 283 |
+
num_beams=3,
|
| 284 |
+
max_length=30,
|
| 285 |
+
min_length=10,
|
| 286 |
+
top_p=0.9,
|
| 287 |
+
repetition_penalty=1.0,
|
| 288 |
+
tag_input=None,
|
| 289 |
+
return_tag_predict=False):
|
| 290 |
+
|
| 291 |
+
image_embeds = self.visual_encoder(image)
|
| 292 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 293 |
+
dtype=torch.long).to(image.device)
|
| 294 |
+
|
| 295 |
+
# if not user specified tags, recognized image tags using image-tag recogntiion decoder
|
| 296 |
+
if tag_input == None:
|
| 297 |
+
|
| 298 |
+
bs = image_embeds.shape[0]
|
| 299 |
+
label_embed = self.label_embed.weight.unsqueeze(0).repeat(bs, 1, 1)
|
| 300 |
+
tagging_embed = self.tagging_head(
|
| 301 |
+
encoder_embeds=label_embed,
|
| 302 |
+
encoder_hidden_states=image_embeds,
|
| 303 |
+
encoder_attention_mask=image_atts,
|
| 304 |
+
return_dict=False,
|
| 305 |
+
mode='tagging',
|
| 306 |
+
)
|
| 307 |
+
|
| 308 |
+
logits = self.fc(tagging_embed[0])
|
| 309 |
+
|
| 310 |
+
targets = torch.where(
|
| 311 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 312 |
+
torch.tensor(1.0).to(image.device),
|
| 313 |
+
torch.zeros(self.num_class).to(image.device))
|
| 314 |
+
|
| 315 |
+
tag = targets.cpu().numpy()
|
| 316 |
+
|
| 317 |
+
# delete some tags that may disturb captioning
|
| 318 |
+
tag[:, self.delete_tag_index] = 0
|
| 319 |
+
|
| 320 |
+
tag_input = []
|
| 321 |
+
for b in range(bs):
|
| 322 |
+
index = np.argwhere(tag[b] == 1)
|
| 323 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 324 |
+
tag_input.append(', '.join(token))
|
| 325 |
+
|
| 326 |
+
tag_output = tag_input
|
| 327 |
+
|
| 328 |
+
# beam search for text generation(default)
|
| 329 |
+
if not sample:
|
| 330 |
+
image_embeds = image_embeds.repeat_interleave(num_beams, dim=0)
|
| 331 |
+
tag_input_temp = []
|
| 332 |
+
for tag in tag_input:
|
| 333 |
+
for i in range(num_beams):
|
| 334 |
+
tag_input_temp.append(tag)
|
| 335 |
+
tag_input = tag_input_temp
|
| 336 |
+
|
| 337 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 338 |
+
dtype=torch.long).to(image.device)
|
| 339 |
+
|
| 340 |
+
# tokenizer input tags
|
| 341 |
+
tag_input_tokenzier = self.tokenizer(tag_input,
|
| 342 |
+
padding='max_length',
|
| 343 |
+
truncation=True,
|
| 344 |
+
max_length=40,
|
| 345 |
+
return_tensors="pt").to(
|
| 346 |
+
image.device)
|
| 347 |
+
encoder_input_ids = tag_input_tokenzier.input_ids
|
| 348 |
+
encoder_input_ids[:, 0] = self.tokenizer.enc_token_id
|
| 349 |
+
|
| 350 |
+
# put input tag into image-tag interaction encoder to interact with image embeddings
|
| 351 |
+
output_tagembedding = self.tag_encoder(
|
| 352 |
+
encoder_input_ids,
|
| 353 |
+
attention_mask=tag_input_tokenzier.attention_mask,
|
| 354 |
+
encoder_hidden_states=image_embeds,
|
| 355 |
+
encoder_attention_mask=image_atts,
|
| 356 |
+
return_dict=True,
|
| 357 |
+
)
|
| 358 |
+
|
| 359 |
+
# prompt trick for better captioning, followed BLIP
|
| 360 |
+
prompt = [self.prompt] * image.size(0)
|
| 361 |
+
input_ids = self.tokenizer(prompt, return_tensors="pt").input_ids.to(
|
| 362 |
+
image.device)
|
| 363 |
+
input_ids[:, 0] = self.tokenizer.bos_token_id
|
| 364 |
+
input_ids = input_ids[:, :-1]
|
| 365 |
+
|
| 366 |
+
if sample:
|
| 367 |
+
# nucleus sampling
|
| 368 |
+
model_kwargs = {
|
| 369 |
+
"encoder_hidden_states": output_tagembedding.last_hidden_state,
|
| 370 |
+
"encoder_attention_mask": None
|
| 371 |
+
}
|
| 372 |
+
outputs = self.text_decoder.generate(
|
| 373 |
+
input_ids=input_ids,
|
| 374 |
+
max_length=max_length,
|
| 375 |
+
min_length=min_length,
|
| 376 |
+
do_sample=True,
|
| 377 |
+
top_p=top_p,
|
| 378 |
+
num_return_sequences=1,
|
| 379 |
+
eos_token_id=self.tokenizer.sep_token_id,
|
| 380 |
+
pad_token_id=self.tokenizer.pad_token_id,
|
| 381 |
+
repetition_penalty=1.1,
|
| 382 |
+
**model_kwargs)
|
| 383 |
+
else:
|
| 384 |
+
# beam search (default)
|
| 385 |
+
model_kwargs = {
|
| 386 |
+
"encoder_hidden_states": output_tagembedding.last_hidden_state,
|
| 387 |
+
"encoder_attention_mask": None
|
| 388 |
+
}
|
| 389 |
+
outputs = self.text_decoder.generate(
|
| 390 |
+
input_ids=input_ids,
|
| 391 |
+
max_length=max_length,
|
| 392 |
+
min_length=min_length,
|
| 393 |
+
num_beams=num_beams,
|
| 394 |
+
eos_token_id=self.tokenizer.sep_token_id,
|
| 395 |
+
pad_token_id=self.tokenizer.pad_token_id,
|
| 396 |
+
repetition_penalty=repetition_penalty,
|
| 397 |
+
**model_kwargs)
|
| 398 |
+
|
| 399 |
+
captions = []
|
| 400 |
+
for output in outputs:
|
| 401 |
+
caption = self.tokenizer.decode(output, skip_special_tokens=True)
|
| 402 |
+
captions.append(caption[len(self.prompt):])
|
| 403 |
+
if return_tag_predict == True:
|
| 404 |
+
return captions, tag_output
|
| 405 |
+
return captions
|
| 406 |
+
|
| 407 |
+
|
| 408 |
+
# load Tag2Text pretrained model parameters
|
| 409 |
+
def tag2text(pretrained='', **kwargs):
|
| 410 |
+
model = Tag2Text(**kwargs)
|
| 411 |
+
if pretrained:
|
| 412 |
+
if kwargs['vit'] == 'swin_b':
|
| 413 |
+
model, msg = load_checkpoint_swinbase(model, pretrained, kwargs)
|
| 414 |
+
else:
|
| 415 |
+
model, msg = load_checkpoint(model, pretrained)
|
| 416 |
+
print('vit:', kwargs['vit'])
|
| 417 |
+
# print('msg', msg)
|
| 418 |
+
return model
|
| 419 |
+
|
ram/models/tag2text_lora.py
ADDED
|
@@ -0,0 +1,419 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* The Tag2Text Model
|
| 3 |
+
* Written by Xinyu Huang
|
| 4 |
+
'''
|
| 5 |
+
import numpy as np
|
| 6 |
+
import json
|
| 7 |
+
import torch
|
| 8 |
+
import warnings
|
| 9 |
+
|
| 10 |
+
from torch import nn
|
| 11 |
+
from .bert_lora import BertConfig, BertModel, BertLMHeadModel
|
| 12 |
+
from .swin_transformer_lora import SwinTransformer
|
| 13 |
+
|
| 14 |
+
from .utils import *
|
| 15 |
+
|
| 16 |
+
warnings.filterwarnings("ignore")
|
| 17 |
+
|
| 18 |
+
|
| 19 |
+
class Tag2Text(nn.Module):
|
| 20 |
+
|
| 21 |
+
def __init__(self,
|
| 22 |
+
med_config=f'{CONFIG_PATH}/configs/med_config.json',
|
| 23 |
+
image_size=384,
|
| 24 |
+
vit='base',
|
| 25 |
+
vit_grad_ckpt=False,
|
| 26 |
+
vit_ckpt_layer=0,
|
| 27 |
+
prompt='a picture of ',
|
| 28 |
+
threshold=0.68,
|
| 29 |
+
delete_tag_index=[127,2961, 3351, 3265, 3338, 3355, 3359],
|
| 30 |
+
tag_list=f'{CONFIG_PATH}/data/tag_list.txt'):
|
| 31 |
+
r""" Tag2Text inference module, both captioning and tagging are included.
|
| 32 |
+
Tag2Text is an efficient and controllable vision-language pre-training framework.
|
| 33 |
+
Described in the paper "Tag2Text: Guiding Vision-Language Model via Image Tagging" https://arxiv.org/abs/2303.05657
|
| 34 |
+
|
| 35 |
+
Args:
|
| 36 |
+
med_config (str): path for the mixture of encoder-decoder model's configuration file
|
| 37 |
+
image_size (int): input image size
|
| 38 |
+
vit (str): model size of vision transformer
|
| 39 |
+
threshold (int): tagging threshold
|
| 40 |
+
delete_tag_index (list): delete some tags that may disturb captioning
|
| 41 |
+
"""
|
| 42 |
+
super().__init__()
|
| 43 |
+
|
| 44 |
+
# create image encoder
|
| 45 |
+
if vit == 'swin_b':
|
| 46 |
+
if image_size == 224:
|
| 47 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_224.json'
|
| 48 |
+
elif image_size == 384:
|
| 49 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_384.json'
|
| 50 |
+
vision_config = read_json(vision_config_path)
|
| 51 |
+
assert image_size == vision_config['image_res']
|
| 52 |
+
# assert config['patch_size'] == 32
|
| 53 |
+
vision_width = vision_config['vision_width']
|
| 54 |
+
|
| 55 |
+
self.visual_encoder = SwinTransformer(
|
| 56 |
+
img_size=vision_config['image_res'],
|
| 57 |
+
patch_size=4,
|
| 58 |
+
in_chans=3,
|
| 59 |
+
embed_dim=vision_config['embed_dim'],
|
| 60 |
+
depths=vision_config['depths'],
|
| 61 |
+
num_heads=vision_config['num_heads'],
|
| 62 |
+
window_size=vision_config['window_size'],
|
| 63 |
+
mlp_ratio=4.,
|
| 64 |
+
qkv_bias=True,
|
| 65 |
+
drop_rate=0.0,
|
| 66 |
+
drop_path_rate=0.1,
|
| 67 |
+
ape=False,
|
| 68 |
+
patch_norm=True,
|
| 69 |
+
use_checkpoint=False)
|
| 70 |
+
|
| 71 |
+
else:
|
| 72 |
+
self.visual_encoder, vision_width = create_vit(
|
| 73 |
+
vit, image_size, vit_grad_ckpt, vit_ckpt_layer)
|
| 74 |
+
|
| 75 |
+
# create tokenzier
|
| 76 |
+
self.tokenizer = init_tokenizer()
|
| 77 |
+
|
| 78 |
+
# Tag2Text employ encoder-decoder architecture for image-tag-text generation: image-tag interaction encoder and image-tag-text decoder
|
| 79 |
+
# create image-tag interaction encoder
|
| 80 |
+
encoder_config = BertConfig.from_json_file(med_config)
|
| 81 |
+
encoder_config.encoder_width = vision_width
|
| 82 |
+
self.tag_encoder = BertModel(config=encoder_config,
|
| 83 |
+
add_pooling_layer=False)
|
| 84 |
+
|
| 85 |
+
# create image-tag-text decoder
|
| 86 |
+
decoder_config = BertConfig.from_json_file(med_config)
|
| 87 |
+
self.text_decoder = BertLMHeadModel(config=decoder_config)
|
| 88 |
+
|
| 89 |
+
# delete some tags that may disturb captioning
|
| 90 |
+
# 127: "quarter"; 2961: "back"; 3351: "two"; 3265: "three"; 3338: "four"; 3355: "five"; 3359: "one"
|
| 91 |
+
self.delete_tag_index = delete_tag_index
|
| 92 |
+
self.prompt = prompt
|
| 93 |
+
self.prompt_length = len(self.tokenizer(self.prompt).input_ids) - 1
|
| 94 |
+
|
| 95 |
+
# load tag list
|
| 96 |
+
self.tag_list = self.load_tag_list(tag_list)
|
| 97 |
+
|
| 98 |
+
# create image-tag recognition decoder
|
| 99 |
+
self.threshold = threshold
|
| 100 |
+
self.num_class = len(self.tag_list)
|
| 101 |
+
q2l_config = BertConfig.from_json_file(f'{CONFIG_PATH}/configs/q2l_config.json')
|
| 102 |
+
q2l_config.encoder_width = vision_width
|
| 103 |
+
self.tagging_head = BertModel(config=q2l_config,
|
| 104 |
+
add_pooling_layer=False)
|
| 105 |
+
self.tagging_head.resize_token_embeddings(len(self.tokenizer))
|
| 106 |
+
self.label_embed = nn.Embedding(self.num_class, q2l_config.hidden_size)
|
| 107 |
+
self.fc = GroupWiseLinear(self.num_class,
|
| 108 |
+
q2l_config.hidden_size,
|
| 109 |
+
bias=True)
|
| 110 |
+
self.del_selfattention()
|
| 111 |
+
|
| 112 |
+
self.tagging_loss_function = AsymmetricLoss(gamma_neg=7,
|
| 113 |
+
gamma_pos=0,
|
| 114 |
+
clip=0.05)
|
| 115 |
+
|
| 116 |
+
# share weights of the lowest 2-layer of "image-tag interaction encoder" with the "image-tag recogntion decoder"
|
| 117 |
+
tie_encoder_decoder_weights(self.tag_encoder, self.tagging_head, '',
|
| 118 |
+
' ')
|
| 119 |
+
|
| 120 |
+
# adjust thresholds for some tags
|
| 121 |
+
# default threshold: 0.68
|
| 122 |
+
# 2701: "person"; 2828: "man"; 1167: "woman";
|
| 123 |
+
tag_thrshold = {2701:0.7, 2828: 0.7, 1167: 0.7}
|
| 124 |
+
self.class_threshold = torch.ones(self.num_class) * self.threshold
|
| 125 |
+
for key,value in tag_thrshold.items():
|
| 126 |
+
self.class_threshold[key] = value
|
| 127 |
+
|
| 128 |
+
def load_tag_list(self, tag_list_file):
|
| 129 |
+
with open(tag_list_file, 'r') as f:
|
| 130 |
+
tag_list = f.read().splitlines()
|
| 131 |
+
tag_list = np.array(tag_list)
|
| 132 |
+
return tag_list
|
| 133 |
+
|
| 134 |
+
# delete self-attention layer of image-tag recognition decoder to reduce computation, follower Query2Label
|
| 135 |
+
def del_selfattention(self):
|
| 136 |
+
del self.tagging_head.embeddings
|
| 137 |
+
for layer in self.tagging_head.encoder.layer:
|
| 138 |
+
del layer.attention
|
| 139 |
+
|
| 140 |
+
|
| 141 |
+
def forward(self, image, caption, tag):
|
| 142 |
+
"""
|
| 143 |
+
call function as forward
|
| 144 |
+
|
| 145 |
+
Args:
|
| 146 |
+
image: type: torch.Tensor shape: batch_size * 3 * 384 * 384
|
| 147 |
+
caption: type: list[string] len: batch_size
|
| 148 |
+
tag: type: torch.Tensor shape: batch * class_num (e.g. 3429) value: positive sample is 1.0, negative sample is 0.0
|
| 149 |
+
|
| 150 |
+
Returns:
|
| 151 |
+
loss: type: torch.Tensor
|
| 152 |
+
"""
|
| 153 |
+
|
| 154 |
+
image_embeds = self.visual_encoder(image)
|
| 155 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 156 |
+
dtype=torch.long).to(image.device)
|
| 157 |
+
|
| 158 |
+
##================= Image Tagging ================##
|
| 159 |
+
bs = image_embeds.shape[0]
|
| 160 |
+
label_embed = self.label_embed.weight.unsqueeze(0).repeat(bs, 1, 1)
|
| 161 |
+
|
| 162 |
+
tagging_embed = self.tagging_head(
|
| 163 |
+
encoder_embeds=label_embed,
|
| 164 |
+
encoder_hidden_states=image_embeds,
|
| 165 |
+
encoder_attention_mask=image_atts,
|
| 166 |
+
return_dict=False,
|
| 167 |
+
mode='tagging',
|
| 168 |
+
)
|
| 169 |
+
|
| 170 |
+
logits = self.fc(tagging_embed[0])
|
| 171 |
+
|
| 172 |
+
loss_tag = self.tagging_loss_function(logits, tag)
|
| 173 |
+
|
| 174 |
+
##================= Image-Tag-Text Generation ================##
|
| 175 |
+
tag = tag.cpu().numpy()
|
| 176 |
+
tag_input = []
|
| 177 |
+
for b in range(bs):
|
| 178 |
+
index = np.argwhere(tag[b] == 1)
|
| 179 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 180 |
+
tag_input.append(' | '.join(token))
|
| 181 |
+
|
| 182 |
+
# tokenizer input tags
|
| 183 |
+
tag_input_tokenzier = self.tokenizer(tag_input,
|
| 184 |
+
padding='max_length',
|
| 185 |
+
truncation=True,
|
| 186 |
+
max_length=40,
|
| 187 |
+
return_tensors="pt").to(
|
| 188 |
+
image.device)
|
| 189 |
+
encoder_input_ids = tag_input_tokenzier.input_ids
|
| 190 |
+
encoder_input_ids[:, 0] = self.tokenizer.enc_token_id
|
| 191 |
+
|
| 192 |
+
# put input tag into image-tag interaction encoder to interact with image embeddings
|
| 193 |
+
output_tagembedding = self.tag_encoder(
|
| 194 |
+
encoder_input_ids,
|
| 195 |
+
attention_mask=tag_input_tokenzier.attention_mask,
|
| 196 |
+
encoder_hidden_states=image_embeds,
|
| 197 |
+
encoder_attention_mask=image_atts,
|
| 198 |
+
return_dict=True,
|
| 199 |
+
)
|
| 200 |
+
|
| 201 |
+
text = self.tokenizer(caption,
|
| 202 |
+
padding='longest',
|
| 203 |
+
truncation=True,
|
| 204 |
+
max_length=40,
|
| 205 |
+
return_tensors="pt").to(
|
| 206 |
+
image.device)
|
| 207 |
+
|
| 208 |
+
decoder_input_ids = text.input_ids
|
| 209 |
+
decoder_input_ids[:,0] = self.tokenizer.bos_token_id
|
| 210 |
+
|
| 211 |
+
decoder_targets = decoder_input_ids.masked_fill(
|
| 212 |
+
decoder_input_ids == self.tokenizer.pad_token_id, -100)
|
| 213 |
+
decoder_targets[:,:self.prompt_length] = -100
|
| 214 |
+
|
| 215 |
+
decoder_output = self.text_decoder(decoder_input_ids,
|
| 216 |
+
attention_mask = text.attention_mask,
|
| 217 |
+
encoder_hidden_states = output_tagembedding.last_hidden_state,
|
| 218 |
+
encoder_attention_mask = None,
|
| 219 |
+
labels = decoder_targets,
|
| 220 |
+
return_dict = True,
|
| 221 |
+
)
|
| 222 |
+
|
| 223 |
+
loss_t2t = decoder_output.loss
|
| 224 |
+
|
| 225 |
+
# balance loss scale
|
| 226 |
+
loss = loss_t2t + loss_tag/(loss_tag/loss_t2t).detach()
|
| 227 |
+
|
| 228 |
+
return loss
|
| 229 |
+
|
| 230 |
+
def generate_image_embeds(self,
|
| 231 |
+
image,
|
| 232 |
+
condition=False
|
| 233 |
+
):
|
| 234 |
+
|
| 235 |
+
image_embeds = self.visual_encoder(image)
|
| 236 |
+
|
| 237 |
+
return image_embeds
|
| 238 |
+
|
| 239 |
+
def condition_forward(self,
|
| 240 |
+
image,
|
| 241 |
+
sample=False,
|
| 242 |
+
num_beams=3,
|
| 243 |
+
max_length=30,
|
| 244 |
+
min_length=10,
|
| 245 |
+
top_p=0.9,
|
| 246 |
+
repetition_penalty=1.0,
|
| 247 |
+
tag_input=None,
|
| 248 |
+
return_tag_predict=False):
|
| 249 |
+
|
| 250 |
+
image_embeds = self.visual_encoder(image)
|
| 251 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 252 |
+
dtype=torch.long).to(image.device)
|
| 253 |
+
|
| 254 |
+
# if not user specified tags, recognized image tags using image-tag recogntiion decoder
|
| 255 |
+
|
| 256 |
+
|
| 257 |
+
bs = image_embeds.shape[0]
|
| 258 |
+
label_embed = self.label_embed.weight.unsqueeze(0).repeat(bs, 1, 1)
|
| 259 |
+
tagging_embed = self.tagging_head(
|
| 260 |
+
encoder_embeds=label_embed,
|
| 261 |
+
encoder_hidden_states=image_embeds,
|
| 262 |
+
encoder_attention_mask=image_atts,
|
| 263 |
+
return_dict=False,
|
| 264 |
+
mode='tagging',
|
| 265 |
+
)
|
| 266 |
+
|
| 267 |
+
logits = self.fc(tagging_embed[0])
|
| 268 |
+
|
| 269 |
+
targets = torch.where(
|
| 270 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 271 |
+
torch.tensor(1.0).to(image.device),
|
| 272 |
+
torch.zeros(self.num_class).to(image.device))
|
| 273 |
+
|
| 274 |
+
# delete some tags that may disturb captioning
|
| 275 |
+
targets[:, self.delete_tag_index] = 0
|
| 276 |
+
|
| 277 |
+
return image_embeds, logits, targets
|
| 278 |
+
|
| 279 |
+
|
| 280 |
+
def generate(self,
|
| 281 |
+
image,
|
| 282 |
+
sample=False,
|
| 283 |
+
num_beams=3,
|
| 284 |
+
max_length=30,
|
| 285 |
+
min_length=10,
|
| 286 |
+
top_p=0.9,
|
| 287 |
+
repetition_penalty=1.0,
|
| 288 |
+
tag_input=None,
|
| 289 |
+
return_tag_predict=False):
|
| 290 |
+
|
| 291 |
+
image_embeds = self.visual_encoder(image)
|
| 292 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 293 |
+
dtype=torch.long).to(image.device)
|
| 294 |
+
|
| 295 |
+
# if not user specified tags, recognized image tags using image-tag recogntiion decoder
|
| 296 |
+
if tag_input == None:
|
| 297 |
+
|
| 298 |
+
bs = image_embeds.shape[0]
|
| 299 |
+
label_embed = self.label_embed.weight.unsqueeze(0).repeat(bs, 1, 1)
|
| 300 |
+
tagging_embed = self.tagging_head(
|
| 301 |
+
encoder_embeds=label_embed,
|
| 302 |
+
encoder_hidden_states=image_embeds,
|
| 303 |
+
encoder_attention_mask=image_atts,
|
| 304 |
+
return_dict=False,
|
| 305 |
+
mode='tagging',
|
| 306 |
+
)
|
| 307 |
+
|
| 308 |
+
logits = self.fc(tagging_embed[0])
|
| 309 |
+
|
| 310 |
+
targets = torch.where(
|
| 311 |
+
torch.sigmoid(logits) > self.class_threshold.to(image.device),
|
| 312 |
+
torch.tensor(1.0).to(image.device),
|
| 313 |
+
torch.zeros(self.num_class).to(image.device))
|
| 314 |
+
|
| 315 |
+
tag = targets.cpu().numpy()
|
| 316 |
+
|
| 317 |
+
# delete some tags that may disturb captioning
|
| 318 |
+
tag[:, self.delete_tag_index] = 0
|
| 319 |
+
|
| 320 |
+
tag_input = []
|
| 321 |
+
for b in range(bs):
|
| 322 |
+
index = np.argwhere(tag[b] == 1)
|
| 323 |
+
token = self.tag_list[index].squeeze(axis=1)
|
| 324 |
+
tag_input.append(', '.join(token))
|
| 325 |
+
|
| 326 |
+
tag_output = tag_input
|
| 327 |
+
|
| 328 |
+
# beam search for text generation(default)
|
| 329 |
+
if not sample:
|
| 330 |
+
image_embeds = image_embeds.repeat_interleave(num_beams, dim=0)
|
| 331 |
+
tag_input_temp = []
|
| 332 |
+
for tag in tag_input:
|
| 333 |
+
for i in range(num_beams):
|
| 334 |
+
tag_input_temp.append(tag)
|
| 335 |
+
tag_input = tag_input_temp
|
| 336 |
+
|
| 337 |
+
image_atts = torch.ones(image_embeds.size()[:-1],
|
| 338 |
+
dtype=torch.long).to(image.device)
|
| 339 |
+
|
| 340 |
+
# tokenizer input tags
|
| 341 |
+
tag_input_tokenzier = self.tokenizer(tag_input,
|
| 342 |
+
padding='max_length',
|
| 343 |
+
truncation=True,
|
| 344 |
+
max_length=40,
|
| 345 |
+
return_tensors="pt").to(
|
| 346 |
+
image.device)
|
| 347 |
+
encoder_input_ids = tag_input_tokenzier.input_ids
|
| 348 |
+
encoder_input_ids[:, 0] = self.tokenizer.enc_token_id
|
| 349 |
+
|
| 350 |
+
# put input tag into image-tag interaction encoder to interact with image embeddings
|
| 351 |
+
output_tagembedding = self.tag_encoder(
|
| 352 |
+
encoder_input_ids,
|
| 353 |
+
attention_mask=tag_input_tokenzier.attention_mask,
|
| 354 |
+
encoder_hidden_states=image_embeds,
|
| 355 |
+
encoder_attention_mask=image_atts,
|
| 356 |
+
return_dict=True,
|
| 357 |
+
)
|
| 358 |
+
|
| 359 |
+
# prompt trick for better captioning, followed BLIP
|
| 360 |
+
prompt = [self.prompt] * image.size(0)
|
| 361 |
+
input_ids = self.tokenizer(prompt, return_tensors="pt").input_ids.to(
|
| 362 |
+
image.device)
|
| 363 |
+
input_ids[:, 0] = self.tokenizer.bos_token_id
|
| 364 |
+
input_ids = input_ids[:, :-1]
|
| 365 |
+
|
| 366 |
+
if sample:
|
| 367 |
+
# nucleus sampling
|
| 368 |
+
model_kwargs = {
|
| 369 |
+
"encoder_hidden_states": output_tagembedding.last_hidden_state,
|
| 370 |
+
"encoder_attention_mask": None
|
| 371 |
+
}
|
| 372 |
+
outputs = self.text_decoder.generate(
|
| 373 |
+
input_ids=input_ids,
|
| 374 |
+
max_length=max_length,
|
| 375 |
+
min_length=min_length,
|
| 376 |
+
do_sample=True,
|
| 377 |
+
top_p=top_p,
|
| 378 |
+
num_return_sequences=1,
|
| 379 |
+
eos_token_id=self.tokenizer.sep_token_id,
|
| 380 |
+
pad_token_id=self.tokenizer.pad_token_id,
|
| 381 |
+
repetition_penalty=1.1,
|
| 382 |
+
**model_kwargs)
|
| 383 |
+
else:
|
| 384 |
+
# beam search (default)
|
| 385 |
+
model_kwargs = {
|
| 386 |
+
"encoder_hidden_states": output_tagembedding.last_hidden_state,
|
| 387 |
+
"encoder_attention_mask": None
|
| 388 |
+
}
|
| 389 |
+
outputs = self.text_decoder.generate(
|
| 390 |
+
input_ids=input_ids,
|
| 391 |
+
max_length=max_length,
|
| 392 |
+
min_length=min_length,
|
| 393 |
+
num_beams=num_beams,
|
| 394 |
+
eos_token_id=self.tokenizer.sep_token_id,
|
| 395 |
+
pad_token_id=self.tokenizer.pad_token_id,
|
| 396 |
+
repetition_penalty=repetition_penalty,
|
| 397 |
+
**model_kwargs)
|
| 398 |
+
|
| 399 |
+
captions = []
|
| 400 |
+
for output in outputs:
|
| 401 |
+
caption = self.tokenizer.decode(output, skip_special_tokens=True)
|
| 402 |
+
captions.append(caption[len(self.prompt):])
|
| 403 |
+
if return_tag_predict == True:
|
| 404 |
+
return captions, tag_output
|
| 405 |
+
return captions
|
| 406 |
+
|
| 407 |
+
|
| 408 |
+
# load Tag2Text pretrained model parameters
|
| 409 |
+
def tag2text(pretrained='', **kwargs):
|
| 410 |
+
model = Tag2Text(**kwargs)
|
| 411 |
+
if pretrained:
|
| 412 |
+
if kwargs['vit'] == 'swin_b':
|
| 413 |
+
model, msg = load_checkpoint_swinbase(model, pretrained, kwargs)
|
| 414 |
+
else:
|
| 415 |
+
model, msg = load_checkpoint(model, pretrained)
|
| 416 |
+
print('vit:', kwargs['vit'])
|
| 417 |
+
# print('msg', msg)
|
| 418 |
+
return model
|
| 419 |
+
|
ram/models/utils.py
ADDED
|
@@ -0,0 +1,365 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
import os
|
| 2 |
+
import json
|
| 3 |
+
import torch
|
| 4 |
+
import math
|
| 5 |
+
|
| 6 |
+
from torch import nn
|
| 7 |
+
from typing import List
|
| 8 |
+
from transformers import BertTokenizer
|
| 9 |
+
from urllib.parse import urlparse
|
| 10 |
+
from timm.models.hub import download_cached_file
|
| 11 |
+
from .vit import interpolate_pos_embed
|
| 12 |
+
from .swin_transformer import interpolate_relative_pos_embed
|
| 13 |
+
from pathlib import Path
|
| 14 |
+
CONFIG_PATH=(Path(__file__).resolve().parents[1])
|
| 15 |
+
|
| 16 |
+
def read_json(rpath):
|
| 17 |
+
with open(rpath, 'r') as f:
|
| 18 |
+
return json.load(f)
|
| 19 |
+
|
| 20 |
+
|
| 21 |
+
def tie_encoder_decoder_weights(encoder: nn.Module, decoder: nn.Module,
|
| 22 |
+
base_model_prefix: str, skip_key: str):
|
| 23 |
+
uninitialized_encoder_weights: List[str] = []
|
| 24 |
+
if decoder.__class__ != encoder.__class__:
|
| 25 |
+
logger.info(
|
| 26 |
+
f"{decoder.__class__} and {encoder.__class__} are not equal. In this case make sure that all encoder weights are correctly initialized."
|
| 27 |
+
)
|
| 28 |
+
|
| 29 |
+
def tie_encoder_to_decoder_recursively(
|
| 30 |
+
decoder_pointer: nn.Module,
|
| 31 |
+
encoder_pointer: nn.Module,
|
| 32 |
+
module_name: str,
|
| 33 |
+
uninitialized_encoder_weights: List[str],
|
| 34 |
+
skip_key: str,
|
| 35 |
+
depth=0,
|
| 36 |
+
):
|
| 37 |
+
assert isinstance(decoder_pointer, nn.Module) and isinstance(
|
| 38 |
+
encoder_pointer, nn.Module
|
| 39 |
+
), f"{decoder_pointer} and {encoder_pointer} have to be of type torch.nn.Module"
|
| 40 |
+
if hasattr(decoder_pointer, "weight") and skip_key not in module_name:
|
| 41 |
+
assert hasattr(encoder_pointer, "weight")
|
| 42 |
+
encoder_pointer.weight = decoder_pointer.weight
|
| 43 |
+
if hasattr(decoder_pointer, "bias"):
|
| 44 |
+
assert hasattr(encoder_pointer, "bias")
|
| 45 |
+
encoder_pointer.bias = decoder_pointer.bias
|
| 46 |
+
print(module_name + ' is tied')
|
| 47 |
+
return
|
| 48 |
+
|
| 49 |
+
encoder_modules = encoder_pointer._modules
|
| 50 |
+
decoder_modules = decoder_pointer._modules
|
| 51 |
+
if len(decoder_modules) > 0:
|
| 52 |
+
assert (
|
| 53 |
+
len(encoder_modules) > 0
|
| 54 |
+
), f"Encoder module {encoder_pointer} does not match decoder module {decoder_pointer}"
|
| 55 |
+
|
| 56 |
+
all_encoder_weights = set([
|
| 57 |
+
module_name + "/" + sub_name
|
| 58 |
+
for sub_name in encoder_modules.keys()
|
| 59 |
+
])
|
| 60 |
+
encoder_layer_pos = 0
|
| 61 |
+
for name, module in decoder_modules.items():
|
| 62 |
+
if name.isdigit():
|
| 63 |
+
encoder_name = str(int(name) + encoder_layer_pos)
|
| 64 |
+
decoder_name = name
|
| 65 |
+
if not isinstance(
|
| 66 |
+
decoder_modules[decoder_name],
|
| 67 |
+
type(encoder_modules[encoder_name])) and len(
|
| 68 |
+
encoder_modules) != len(decoder_modules):
|
| 69 |
+
# this can happen if the name corresponds to the position in a list module list of layers
|
| 70 |
+
# in this case the decoder has added a cross-attention that the encoder does not have
|
| 71 |
+
# thus skip this step and subtract one layer pos from encoder
|
| 72 |
+
encoder_layer_pos -= 1
|
| 73 |
+
continue
|
| 74 |
+
elif name not in encoder_modules:
|
| 75 |
+
continue
|
| 76 |
+
elif depth > 500:
|
| 77 |
+
raise ValueError(
|
| 78 |
+
"Max depth of recursive function `tie_encoder_to_decoder` reached. It seems that there is a circular dependency between two or more `nn.Modules` of your model."
|
| 79 |
+
)
|
| 80 |
+
else:
|
| 81 |
+
decoder_name = encoder_name = name
|
| 82 |
+
tie_encoder_to_decoder_recursively(
|
| 83 |
+
decoder_modules[decoder_name],
|
| 84 |
+
encoder_modules[encoder_name],
|
| 85 |
+
module_name + "/" + name,
|
| 86 |
+
uninitialized_encoder_weights,
|
| 87 |
+
skip_key,
|
| 88 |
+
depth=depth + 1,
|
| 89 |
+
)
|
| 90 |
+
all_encoder_weights.remove(module_name + "/" + encoder_name)
|
| 91 |
+
|
| 92 |
+
uninitialized_encoder_weights += list(all_encoder_weights)
|
| 93 |
+
|
| 94 |
+
# tie weights recursively
|
| 95 |
+
tie_encoder_to_decoder_recursively(decoder, encoder, base_model_prefix,
|
| 96 |
+
uninitialized_encoder_weights, skip_key)
|
| 97 |
+
|
| 98 |
+
|
| 99 |
+
class GroupWiseLinear(nn.Module):
|
| 100 |
+
# could be changed to:
|
| 101 |
+
# output = torch.einsum('ijk,zjk->ij', x, self.W)
|
| 102 |
+
# or output = torch.einsum('ijk,jk->ij', x, self.W[0])
|
| 103 |
+
def __init__(self, num_class, hidden_dim, bias=True):
|
| 104 |
+
super().__init__()
|
| 105 |
+
self.num_class = num_class
|
| 106 |
+
self.hidden_dim = hidden_dim
|
| 107 |
+
self.bias = bias
|
| 108 |
+
|
| 109 |
+
self.W = nn.Parameter(torch.Tensor(1, num_class, hidden_dim))
|
| 110 |
+
if bias:
|
| 111 |
+
self.b = nn.Parameter(torch.Tensor(1, num_class))
|
| 112 |
+
self.reset_parameters()
|
| 113 |
+
|
| 114 |
+
def reset_parameters(self):
|
| 115 |
+
stdv = 1. / math.sqrt(self.W.size(2))
|
| 116 |
+
for i in range(self.num_class):
|
| 117 |
+
self.W[0][i].data.uniform_(-stdv, stdv)
|
| 118 |
+
if self.bias:
|
| 119 |
+
for i in range(self.num_class):
|
| 120 |
+
self.b[0][i].data.uniform_(-stdv, stdv)
|
| 121 |
+
|
| 122 |
+
def forward(self, x):
|
| 123 |
+
# x: B,K,d
|
| 124 |
+
x = (self.W * x).sum(-1)
|
| 125 |
+
if self.bias:
|
| 126 |
+
x = x + self.b
|
| 127 |
+
return x
|
| 128 |
+
|
| 129 |
+
|
| 130 |
+
def init_tokenizer():
|
| 131 |
+
tokenizer = BertTokenizer.from_pretrained('bert-base-uncased')
|
| 132 |
+
# tokenizer = BertTokenizer.from_pretrained('/home/notebook/data/group/LowLevelLLM/LLM/bert-base-uncased', local_files_only=True)
|
| 133 |
+
tokenizer.add_special_tokens({'bos_token': '[DEC]'})
|
| 134 |
+
tokenizer.add_special_tokens({'additional_special_tokens': ['[ENC]']})
|
| 135 |
+
tokenizer.enc_token_id = tokenizer.additional_special_tokens_ids[0]
|
| 136 |
+
return tokenizer
|
| 137 |
+
|
| 138 |
+
|
| 139 |
+
def create_vit(vit,
|
| 140 |
+
image_size,
|
| 141 |
+
use_grad_checkpointing=False,
|
| 142 |
+
ckpt_layer=0,
|
| 143 |
+
drop_path_rate=0):
|
| 144 |
+
|
| 145 |
+
assert vit in ['base', 'large'], "vit parameter must be base or large"
|
| 146 |
+
if vit == 'base':
|
| 147 |
+
vision_width = 768
|
| 148 |
+
visual_encoder = VisionTransformer(
|
| 149 |
+
img_size=image_size,
|
| 150 |
+
patch_size=16,
|
| 151 |
+
embed_dim=vision_width,
|
| 152 |
+
depth=12,
|
| 153 |
+
num_heads=12,
|
| 154 |
+
use_grad_checkpointing=use_grad_checkpointing,
|
| 155 |
+
ckpt_layer=ckpt_layer,
|
| 156 |
+
drop_path_rate=0 or drop_path_rate)
|
| 157 |
+
elif vit == 'large':
|
| 158 |
+
vision_width = 1024
|
| 159 |
+
visual_encoder = VisionTransformer(
|
| 160 |
+
img_size=image_size,
|
| 161 |
+
patch_size=16,
|
| 162 |
+
embed_dim=vision_width,
|
| 163 |
+
depth=24,
|
| 164 |
+
num_heads=16,
|
| 165 |
+
use_grad_checkpointing=use_grad_checkpointing,
|
| 166 |
+
ckpt_layer=ckpt_layer,
|
| 167 |
+
drop_path_rate=0.1 or drop_path_rate)
|
| 168 |
+
return visual_encoder, vision_width
|
| 169 |
+
|
| 170 |
+
|
| 171 |
+
def is_url(url_or_filename):
|
| 172 |
+
parsed = urlparse(url_or_filename)
|
| 173 |
+
return parsed.scheme in ("http", "https")
|
| 174 |
+
|
| 175 |
+
|
| 176 |
+
def load_checkpoint(model, url_or_filename):
|
| 177 |
+
if is_url(url_or_filename):
|
| 178 |
+
cached_file = download_cached_file(url_or_filename,
|
| 179 |
+
check_hash=False,
|
| 180 |
+
progress=True)
|
| 181 |
+
checkpoint = torch.load(cached_file, map_location='cpu')
|
| 182 |
+
elif os.path.isfile(url_or_filename):
|
| 183 |
+
checkpoint = torch.load(url_or_filename, map_location='cpu')
|
| 184 |
+
else:
|
| 185 |
+
raise RuntimeError('checkpoint url or path is invalid')
|
| 186 |
+
|
| 187 |
+
state_dict = checkpoint['model']
|
| 188 |
+
|
| 189 |
+
state_dict['visual_encoder.pos_embed'] = interpolate_pos_embed(
|
| 190 |
+
state_dict['visual_encoder.pos_embed'], model.visual_encoder)
|
| 191 |
+
if 'visual_encoder_m.pos_embed' in model.state_dict().keys():
|
| 192 |
+
state_dict['visual_encoder_m.pos_embed'] = interpolate_pos_embed(
|
| 193 |
+
state_dict['visual_encoder_m.pos_embed'], model.visual_encoder_m)
|
| 194 |
+
for key in model.state_dict().keys():
|
| 195 |
+
if key in state_dict.keys():
|
| 196 |
+
if state_dict[key].shape != model.state_dict()[key].shape:
|
| 197 |
+
del state_dict[key]
|
| 198 |
+
|
| 199 |
+
msg = model.load_state_dict(state_dict, strict=False)
|
| 200 |
+
print('load checkpoint from %s' % url_or_filename)
|
| 201 |
+
return model, msg
|
| 202 |
+
|
| 203 |
+
# def load_checkpoint_condition(model, url_or_filename):
|
| 204 |
+
def load_checkpoint_swinlarge_condition(model, url_or_filename, kwargs):
|
| 205 |
+
if kwargs['image_size'] == 224:
|
| 206 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_224.json'
|
| 207 |
+
elif kwargs['image_size'] == 384:
|
| 208 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_384.json'
|
| 209 |
+
window_size = read_json(vision_config_path)['window_size']
|
| 210 |
+
print('--------------')
|
| 211 |
+
print(url_or_filename)
|
| 212 |
+
print('--------------')
|
| 213 |
+
if is_url(url_or_filename):
|
| 214 |
+
cached_file = download_cached_file(url_or_filename,
|
| 215 |
+
check_hash=False,
|
| 216 |
+
progress=True)
|
| 217 |
+
checkpoint = torch.load(cached_file, map_location='cpu')
|
| 218 |
+
elif os.path.isfile(url_or_filename):
|
| 219 |
+
checkpoint = torch.load(url_or_filename, map_location='cpu')
|
| 220 |
+
else:
|
| 221 |
+
raise RuntimeError('checkpoint url or path is invalid')
|
| 222 |
+
|
| 223 |
+
state_dict = checkpoint['params']
|
| 224 |
+
|
| 225 |
+
for k in list(state_dict.keys()):
|
| 226 |
+
if 'relative_position_bias_table' in k:
|
| 227 |
+
dst_num_pos = (2 * window_size - 1)**2
|
| 228 |
+
state_dict[k] = interpolate_relative_pos_embed(state_dict[k],
|
| 229 |
+
dst_num_pos,
|
| 230 |
+
param_name=k)
|
| 231 |
+
elif ('relative_position_index' in k) or ('attn_mask' in k):
|
| 232 |
+
del state_dict[k]
|
| 233 |
+
elif "vision_multi" in k:
|
| 234 |
+
state_dict[k.replace("vision_multi",
|
| 235 |
+
"tagging_head")] = state_dict.pop(k)
|
| 236 |
+
|
| 237 |
+
msg = model.load_state_dict(state_dict, strict=False)
|
| 238 |
+
print('load checkpoint from %s' % url_or_filename)
|
| 239 |
+
return model, msg
|
| 240 |
+
|
| 241 |
+
|
| 242 |
+
def load_checkpoint_swinbase(model, url_or_filename, kwargs):
|
| 243 |
+
if kwargs['image_size'] == 224:
|
| 244 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_224.json'
|
| 245 |
+
elif kwargs['image_size'] == 384:
|
| 246 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinB_384.json'
|
| 247 |
+
window_size = read_json(vision_config_path)['window_size']
|
| 248 |
+
print('--------------')
|
| 249 |
+
print(url_or_filename)
|
| 250 |
+
print('--------------')
|
| 251 |
+
if is_url(url_or_filename):
|
| 252 |
+
cached_file = download_cached_file(url_or_filename,
|
| 253 |
+
check_hash=False,
|
| 254 |
+
progress=True)
|
| 255 |
+
checkpoint = torch.load(cached_file, map_location='cpu')
|
| 256 |
+
elif os.path.isfile(url_or_filename):
|
| 257 |
+
checkpoint = torch.load(url_or_filename, map_location='cpu')
|
| 258 |
+
else:
|
| 259 |
+
raise RuntimeError('checkpoint url or path is invalid')
|
| 260 |
+
|
| 261 |
+
state_dict = checkpoint['model']
|
| 262 |
+
|
| 263 |
+
for k in list(state_dict.keys()):
|
| 264 |
+
if 'relative_position_bias_table' in k:
|
| 265 |
+
dst_num_pos = (2 * window_size - 1)**2
|
| 266 |
+
state_dict[k] = interpolate_relative_pos_embed(state_dict[k],
|
| 267 |
+
dst_num_pos,
|
| 268 |
+
param_name=k)
|
| 269 |
+
elif ('relative_position_index' in k) or ('attn_mask' in k):
|
| 270 |
+
del state_dict[k]
|
| 271 |
+
elif "vision_multi" in k:
|
| 272 |
+
state_dict[k.replace("vision_multi",
|
| 273 |
+
"tagging_head")] = state_dict.pop(k)
|
| 274 |
+
|
| 275 |
+
msg = model.load_state_dict(state_dict, strict=False)
|
| 276 |
+
print('load checkpoint from %s' % url_or_filename)
|
| 277 |
+
return model, msg
|
| 278 |
+
|
| 279 |
+
|
| 280 |
+
def load_checkpoint_swinlarge(model, url_or_filename, kwargs):
|
| 281 |
+
if kwargs['image_size'] == 224:
|
| 282 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_224.json'
|
| 283 |
+
elif kwargs['image_size'] == 384:
|
| 284 |
+
vision_config_path = f'{CONFIG_PATH}/configs/swin/config_swinL_384.json'
|
| 285 |
+
window_size = read_json(vision_config_path)['window_size']
|
| 286 |
+
print('--------------')
|
| 287 |
+
print(url_or_filename)
|
| 288 |
+
print('--------------')
|
| 289 |
+
if is_url(url_or_filename):
|
| 290 |
+
cached_file = download_cached_file(url_or_filename,
|
| 291 |
+
check_hash=False,
|
| 292 |
+
progress=True)
|
| 293 |
+
checkpoint = torch.load(cached_file, map_location='cpu')
|
| 294 |
+
elif os.path.isfile(url_or_filename):
|
| 295 |
+
checkpoint = torch.load(url_or_filename, map_location='cpu')
|
| 296 |
+
else:
|
| 297 |
+
raise RuntimeError('checkpoint url or path is invalid')
|
| 298 |
+
|
| 299 |
+
state_dict = checkpoint['model']
|
| 300 |
+
|
| 301 |
+
for k in list(state_dict.keys()):
|
| 302 |
+
if 'relative_position_bias_table' in k:
|
| 303 |
+
dst_num_pos = (2 * window_size - 1)**2
|
| 304 |
+
state_dict[k] = interpolate_relative_pos_embed(state_dict[k],
|
| 305 |
+
dst_num_pos,
|
| 306 |
+
param_name=k)
|
| 307 |
+
elif ('relative_position_index' in k) or ('attn_mask' in k):
|
| 308 |
+
del state_dict[k]
|
| 309 |
+
elif "vision_multi" in k:
|
| 310 |
+
state_dict[k.replace("vision_multi",
|
| 311 |
+
"tagging_head")] = state_dict.pop(k)
|
| 312 |
+
|
| 313 |
+
msg = model.load_state_dict(state_dict, strict=False)
|
| 314 |
+
print('load checkpoint from %s' % url_or_filename)
|
| 315 |
+
return model, msg
|
| 316 |
+
|
| 317 |
+
|
| 318 |
+
# Tagging loss function
|
| 319 |
+
# copy from https://github.com/Alibaba-MIIL/ASL/blob/main/src/loss_functions/losses.py
|
| 320 |
+
class AsymmetricLoss(nn.Module):
|
| 321 |
+
def __init__(self, gamma_neg=4, gamma_pos=1, clip=0.05, eps=1e-8, disable_torch_grad_focal_loss=True):
|
| 322 |
+
super(AsymmetricLoss, self).__init__()
|
| 323 |
+
|
| 324 |
+
self.gamma_neg = gamma_neg
|
| 325 |
+
self.gamma_pos = gamma_pos
|
| 326 |
+
self.clip = clip
|
| 327 |
+
self.disable_torch_grad_focal_loss = disable_torch_grad_focal_loss
|
| 328 |
+
self.eps = eps
|
| 329 |
+
|
| 330 |
+
def forward(self, x, y):
|
| 331 |
+
""""
|
| 332 |
+
Parameters
|
| 333 |
+
----------
|
| 334 |
+
x: input logits
|
| 335 |
+
y: targets (multi-label binarized vector)
|
| 336 |
+
"""
|
| 337 |
+
|
| 338 |
+
# Calculating Probabilities
|
| 339 |
+
x_sigmoid = torch.sigmoid(x)
|
| 340 |
+
xs_pos = x_sigmoid
|
| 341 |
+
xs_neg = 1 - x_sigmoid
|
| 342 |
+
|
| 343 |
+
# Asymmetric Clipping
|
| 344 |
+
if self.clip is not None and self.clip > 0:
|
| 345 |
+
xs_neg = (xs_neg + self.clip).clamp(max=1)
|
| 346 |
+
|
| 347 |
+
# Basic CE calculation
|
| 348 |
+
los_pos = y * torch.log(xs_pos.clamp(min=self.eps))
|
| 349 |
+
los_neg = (1 - y) * torch.log(xs_neg.clamp(min=self.eps))
|
| 350 |
+
loss = los_pos + los_neg
|
| 351 |
+
|
| 352 |
+
# Asymmetric Focusing
|
| 353 |
+
if self.gamma_neg > 0 or self.gamma_pos > 0:
|
| 354 |
+
if self.disable_torch_grad_focal_loss:
|
| 355 |
+
torch.set_grad_enabled(False)
|
| 356 |
+
pt0 = xs_pos * y
|
| 357 |
+
pt1 = xs_neg * (1 - y) # pt = p if t > 0 else 1-p
|
| 358 |
+
pt = pt0 + pt1
|
| 359 |
+
one_sided_gamma = self.gamma_pos * y + self.gamma_neg * (1 - y)
|
| 360 |
+
one_sided_w = torch.pow(1 - pt, one_sided_gamma)
|
| 361 |
+
if self.disable_torch_grad_focal_loss:
|
| 362 |
+
torch.set_grad_enabled(True)
|
| 363 |
+
loss *= one_sided_w
|
| 364 |
+
|
| 365 |
+
return -loss.sum()
|
ram/models/vit.py
ADDED
|
@@ -0,0 +1,305 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
'''
|
| 2 |
+
* Copyright (c) 2022, salesforce.com, inc.
|
| 3 |
+
* All rights reserved.
|
| 4 |
+
* SPDX-License-Identifier: BSD-3-Clause
|
| 5 |
+
* For full license text, see LICENSE.txt file in the repo root or https://opensource.org/licenses/BSD-3-Clause
|
| 6 |
+
* By Junnan Li
|
| 7 |
+
* Based on timm code base
|
| 8 |
+
* https://github.com/rwightman/pytorch-image-models/tree/master/timm
|
| 9 |
+
'''
|
| 10 |
+
|
| 11 |
+
import torch
|
| 12 |
+
import torch.nn as nn
|
| 13 |
+
import torch.nn.functional as F
|
| 14 |
+
from functools import partial
|
| 15 |
+
|
| 16 |
+
from timm.models.vision_transformer import _cfg, PatchEmbed
|
| 17 |
+
from timm.models.registry import register_model
|
| 18 |
+
from timm.models.layers import trunc_normal_, DropPath
|
| 19 |
+
from timm.models.helpers import named_apply, adapt_input_conv
|
| 20 |
+
|
| 21 |
+
from fairscale.nn.checkpoint.checkpoint_activations import checkpoint_wrapper
|
| 22 |
+
|
| 23 |
+
class Mlp(nn.Module):
|
| 24 |
+
""" MLP as used in Vision Transformer, MLP-Mixer and related networks
|
| 25 |
+
"""
|
| 26 |
+
def __init__(self, in_features, hidden_features=None, out_features=None, act_layer=nn.GELU, drop=0.):
|
| 27 |
+
super().__init__()
|
| 28 |
+
out_features = out_features or in_features
|
| 29 |
+
hidden_features = hidden_features or in_features
|
| 30 |
+
self.fc1 = nn.Linear(in_features, hidden_features)
|
| 31 |
+
self.act = act_layer()
|
| 32 |
+
self.fc2 = nn.Linear(hidden_features, out_features)
|
| 33 |
+
self.drop = nn.Dropout(drop)
|
| 34 |
+
|
| 35 |
+
def forward(self, x):
|
| 36 |
+
x = self.fc1(x)
|
| 37 |
+
x = self.act(x)
|
| 38 |
+
x = self.drop(x)
|
| 39 |
+
x = self.fc2(x)
|
| 40 |
+
x = self.drop(x)
|
| 41 |
+
return x
|
| 42 |
+
|
| 43 |
+
|
| 44 |
+
class Attention(nn.Module):
|
| 45 |
+
def __init__(self, dim, num_heads=8, qkv_bias=False, qk_scale=None, attn_drop=0., proj_drop=0.):
|
| 46 |
+
super().__init__()
|
| 47 |
+
self.num_heads = num_heads
|
| 48 |
+
head_dim = dim // num_heads
|
| 49 |
+
# NOTE scale factor was wrong in my original version, can set manually to be compat with prev weights
|
| 50 |
+
self.scale = qk_scale or head_dim ** -0.5
|
| 51 |
+
self.qkv = nn.Linear(dim, dim * 3, bias=qkv_bias)
|
| 52 |
+
self.attn_drop = nn.Dropout(attn_drop)
|
| 53 |
+
self.proj = nn.Linear(dim, dim)
|
| 54 |
+
self.proj_drop = nn.Dropout(proj_drop)
|
| 55 |
+
self.attn_gradients = None
|
| 56 |
+
self.attention_map = None
|
| 57 |
+
|
| 58 |
+
def save_attn_gradients(self, attn_gradients):
|
| 59 |
+
self.attn_gradients = attn_gradients
|
| 60 |
+
|
| 61 |
+
def get_attn_gradients(self):
|
| 62 |
+
return self.attn_gradients
|
| 63 |
+
|
| 64 |
+
def save_attention_map(self, attention_map):
|
| 65 |
+
self.attention_map = attention_map
|
| 66 |
+
|
| 67 |
+
def get_attention_map(self):
|
| 68 |
+
return self.attention_map
|
| 69 |
+
|
| 70 |
+
def forward(self, x, register_hook=False):
|
| 71 |
+
B, N, C = x.shape
|
| 72 |
+
qkv = self.qkv(x).reshape(B, N, 3, self.num_heads, C // self.num_heads).permute(2, 0, 3, 1, 4)
|
| 73 |
+
q, k, v = qkv[0], qkv[1], qkv[2] # make torchscript happy (cannot use tensor as tuple)
|
| 74 |
+
|
| 75 |
+
attn = (q @ k.transpose(-2, -1)) * self.scale
|
| 76 |
+
attn = attn.softmax(dim=-1)
|
| 77 |
+
attn = self.attn_drop(attn)
|
| 78 |
+
|
| 79 |
+
if register_hook:
|
| 80 |
+
self.save_attention_map(attn)
|
| 81 |
+
attn.register_hook(self.save_attn_gradients)
|
| 82 |
+
|
| 83 |
+
x = (attn @ v).transpose(1, 2).reshape(B, N, C)
|
| 84 |
+
x = self.proj(x)
|
| 85 |
+
x = self.proj_drop(x)
|
| 86 |
+
return x
|
| 87 |
+
|
| 88 |
+
|
| 89 |
+
class Block(nn.Module):
|
| 90 |
+
|
| 91 |
+
def __init__(self, dim, num_heads, mlp_ratio=4., qkv_bias=False, qk_scale=None, drop=0., attn_drop=0.,
|
| 92 |
+
drop_path=0., act_layer=nn.GELU, norm_layer=nn.LayerNorm, use_grad_checkpointing=False):
|
| 93 |
+
super().__init__()
|
| 94 |
+
self.norm1 = norm_layer(dim)
|
| 95 |
+
self.attn = Attention(
|
| 96 |
+
dim, num_heads=num_heads, qkv_bias=qkv_bias, qk_scale=qk_scale, attn_drop=attn_drop, proj_drop=drop)
|
| 97 |
+
# NOTE: drop path for stochastic depth, we shall see if this is better than dropout here
|
| 98 |
+
self.drop_path = DropPath(drop_path) if drop_path > 0. else nn.Identity()
|
| 99 |
+
self.norm2 = norm_layer(dim)
|
| 100 |
+
mlp_hidden_dim = int(dim * mlp_ratio)
|
| 101 |
+
self.mlp = Mlp(in_features=dim, hidden_features=mlp_hidden_dim, act_layer=act_layer, drop=drop)
|
| 102 |
+
|
| 103 |
+
if use_grad_checkpointing:
|
| 104 |
+
self.attn = checkpoint_wrapper(self.attn)
|
| 105 |
+
self.mlp = checkpoint_wrapper(self.mlp)
|
| 106 |
+
|
| 107 |
+
def forward(self, x, register_hook=False):
|
| 108 |
+
x = x + self.drop_path(self.attn(self.norm1(x), register_hook=register_hook))
|
| 109 |
+
x = x + self.drop_path(self.mlp(self.norm2(x)))
|
| 110 |
+
return x
|
| 111 |
+
|
| 112 |
+
|
| 113 |
+
class VisionTransformer(nn.Module):
|
| 114 |
+
""" Vision Transformer
|
| 115 |
+
A PyTorch impl of : `An Image is Worth 16x16 Words: Transformers for Image Recognition at Scale` -
|
| 116 |
+
https://arxiv.org/abs/2010.11929
|
| 117 |
+
"""
|
| 118 |
+
def __init__(self, img_size=224, patch_size=16, in_chans=3, num_classes=1000, embed_dim=768, depth=12,
|
| 119 |
+
num_heads=12, mlp_ratio=4., qkv_bias=True, qk_scale=None, representation_size=None,
|
| 120 |
+
drop_rate=0., attn_drop_rate=0., drop_path_rate=0., norm_layer=None,
|
| 121 |
+
use_grad_checkpointing=False, ckpt_layer=0):
|
| 122 |
+
"""
|
| 123 |
+
Args:
|
| 124 |
+
img_size (int, tuple): input image size
|
| 125 |
+
patch_size (int, tuple): patch size
|
| 126 |
+
in_chans (int): number of input channels
|
| 127 |
+
num_classes (int): number of classes for classification head
|
| 128 |
+
embed_dim (int): embedding dimension
|
| 129 |
+
depth (int): depth of transformer
|
| 130 |
+
num_heads (int): number of attention heads
|
| 131 |
+
mlp_ratio (int): ratio of mlp hidden dim to embedding dim
|
| 132 |
+
qkv_bias (bool): enable bias for qkv if True
|
| 133 |
+
qk_scale (float): override default qk scale of head_dim ** -0.5 if set
|
| 134 |
+
representation_size (Optional[int]): enable and set representation layer (pre-logits) to this value if set
|
| 135 |
+
drop_rate (float): dropout rate
|
| 136 |
+
attn_drop_rate (float): attention dropout rate
|
| 137 |
+
drop_path_rate (float): stochastic depth rate
|
| 138 |
+
norm_layer: (nn.Module): normalization layer
|
| 139 |
+
"""
|
| 140 |
+
super().__init__()
|
| 141 |
+
self.num_features = self.embed_dim = embed_dim # num_features for consistency with other models
|
| 142 |
+
norm_layer = norm_layer or partial(nn.LayerNorm, eps=1e-6)
|
| 143 |
+
|
| 144 |
+
self.patch_embed = PatchEmbed(
|
| 145 |
+
img_size=img_size, patch_size=patch_size, in_chans=in_chans, embed_dim=embed_dim)
|
| 146 |
+
|
| 147 |
+
num_patches = self.patch_embed.num_patches
|
| 148 |
+
|
| 149 |
+
self.cls_token = nn.Parameter(torch.zeros(1, 1, embed_dim))
|
| 150 |
+
self.pos_embed = nn.Parameter(torch.zeros(1, num_patches + 1, embed_dim))
|
| 151 |
+
self.pos_drop = nn.Dropout(p=drop_rate)
|
| 152 |
+
|
| 153 |
+
dpr = [x.item() for x in torch.linspace(0, drop_path_rate, depth)] # stochastic depth decay rule
|
| 154 |
+
self.blocks = nn.ModuleList([
|
| 155 |
+
Block(
|
| 156 |
+
dim=embed_dim, num_heads=num_heads, mlp_ratio=mlp_ratio, qkv_bias=qkv_bias, qk_scale=qk_scale,
|
| 157 |
+
drop=drop_rate, attn_drop=attn_drop_rate, drop_path=dpr[i], norm_layer=norm_layer,
|
| 158 |
+
use_grad_checkpointing=(use_grad_checkpointing and i>=depth-ckpt_layer)
|
| 159 |
+
)
|
| 160 |
+
for i in range(depth)])
|
| 161 |
+
self.norm = norm_layer(embed_dim)
|
| 162 |
+
|
| 163 |
+
trunc_normal_(self.pos_embed, std=.02)
|
| 164 |
+
trunc_normal_(self.cls_token, std=.02)
|
| 165 |
+
self.apply(self._init_weights)
|
| 166 |
+
|
| 167 |
+
def _init_weights(self, m):
|
| 168 |
+
if isinstance(m, nn.Linear):
|
| 169 |
+
trunc_normal_(m.weight, std=.02)
|
| 170 |
+
if isinstance(m, nn.Linear) and m.bias is not None:
|
| 171 |
+
nn.init.constant_(m.bias, 0)
|
| 172 |
+
elif isinstance(m, nn.LayerNorm):
|
| 173 |
+
nn.init.constant_(m.bias, 0)
|
| 174 |
+
nn.init.constant_(m.weight, 1.0)
|
| 175 |
+
|
| 176 |
+
@torch.jit.ignore
|
| 177 |
+
def no_weight_decay(self):
|
| 178 |
+
return {'pos_embed', 'cls_token'}
|
| 179 |
+
|
| 180 |
+
def forward(self, x, register_blk=-1):
|
| 181 |
+
B = x.shape[0]
|
| 182 |
+
x = self.patch_embed(x)
|
| 183 |
+
|
| 184 |
+
cls_tokens = self.cls_token.expand(B, -1, -1) # stole cls_tokens impl from Phil Wang, thanks
|
| 185 |
+
x = torch.cat((cls_tokens, x), dim=1)
|
| 186 |
+
|
| 187 |
+
x = x + self.pos_embed[:,:x.size(1),:]
|
| 188 |
+
x = self.pos_drop(x)
|
| 189 |
+
|
| 190 |
+
for i,blk in enumerate(self.blocks):
|
| 191 |
+
x = blk(x, register_blk==i)
|
| 192 |
+
x = self.norm(x)
|
| 193 |
+
|
| 194 |
+
return x
|
| 195 |
+
|
| 196 |
+
@torch.jit.ignore()
|
| 197 |
+
def load_pretrained(self, checkpoint_path, prefix=''):
|
| 198 |
+
_load_weights(self, checkpoint_path, prefix)
|
| 199 |
+
|
| 200 |
+
|
| 201 |
+
@torch.no_grad()
|
| 202 |
+
def _load_weights(model: VisionTransformer, checkpoint_path: str, prefix: str = ''):
|
| 203 |
+
""" Load weights from .npz checkpoints for official Google Brain Flax implementation
|
| 204 |
+
"""
|
| 205 |
+
import numpy as np
|
| 206 |
+
|
| 207 |
+
def _n2p(w, t=True):
|
| 208 |
+
if w.ndim == 4 and w.shape[0] == w.shape[1] == w.shape[2] == 1:
|
| 209 |
+
w = w.flatten()
|
| 210 |
+
if t:
|
| 211 |
+
if w.ndim == 4:
|
| 212 |
+
w = w.transpose([3, 2, 0, 1])
|
| 213 |
+
elif w.ndim == 3:
|
| 214 |
+
w = w.transpose([2, 0, 1])
|
| 215 |
+
elif w.ndim == 2:
|
| 216 |
+
w = w.transpose([1, 0])
|
| 217 |
+
return torch.from_numpy(w)
|
| 218 |
+
|
| 219 |
+
w = np.load(checkpoint_path)
|
| 220 |
+
if not prefix and 'opt/target/embedding/kernel' in w:
|
| 221 |
+
prefix = 'opt/target/'
|
| 222 |
+
|
| 223 |
+
if hasattr(model.patch_embed, 'backbone'):
|
| 224 |
+
# hybrid
|
| 225 |
+
backbone = model.patch_embed.backbone
|
| 226 |
+
stem_only = not hasattr(backbone, 'stem')
|
| 227 |
+
stem = backbone if stem_only else backbone.stem
|
| 228 |
+
stem.conv.weight.copy_(adapt_input_conv(stem.conv.weight.shape[1], _n2p(w[f'{prefix}conv_root/kernel'])))
|
| 229 |
+
stem.norm.weight.copy_(_n2p(w[f'{prefix}gn_root/scale']))
|
| 230 |
+
stem.norm.bias.copy_(_n2p(w[f'{prefix}gn_root/bias']))
|
| 231 |
+
if not stem_only:
|
| 232 |
+
for i, stage in enumerate(backbone.stages):
|
| 233 |
+
for j, block in enumerate(stage.blocks):
|
| 234 |
+
bp = f'{prefix}block{i + 1}/unit{j + 1}/'
|
| 235 |
+
for r in range(3):
|
| 236 |
+
getattr(block, f'conv{r + 1}').weight.copy_(_n2p(w[f'{bp}conv{r + 1}/kernel']))
|
| 237 |
+
getattr(block, f'norm{r + 1}').weight.copy_(_n2p(w[f'{bp}gn{r + 1}/scale']))
|
| 238 |
+
getattr(block, f'norm{r + 1}').bias.copy_(_n2p(w[f'{bp}gn{r + 1}/bias']))
|
| 239 |
+
if block.downsample is not None:
|
| 240 |
+
block.downsample.conv.weight.copy_(_n2p(w[f'{bp}conv_proj/kernel']))
|
| 241 |
+
block.downsample.norm.weight.copy_(_n2p(w[f'{bp}gn_proj/scale']))
|
| 242 |
+
block.downsample.norm.bias.copy_(_n2p(w[f'{bp}gn_proj/bias']))
|
| 243 |
+
embed_conv_w = _n2p(w[f'{prefix}embedding/kernel'])
|
| 244 |
+
else:
|
| 245 |
+
embed_conv_w = adapt_input_conv(
|
| 246 |
+
model.patch_embed.proj.weight.shape[1], _n2p(w[f'{prefix}embedding/kernel']))
|
| 247 |
+
model.patch_embed.proj.weight.copy_(embed_conv_w)
|
| 248 |
+
model.patch_embed.proj.bias.copy_(_n2p(w[f'{prefix}embedding/bias']))
|
| 249 |
+
model.cls_token.copy_(_n2p(w[f'{prefix}cls'], t=False))
|
| 250 |
+
pos_embed_w = _n2p(w[f'{prefix}Transformer/posembed_input/pos_embedding'], t=False)
|
| 251 |
+
if pos_embed_w.shape != model.pos_embed.shape:
|
| 252 |
+
pos_embed_w = resize_pos_embed( # resize pos embedding when different size from pretrained weights
|
| 253 |
+
pos_embed_w, model.pos_embed, getattr(model, 'num_tokens', 1), model.patch_embed.grid_size)
|
| 254 |
+
model.pos_embed.copy_(pos_embed_w)
|
| 255 |
+
model.norm.weight.copy_(_n2p(w[f'{prefix}Transformer/encoder_norm/scale']))
|
| 256 |
+
model.norm.bias.copy_(_n2p(w[f'{prefix}Transformer/encoder_norm/bias']))
|
| 257 |
+
# if isinstance(model.head, nn.Linear) and model.head.bias.shape[0] == w[f'{prefix}head/bias'].shape[-1]:
|
| 258 |
+
# model.head.weight.copy_(_n2p(w[f'{prefix}head/kernel']))
|
| 259 |
+
# model.head.bias.copy_(_n2p(w[f'{prefix}head/bias']))
|
| 260 |
+
# if isinstance(getattr(model.pre_logits, 'fc', None), nn.Linear) and f'{prefix}pre_logits/bias' in w:
|
| 261 |
+
# model.pre_logits.fc.weight.copy_(_n2p(w[f'{prefix}pre_logits/kernel']))
|
| 262 |
+
# model.pre_logits.fc.bias.copy_(_n2p(w[f'{prefix}pre_logits/bias']))
|
| 263 |
+
for i, block in enumerate(model.blocks.children()):
|
| 264 |
+
block_prefix = f'{prefix}Transformer/encoderblock_{i}/'
|
| 265 |
+
mha_prefix = block_prefix + 'MultiHeadDotProductAttention_1/'
|
| 266 |
+
block.norm1.weight.copy_(_n2p(w[f'{block_prefix}LayerNorm_0/scale']))
|
| 267 |
+
block.norm1.bias.copy_(_n2p(w[f'{block_prefix}LayerNorm_0/bias']))
|
| 268 |
+
block.attn.qkv.weight.copy_(torch.cat([
|
| 269 |
+
_n2p(w[f'{mha_prefix}{n}/kernel'], t=False).flatten(1).T for n in ('query', 'key', 'value')]))
|
| 270 |
+
block.attn.qkv.bias.copy_(torch.cat([
|
| 271 |
+
_n2p(w[f'{mha_prefix}{n}/bias'], t=False).reshape(-1) for n in ('query', 'key', 'value')]))
|
| 272 |
+
block.attn.proj.weight.copy_(_n2p(w[f'{mha_prefix}out/kernel']).flatten(1))
|
| 273 |
+
block.attn.proj.bias.copy_(_n2p(w[f'{mha_prefix}out/bias']))
|
| 274 |
+
for r in range(2):
|
| 275 |
+
getattr(block.mlp, f'fc{r + 1}').weight.copy_(_n2p(w[f'{block_prefix}MlpBlock_3/Dense_{r}/kernel']))
|
| 276 |
+
getattr(block.mlp, f'fc{r + 1}').bias.copy_(_n2p(w[f'{block_prefix}MlpBlock_3/Dense_{r}/bias']))
|
| 277 |
+
block.norm2.weight.copy_(_n2p(w[f'{block_prefix}LayerNorm_2/scale']))
|
| 278 |
+
block.norm2.bias.copy_(_n2p(w[f'{block_prefix}LayerNorm_2/bias']))
|
| 279 |
+
|
| 280 |
+
|
| 281 |
+
def interpolate_pos_embed(pos_embed_checkpoint, visual_encoder):
|
| 282 |
+
# interpolate position embedding
|
| 283 |
+
embedding_size = pos_embed_checkpoint.shape[-1]
|
| 284 |
+
num_patches = visual_encoder.patch_embed.num_patches
|
| 285 |
+
num_extra_tokens = visual_encoder.pos_embed.shape[-2] - num_patches
|
| 286 |
+
# height (== width) for the checkpoint position embedding
|
| 287 |
+
orig_size = int((pos_embed_checkpoint.shape[-2] - num_extra_tokens) ** 0.5)
|
| 288 |
+
# height (== width) for the new position embedding
|
| 289 |
+
new_size = int(num_patches ** 0.5)
|
| 290 |
+
|
| 291 |
+
if orig_size!=new_size:
|
| 292 |
+
# class_token and dist_token are kept unchanged
|
| 293 |
+
extra_tokens = pos_embed_checkpoint[:, :num_extra_tokens]
|
| 294 |
+
# only the position tokens are interpolated
|
| 295 |
+
pos_tokens = pos_embed_checkpoint[:, num_extra_tokens:]
|
| 296 |
+
pos_tokens = pos_tokens.reshape(-1, orig_size, orig_size, embedding_size).permute(0, 3, 1, 2)
|
| 297 |
+
pos_tokens = torch.nn.functional.interpolate(
|
| 298 |
+
pos_tokens, size=(new_size, new_size), mode='bicubic', align_corners=False)
|
| 299 |
+
pos_tokens = pos_tokens.permute(0, 2, 3, 1).flatten(1, 2)
|
| 300 |
+
new_pos_embed = torch.cat((extra_tokens, pos_tokens), dim=1)
|
| 301 |
+
print('reshape position embedding from %d to %d'%(orig_size ** 2,new_size ** 2))
|
| 302 |
+
|
| 303 |
+
return new_pos_embed
|
| 304 |
+
else:
|
| 305 |
+
return pos_embed_checkpoint
|
ram/transform.py
ADDED
|
@@ -0,0 +1,13 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from torchvision.transforms import Normalize, Compose, Resize, ToTensor
|
| 2 |
+
|
| 3 |
+
|
| 4 |
+
def convert_to_rgb(image):
|
| 5 |
+
return image.convert("RGB")
|
| 6 |
+
|
| 7 |
+
def get_transform(image_size=384):
|
| 8 |
+
return Compose([
|
| 9 |
+
convert_to_rgb,
|
| 10 |
+
Resize((image_size, image_size)),
|
| 11 |
+
ToTensor(),
|
| 12 |
+
Normalize(mean=[0.485, 0.456, 0.406], std=[0.229, 0.224, 0.225])
|
| 13 |
+
])
|
ram/utils/__init__.py
ADDED
|
@@ -0,0 +1,2 @@
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from .metrics import get_mAP, get_PR
|
| 2 |
+
from .openset_utils import build_openset_label_embedding
|
ram/utils/metrics.py
ADDED
|
@@ -0,0 +1,102 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
from typing import List, Tuple
|
| 2 |
+
|
| 3 |
+
import numpy as np
|
| 4 |
+
from numpy import ndarray
|
| 5 |
+
|
| 6 |
+
|
| 7 |
+
def get_mAP(
|
| 8 |
+
preds: ndarray,
|
| 9 |
+
gt_file: str,
|
| 10 |
+
taglist: List[str]
|
| 11 |
+
) -> Tuple[float, ndarray]:
|
| 12 |
+
assert preds.shape[1] == len(taglist)
|
| 13 |
+
|
| 14 |
+
# When mapping categories from test datasets to our system, there might be
|
| 15 |
+
# multiple vs one situation due to different semantic definitions of tags.
|
| 16 |
+
# So there can be duplicate tags in `taglist`. This special case is taken
|
| 17 |
+
# into account.
|
| 18 |
+
tag2idxs = {}
|
| 19 |
+
for idx, tag in enumerate(taglist):
|
| 20 |
+
if tag not in tag2idxs:
|
| 21 |
+
tag2idxs[tag] = []
|
| 22 |
+
tag2idxs[tag].append(idx)
|
| 23 |
+
|
| 24 |
+
# build targets
|
| 25 |
+
targets = np.zeros_like(preds)
|
| 26 |
+
with open(gt_file, "r") as f:
|
| 27 |
+
lines = [line.strip("\n").split(",") for line in f.readlines()]
|
| 28 |
+
assert len(lines) == targets.shape[0]
|
| 29 |
+
for i, line in enumerate(lines):
|
| 30 |
+
for tag in line[1:]:
|
| 31 |
+
targets[i, tag2idxs[tag]] = 1.0
|
| 32 |
+
|
| 33 |
+
# compute average precision for each class
|
| 34 |
+
APs = np.zeros(preds.shape[1])
|
| 35 |
+
for k in range(preds.shape[1]):
|
| 36 |
+
APs[k] = _average_precision(preds[:, k], targets[:, k])
|
| 37 |
+
|
| 38 |
+
return APs.mean(), APs
|
| 39 |
+
|
| 40 |
+
|
| 41 |
+
def _average_precision(output: ndarray, target: ndarray) -> float:
|
| 42 |
+
epsilon = 1e-8
|
| 43 |
+
|
| 44 |
+
# sort examples
|
| 45 |
+
indices = output.argsort()[::-1]
|
| 46 |
+
# Computes prec@i
|
| 47 |
+
total_count_ = np.cumsum(np.ones((len(output), 1)))
|
| 48 |
+
|
| 49 |
+
target_ = target[indices]
|
| 50 |
+
ind = target_ == 1
|
| 51 |
+
pos_count_ = np.cumsum(ind)
|
| 52 |
+
total = pos_count_[-1]
|
| 53 |
+
pos_count_[np.logical_not(ind)] = 0
|
| 54 |
+
pp = pos_count_ / total_count_
|
| 55 |
+
precision_at_i_ = np.sum(pp)
|
| 56 |
+
precision_at_i = precision_at_i_ / (total + epsilon)
|
| 57 |
+
|
| 58 |
+
return precision_at_i
|
| 59 |
+
|
| 60 |
+
|
| 61 |
+
def get_PR(
|
| 62 |
+
pred_file: str,
|
| 63 |
+
gt_file: str,
|
| 64 |
+
taglist: List[str]
|
| 65 |
+
) -> Tuple[float, float, ndarray, ndarray]:
|
| 66 |
+
# When mapping categories from test datasets to our system, there might be
|
| 67 |
+
# multiple vs one situation due to different semantic definitions of tags.
|
| 68 |
+
# So there can be duplicate tags in `taglist`. This special case is taken
|
| 69 |
+
# into account.
|
| 70 |
+
tag2idxs = {}
|
| 71 |
+
for idx, tag in enumerate(taglist):
|
| 72 |
+
if tag not in tag2idxs:
|
| 73 |
+
tag2idxs[tag] = []
|
| 74 |
+
tag2idxs[tag].append(idx)
|
| 75 |
+
|
| 76 |
+
# build preds
|
| 77 |
+
with open(pred_file, "r", encoding="utf-8") as f:
|
| 78 |
+
lines = [line.strip().split(",") for line in f.readlines()]
|
| 79 |
+
preds = np.zeros((len(lines), len(tag2idxs)), dtype=bool)
|
| 80 |
+
for i, line in enumerate(lines):
|
| 81 |
+
for tag in line[1:]:
|
| 82 |
+
preds[i, tag2idxs[tag]] = True
|
| 83 |
+
|
| 84 |
+
# build targets
|
| 85 |
+
with open(gt_file, "r", encoding="utf-8") as f:
|
| 86 |
+
lines = [line.strip().split(",") for line in f.readlines()]
|
| 87 |
+
targets = np.zeros((len(lines), len(tag2idxs)), dtype=bool)
|
| 88 |
+
for i, line in enumerate(lines):
|
| 89 |
+
for tag in line[1:]:
|
| 90 |
+
targets[i, tag2idxs[tag]] = True
|
| 91 |
+
|
| 92 |
+
assert preds.shape == targets.shape
|
| 93 |
+
|
| 94 |
+
# calculate P and R
|
| 95 |
+
TPs = ( preds & targets).sum(axis=0) # noqa: E201, E222
|
| 96 |
+
FPs = ( preds & ~targets).sum(axis=0) # noqa: E201, E222
|
| 97 |
+
FNs = (~preds & targets).sum(axis=0) # noqa: E201, E222
|
| 98 |
+
eps = 1.e-9
|
| 99 |
+
Ps = TPs / (TPs + FPs + eps)
|
| 100 |
+
Rs = TPs / (TPs + FNs + eps)
|
| 101 |
+
|
| 102 |
+
return Ps.mean(), Rs.mean(), Ps, Rs
|
ram/utils/openset_utils.py
ADDED
|
@@ -0,0 +1,333 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
|
| 2 |
+
|
| 3 |
+
|
| 4 |
+
import torch
|
| 5 |
+
import torch.nn as nn
|
| 6 |
+
from clip import clip
|
| 7 |
+
|
| 8 |
+
|
| 9 |
+
def article(name):
|
| 10 |
+
return "an" if name[0] in "aeiou" else "a"
|
| 11 |
+
|
| 12 |
+
|
| 13 |
+
def processed_name(name, rm_dot=False):
|
| 14 |
+
# _ for lvis
|
| 15 |
+
# / for obj365
|
| 16 |
+
res = name.replace("_", " ").replace("/", " or ").lower()
|
| 17 |
+
if rm_dot:
|
| 18 |
+
res = res.rstrip(".")
|
| 19 |
+
return res
|
| 20 |
+
|
| 21 |
+
|
| 22 |
+
single_template = ["a photo of a {}."]
|
| 23 |
+
|
| 24 |
+
multiple_templates = [
|
| 25 |
+
"There is {article} {} in the scene.",
|
| 26 |
+
"There is the {} in the scene.",
|
| 27 |
+
"a photo of {article} {} in the scene.",
|
| 28 |
+
"a photo of the {} in the scene.",
|
| 29 |
+
"a photo of one {} in the scene.",
|
| 30 |
+
"itap of {article} {}.",
|
| 31 |
+
"itap of my {}.", # itap: I took a picture of
|
| 32 |
+
"itap of the {}.",
|
| 33 |
+
"a photo of {article} {}.",
|
| 34 |
+
"a photo of my {}.",
|
| 35 |
+
"a photo of the {}.",
|
| 36 |
+
"a photo of one {}.",
|
| 37 |
+
"a photo of many {}.",
|
| 38 |
+
"a good photo of {article} {}.",
|
| 39 |
+
"a good photo of the {}.",
|
| 40 |
+
"a bad photo of {article} {}.",
|
| 41 |
+
"a bad photo of the {}.",
|
| 42 |
+
"a photo of a nice {}.",
|
| 43 |
+
"a photo of the nice {}.",
|
| 44 |
+
"a photo of a cool {}.",
|
| 45 |
+
"a photo of the cool {}.",
|
| 46 |
+
"a photo of a weird {}.",
|
| 47 |
+
"a photo of the weird {}.",
|
| 48 |
+
"a photo of a small {}.",
|
| 49 |
+
"a photo of the small {}.",
|
| 50 |
+
"a photo of a large {}.",
|
| 51 |
+
"a photo of the large {}.",
|
| 52 |
+
"a photo of a clean {}.",
|
| 53 |
+
"a photo of the clean {}.",
|
| 54 |
+
"a photo of a dirty {}.",
|
| 55 |
+
"a photo of the dirty {}.",
|
| 56 |
+
"a bright photo of {article} {}.",
|
| 57 |
+
"a bright photo of the {}.",
|
| 58 |
+
"a dark photo of {article} {}.",
|
| 59 |
+
"a dark photo of the {}.",
|
| 60 |
+
"a photo of a hard to see {}.",
|
| 61 |
+
"a photo of the hard to see {}.",
|
| 62 |
+
"a low resolution photo of {article} {}.",
|
| 63 |
+
"a low resolution photo of the {}.",
|
| 64 |
+
"a cropped photo of {article} {}.",
|
| 65 |
+
"a cropped photo of the {}.",
|
| 66 |
+
"a close-up photo of {article} {}.",
|
| 67 |
+
"a close-up photo of the {}.",
|
| 68 |
+
"a jpeg corrupted photo of {article} {}.",
|
| 69 |
+
"a jpeg corrupted photo of the {}.",
|
| 70 |
+
"a blurry photo of {article} {}.",
|
| 71 |
+
"a blurry photo of the {}.",
|
| 72 |
+
"a pixelated photo of {article} {}.",
|
| 73 |
+
"a pixelated photo of the {}.",
|
| 74 |
+
"a black and white photo of the {}.",
|
| 75 |
+
"a black and white photo of {article} {}.",
|
| 76 |
+
"a plastic {}.",
|
| 77 |
+
"the plastic {}.",
|
| 78 |
+
"a toy {}.",
|
| 79 |
+
"the toy {}.",
|
| 80 |
+
"a plushie {}.",
|
| 81 |
+
"the plushie {}.",
|
| 82 |
+
"a cartoon {}.",
|
| 83 |
+
"the cartoon {}.",
|
| 84 |
+
"an embroidered {}.",
|
| 85 |
+
"the embroidered {}.",
|
| 86 |
+
"a painting of the {}.",
|
| 87 |
+
"a painting of a {}.",
|
| 88 |
+
]
|
| 89 |
+
|
| 90 |
+
|
| 91 |
+
openimages_rare_unseen = ['Aerial photography',
|
| 92 |
+
'Aircraft engine',
|
| 93 |
+
'Ale',
|
| 94 |
+
'Aloe',
|
| 95 |
+
'Amphibian',
|
| 96 |
+
'Angling',
|
| 97 |
+
'Anole',
|
| 98 |
+
'Antique car',
|
| 99 |
+
'Arcade game',
|
| 100 |
+
'Arthropod',
|
| 101 |
+
'Assault rifle',
|
| 102 |
+
'Athletic shoe',
|
| 103 |
+
'Auto racing',
|
| 104 |
+
'Backlighting',
|
| 105 |
+
'Bagpipes',
|
| 106 |
+
'Ball game',
|
| 107 |
+
'Barbecue chicken',
|
| 108 |
+
'Barechested',
|
| 109 |
+
'Barquentine',
|
| 110 |
+
'Beef tenderloin',
|
| 111 |
+
'Billiard room',
|
| 112 |
+
'Billiards',
|
| 113 |
+
'Bird of prey',
|
| 114 |
+
'Black swan',
|
| 115 |
+
'Black-and-white',
|
| 116 |
+
'Blond',
|
| 117 |
+
'Boating',
|
| 118 |
+
'Bonbon',
|
| 119 |
+
'Bottled water',
|
| 120 |
+
'Bouldering',
|
| 121 |
+
'Bovine',
|
| 122 |
+
'Bratwurst',
|
| 123 |
+
'Breadboard',
|
| 124 |
+
'Briefs',
|
| 125 |
+
'Brisket',
|
| 126 |
+
'Brochette',
|
| 127 |
+
'Calabaza',
|
| 128 |
+
'Camera operator',
|
| 129 |
+
'Canola',
|
| 130 |
+
'Childbirth',
|
| 131 |
+
'Chordophone',
|
| 132 |
+
'Church bell',
|
| 133 |
+
'Classical sculpture',
|
| 134 |
+
'Close-up',
|
| 135 |
+
'Cobblestone',
|
| 136 |
+
'Coca-cola',
|
| 137 |
+
'Combat sport',
|
| 138 |
+
'Comics',
|
| 139 |
+
'Compact car',
|
| 140 |
+
'Computer speaker',
|
| 141 |
+
'Cookies and crackers',
|
| 142 |
+
'Coral reef fish',
|
| 143 |
+
'Corn on the cob',
|
| 144 |
+
'Cosmetics',
|
| 145 |
+
'Crocodilia',
|
| 146 |
+
'Digital camera',
|
| 147 |
+
'Dishware',
|
| 148 |
+
'Divemaster',
|
| 149 |
+
'Dobermann',
|
| 150 |
+
'Dog walking',
|
| 151 |
+
'Domestic rabbit',
|
| 152 |
+
'Domestic short-haired cat',
|
| 153 |
+
'Double-decker bus',
|
| 154 |
+
'Drums',
|
| 155 |
+
'Electric guitar',
|
| 156 |
+
'Electric piano',
|
| 157 |
+
'Electronic instrument',
|
| 158 |
+
'Equestrianism',
|
| 159 |
+
'Equitation',
|
| 160 |
+
'Erinaceidae',
|
| 161 |
+
'Extreme sport',
|
| 162 |
+
'Falafel',
|
| 163 |
+
'Figure skating',
|
| 164 |
+
'Filling station',
|
| 165 |
+
'Fire apparatus',
|
| 166 |
+
'Firearm',
|
| 167 |
+
'Flatbread',
|
| 168 |
+
'Floristry',
|
| 169 |
+
'Forklift truck',
|
| 170 |
+
'Freight transport',
|
| 171 |
+
'Fried food',
|
| 172 |
+
'Fried noodles',
|
| 173 |
+
'Frigate',
|
| 174 |
+
'Frozen yogurt',
|
| 175 |
+
'Frying',
|
| 176 |
+
'Full moon',
|
| 177 |
+
'Galleon',
|
| 178 |
+
'Glacial landform',
|
| 179 |
+
'Gliding',
|
| 180 |
+
'Go-kart',
|
| 181 |
+
'Goats',
|
| 182 |
+
'Grappling',
|
| 183 |
+
'Great white shark',
|
| 184 |
+
'Gumbo',
|
| 185 |
+
'Gun turret',
|
| 186 |
+
'Hair coloring',
|
| 187 |
+
'Halter',
|
| 188 |
+
'Headphones',
|
| 189 |
+
'Heavy cruiser',
|
| 190 |
+
'Herding',
|
| 191 |
+
'High-speed rail',
|
| 192 |
+
'Holding hands',
|
| 193 |
+
'Horse and buggy',
|
| 194 |
+
'Horse racing',
|
| 195 |
+
'Hound',
|
| 196 |
+
'Hunting knife',
|
| 197 |
+
'Hurdling',
|
| 198 |
+
'Inflatable',
|
| 199 |
+
'Jackfruit',
|
| 200 |
+
'Jeans',
|
| 201 |
+
'Jiaozi',
|
| 202 |
+
'Junk food',
|
| 203 |
+
'Khinkali',
|
| 204 |
+
'Kitesurfing',
|
| 205 |
+
'Lawn game',
|
| 206 |
+
'Leaf vegetable',
|
| 207 |
+
'Lechon',
|
| 208 |
+
'Lifebuoy',
|
| 209 |
+
'Locust',
|
| 210 |
+
'Lumpia',
|
| 211 |
+
'Luxury vehicle',
|
| 212 |
+
'Machine tool',
|
| 213 |
+
'Medical imaging',
|
| 214 |
+
'Melee weapon',
|
| 215 |
+
'Microcontroller',
|
| 216 |
+
'Middle ages',
|
| 217 |
+
'Military person',
|
| 218 |
+
'Military vehicle',
|
| 219 |
+
'Milky way',
|
| 220 |
+
'Miniature Poodle',
|
| 221 |
+
'Modern dance',
|
| 222 |
+
'Molluscs',
|
| 223 |
+
'Monoplane',
|
| 224 |
+
'Motorcycling',
|
| 225 |
+
'Musical theatre',
|
| 226 |
+
'Narcissus',
|
| 227 |
+
'Nest box',
|
| 228 |
+
'Newsagent\'s shop',
|
| 229 |
+
'Nile crocodile',
|
| 230 |
+
'Nordic skiing',
|
| 231 |
+
'Nuclear power plant',
|
| 232 |
+
'Orator',
|
| 233 |
+
'Outdoor shoe',
|
| 234 |
+
'Parachuting',
|
| 235 |
+
'Pasta salad',
|
| 236 |
+
'Peafowl',
|
| 237 |
+
'Pelmeni',
|
| 238 |
+
'Perching bird',
|
| 239 |
+
'Performance car',
|
| 240 |
+
'Personal water craft',
|
| 241 |
+
'Pit bull',
|
| 242 |
+
'Plant stem',
|
| 243 |
+
'Pork chop',
|
| 244 |
+
'Portrait photography',
|
| 245 |
+
'Primate',
|
| 246 |
+
'Procyonidae',
|
| 247 |
+
'Prosciutto',
|
| 248 |
+
'Public speaking',
|
| 249 |
+
'Racewalking',
|
| 250 |
+
'Ramen',
|
| 251 |
+
'Rear-view mirror',
|
| 252 |
+
'Residential area',
|
| 253 |
+
'Ribs',
|
| 254 |
+
'Rice ball',
|
| 255 |
+
'Road cycling',
|
| 256 |
+
'Roller skating',
|
| 257 |
+
'Roman temple',
|
| 258 |
+
'Rowing',
|
| 259 |
+
'Rural area',
|
| 260 |
+
'Sailboat racing',
|
| 261 |
+
'Scaled reptile',
|
| 262 |
+
'Scuba diving',
|
| 263 |
+
'Senior citizen',
|
| 264 |
+
'Shallot',
|
| 265 |
+
'Shinto shrine',
|
| 266 |
+
'Shooting range',
|
| 267 |
+
'Siberian husky',
|
| 268 |
+
'Sledding',
|
| 269 |
+
'Soba',
|
| 270 |
+
'Solar energy',
|
| 271 |
+
'Sport climbing',
|
| 272 |
+
'Sport utility vehicle',
|
| 273 |
+
'Steamed rice',
|
| 274 |
+
'Stemware',
|
| 275 |
+
'Sumo',
|
| 276 |
+
'Surfing Equipment',
|
| 277 |
+
'Team sport',
|
| 278 |
+
'Touring car',
|
| 279 |
+
'Toy block',
|
| 280 |
+
'Trampolining',
|
| 281 |
+
'Underwater diving',
|
| 282 |
+
'Vegetarian food',
|
| 283 |
+
'Wallaby',
|
| 284 |
+
'Water polo',
|
| 285 |
+
'Watercolor paint',
|
| 286 |
+
'Whiskers',
|
| 287 |
+
'Wind wave',
|
| 288 |
+
'Woodwind instrument',
|
| 289 |
+
'Yakitori',
|
| 290 |
+
'Zeppelin']
|
| 291 |
+
|
| 292 |
+
|
| 293 |
+
def build_openset_label_embedding(categories=None):
|
| 294 |
+
if categories is None:
|
| 295 |
+
categories = openimages_rare_unseen
|
| 296 |
+
# model, _ = clip.load("ViT-B/16")
|
| 297 |
+
model, _ = clip.load("ViT-B-16.pt")
|
| 298 |
+
templates = multiple_templates
|
| 299 |
+
|
| 300 |
+
run_on_gpu = torch.cuda.is_available()
|
| 301 |
+
|
| 302 |
+
with torch.no_grad():
|
| 303 |
+
openset_label_embedding = []
|
| 304 |
+
for category in categories:
|
| 305 |
+
texts = [
|
| 306 |
+
template.format(
|
| 307 |
+
processed_name(category, rm_dot=True), article=article(category)
|
| 308 |
+
)
|
| 309 |
+
for template in templates
|
| 310 |
+
]
|
| 311 |
+
texts = [
|
| 312 |
+
"This is " + text if text.startswith("a") or text.startswith("the") else text
|
| 313 |
+
for text in texts
|
| 314 |
+
]
|
| 315 |
+
texts = clip.tokenize(texts) # tokenize
|
| 316 |
+
if run_on_gpu:
|
| 317 |
+
texts = texts.cuda()
|
| 318 |
+
model = model.cuda()
|
| 319 |
+
text_embeddings = model.encode_text(texts)
|
| 320 |
+
text_embeddings /= text_embeddings.norm(dim=-1, keepdim=True)
|
| 321 |
+
text_embedding = text_embeddings.mean(dim=0)
|
| 322 |
+
text_embedding /= text_embedding.norm()
|
| 323 |
+
openset_label_embedding.append(text_embedding)
|
| 324 |
+
openset_label_embedding = torch.stack(openset_label_embedding, dim=1)
|
| 325 |
+
if run_on_gpu:
|
| 326 |
+
openset_label_embedding = openset_label_embedding.cuda()
|
| 327 |
+
|
| 328 |
+
openset_label_embedding = openset_label_embedding.t()
|
| 329 |
+
return openset_label_embedding, categories
|
| 330 |
+
|
| 331 |
+
|
| 332 |
+
|
| 333 |
+
|
requirements.txt
ADDED
|
@@ -0,0 +1,52 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
accelerate==1.4.0
|
| 2 |
+
certifi==2025.1.31
|
| 3 |
+
charset-normalizer==3.4.1
|
| 4 |
+
diffusers==0.32.1
|
| 5 |
+
fairscale==0.4.13
|
| 6 |
+
filelock==3.13.1
|
| 7 |
+
fsspec==2024.6.1
|
| 8 |
+
huggingface-hub==0.29.3
|
| 9 |
+
idna==3.10
|
| 10 |
+
importlib_metadata==8.6.1
|
| 11 |
+
Jinja2==3.1.4
|
| 12 |
+
loralib==0.1.2
|
| 13 |
+
lpips==0.1.4
|
| 14 |
+
MarkupSafe==2.1.5
|
| 15 |
+
mpmath==1.3.0
|
| 16 |
+
munch==4.0.0
|
| 17 |
+
networkx==3.3
|
| 18 |
+
numpy==2.1.2
|
| 19 |
+
nvidia-cublas-cu11==11.11.3.6
|
| 20 |
+
nvidia-cuda-cupti-cu11==11.8.87
|
| 21 |
+
nvidia-cuda-nvrtc-cu11==11.8.89
|
| 22 |
+
nvidia-cuda-runtime-cu11==11.8.89
|
| 23 |
+
nvidia-cudnn-cu11==9.1.0.70
|
| 24 |
+
nvidia-cufft-cu11==10.9.0.58
|
| 25 |
+
nvidia-curand-cu11==10.3.0.86
|
| 26 |
+
nvidia-cusolver-cu11==11.4.1.48
|
| 27 |
+
nvidia-cusparse-cu11==11.7.5.86
|
| 28 |
+
nvidia-nccl-cu11==2.20.5
|
| 29 |
+
nvidia-nvtx-cu11==11.8.86
|
| 30 |
+
packaging==24.2
|
| 31 |
+
peft==0.15.2
|
| 32 |
+
pillow==11.1.0
|
| 33 |
+
protobuf==6.30.0
|
| 34 |
+
psutil==7.0.0
|
| 35 |
+
PyYAML==6.0.2
|
| 36 |
+
qwen-vl-utils[decord]==0.0.8
|
| 37 |
+
regex==2024.11.6
|
| 38 |
+
requests==2.32.3
|
| 39 |
+
safetensors==0.5.3
|
| 40 |
+
scipy==1.15.2
|
| 41 |
+
sentencepiece==0.2.0
|
| 42 |
+
sympy==1.13.1
|
| 43 |
+
timm==1.0.15
|
| 44 |
+
tokenizers==0.21.0
|
| 45 |
+
torch==2.4.1
|
| 46 |
+
torchvision==0.19.1
|
| 47 |
+
tqdm==4.67.1
|
| 48 |
+
transformers==4.49.0
|
| 49 |
+
triton==3.0.0
|
| 50 |
+
typing_extensions==4.12.2
|
| 51 |
+
urllib3==2.3.0
|
| 52 |
+
zipp==3.21.0
|
samples/0064.png
ADDED
|
Git LFS Details
|
samples/0245.png
ADDED
|
Git LFS Details
|
samples/0393.png
ADDED
|
Git LFS Details
|
samples/0457.png
ADDED
|
Git LFS Details
|
samples/0479.png
ADDED
|
Git LFS Details
|
scripts/inference/inference_coz_dapeprompt.sh
ADDED
|
@@ -0,0 +1,17 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#----------------- CoZ with DAPE Prompts -----------------#
|
| 2 |
+
# REQUIRED ENVIRONMENT: coz
|
| 3 |
+
|
| 4 |
+
INPUT_FOLDER="samples"
|
| 5 |
+
OUTPUT_FOLDER="inference_results/coz_dapeprompt"
|
| 6 |
+
|
| 7 |
+
CUDA_VISIBLE_DEVICES=0,1, python inference_coz.py \
|
| 8 |
+
-i $INPUT_FOLDER \
|
| 9 |
+
-o $OUTPUT_FOLDER \
|
| 10 |
+
--rec_type recursive \
|
| 11 |
+
--prompt_type dape \
|
| 12 |
+
--lora_path ckpt/SR_LoRA/model_20001.pkl \
|
| 13 |
+
--vae_path ckpt/SR_VAE/vae_encoder_20001.pt \
|
| 14 |
+
--pretrained_model_name_or_path 'stabilityai/stable-diffusion-3-medium-diffusers' \
|
| 15 |
+
--ram_ft_path ckpt/DAPE/DAPE.pth \
|
| 16 |
+
--ram_path ckpt/RAM/ram_swin_large_14m.pth \
|
| 17 |
+
|
scripts/inference/inference_coz_nullprompt.sh
ADDED
|
@@ -0,0 +1,15 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
| 1 |
+
#----------------- CoZ with Null Prompts -----------------#
|
| 2 |
+
# REQUIRED ENVIRONMENT: coz
|
| 3 |
+
|
| 4 |
+
INPUT_FOLDER="samples"
|
| 5 |
+
OUTPUT_FOLDER="inference_results/coz_nullprompt"
|
| 6 |
+
|
| 7 |
+
CUDA_VISIBLE_DEVICES=0,1, python inference_coz.py \
|
| 8 |
+
-i $INPUT_FOLDER \
|
| 9 |
+
-o $OUTPUT_FOLDER \
|
| 10 |
+
--rec_type recursive \
|
| 11 |
+
--prompt_type null \
|
| 12 |
+
--lora_path ckpt/SR_LoRA/model_20001.pkl \
|
| 13 |
+
--vae_path ckpt/SR_VAE/vae_encoder_20001.pt \
|
| 14 |
+
--pretrained_model_name_or_path 'stabilityai/stable-diffusion-3-medium-diffusers' \
|
| 15 |
+
|